SQL Azure vs SQL Server 2016

here is a list of some of the differences between SQL Azure and SQL Server 2016


Difference by

SQL Azure

SQL server 2016



In SQL Azure backups are automatically scheduled. It does not support conventional backups and restore statements.

SQL server has support for conventional backups and restore statements.


Recovery Models

Recovery Models are not available in SLQ Azure However Azure SQL Database offers the following capabilities for recovering from an outage

· Active geo-replication

· Geo-restore

SQL server supports three different recovery models

· Simple

· Full

· Bulk logged


SQL Server Agent

SQL Azure does not support SQL Server Agent but Azure Automation can be used for simplifying cloud management through process automation.

SQL Server Agent is a component of SQL Server which schedules jobs and handles other automated tasks. It runs as a Windows service so it can start automatically when the system boots or it can be started manually.



Due to the nature of PaaS service, SQL Azure does not support FileStream.

SQL server has support for FileStream


Table Partitioning

Table Partitioning is allowed in Azure SQL Server. However, because of the PaaS nature of the product you cannot allocate partitions to different file groups. You would have to create partition scheme on Primary and the storage will be taken care by SQL Azure database itself. You will surely benefit from partition elimination here.

In SQL server has full support for table partitioning and can create and allocate partitions to different file groups.


Link Server

In SQL Azure you cannot access link servers.

In SQL Server you can create a linked server and access data from another SQL Server.


Default Transaction

The default transaction isolation level in SQL Azure is READ_COMMITTED_SNAPSHOT.

The default transaction isolation level in SQL Server is READ_COMMITTED.


Global Temporary


In SQL Azure you can create only local temporary tables. It does not support global temporary tables.

SQL Server has support for both local temporary tables and global temporary tables.


Stored Procedures

· Extended SP = NO

· User defined SP = yes

· System SP = Some

· Extended SP = yes

· User defined SP = yes

· System SP = yes


SQL Trace and Profiler

In SQL Azure SQL Trace and Profiler cannot be used to trace the events. As of now there is not any direct alternate other than using DMVs, monitoring using Azure Portal and extended events.

In SQL Server you use SQL Trace and Profiler to trace the events.



In SQL Azure Common Language Runtime (SQL CLR) was first supported and then removed concerning security issues.

SQL Server still has support for SQL CLR


USE Statement

USE [DatabaseName] statement to switch the database context is not supported in SQL Azure.

Supported in SQL Server 2016


Log Shipping

SQL Azure does not support Log shipping

SQL Server supports log shipping.


Resource Governor

This feature is not available in Azure SQL Database.

Azure SQL Database comes with different services tiers each suitable for different workloads. You should first evaluate the performance tier your application workload will fit in and accordingly provision the database for that performance tier.

Supported in SQL Server. Resource Governor allows you to throttle/limit resources (CPU, Memory, I/O) as per different SQL Server workloads.


Attach Database

This feature is not supported in SQL Azure

Supported in SQL Server


Database Mail

This feature is not supported in SQL Azure

Supported in SQL Server


Data Quality Services

Not Supported in SQL Azure

Yes this feature available since SQL server 2012.


Event Notifications

Event notification is not supported in SQL Azure. However, One of the easiest ways to gain some automated insight into what is going on in your Azure SQL Databases is to set up alerts on your databases.

Supported in SQL server.


Master Data Services

This feature is not supported in SQL Azure

It is supported in SQL server.


Analysis Services (SSAS)

It is now supported in the cloud as azure analysis services.

It is supported in SQL server.


Integration Services

It is not supported directly but Microsoft provide SQL Server Integration Services (SSIS) Feature Pack for Azure.

It is supported in SQL server.


Failover Clustering

Not supported in SQL Azure. However, an alternative option is available which is Failover groups and active geo replication

It is supported in SQL server.



Server level auditing is not supported in SQL Azure but you can use SQL Azure database auditing which tracks database events and writes them to an audit log in your Azure storage account.

It is supported for SQL server and SQL server database.


Server Configuration Settings

In SQL Azure you cannot manage server configuration settings. You only have rights to manage database settings.

It is supported.

Reference Links:





Azure Logic Apps – August 2017 Update

  1. Azure Event Grid

Azure Event Grid is the event-based routing as a service offering from Microsoft that aligns with their “Serverless” strategy. Azure Event Grid simplifies the Event Consumption logic by making it more of a “Push” mechanism rather than a “Pull” mechanism – meaning, you can simply listen to and react to events from different Azure services and other sources without having to constantly poll into each service and look for events.

The best example where you can use Azure Event Grid is to automatically get notified when any user makes a slight modification to the production subscription, or when you have multiple IoT devices pumping telemetry data.

Azure Event Grid Connectors for Logic Apps

At present, there is an Azure Event Grid Connector with only one trigger – Azure Event Grid – When a resource event occurs. You can use this connector to trigger events whenever a resource event occurs.



  1. Custom HTML and CSV headers

If you have an array of data, you can easily convert the information into a CSV document or HTML table by using the “Create CSV Table” action. Later, you can pick up this CSV table and easily embed to an email.




  1. Enable log analytics from create

More easier way to enable Log Analytics by toggling the status while creating the Logic App. You no longer need to go to the Diagnostics section to enable Log Analytics.

Link to more details on how you can integrate Azure Logic Apps and Log Analytics:




  1. OMS Workspace Dashboard

Create a global dashboard for all the available Logic Apps under your subscription. View the status of the Logic App, number of runs and additional details.



  1. Peek at code view

Say, you are working with Logic Apps and you add a connector. From now, you can easily switch between the code view and designer view by clicking “Peek code” from the Options drop down.

Note: At present, the Peek code is available only in Read-Only mode. If you wish you need to edit the code directly from here, you can send the Logic Apps team a feedback on Twitter or through User Voice.


  1. Advance scheduling in the Logic Apps Designer

There are new options to schedule the Logic App execution on a Daily and Weekly basis. This was available in the code view but now you can get this experience right in the designer. Monthly update will be rolled out soon!

In the Schedule trigger, you will notice that when you click on Week, there are few advanced operations available for you to define when you want the trigger to execute during a week. Say, you want your trigger to execute every Monday, Wednesday, Friday at 9:35 AM, 1:35 PM; 5:35 PM. The below screenshot depicts the example. The preview section will display the actual Logic App trigger condition based on the previous selections.



New Connectors

  • Azure table storage
  • Azure Event Grid
  • Azure Log analytics
  • Azure SQL Data Warehouse
  • Microsoft StaffHub
  • MySQL (R/W)
  • ServiceNow (East US 2 region)
  • Amazon Redshift
  • DocFusion 365


What’s in progress?

As usual, another long list of features that the Logic Apps team is currently working on and should be available in the coming weeks.

  1. Concurrency Control (code-view live)

Say, your Logic App is executing in a faster way than you want it to actually work. In this case, you can make Logic Apps to slow down (restrict the number of Logic Apps running in parallel). This is possible today in the code-view where you can define say, only 10 Logic Apps can execute at a particular time in parallel. Therefore, when 10 Logic Apps are executing in parallel, the Logic Apps logic will stop polling until one of the 10 Logic Apps finish execution and then start polling for data.

NOTE: This works with the Polling Trigger (and not with Request Triggers such as Twitter connector etc) without SplitOn enabled.

  1. Custom Connectors

Get your own connector within your subscription so that your connector gets shown up on the list. This is currently in Private preview and should be available for public in the month of September.

  1. Large Files

Ability to move large files up to 1 GB (between) for specific connectors (blob, FTP). This is almost ready for release!

  1. SOAP

Native SOAP support to consume cloud and on-premise SOAP services. This is one of the most requested features on UserVoice.

  1. Variables (Cove Live view)

Append capability to aggregate data within loops. The AppendToArray will be shipped soon, and AppendToString will come in the next few weeks.


  1. Expression intellisense

This functionality will go live on August 25th. Say, if you are typing an expression, you will see the same intelligent view that you see when you are typing in Visual studio.

  1. Expression Tracing

You can actually get to see the intermediate values for complex expressions.

  1. Foreach nesting in the designer

This capability will soon be incorporated into the designer in the coming few weeks.

  1. Foreach failure navigation

If there are 1000 iterations in the foreach loop and 10 of them failed; instead of having to look for which one actually failed, you can navigate to the next failed action inside a for each loop easily to see what happened.

  1. Functions + swagger

You can automatically render the Azure functions annotated with Swagger. This functionality will be going live by end of August.

  1. Publish Logic Apps to Power Apps and Flow in a easy way

Many users who uses Power Apps and they want to talk to Logic Apps. This will make it a lot easier to call a logic App. You have to do some strange mechanicians before to light up that scenario.

  1. Time based batching

One of those hidden features everyone should know about is that you can create batches in a Logic App. Adding new functionality, once the batch is full it will be flushed automatically according to the time you specify.

For example, flush the batch after every 10 minute.





New features in Logic Apps – May 2017

Visual Studio 2017 Tools

Now we have logic app designer support and tooling in Visual Studio 2017.

Download from Cloud Explorer

You can browse live logic apps using Cloud Explorer and run History.

If you go to Cloud Explorer and Run Logic App, You can download full template version of it by using download Button on Visual studio. You can deploy your app directly.

Parallel creation in designer

Now we have new feature of logic app is parallel creation designer. You can click plus button on designer and add a parallel action. That feature was available in code view before ,now we have full support in logic app designer.

 5-12-2017 12-57-13 PM

Select and join Action

You have more action that you will do on arrays. So Select allows you to go ahead and if you have an array of objects and you have ABCD as properties of that object. You get to select A and B and you get new array with Properties A and B then join action is that lets you take an array of strings or objects. You can use delimiter and you can join and create a string out of those entries in that array. It is very useful one of new connectors is the computer version service witch is one of cognitive services one of the operation that describes an image give you all tags, it’s like an image has dog, it has building and all these things select and join. In case of tags you can filter them and you can turn into new string that you can send in an email.

Retry information in History

What is happening in the background? How long it took time? What errors you are getting in the intermediate calls? If you go to update something in MS Dynamics and MS Dynamics is down for a sub seconds instead of you just saying, you don’t get your update, we will go and we will retry it with retry polices. Now you can actually see what actually happened behind. You can retry three times.

Service Bus Sessions

Service bus session allows to do thing like correlation and sequential convoys for all your biz talk historians out there. Now you can do that in logic apps which uses the sequential convoy.

Sequential convoy is ability to have in order correlated messages locked to particular logic app instance so that no other logic app instance can get the message correlated to that. This is called a convoy set of correlated messages, so all go to one instance. When you lock that convoy, which is using for session, that logic app continue the process to all the messages that related to that context.



Run navigation when looking at history

When you look at your Run History Blade and when you enter one of your entries in run history, what you do? You get a compact view, you can monitoring your designer and you go ahead and click on all the items that you have filtered out.

B2B OMS Message Download

You have an option analyst, you can find what you run instances and download those and you can see all input, output messages and the payloads.




Now Set of variables are available, you can put ant value you want as a variable and use reference of that variable anywhere in logic app.

Release Notes on Blade

Now right from the management portal you go click on release node and you can see a blade of history of all logic app releases and find what new feature have been added.

Run After Configuration in Designer

You can configure your action to determine when it will run. So usually you know if you look at code view we see run after success is the default setting but some time you want to have an exception handling so now you can have a run after Success branch and run after failure branch then react to the different items. Now in the blade you hit the … in the action and you get chose the reaction that you run after the previous step. There is some styling too when the run after failed then the arrow is different. You can tell which one is catching an exception.

Posted in Azure. 1 Comment »

Azure BizTalk Rules Alternatives


All Preview API Apps including BizTalk Rules V1 API App was deprecated from Azure since Jan 18th, 2017. The migration path is to redeploy the V1 API App as App Service but this is not possible for built in BizTalk Rues API App.

Here are few alternatives of BizTalk Rules V1 API App:

Option 1: On-Premises BizTalk Rules Engine

we can use BizTalk BRE to define Rules in Rules Editor, deploy them on a BizTalk Server installation and write a Web API App to execute the policy through code, and we access the Web API App from Azure through a Hybrid Connection. The Logic Apps does not support Hybrid Connections yet, therefore we have to create a Relay Web App in Azure which take HTTP requests from the Logic App and pass it on to On-premises Rules Web API App.


Logic App –> App Service | Azure Hosted Relay Web API [Relay request and Cache Rules Result] –> Hybrid Connection –> On-Premises Web API –> BizTalk BRE Call


With this approach, we have access to full capabilities of BizTalk Rules Engine to create and execute complex Rules, and store and retrieve facts from on-premises SQL Server database. However, we need to have a BizTalk 2010+ installation on any on-premises server or on an Azure VM.


Option 2: CodeEffects Rules Engine

CodeEffects Rules Engine provide a web based Rules editing and testing platform, and we can choose this option if we don’t have a BizTalk Server installation, yet we want a UI based Rule editor and complex rules processing.

Logic App –> App Service | Azure Hosted Relay Web API [Relay request and Cache Rules Result] –> Hybrid Connection –> Code Effects Rule Engine

Option 3: Azure Functions as Rules Engine


Azure Functions is a solution for easily running small pieces of code in the cloud. The Code can be edited and deployed within Azure Portal. The idea is to use Azure Functions to defined and execute the Rules Logic and the Functions are invoked over an HTTP call from Logic Apps or Web Apps.


Logic App | Web App –> HTTP call to Azure Function –> Azure Function implementation



Rules Authoring

Rules Complexity


1: BizTalk Rules Engine



BizTalk installation

2: CodeEffects



CodeEffects Rules License/Free version

3: Azure Functions


Custom Code


Posted in Azure, BizTalk, C#. Tags: . Leave a Comment »

Azure Service Bus: Securing message content

When working with web services published over Internet, it is immensely important to secure the data sent and received from the web services. To provide data security at transport level, we can use Secure Socket Layer/TLS. Secure Sockets Layer (SSL) provides encryption and signing the contents of the packets sent over TCP which makes data securely transmitted over network layer.
But this is not enough when the data being sent is of sensitive nature and no one except the receiver should be able to read it. We also need to consider data archiving or tracking at intermediate storage or message data security of inflight messages between the two systems. In such scenarios, it is very important to have message level security as well along with transport level security (SSL) in order to keep the data safe and secure.

In order to implement message level security, the data should be encrypted before sending over wire to the web service and when it reaches the destination, it should be decrypted. RSA and AES algorithms are the most commonly used modern times cryptosystems.

AES Advanced Encryption Standard is a symmetric cryptosystem widely used to encrypt large amount of data. AES makes use of a password and a salt to encrypt and decrypt data.

While RSA is an asymmetric cryptographic algorithm which means two keys are generated, one as public key which is used to encrypt the data and the other as private key used to decrypt the data. Unlike AES the RSA has a maximum data size limitation. The maximum size of data which can be encrypted using RSA with 2048 bits key is 256 bytes. This is the reason why we used RSA for encryption of small piece of data i.e. password and salt. Whereas for larger piece of data we used AES algorithm for encryption.

Using only RSA would have a negative impact on overall performance. As stated earlier the RSA can only encrypt and decrypt data of limited size, so we would have to partition data into several chunks with each not exceeding the maximum of 256 bytes including the padding size. Then applying encryption on each piece of chunk and after encryption, chaining these chunks together in a sequence which they were in before applying partitioning. The same strategy would have also needed to be applied at the time of decryption at the cost of performance degradation. All this would have added as a huge overhead to overall complexity and performance of encryption and decryption.

We used RSA to encrypt and decrypt small piece of data i.e. password and salt used in AES encryption and used AES encryption and decryption for the data payload.


In our scenario there is a producer console application which picks up a HL7 Eligibility EDI file stored in a local folder, encrypts the data and posts it to a Logic App hosted in Azure.

The Logic App then stores the encrypted data in a Blob storage for archiving and sends the encrypted data to the Service Bus queue.

The consumer is an on premises web application which picks data from the Service Bus queue, decrypts it on premises and displays it on the screen.


We created a self-signed certificate and generated .cer that holds the public key to be used for encryption and .pfx that stores both private and public keys which is used for decryption.

The producer console application used public key to encrypt and the consumer web application used private key to decrypt.

Implementation Steps

1. We used AES cryptosystem to encrypt data using a randomly generated password and a Salt.

2. Then we encrypted randomly generated password and the Salt using RSA cryptosystem with the public key stored in a self-signed public certificate. (.cer file)

3. After getting all three pieces of information encrypted, a JSON message is created for sending encrypted content, encrypted password and encrypted salt to the Logic app. The data was base64 encoded before putting it in the JSON message







4. Once message is received at the Logic App, it stores it in a blob storage and sends it to a Service Bus queue.


5. A consumer web app get the encrypted message from Service Bus queue and decrypts encrypted password and SALT using the private key stored in a certificate which carries a private key along with public key and has the .pfx extension. Then using these decrypted password and Salt, decrypts the AES encrypted data and displays it on the screen.


How to Generate Self Signed certificate

There are multiple ways to generate a self -signed certificate but we used IIS Manager to generate one for our demo.

1. Open IIS Manager


2. Scroll down to IIS pane and double click icon ‘Server Certificates’


3. A new window will open, click Create Self-Signed Certificate on the right under the ‘Action’ pane


4. Give any name to the new certificate and create


5. Select newly created certificate and click on export from the right pane


6. Set a password and press Ok. This password is required when you are loading the certificate in the code or importing it to the certificate store.

Posted in Azure, C#. 2 Comments »

Secure Azure Service Bus Relays with SAS Token

This article explains how to secure Service Bus Relays using Shared Access Signature (SAS) to prevent unauthorized/Anonymous access.

Shared Access Signature

Shared Access Signatures are based on SHA-256 secure hashes or URIs. In Azure all Service Bus services provide this authentication mechanism to control access to the resource that may be a service bus relay, a service bus messaging queue or a topic. The SAS is composed of two components:

1. A Shared Access Policy.

2. A Shared Access Signature which is also called a token.

To secure and access a Service Bus Relay endpoint first we need to create a Service Bus Relay Namespace in the Azure portal. After the namespace has been created, create a new policy under Service Bus Relay namespace. We created a new Service Bus Namespace and a new policy as RelayPolicy as shown in the picture below.


Note: we will use Policy name and the Primary key to generate a SAS token or Shared Access Signature in the console application which we are going to create shortly.

Create a console application in C# to generate SAS token

Now we will create a C# script to generate a SAS token.

Create a console application in Visual Studio and name it whatever you like.

Replace the code in the Program.cs class with the following code. Note that the Primary Key and the Policy name may vary and you need to put your own Policy name and Primary Key here.

static void Main(string[] args)


var strAuthorizaitionHeader = GenerateToken(“https:// Dev-Relays.servicebus.windows.net/”,

“RelayPolicy”, “*********************=”);


public static string GenerateToken(string resourceUri, string sasKeyName, string sasKey)


//set the token lifespan

TimeSpan sinceEpoch = DateTime.UtcNow – new DateTime(1970, 1, 1);

var expiry = Convert.ToString((int)sinceEpoch.TotalSeconds + 3600); //1hour

string stringToSign = HttpUtility.UrlEncode(resourceUri) + “\n” + expiry;

HMACSHA256 hmac = new HMACSHA256(Encoding.UTF8.GetBytes(sasKey));

var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));

//format the sas token

var sasToken = String.Format(CultureInfo.InvariantCulture, “SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}”,

HttpUtility.UrlEncode(resourceUri), HttpUtility.UrlEncode(signature), expiry, sasKeyName);

return sasToken;



We will use this token in the logic app to send HTTP request to the Relay Service endpoint.

Now Open the logic app and go to the HTTP Post action and paste the SAS token string as value for Authorization HTTP header


Configurations for BasicHttp Relay transport properties

In Biztalk BasicHttp Relay set the transport properties as following.

1. Set the Security mode to Transport.

2. Set Relay Client Authentication Type as Relay Access Token.

The following picture shows the configurations settings of BasicHttp Relay transport properties