New features in Logic Apps – May 2017

Visual Studio 2017 Tools

Now we have logic app designer support and tooling in Visual Studio 2017.

Download from Cloud Explorer

You can browse live logic apps using Cloud Explorer and run History.

If you go to Cloud Explorer and Run Logic App, You can download full template version of it by using download Button on Visual studio. You can deploy your app directly.

Parallel creation in designer

Now we have new feature of logic app is parallel creation designer. You can click plus button on designer and add a parallel action. That feature was available in code view before ,now we have full support in logic app designer.

 5-12-2017 12-57-13 PM

Select and join Action

You have more action that you will do on arrays. So Select allows you to go ahead and if you have an array of objects and you have ABCD as properties of that object. You get to select A and B and you get new array with Properties A and B then join action is that lets you take an array of strings or objects. You can use delimiter and you can join and create a string out of those entries in that array. It is very useful one of new connectors is the computer version service witch is one of cognitive services one of the operation that describes an image give you all tags, it’s like an image has dog, it has building and all these things select and join. In case of tags you can filter them and you can turn into new string that you can send in an email.

Retry information in History

What is happening in the background? How long it took time? What errors you are getting in the intermediate calls? If you go to update something in MS Dynamics and MS Dynamics is down for a sub seconds instead of you just saying, you don’t get your update, we will go and we will retry it with retry polices. Now you can actually see what actually happened behind. You can retry three times.

Service Bus Sessions

Service bus session allows to do thing like correlation and sequential convoys for all your biz talk historians out there. Now you can do that in logic apps which uses the sequential convoy.

Sequential convoy is ability to have in order correlated messages locked to particular logic app instance so that no other logic app instance can get the message correlated to that. This is called a convoy set of correlated messages, so all go to one instance. When you lock that convoy, which is using for session, that logic app continue the process to all the messages that related to that context.



Run navigation when looking at history

When you look at your Run History Blade and when you enter one of your entries in run history, what you do? You get a compact view, you can monitoring your designer and you go ahead and click on all the items that you have filtered out.

B2B OMS Message Download

You have an option analyst, you can find what you run instances and download those and you can see all input, output messages and the payloads.




Now Set of variables are available, you can put ant value you want as a variable and use reference of that variable anywhere in logic app.

Release Notes on Blade

Now right from the management portal you go click on release node and you can see a blade of history of all logic app releases and find what new feature have been added.

Run After Configuration in Designer

You can configure your action to determine when it will run. So usually you know if you look at code view we see run after success is the default setting but some time you want to have an exception handling so now you can have a run after Success branch and run after failure branch then react to the different items. Now in the blade you hit the … in the action and you get chose the reaction that you run after the previous step. There is some styling too when the run after failed then the arrow is different. You can tell which one is catching an exception.

Posted in Azure. 1 Comment »

Azure BizTalk Rules Alternatives


All Preview API Apps including BizTalk Rules V1 API App was deprecated from Azure since Jan 18th, 2017. The migration path is to redeploy the V1 API App as App Service but this is not possible for built in BizTalk Rues API App.

Here are few alternatives of BizTalk Rules V1 API App:

Option 1: On-Premises BizTalk Rules Engine

we can use BizTalk BRE to define Rules in Rules Editor, deploy them on a BizTalk Server installation and write a Web API App to execute the policy through code, and we access the Web API App from Azure through a Hybrid Connection. The Logic Apps does not support Hybrid Connections yet, therefore we have to create a Relay Web App in Azure which take HTTP requests from the Logic App and pass it on to On-premises Rules Web API App.


Logic App –> App Service | Azure Hosted Relay Web API [Relay request and Cache Rules Result] –> Hybrid Connection –> On-Premises Web API –> BizTalk BRE Call


With this approach, we have access to full capabilities of BizTalk Rules Engine to create and execute complex Rules, and store and retrieve facts from on-premises SQL Server database. However, we need to have a BizTalk 2010+ installation on any on-premises server or on an Azure VM.


Option 2: CodeEffects Rules Engine

CodeEffects Rules Engine provide a web based Rules editing and testing platform, and we can choose this option if we don’t have a BizTalk Server installation, yet we want a UI based Rule editor and complex rules processing.

Logic App –> App Service | Azure Hosted Relay Web API [Relay request and Cache Rules Result] –> Hybrid Connection –> Code Effects Rule Engine

Option 3: Azure Functions as Rules Engine


Azure Functions is a solution for easily running small pieces of code in the cloud. The Code can be edited and deployed within Azure Portal. The idea is to use Azure Functions to defined and execute the Rules Logic and the Functions are invoked over an HTTP call from Logic Apps or Web Apps.


Logic App | Web App –> HTTP call to Azure Function –> Azure Function implementation



Rules Authoring

Rules Complexity


1: BizTalk Rules Engine



BizTalk installation

2: CodeEffects



CodeEffects Rules License/Free version

3: Azure Functions


Custom Code


Posted in Azure, BizTalk, C#. Tags: . Leave a Comment »

Azure Service Bus: Securing message content

When working with web services published over Internet, it is immensely important to secure the data sent and received from the web services. To provide data security at transport level, we can use Secure Socket Layer/TLS. Secure Sockets Layer (SSL) provides encryption and signing the contents of the packets sent over TCP which makes data securely transmitted over network layer.
But this is not enough when the data being sent is of sensitive nature and no one except the receiver should be able to read it. We also need to consider data archiving or tracking at intermediate storage or message data security of inflight messages between the two systems. In such scenarios, it is very important to have message level security as well along with transport level security (SSL) in order to keep the data safe and secure.

In order to implement message level security, the data should be encrypted before sending over wire to the web service and when it reaches the destination, it should be decrypted. RSA and AES algorithms are the most commonly used modern times cryptosystems.

AES Advanced Encryption Standard is a symmetric cryptosystem widely used to encrypt large amount of data. AES makes use of a password and a salt to encrypt and decrypt data.

While RSA is an asymmetric cryptographic algorithm which means two keys are generated, one as public key which is used to encrypt the data and the other as private key used to decrypt the data. Unlike AES the RSA has a maximum data size limitation. The maximum size of data which can be encrypted using RSA with 2048 bits key is 256 bytes. This is the reason why we used RSA for encryption of small piece of data i.e. password and salt. Whereas for larger piece of data we used AES algorithm for encryption.

Using only RSA would have a negative impact on overall performance. As stated earlier the RSA can only encrypt and decrypt data of limited size, so we would have to partition data into several chunks with each not exceeding the maximum of 256 bytes including the padding size. Then applying encryption on each piece of chunk and after encryption, chaining these chunks together in a sequence which they were in before applying partitioning. The same strategy would have also needed to be applied at the time of decryption at the cost of performance degradation. All this would have added as a huge overhead to overall complexity and performance of encryption and decryption.

We used RSA to encrypt and decrypt small piece of data i.e. password and salt used in AES encryption and used AES encryption and decryption for the data payload.


In our scenario there is a producer console application which picks up a HL7 Eligibility EDI file stored in a local folder, encrypts the data and posts it to a Logic App hosted in Azure.

The Logic App then stores the encrypted data in a Blob storage for archiving and sends the encrypted data to the Service Bus queue.

The consumer is an on premises web application which picks data from the Service Bus queue, decrypts it on premises and displays it on the screen.


We created a self-signed certificate and generated .cer that holds the public key to be used for encryption and .pfx that stores both private and public keys which is used for decryption.

The producer console application used public key to encrypt and the consumer web application used private key to decrypt.

Implementation Steps

1. We used AES cryptosystem to encrypt data using a randomly generated password and a Salt.

2. Then we encrypted randomly generated password and the Salt using RSA cryptosystem with the public key stored in a self-signed public certificate. (.cer file)

3. After getting all three pieces of information encrypted, a JSON message is created for sending encrypted content, encrypted password and encrypted salt to the Logic app. The data was base64 encoded before putting it in the JSON message







4. Once message is received at the Logic App, it stores it in a blob storage and sends it to a Service Bus queue.


5. A consumer web app get the encrypted message from Service Bus queue and decrypts encrypted password and SALT using the private key stored in a certificate which carries a private key along with public key and has the .pfx extension. Then using these decrypted password and Salt, decrypts the AES encrypted data and displays it on the screen.


How to Generate Self Signed certificate

There are multiple ways to generate a self -signed certificate but we used IIS Manager to generate one for our demo.

1. Open IIS Manager


2. Scroll down to IIS pane and double click icon ‘Server Certificates’


3. A new window will open, click Create Self-Signed Certificate on the right under the ‘Action’ pane


4. Give any name to the new certificate and create


5. Select newly created certificate and click on export from the right pane


6. Set a password and press Ok. This password is required when you are loading the certificate in the code or importing it to the certificate store.

Posted in Azure, C#. 1 Comment »

Secure Azure Service Bus Relays with SAS Token

This article explains how to secure Service Bus Relays using Shared Access Signature (SAS) to prevent unauthorized/Anonymous access.

Shared Access Signature

Shared Access Signatures are based on SHA-256 secure hashes or URIs. In Azure all Service Bus services provide this authentication mechanism to control access to the resource that may be a service bus relay, a service bus messaging queue or a topic. The SAS is composed of two components:

1. A Shared Access Policy.

2. A Shared Access Signature which is also called a token.

To secure and access a Service Bus Relay endpoint first we need to create a Service Bus Relay Namespace in the Azure portal. After the namespace has been created, create a new policy under Service Bus Relay namespace. We created a new Service Bus Namespace and a new policy as RelayPolicy as shown in the picture below.


Note: we will use Policy name and the Primary key to generate a SAS token or Shared Access Signature in the console application which we are going to create shortly.

Create a console application in C# to generate SAS token

Now we will create a C# script to generate a SAS token.

Create a console application in Visual Studio and name it whatever you like.

Replace the code in the Program.cs class with the following code. Note that the Primary Key and the Policy name may vary and you need to put your own Policy name and Primary Key here.

static void Main(string[] args)


var strAuthorizaitionHeader = GenerateToken(“https://”,

“RelayPolicy”, “*********************=”);


public static string GenerateToken(string resourceUri, string sasKeyName, string sasKey)


//set the token lifespan

TimeSpan sinceEpoch = DateTime.UtcNow – new DateTime(1970, 1, 1);

var expiry = Convert.ToString((int)sinceEpoch.TotalSeconds + 3600); //1hour

string stringToSign = HttpUtility.UrlEncode(resourceUri) + “\n” + expiry;

HMACSHA256 hmac = new HMACSHA256(Encoding.UTF8.GetBytes(sasKey));

var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));

//format the sas token

var sasToken = String.Format(CultureInfo.InvariantCulture, “SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}”,

HttpUtility.UrlEncode(resourceUri), HttpUtility.UrlEncode(signature), expiry, sasKeyName);

return sasToken;



We will use this token in the logic app to send HTTP request to the Relay Service endpoint.

Now Open the logic app and go to the HTTP Post action and paste the SAS token string as value for Authorization HTTP header


Configurations for BasicHttp Relay transport properties

In Biztalk BasicHttp Relay set the transport properties as following.

1. Set the Security mode to Transport.

2. Set Relay Client Authentication Type as Relay Access Token.

The following picture shows the configurations settings of BasicHttp Relay transport properties