New features in Logic Apps – May 2017

Visual Studio 2017 Tools

Now we have logic app designer support and tooling in Visual Studio 2017.

Download from Cloud Explorer

You can browse live logic apps using Cloud Explorer and run History.

If you go to Cloud Explorer and Run Logic App, You can download full template version of it by using download Button on Visual studio. You can deploy your app directly.

Parallel creation in designer

Now we have new feature of logic app is parallel creation designer. You can click plus button on designer and add a parallel action. That feature was available in code view before ,now we have full support in logic app designer.

 5-12-2017 12-57-13 PM

Select and join Action

You have more action that you will do on arrays. So Select allows you to go ahead and if you have an array of objects and you have ABCD as properties of that object. You get to select A and B and you get new array with Properties A and B then join action is that lets you take an array of strings or objects. You can use delimiter and you can join and create a string out of those entries in that array. It is very useful one of new connectors is the computer version service witch is one of cognitive services one of the operation that describes an image give you all tags, it’s like an image has dog, it has building and all these things select and join. In case of tags you can filter them and you can turn into new string that you can send in an email.

Retry information in History

What is happening in the background? How long it took time? What errors you are getting in the intermediate calls? If you go to update something in MS Dynamics and MS Dynamics is down for a sub seconds instead of you just saying, you don’t get your update, we will go and we will retry it with retry polices. Now you can actually see what actually happened behind. You can retry three times.

Service Bus Sessions

Service bus session allows to do thing like correlation and sequential convoys for all your biz talk historians out there. Now you can do that in logic apps which uses the sequential convoy.

Sequential convoy is ability to have in order correlated messages locked to particular logic app instance so that no other logic app instance can get the message correlated to that. This is called a convoy set of correlated messages, so all go to one instance. When you lock that convoy, which is using for session, that logic app continue the process to all the messages that related to that context.

clip_image003

clip_image004

Run navigation when looking at history

When you look at your Run History Blade and when you enter one of your entries in run history, what you do? You get a compact view, you can monitoring your designer and you go ahead and click on all the items that you have filtered out.

B2B OMS Message Download

You have an option analyst, you can find what you run instances and download those and you can see all input, output messages and the payloads.

clip_image008

clip_image010

Variables-Set

Now Set of variables are available, you can put ant value you want as a variable and use reference of that variable anywhere in logic app.

Release Notes on Blade

Now right from the management portal you go click on release node and you can see a blade of history of all logic app releases and find what new feature have been added.

Run After Configuration in Designer

You can configure your action to determine when it will run. So usually you know if you look at code view we see run after success is the default setting but some time you want to have an exception handling so now you can have a run after Success branch and run after failure branch then react to the different items. Now in the blade you hit the … in the action and you get chose the reaction that you run after the previous step. There is some styling too when the run after failed then the arrow is different. You can tell which one is catching an exception.

Posted in Azure. 1 Comment »

Azure BizTalk Rules Alternatives

 

All Preview API Apps including BizTalk Rules V1 API App was deprecated from Azure since Jan 18th, 2017. The migration path is to redeploy the V1 API App as App Service but this is not possible for built in BizTalk Rues API App.

Here are few alternatives of BizTalk Rules V1 API App:

Option 1: On-Premises BizTalk Rules Engine

we can use BizTalk BRE to define Rules in Rules Editor, deploy them on a BizTalk Server installation and write a Web API App to execute the policy through code, and we access the Web API App from Azure through a Hybrid Connection. The Logic Apps does not support Hybrid Connections yet, therefore we have to create a Relay Web App in Azure which take HTTP requests from the Logic App and pass it on to On-premises Rules Web API App.

 

Logic App –> App Service | Azure Hosted Relay Web API [Relay request and Cache Rules Result] –> Hybrid Connection –> On-Premises Web API –> BizTalk BRE Call

 

With this approach, we have access to full capabilities of BizTalk Rules Engine to create and execute complex Rules, and store and retrieve facts from on-premises SQL Server database. However, we need to have a BizTalk 2010+ installation on any on-premises server or on an Azure VM.

 

Option 2: CodeEffects Rules Engine

CodeEffects Rules Engine provide a web based Rules editing and testing platform, and we can choose this option if we don’t have a BizTalk Server installation, yet we want a UI based Rule editor and complex rules processing.

Logic App –> App Service | Azure Hosted Relay Web API [Relay request and Cache Rules Result] –> Hybrid Connection –> Code Effects Rule Engine

Option 3: Azure Functions as Rules Engine

 

Azure Functions is a solution for easily running small pieces of code in the cloud. The Code can be edited and deployed within Azure Portal. The idea is to use Azure Functions to defined and execute the Rules Logic and the Functions are invoked over an HTTP call from Logic Apps or Web Apps.

 

Logic App | Web App –> HTTP call to Azure Function –> Azure Function implementation

 

Option/Feature

Rules Authoring

Rules Complexity

Dependency

1: BizTalk Rules Engine

UI

Complex

BizTalk installation

2: CodeEffects

UI

Complex

CodeEffects Rules License/Free version

3: Azure Functions

Code

Custom Code

No

Posted in Azure, BizTalk, C#. Tags: . Leave a Comment »

Azure Service Bus: Securing message content

When working with web services published over Internet, it is immensely important to secure the data sent and received from the web services. To provide data security at transport level, we can use Secure Socket Layer/TLS. Secure Sockets Layer (SSL) provides encryption and signing the contents of the packets sent over TCP which makes data securely transmitted over network layer.
But this is not enough when the data being sent is of sensitive nature and no one except the receiver should be able to read it. We also need to consider data archiving or tracking at intermediate storage or message data security of inflight messages between the two systems. In such scenarios, it is very important to have message level security as well along with transport level security (SSL) in order to keep the data safe and secure.

In order to implement message level security, the data should be encrypted before sending over wire to the web service and when it reaches the destination, it should be decrypted. RSA and AES algorithms are the most commonly used modern times cryptosystems.

AES Advanced Encryption Standard is a symmetric cryptosystem widely used to encrypt large amount of data. AES makes use of a password and a salt to encrypt and decrypt data.

While RSA is an asymmetric cryptographic algorithm which means two keys are generated, one as public key which is used to encrypt the data and the other as private key used to decrypt the data. Unlike AES the RSA has a maximum data size limitation. The maximum size of data which can be encrypted using RSA with 2048 bits key is 256 bytes. This is the reason why we used RSA for encryption of small piece of data i.e. password and salt. Whereas for larger piece of data we used AES algorithm for encryption.

Using only RSA would have a negative impact on overall performance. As stated earlier the RSA can only encrypt and decrypt data of limited size, so we would have to partition data into several chunks with each not exceeding the maximum of 256 bytes including the padding size. Then applying encryption on each piece of chunk and after encryption, chaining these chunks together in a sequence which they were in before applying partitioning. The same strategy would have also needed to be applied at the time of decryption at the cost of performance degradation. All this would have added as a huge overhead to overall complexity and performance of encryption and decryption.

We used RSA to encrypt and decrypt small piece of data i.e. password and salt used in AES encryption and used AES encryption and decryption for the data payload.

Scenario

In our scenario there is a producer console application which picks up a HL7 Eligibility EDI file stored in a local folder, encrypts the data and posts it to a Logic App hosted in Azure.

The Logic App then stores the encrypted data in a Blob storage for archiving and sends the encrypted data to the Service Bus queue.

The consumer is an on premises web application which picks data from the Service Bus queue, decrypts it on premises and displays it on the screen.

clip_image001

We created a self-signed certificate and generated .cer that holds the public key to be used for encryption and .pfx that stores both private and public keys which is used for decryption.

The producer console application used public key to encrypt and the consumer web application used private key to decrypt.

Implementation Steps

1. We used AES cryptosystem to encrypt data using a randomly generated password and a Salt.

2. Then we encrypted randomly generated password and the Salt using RSA cryptosystem with the public key stored in a self-signed public certificate. (.cer file)

3. After getting all three pieces of information encrypted, a JSON message is created for sending encrypted content, encrypted password and encrypted salt to the Logic app. The data was base64 encoded before putting it in the JSON message

Example:

{

"content":"dz921W/64+ArEyrlE/2OI5CdAVvKXFlnIF43Y+zkrgftKiffLpy0a9Ei2YG4JGRqcH4I3wSI7H1vuC91sLHCgbyM0bIpR7G….JzL",

"password":"OwQQfWcCsuNNYo…==",

"salt":"GnmIBuy1rdK9cILpUc…1w=="

}

4. Once message is received at the Logic App, it stores it in a blob storage and sends it to a Service Bus queue.

clip_image003

5. A consumer web app get the encrypted message from Service Bus queue and decrypts encrypted password and SALT using the private key stored in a certificate which carries a private key along with public key and has the .pfx extension. Then using these decrypted password and Salt, decrypts the AES encrypted data and displays it on the screen.

clip_image005

How to Generate Self Signed certificate

There are multiple ways to generate a self -signed certificate but we used IIS Manager to generate one for our demo.

1. Open IIS Manager

clip_image007

2. Scroll down to IIS pane and double click icon ‘Server Certificates’

clip_image009

3. A new window will open, click Create Self-Signed Certificate on the right under the ‘Action’ pane

clip_image011

4. Give any name to the new certificate and create

clip_image013

5. Select newly created certificate and click on export from the right pane

clip_image015

6. Set a password and press Ok. This password is required when you are loading the certificate in the code or importing it to the certificate store.

Posted in Azure, C#. 1 Comment »

Secure Azure Service Bus Relays with SAS Token

This article explains how to secure Service Bus Relays using Shared Access Signature (SAS) to prevent unauthorized/Anonymous access.

Shared Access Signature

Shared Access Signatures are based on SHA-256 secure hashes or URIs. In Azure all Service Bus services provide this authentication mechanism to control access to the resource that may be a service bus relay, a service bus messaging queue or a topic. The SAS is composed of two components:

1. A Shared Access Policy.

2. A Shared Access Signature which is also called a token.

To secure and access a Service Bus Relay endpoint first we need to create a Service Bus Relay Namespace in the Azure portal. After the namespace has been created, create a new policy under Service Bus Relay namespace. We created a new Service Bus Namespace and a new policy as RelayPolicy as shown in the picture below.

clip_image002

Note: we will use Policy name and the Primary key to generate a SAS token or Shared Access Signature in the console application which we are going to create shortly.

Create a console application in C# to generate SAS token

Now we will create a C# script to generate a SAS token.

Create a console application in Visual Studio and name it whatever you like.

Replace the code in the Program.cs class with the following code. Note that the Primary Key and the Policy name may vary and you need to put your own Policy name and Primary Key here.

static void Main(string[] args)

{

var strAuthorizaitionHeader = GenerateToken(“https:// Dev-Relays.servicebus.windows.net/”,

“RelayPolicy”, “*********************=”);

}

public static string GenerateToken(string resourceUri, string sasKeyName, string sasKey)

{

//set the token lifespan

TimeSpan sinceEpoch = DateTime.UtcNow – new DateTime(1970, 1, 1);

var expiry = Convert.ToString((int)sinceEpoch.TotalSeconds + 3600); //1hour

string stringToSign = HttpUtility.UrlEncode(resourceUri) + “\n” + expiry;

HMACSHA256 hmac = new HMACSHA256(Encoding.UTF8.GetBytes(sasKey));

var signature = Convert.ToBase64String(hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)));

//format the sas token

var sasToken = String.Format(CultureInfo.InvariantCulture, “SharedAccessSignature sr={0}&sig={1}&se={2}&skn={3}”,

HttpUtility.UrlEncode(resourceUri), HttpUtility.UrlEncode(signature), expiry, sasKeyName);

return sasToken;

}

 

We will use this token in the logic app to send HTTP request to the Relay Service endpoint.

Now Open the logic app and go to the HTTP Post action and paste the SAS token string as value for Authorization HTTP header

clip_image004

Configurations for BasicHttp Relay transport properties

In Biztalk BasicHttp Relay set the transport properties as following.

1. Set the Security mode to Transport.

2. Set Relay Client Authentication Type as Relay Access Token.

The following picture shows the configurations settings of BasicHttp Relay transport properties

clip_image006

 

 

BizTalk: Start EDI batches through SQL Script

Here is the script to start all EDI batches for a given Sender and Receiver Party. the script inserts PAM control messages in database which will trigger EDI batching Orchestration in BizTalk

DECLARE @i int = 47 --start batch Id
WHILE @i <= 80 --end batch id
BEGIN
exec edi_PAMBatchingLogDelete @BatchId=@i,@IgnorePendingControlMessages=0
SET @i = @i + 1
END


INSERT INTO [BizTalkMgmtDb].[dbo].[PAM_Control]
([EdiMessageType]
,[ActionType]
,[ActionDateTime]
,[UsedOnce]
,[BatchId]
,[BatchName]
,[SenderPartyName]
,[ReceiverPartyName]
,[AgreementName])
SELECT 0,
'EdiBatchActivate',
GetDate() as 'ActionDateTime',
0 as 'UsedOnce',
bd.Id,
bd.[Name],
[SenderPartyName],
[ReceiverPartyName],
a.Name
FROM [BizTalkMgmtDb].[tpm].[BatchDescription] bd
join [BizTalkMgmtDb].[tpm].Agreement a on bd.OnewayAgreementId = a.ReceiverOnewayAgreementId

Posted in Misc. Tags: , . Leave a Comment »

Could not load file or assembly ‘Microsoft.BizTalk.Interop.SSOClient, Version=7.0.2300.0

SSO Error

Unexpected exception occurred while configuring [BizTalk EDI/AS2 Runtime].

——————————
ADDITIONAL INFORMATION:

Could not load file or assembly ‘Microsoft.BizTalk.Interop.SSOClient, Version=7.0.2300.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35’ or one of its dependencies. The system cannot find the file specified. (EDIAS2Config)

FIX

Install SSOClient from [BizTalkServer2013 installation media]\BT Server\Platform\SSO\Client

Posted in BizTalk. Tags: . 1 Comment »

BizTalk EDI: Creating batching configuration in code

For outbound EDI batching, we have to create Batch Configuration in Party Configuration and set Filter Criteria about what messages should constitute one batch. But what if Filter Criteria is dynamic or we have too many batches to configure

here is code snippet to create batch configuration through API

 var builder = new SqlConnectionStringBuilder("DATA SOURCE="BTSSQLServer;Initial Catalog=BizTalkMgmtDb;
Integrated Security=SSPI;");
            using (var tmpCtx = TpmContext.Create(builder))
            {
                var agreement = (from p in tmpCtx.Agreements where p.Name == AgreementName select p).FirstOrDefault();
                if (agreement == null)//create agreement if does not exist.
                {
                    string Sender = ConfigurationManager.AppSettings["Sender"] ?? "SenderPartyName";
                    string Receiver = ConfigurationManager.AppSettings["Receiver"] ?? "ReceiverPartyName";
                    var OneWayAgreement = agreement.GetOnewayAgreement(Sender, Receiver); 
                    var firstBatch = OneWayAgreement.GetBatches()[0]; // first batch config is used as template
                    var newBatch = OneWayAgreement.CreateBatch(BatchName);
                    FilterPredicate predicate = firstBatch.GetFilterPredicate();
                    predicate.Groups[0].Statements[0].Value = "FilterRHS";
                    newBatch.SetFilterPredicate(predicate);
                    newBatch.SetReleaseCriteria(firstBatch.GetReleaseCriteria());
                    tmpCtx.AddToBatchDescriptions(newBatch);
                    tmpCtx.SaveChanges();
// this SQL insert is required to Activate the batch config
                    using (var cmd = new SqlCommand(@" INSERT INTO [dbo].[PAM_Control]
                                                                       ([EdiMessageType]
                                                                       ,[ActionType]
                                                                       ,[ActionDateTime]
                                                                       ,[UsedOnce]
                                                                       ,[BatchId]
                                                                       ,[BatchName]
                                                                       ,[SenderPartyName], ReceiverPartyName, AgreementName)
                                                                    SELECT 0 as EDIMessageType
                                                                  ,'EdiBatchActivate' as 'ActionType'
                                                                  ,GetDate() as 'ActionDateTime'
                                                                  ,0 as 'UsedOnce' 
                                                                  ," + newBatch.Id + @" as [BatchId]
                                                                  ,'" + BatchName + @"' as [BatchName], '"+ Sender +@"', '"+
 Receiver +@"', '" + AgreementName + "'", new SqlConnection(builder.ConnectionString)))
                    {
                        cmd.ExecuteNonQuery();
                    }
                }
                tmpCtx.Dispose();
Posted in BizTalk, C#. Tags: , . Leave a Comment »