Tim Murphy's .NET Software Architecture Blog

Azure Functions Visual Studio 2017 Development Aug 10

Image result for azure functions logo

The development tools and processes for Azure Functions are ever changing.  We started out only being able to create a function through the portal which I did a series on.  We then got a template in VS2015, but it really didn’t work very well.  They have since been able to create functions as Web Application libraries and now we are close the the release of a VS2017 template.

This post will walk through the basics of using the VS2017 Preview with the Visual Studio Tools For Azure Functions which you can download here.

Create New Project

To create the initial solution open up the New Project dialog and find the Azure Function project type, name your project and click OK.

image_thumb10

Create New Function

To add a function to your project, right-click the project and select New Item.  In the New Item dialog select Azure Function and provide a name for the class and click Add. 

image_thumb12

The next dialog which will appear is the New Azure Function dialog.  Here you will select the function trigger type and its parameters.  In the example below a timer trigger has been selected and a Cron schedule definition is automatically defined to execute every 5 minutes.

Also in this dialog you can set the name of the function.  When you compile a folder will be created with that name in you bin directory which will be used later for deployment.

image_thumb14

Add Bindings

With each generation of Azure Function development the way you initially define bindings changes (even if they stay the same behind the scenes).  Initially you had to use the portal Integrate page.  This had its advantages.  It would visually prompt you for the type of binding and the parameters for that binding.

With the Visual Studio template you have to add attributes to the Run method of your function class.  This requires that you know what the attribute names are and what parameters are available and their proper values.  You can find a list of the main binding attributes here.

At compile time the attributes will be used to generate a function.json file with your trigger and bindings definition.

Add NuGet Packages

If you are building functions in the portal you have to create a projects.json file that defines the packages you want to include.  This requires that you know the format of the file.  Thankfully with the Visual Studio template you can use the normal Nuget Package manager.

Deploying

There are a couple of ways to deploy your solution.  In the end a Function App is a specialized App Service.  This means you have the same deployment options of Visual Studio, PowerShell or via VSTS continuous deployment.  The main difference is that you don’t have a web.config file and have to manage you app settings and connection strings through the portal.  This can be reached by following the Application Settings link under the Configured Features section of the Function App Overview page.

image

Summary

While creating Azure Functions still isn’t a WYSIWYG turn key process the latest incarnation gives us an ALM capable solution.  I believe this is the development approach that will stabilize for the foreseeable future and anyone who is creating Functions should invest in learning.

Query Application Insights REST API To Create Custom Notifications Aug 04

Image result for azure application insights logo

Application Insights is one of those tools that has been around for a number of years now, but is finally getting understood as more companies move to Azure as a cloud solution.  It has become an amazing tool for monitoring the performance of your application, but it can also work as a general logging platform as I have posted before.

Now that you are capturing all this information how can you leverage it?  Going to the Azure portal whenever you want an answer is time consuming.  It would be great if you could automate this process.  Of course there are a number of metrics that you can create alerts for directly via the portal, but what if you want a non-standard metric or want to do something beside just send an alert?

Fortunately Microsoft has a REST API in beta for Application Insights.  It allows you to check standard metrics as well as run custom queries as you do in the Analytics portal.  Let’s explore how to use this API.

In this post will show how to create a demo that implements an Azure Function which calls the Application Insights REST API and then send the results out using SendGrid.  I created them with the VS2017 Preview and the new Azure Functions templates.

Generate Custom Events

First we need some data to work with.  The simplest way is to leverage the TrackEvent and TrackException method of the Application Insights API.  In order to do this you first need to setup a TelemetryClient.  The code below I have as part of the class level variables.

        private static string appInsightsKey = System.Environment.GetEnvironmentVariable("AppInsightKey", EnvironmentVariableTarget.Process);
        private static TelemetryClient telemetry = new TelemetryClient();
        private static string key = TelemetryConfiguration.Active.InstrumentationKey = appInsightsKey; //System.Environment.GetEnvironmentVariable("AN:InsightKey", EnvironmentVariableTarget.Process);

After that it is simple to call the TrackEvent method on the TelemetryClient object to log an activity in your code (be aware it may take 5 minutes for an event to show up in Application Insights).

            telemetry.TrackEvent($"This is a POC event");

Create a VS2017 Function Application

I will have another post on the details in the future, but if you have Visual Studio 2017 Preview 15.3.0 installed you will be able to create an Azure Functions project.

image

Right click the project and select the New Item context menu option and select Azure Function as shown below.

image

On the New Azure Function dialog select TimerTrigger and leave the remaining options as default.

image

Call Application Insights REST API

Once there are events in the customEvents collection we can write a query and execute it against the Application Insights REST API.  To accomplish this the example uses a simple HttpClient call.  The API page for Application Insights can be found here and contains the ULRs and formats for each call type.  We will be using the Query API scenario which will be setup with a couple of variables.

        private const string URL = "https://api.applicationinsights.io/beta/apps/{0}/query?query={1}";
        private const string query = "customEvents | where timestamp >= ago(20m) and name contains \"This is a POC event\" | count";

The call to the service is a common pattern using the HttpClient as shown below.  Add this to the Run method of your new function.

            HttpClient client = new HttpClient();
            client.DefaultRequestHeaders.Accept.Add(
                new MediaTypeWithQualityHeaderValue("application/json"));
            client.DefaultRequestHeaders.Add("x-api-key", appInsightsApiKey);
            var req = string.Format(URL, appInsightsId, query);
            HttpResponseMessage response = client.GetAsync(req).Result;

Process Results

After we have a result we can deserialize the JSON using JSON.NET and send it to our support team via SendGrid.  You will have to add the NuGet package Microsoft.Azure.WebJobs.Extensions.SendGrid.

Modify the signature of your function’s Run method to match the code sample shown here.  In this example “message” is defined as an output variable for the Azure Function which is defined as a binding by using the SendGrid attribute. 

        public static void Run([TimerTrigger("0 */15 * * * *")]TimerInfo myTimer, TraceWriter log, [SendGrid(ApiKey = "SendGridApiKey")]out Mail message)

We will also need a structure to deserialize the returned JSON message into. If you look at the message itself it can appear rather daunting but it breaks down into the following class structure.  Create a new class file and replace the default class with this code.

    public class Column
    {
        public string ColumnName { get; set; }
        public string DataType { get; set; }
        public string ColumnType { get; set; }
    }

    public class Table
    {
        public string TableName { get; set; }
        public List<Column> Columns { get; set; }
        public List<List<object>> Rows { get; set; }
    }

    public class RootObject
    {
        public List<Table> Tables { get; set; }
    }

The last code example below performs the deserialization and creates the SendGrid email message.  Insert this to the Run method after the HttpClient call we previously added.

                string result = response.Content.ReadAsStringAsync().Result;
                log.Info(result);

                RootObject aiResult = JsonConvert.DeserializeObject<RootObject>(result);

                string countString = aiResult.Tables[0].Rows[0][0].ToString();

                string recipientEmail = System.Environment.GetEnvironmentVariable($"recipient", EnvironmentVariableTarget.Process);
                string senderEmail = System.Environment.GetEnvironmentVariable($"sender", EnvironmentVariableTarget.Process);

                var messageContent = new Content("text/html", $"There were {countString} POC records found");

                message = new Mail(new Email(senderEmail), "App Insights POC", new Email(recipientEmail), messageContent);

Publish your solution to an Azure Function App by downloading the Function App’s profile and using the VS2017 projects publish options.  You will also need to define the application settings referred to in the code so that they are appropriate for you environment.  At that point you will be able to observe the results of you efforts.

Summary

This post demonstrates how a small amount of code can give you the ability to leverage Application Insights for more than just out of the box statistics alerts.  This approach is flexible enough to be use for report on types of errors and monitoring if subsystems are remaining available.  Combining the features within Azure’s cloud offerings gives you capabilities that would cost much more in development time and resource if they were done on premises. 

My only real problem with this approach is that I would prefer to be accessing values in the result by name rather than indexes because this makes the code less readable and more brittle to changes.

Try these examples out and see what other scenarios they apply to in your business.

Logging To Application Insights In Azure Functions Feb 16

In my last post I covered logging in Azure Functions using TraceWriter and log4net.  Both of these work, but Application Insights rolls all your monitoring into one solution, from metrics to tracking messages.  I have also heard a rumor that in the near future this will be an integrated part of Azure Functions.  Given these factors it seem wise to start give it a closer look.

So how do you take advantage of them right now?  If you go to GitHub there is a sample written by Christopher Anderson, but let me boil this down.  First we need to create an Application Insight instance and grab the instrumentation key.

When I created my Application Insight instance I chose the General application type and the same resource group as my function app.

image

Once the instance has been allocated you will need to go into the properties blade.  There you will find a GUID for the Instrumentation Key.  Save this off so that we can use it later.

You then need to add the Microsoft.ApplicationInsights NuGet package by creating a project.json file in your function.  Insert the following code in the new file and save it.  If you have your log window open you will see the package being loaded.

 {   
  "frameworks": {   
   "net46":{   
    "dependencies": {   
     "Microsoft.ApplicationInsights": "2.1.0"   
    }   
   }   
   }   
 }  

In the sample code read.me it says that you need to add a specific app setting, but as long as your code reads from the appropriate setting that is the most important part.  Take the Instrumentation Key that you saved earlier and place it in the app settings.  In my case I used one called InsightKey.  

Next setup your TelemetryClient object like the code here by creating global static variables that can be used throughout your application.  After that we are ready to start tracking our function. 

 private static TelemetryClient telemetry = new TelemetryClient();   
 private static string key = TelemetryConfiguration.Active.InstrumentationKey = System.Environment.GetEnvironmentVariable("InsightKey", EnvironmentVariableTarget.Process);  

To track and event or an exception simply call the appropriate method.  I prefer to encapsulate them in their own methods where I can standardize the usage.  I have added the function name, method name, and context ID from the function execution to make it easier to search and associate entries.

 private static void TrackEvent(string desc, string methodName)   
 {   
   telemetry.TrackEvent($"{FunctionName} - {methodName} - {contextId}: {desc}");   
 } private static void TrackException(Exception ex, string desc, string methodName)   
 {   
   Dictionary<string,string> properties = new Dictionary<string,string>() {{"Function",FunctionName}, {"Method",methodName}, {"Description",desc}, {"ContextId",contextId}};   
   telemetry.TrackException(ex, properties);   
 }  

Analytics

This isn’t an instant answer type of event store.  At the very least there is a few minute delay your application logging and event or exception and when it is visible in the Analytics board.

Once you are logging and sending metrics to Application Insights you need to read the results.  From your Application Insight main blade click on the Analytics button at the top of the overview.  It will open a new page that resembles what you see below.

image

Click the new tab button at the top next to the Home Page tab.  This will open a query window. The query language has a similar structure to SQL, but that is about as far as it goes.

The table objects are listed on the left navigation with the fields listed as you expand out each table.  Fortunately intellisense works pretty well in this tool.  You have what would normally be considered aggregate functions that make life easier.  As you can see below you can use the contains syntax that acts similar to a SQL like comparison.  There are also date range functions like the ago function used below.  I found that these two features can find most things you are looking for.

image

Summary

This posted didn’t cover a lot of the native functionality in Application Insight, but hopefully it gives you a starting point to instrument your Azure Functions.  The flexibility of this tool along with it the probability of it being built into Functions in the future make it a very attractive option.  Spend some time experimenting with it and I think you find it will pay dividends.

Implementing Logging In Azure Functions Feb 13

image

Logging is essential to the support of any piece of code.  In this post I will cover two approaches to logging in Azure Functions: TraceWriter and log4net.

TraceWriter

The TraceWriter that is available out of the box with Azure Functions is a good starting point.  Unfortunately it is short lived and only 1000 messages are kept at a maximum and at most they are held in file form for two days.  That being said, I would not skip using the TraceWriter.

Your function will have a TraceWriter object passed to it in the parameters of the Run method.  You can use the Debug, Error, Fatal, Info and Warn methods to write different types of messages to the log as shown below.

log.Info($"Queue item received: {myQueueItem}");

Once it is in the log you need to be able to find the messages.  The easiest way to find the log files is through Kudu.  You have to drill down from the LogFiles –> Application –> Functions –> Function –> <your_function_name>.  At this location you will find a series of .log files if you function has been triggered recently.

image

The other way to look at your logs is through Table Storage via the Microsoft Azure Storage Explorer.  After attaching to your account open the storage account associated with your Function App.  Depending on how you organized your resource groups you can find the storage account by looking at the list of resources in the group that the function belongs to.

Once you drill down to that account look for the tables named AzureWebJobHostLogsyyyymm as you see below.

image

Opening these tables will allow you to see the different types of log entries saved by the TraceWriter.  If you filter to the partition key “I” you will see the entries your functions posted.  You can further filter name and date range to identify specific log entries.

image

log4net

If the default TraceWriter isn’t robust enough you can implement logging via a framework like log4net.  Unfortunately because of the architecture of Azure Functions this isn’t as easy as it would be with a normal desktop or web application.  The main stumbling block is the lack of ability to create custom configuration sections which these libraries rely on.  In this section I’ll outline a process for getting log4net to work inside your function.

The first thing that we need is the log4net library.  Add the log4net NuGet package by placing the following code in the project.json file.

{
  "frameworks": {
    "net46":{
      "dependencies": {
        "log4net": "2.0.5"
      }
    }
   }
}

To get around the lack of custom configuration sections we will bind a blob file with your log4net configuration.  Simply take the log4net section of and save it to a text file.  Upload that to a storage container and bind it to your function using the full storage path.

image

Add the references to the log4net library and configure the logger.  Once you have that simply call the appropriate method on the logger and off you go.  A basic sample of the code for configuring and using the logger is listed below.  In this case I am actually using a SQL Server appender.

using System;
using System.Xml;
using log4net;

public static void Run(string input, TraceWriter log, string inputBlob)
{
    log.Info($"Log4NetPoc manually triggered function called with input: {input}");
    log.Info($"{inputBlob}");

    XmlDocument doc = new XmlDocument();
    doc.LoadXml(inputBlob);
    XmlElement element = doc.DocumentElement;

    log4net.Config.XmlConfigurator.Configure(element);

    ILog logger = LogManager.GetLogger("AzureLogger");

    logger.Debug($"Test log message from Azure Function", new Exception("This is a dummy exception"));
   
}

Summary

By no means does this post cover every aspect of these two logging approaches or all possible logging approaches for Azure Functions.  In future posts I will also cover AppInsight.  In any case it is always important to have logging for you application.  Find the tool that works for your team and implement it.

Building Azure Functions: Part 3 – Coding Concerns Feb 02

Image result for azure functions logo

In this third part of my series on Azure Function development I will cover a number of development concepts and concerns.  These are just some of the basics.  You can look for more posts coming in the future that will cover specific topics in more detail.

General Development

One of the first things you will have to get used to is developing in a very stateless manner.  Any other .NET application type has a class at its base.  Functions, on the other hand, are just what they say, a method that runs within its own context.  Because of this you don’t have anything resembling a global or class level variable.  This means that if you need something like a logger in every method you have to pass it in.

[Update 2016-02-13] The above information is not completely correct.  You can implement function global variables by defining them as private static.

You may find that it makes sense to create classes within your function either as DTOs or to make the code more manageable.  Start by adding a .csx file in the files view pane of your function.  The same coding techniques and standards apply as your Run.csx file, otherwise develop the class as you would any other .NET class.

image

In the previous post I showed how to create App Settings.  If you took the time to create them you are going to want to be able to retrieve them.  The GetEnvironmentVariable method of the Environment class gives you the same capability as using AppSettings from ConfigurationManager in traditional .NET applications.

System.Environment.GetEnvironmentVariable("YourSettingKey")

A critical coding practice for functions that use perishable resources such as queues is to make sure that if you catch and log an exception that you rethrow it so that your function fails.  This will cause the queue message to remain on the queue instead of dequeuing.

Debugging

image

It can be hard to read the log when the function is running full speed since instance run in parallel but report to the same log.  I would suggest that you added the process ID to your TraceWriter logging messages so that you can correlate them.

Even more powerful is the ability to remote debug functions from Visual Studio.  To do this open your Server Explorer and either connect to your Azure subscription.  From there you can drill down to the Function App in App Services and then to the run.csx file in the individual function.  Once you have open the code file and place your break points, right-click the function and select Attach Debugger.  From there it acts like any other Visual Studio debugging session.

image

Race Conditions

I wanted to place special attention on this subject.  As with any highly parallel/asynchronous processing environment you will have to make sure that you take into account any race conditions that may occur.  If at all possible keep the type of functionality that your create to non-related pieces of data.  If it is critical that items in a queue, blob container or table storage are processed in order then Azure Functions are probably not the right tool for your solution.

Summary

Azure Functions are one of the most powerful units of code available.  Hopefully this series gives you a starting point for your adventure into serverless applications and you can discover how they can benefit your business.

Building Azure Functions: Part 2–Settings And References Feb 01

Image result for azure functions logo

This is the second post in a series on building Azure Functions.  In this post I’ll continue by describing how to add settings to your function and reference different assemblies to give you more capabilities.

Settings

image_thumb1[1]

Functions do not have configuration files so you must add app settings and connection strings through the settings page.  The settings are maintained at an Function App level and not individual functions.  While this allows you to share common configuration values it means that if your custom assemblies need different values in configuration settings per function they will each function will have to live in a separate function app.

To get to them go to the Function App Settings link at the lower left of your App Function’s main page and then click the Configure App Settings button which will bring you to the blade shown below.  At that point it is the same any .NET configuration file.

image

At some point I would like to see the capability of importing and exporting settings since maintaining them individually, by hand leads to human error and less reliable application lifecycle management.

Another drawback to the Azure Functions development environment is that at the time of this post you don’t have the ability to leverage custom configuration sections.  The main place I have found this to cause heartburn is using logging libraries such as log4net where the most common scenario is to use a custom configuration section to define adapters and loggers.

Referencing Assemblies And Nuget

No .NET application is very useful if you can’t reference all of the .NET Framework as well as third party and your own custom assemblies.  There is no add references menu for Azure functions and there are multiple ways to add references.  Lets take a look at each.

There are a number of .NET assemblies that are automatically referenced for your Function application.  There are a second group of assemblies that are available but need to be specifically reference.  For a partial list consult the Azure Function documentation here.  You can also load your own custom assemblies or bring in Nuget packages. 

In order to load Nuget packages you need to create a project.json file.  Do this by clicking the View Files link in the upper right corner of the editor blade and then the Add link below the file list pane. 

project.json files require the same information that is contained in packages.config file, but it is formatted in json as shown in the example below.  Once you save this file and reference the assembly in your Run.csx file Azure will load the designated packages.

image_thumb8

If you have custom libraries that you want to leverage you will need to add a bin folder to your function.  The easiest way I have found to do this is to open the App Service Editor from the Function App Settings page.  This will open up what is essentially Visual Studio Code in a browser.  Navigate the file tree to your function under wwwroot.  Right click your function name and select New Folder.  The folder must be named “bin”.  You can then right click the bin folder and upload your custom assemblies.

Once you have an assembly available you need to reference it using the “r#” directive as shown below.  You will notice that native assemblies and Nuget package loaded libraries do not need the dll extension specified, but they must be added for custom assemblies.

#r "System.Xml"
#r "System.Xml.Linq"
#r "System.Data.Entity"
#r "My.Custom.Data.dll"
#r "My.Custom.Domain.dll"
#r "Newtonsoft.Json"
#r "Microsoft.Azure.Documents.Client"
#r "Microsoft.WindowsAzure.Storage"

Now we are ready to declare our normal using statements and get down to the real business of functions.

Summary

After this post we have our trigger, bindings, settings and dependent assemblies.  This still isn’t enough for a useful function.  In the next post I will cover coding and debugging concerns to complete the story.

Building Azure Functions: Part 1–Creating and Binding Jan 31

Image result for azure functions logo

The latest buzz word is serverless applications.  Azure Functions are Microsoft’s offering in this space.  As with most products that are new on the cloud Azure Functions are still evolving and therefore can be challenging to develop.  Documentation is still being worked on at the time I am writing this so here are some things that I have learned while implementing them.

There is a lot to cover here so I am going to break this topic into a few posts:

  1. Creating and Binding
  2. Settings and References
  3. Coding Concerns

Creating A New Function

The first thing you are going to need to do is create a Function App.  This is a App Services product that serves as a container for your individual functions.  The easiest way I’ve found to start is to go to the main add (+) button on the Azure Portal and then do a search for Function App.

image

Click on Function App and then the Create button when the Function App blade comes up.  Fill in your app name remembering that this a container and not your actual function.  As with other Azure features you need to supply a subscription, resource group and location.  Additionally for a Function App you need to supply a hosting plan and storage account.  If you want to take full benefit of Function Apps scaling and pricing leave the default Consumption Plan.  This way you only pay for what you use.  If you chose App Service Plan your function will will pay for it whether it is actually processing or not.

image

Once you click Create the Function App will start to deploy.  At this point you will start to create your first function in the Function App.  Once you find your Function App in the list of App Services it will open the blade shown below.  It offers a quick start page, but I quickly found that didn’t give me options I needed beyond a simple “Hello World” function.  Instead press the New Function link at the left.  You will be offered a list of trigger based templates which I will cover in the next section.

image

Triggers

image

Triggers define the event source that will cause your function to be executed.  While there are many different triggers and there are more being added every day, the most common ones are included under the core scenarios.  In my experience the most useful are timer, queue, and blob triggered functions.

Queues and blobs require a connection to a storage account be defined.  Fortunately this is created with a couple of clicks and can be shared between triggers and bindings as well as between functions.  Once you have that you simply enter the name of the queue or blob container and you are off to the races.

When it comes to timer dependent functions, the main topic you will have to become familiar with is chron scheduling definitions.  If you come from a Unix background or have been working with more recent timer based WebJobs this won’t be anything new.  Otherwise the simplest way to remember is that each time increment is defined by a division statement.

image

In the case of queue triggers the parameter that is automatically added to the Run method signature will be the contents of the queue message as a string.  Similarly most trigger types have a parameter that passes values from the triggering event.

Input and Output Bindings

image

Some of the function templates include an output binding.  If none of these fit your needs or you just prefer to have full control you can add a binding via the Integration tab.  The input and output binding definitions end up in the same function.json file as the trigger bindings. 

The one gripe I have with these bindings is that they connect to a specific entity at the beginning of your function.  I would find it preferable to bind to the parent container of whatever source you are binding to and have a set of standard commands available for normal CRUD operations.

Let’s say that you want to load an external configuration file from blob storage when your function starts.  The path shown below specifies the container and the blob name.  The default format show a variable “name” as the blob name.  This needs to be a variable that is available and populated when the function starts or an exception will be thrown.  As for your storage account specify it by clicking the “new” link next to the dropdown and pick the storage account from those that you have available.  If you specified a storage account while defining your trigger and it is the same as your binding it can be reused.

image

The convenient thing about blob bindings is that they are bound as strings and so for most scenarios you don’t have to do anything else to leverage them in your function.  You will have to add a string parameter to the function’s Run method that matches the name in the blob parameter name text box.

Summary

That should give you a starting point for getting the shell of your Azure Function created.  In the next two posts I will add settings, assembly references and some tips for coding your function.

Sketchnotes: Microsoft Windows 10 Creator Update Event Oct 26

On October 26, 2016 Microsoft had an event to show off the future of Windows 10 and some new hardware.  The following sketchnotes summarize the announcements from that event.

clip_image001

clip_image001

clip_image001[6]

clip_image001[8]

Cloud Battles: Azure vs AWS–The Video Jun 29

Earlier this month Norm Murrin and I gave a talk at the Chicago Coder Conference.  We learned a lot about how the offerings of each company compares during our preparation.  In the end we come to the conclusion that there is no clear winner except those of us who are leveraging the resources.  Check out this video posted by the conference do get the blow-by-blow details.

Application Integration: Azure Functions Vs WebJobs Jun 02

image

[Updated]

UI development gets all the attention, but application integration is where the real work is done.  When it comes to application integration in the Azure ecosystem you better learn how Functions and WebJobs are developed and under what conditions you should use each.  In this post I will try to answer those questions.

For me it is important that a solutions is reasonably maintainable, deployable through environments and can be easily managed under source control.

Both products are built on the same code base and share the same base API.  From that perspective they are closely matched.  Functions do have the advantage of handling web hooks as opposed to simply timer and storage events with WebJobs.

There is another difference that I haven’t been able to prove you, but I’ve seen mentioned in a couple of places.  It seems like Functions may take time to warm up since they aren’t always instantiated.  Since WebJobs are always running they would not incur this startup cost.  If immediate processing is important then WebJobs may be the more appropriate options for you.

When it comes to actual development I prefer to have the resources of Visual Studio to write and manage source code as well as package my deliverables for deployment.  As of this writing I have not been able to find a Visual Studio project type.  This means you edit the code through a web browser.  This in portal editor does allow you to integrate with Git or VSTS for source control.  I would expect at some point in the future we will get a Functions project type.

Both WebJobs and Functions can be written using C#/VB.NET and Node.js.  From the language availability perspective they are even.

Summary

So what is the real separating line between using one or the other.  From what I have experienced so far, if you need the web hooks then Functions are the right choice.  If you don’t need the web hooks and maintainability is you priority then WebJobs are the way to go.  I’m sure there are more reason, but these are the most obvious in the early days of Functions.  As the products evolve I’ll post updates.

[Update]

Christopher Anderson (@crandycodes) from the Azure team replied via Twitter with the following:

You hit on some key points like lack of tooling/VS integration. We plan on addressing those before GA.
I think the major point missing is the dynamic scale functionality, pay per use. Functions scale automatically and don't cost a VM.
Also, if you run Functions in dedicated with always on, there is no cold start issues, but you pay per VM at that point.
WebJobs vs Functions is really: "Do I want to manage my own custom service?" Yes: WebJobs, No: Functions. Otherwise, similar power.

A TFS Developer In A GitHub World May 04

Git and GitHub have been around for a few years now.  They are becoming more popular by the day.  I finally got around to looking at them more closely over the last few months and decided to summarize the experiences.

My first experience with GitHub was not the most pleasant.  I was using Visual Studio 2013 which doesn’t seem to have the best integration story (or at least didn’t when I tried it).  The fact that it required that an existing repository be cloned via the GitHub desktop before Visual Studio knew anything about it was the biggest pain.

Visual Studio 2015 on the other hand has a much better use story.  You are able to log into your repository and get a clone of the repository without breaking out to another tool.  The commit process is pretty similar to that of TFS from that point on.

From my trials I have found that GitHub works well as a source control repository.  It is hard at first getting used to the non-Microsoft verbs that are used.  Retraining yourself that you have to do a commit and push before something is actually checked in instead of just doing a

As for working as a team I think that TFS still has the better features.  This may just be because it isn’t as well integrated with Visual Studio.  Having customizable work items in TFS comes in very handy, especially on larger enterprise projects.

The wiki gives a good place to put documentation, but it doesn’t give you a place to manage Word, Excel, PowerPoint, PDF and Visio documents that might have important information about your project. This is where TFS and SharePoint really shine.

Another drawback I see over TFS is that GitHub repositories are public and can’t be private unless you have a paid account.  I can create a free TFS Online account that gives me private repositories and access for up to five users.  This makes it better for the individual developer to the small team.

Of course there is a third option that you can use Git in TFS.  This gives you the source control of Git with the project management features of TFS.  It took a little bit to get may existing code into the new Git-TFS project repository.  Then came the realization that the only source control viewer for Git repositories is in the portal.  The growing pains continue.

Summary

I am sure that the story around GitHub will improve over time, but right now it just seems like people are using it because it is what the cool kids are doing or they are working on open source projects.  If I have to advise a client I am going to suggest they go with the product with the best and most complete story with the best integration to their current toolset.  For now that is TFS, especially if you are a Microsoft development shop.

As for my GitHub experiment, it goes on but I deleted the repository I had created for security reason.  Stay tuned and see what else develops.  The next step is probably Git in TFS.

Increase Cloud Application Responsiveness With Azure Queues and WebJobs May 03

This post is based on the presentation I gave at Cloud Saturday Chicago.

In a mobile first world services need to be able to process high volumes of traffic as quickly as possible. With the certification and deployment process which native mobile apps have to go through being long and sometimes uncertain, we find ourselves looking for ways to make improvements without changing the device code. The flexibility of Azure and other cloud platforms gives developers the ability to easily implement changes to solutions using that can be deployed quickly with nearly infinite resources.

The solutions described here can also help the throughput of you web applications as easily as mobile apps. This article will demonstrate the capabilities of Azure WebJobs and Queues improve application responsiveness by deferring processing.

Let’s start with what a normal mobile app infrastructure design may look like. We have a mobile app which is sending data to a web service to be processed. This service may have a whole workflow of operations to perform on that data once it is received. Once the processing is complete the data is stored in an Azure SQL Database.  The problem that we are facing is that you don't want the mobile app to continue to wait while the processing occurs as this may cause timeouts or the delays in the UI while.

clip_image001

To reduce the delays we can make a couple of minor changes using features of Azure. We remove the main workload from the web service so that it simply puts the incoming message on an Azure queue. The work that used to be in the service is now moved to a continuous Azure WebJob along with code to read from the queue. This give the service the ability to return an acknowlegement message to the mobile app almost immediately. The WebJob can pull from the queue at its own speed and since we are in the cloud we can easily add new instances of the WebJob to scale out if needed.

clip_image002

What are the actual performance differences? That will depend greatly on how much work your service was doing to begin with. If it was only a simple table insert there may not be a significant improvement or possibly even a loss due to the serialization to the queue. If you have to reach out to several different resources or perform a strong of operations this will off load the real work.

The Code

The first thing that we need to do is add the code to insert into the queue from the service.

string connectionString = ConfigurationManager.ConnectionStrings["AzureWebJobsStorage"].ToString();

CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
    ConfigurationManager.ConnectionStrings["AzureWebJobsStorage"].ToString());

CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();

CloudQueue queue = queueClient.GetQueueReference("yourrequestqueue");

queue.CreateIfNotExists();

var tempMessage = JsonConvert.SerializeObject(request);
CloudQueueMessage message = new CloudQueueMessage(tempMessage);


queue.AddMessage(message);

In order for this code to work we need to setup configuration connection strings to the storage account which will contain the queue.  In order to get the keys for the storage account open the Azure portal and go to [your storage account] –> Keys.

image

Below are the connection string entries that you should use.

<add name="AzureWebJobsDashboard" connectionString="DefaultEndpointsProtocol=https;AccountName=yourstore;AccountKey=your secondary access key" />
<add name="AzureWebJobsStorage" connectionString="DefaultEndpointsProtocol=https;AccountName=yourstore;AccountKey=your secondary access key" />

Now we add a new Azure WebJob project to our solution.  You will find the Azure WebJob template under the Cloud template section.

image

Once we have the project add the following code to a method in the Functions class specifying the name of the queue in the QueueTrigger parameter.  In this instance I have used the JavaScriptSerializer to deserialize the message from the queue.  You can then pass the object on to the processing methods as you originally had in the service.

public static void ProcessQueueMessage([QueueTrigger("your queue")] string message, TextWriter log)
{
    var serializer = new JavaScriptSerializer();
    YourType tempRequest = serializer.Deserialize<YourType>(message);
    // Do your work here
}

Once the storage key is added to the connection strings for the WebJob as we did for the service we will be ready to deploy and test.

Deploying the WebJob is fairly straight forward.  If you zip up the contents of your bin folder including the config file you can then upload it through the App Service –> [Your App Service] –> Settings –> WebJobs blade of the Azure portal.

image

Give the WebJob a name, leave the other settings and select the file for your zipped bin folder and you are all set to go.  I will cover debugging in a future post.

Summary

Creating WebJobs and manipulating Azure queues is not rocket surgery.  If you have kept your code as loosely coupled as possible then moving to a continuous WebJob solution will be fairly quick and painless.  With just a little effort your application will seem more responsive and will be able to handle much larger request loads.

IDEs For Developing Node.js In A Windows Environment Apr 11

Node.js has become one of the most predominant JavaScript based solution frameworks over the last couple of years.  As with most ideas that start in the open source community it has crept its way into the Microsoft based platforms.  For those of us who make our living in the Windows ecosystem it is beneficial to know what tools are available to add Node.js components to our solution landscape.

This post will briefly introduce four of the development environments to build Node.js projects on the Windows platform.  These include DOS command line, VS Code, WebMatrix and Node.js tools for Visual Studio.

Command Line

image

As with most development, the solution with the fewest bells and whistles has the most power.  You can simply download the Node.js base from nodejs.org.  This will supply you with all the basic components including a Node.js command window.  At that point you can use npm to pull down what ever packages you wish to leverage and use your favorite text editor to develop solutions.

VS Code

One step up from command line is Visual Studio Code.  This is a tool that Microsoft built in response to the portion of the development community who prefer a minimalist solution where you pick every feature you want to include in your code explicitly.  Since it is a very file based/folder based tool there is no native support for managing associated resources such as modules that you want to include in your Node.js package.  In many cases you still need to leverage the command line in order to perform operations such as adding modules.

image

You can open an existing Node.js folder and effectively maintain your application.  There are several VS Code Extensions available to help with the out of the box deficiencies and more are being added all the time.

WebMatrix

With WebMatrix you add the ability to start from a project template. You still won’t have the capability out of the box to add new Node packages.  In theory this should be overcome by simply installing the NPM Gallery extension, but when I tried it there were bugs that prevented retrieving any new packages.

image

On another cautionary note, I have found that WebMatrix can have unpredictable problems where sometimes Node.js code that will work in other tools will have random results.  I’m not sure if this is a result of my environment or the WebMatrix hosting model, but buyer beware.

Node.js Tools for Visual Studio

Visual Studio is the envy of all other development platforms so it is to be expected that it is the best development experience for the Windows platform.  You have all your favorite productivity features.  If you are normally a .NET developer and need to do Node.js development this is your choice, hands down.  It gives you intellisense, debugging and profiling on top of NPM integration.  What more could you ask for?

image

Summary

In the end everyone will have their favorite toolset based on how they work and what type of projects they are developing.  There isn’t a right or wrong choice, just your personal preferences.

BUILD 2016 Thoughts: Home Sweet Windows Apr 04

It is that time of year again where Microsoft developers gather in San Francisco to hear the direction that Microsoft is moving and the tools they are offering to get us there.  The big theme this year has been “Windows is home for developers”.  So what did Microsoft have in store for us?

How Far Have We Come

There were a lot of stats that showed up during the keynotes.  A couple that stood out for me were that there are currently over 270 million users on Windows 10 and that Cortana is answering over 1 million voice questions a day.  First, this makes Windows 10 the fastest adopted version of Windows in history which say to me that Microsoft is starting to get a little of its appeal back.  Second, it tells me that even if Windows Phone didn’t take of the way it should have it still brought about assets that for Microsoft that they are leveraging to the benefit of users.

Windows

Now I didn’t hear anyone at BUILD say why, but the next update to Windows 10 will be called the Anniversary Update (I guess 31 years is a nice round number).  Whatever the reason there were a number of new features that will be coming with the release that are worth looking at.  The three big features included security, ink and Linux (believe it or not). 

The one security feature that stood out is biometric logins from Edge.  This allows you to use something like a fingerprint reader to log into a web site.  My understanding is that this is an extension of the Windows Hello interface.

The second big feature is the ink enhancements.  The statement was that since most people still use a pen and paper so often that it should be front and center in your experience. This includes context sensitive handwriting recognition that can realize that if you write a word that implies a date it will be a link that can open Cortana and set a reminder.  This along with new highlighting feature, a new sketch pad and rulers that make it easier to draw lines and shapes the ability to use a pen on your computer becomes infinitely more powerful.

For the third announcement Microsoft introduced a Bash shell for Windows running Ubuntu Linux.  I’m still not sure where this is useful, but the fact that the two can live together and interact bears another look.

Xamarin

Here is where wishes come true.  It was rumored before last year’s BUILD that we would hear that Microsoft would buy Xamarin.  That announcement came earlier this year instead.  This week we really got our wish when Scott Gu announce on stage that Xamarin will come free to every Visual Studio user.  Not only that but we will also get access to the other Xamarin goodies like the iOS emulator, test recorder, and test cloud. 

BUT WAIT!  There’s more!  The Gu also announced that Xamarin will be open sourced through the Open Source Foundation.  Merry Christmas!

HoloLens

While it didn’t feature as prominently as it did last year, HoloLens still got some stage time.  It could be that your next classroom experience is with HoloLens if you happen to be a medical student at Case Western Reserve University.  Or you may find HoloLens at your next visit to the Kennedy Space Center where you can explore Mars with Buzz Aldrin.

Azure

Azure continues to be a home run and savior for Microsoft.  As PC sales decline and the types of devices we use diversify the cloud has become the backing service for everything.  You can see then why Azure is now in what will be mile long data centers and why almost the entire second day keynote of BUILD was dedicated to Azure.

Xbox And Gaming

Developers are finally getting the opportunities on Xbox One that they have been asking about since it was announced.  The combination of Universal Windows Platform (UWP) and DirectX 12 not only means that more games will be shipped for both Xbox One and PC, but developers will be able to able to write Xbox One apps and games more easily.  They will be able to test them by using the new developer mode that will be available on the Xbox One.  They will then be able to release it to a unified store.  What more can you ask for.

Cortana

As we have seen over the last year, Cortana is everywhere.  Since every device, regardless of platform, can run Cortana it has the ability to help you in ways that other personal assistant AIs can’t.  They also showed off how Cortana will be integrated with Skype and Bots to give you access to entertainment, products and services that relate to your chat conversations.

Cognitive Services and Bots

I can’t say that the keynotes really made it clear what cognitive services and bots are.  We say where they play but there was not explicit definition of what they are.  From what I saw the cognitive services are pre-packaged services built on top of machine learning.  They are handy tools that you can incorporate into your applications to make them more human friendly like the ability to detect emotions from a person’s expressions.

Bots on the other hand are built on a new framework that is available to devs which integrates services with natural language syntax understanding.  It seems to be the next step in allowing developers to create components that will plug in as part of a greater AI type experience for users.

Summary

That is a lot to take in and a ton of new things for devs to learn.  Microsoft is making it easier for developers of all backgrounds to use Windows as their main development environment and a jumping off point to develop solutions that will live on other platforms.  At the same time they are building cloud and device features that have the potential to move computing light years forward.  Even though this year wasn’t about the hardware it is still an exciting time to be developing on the Microsoft platform.  Check out the content from BUILD on Channel9 and find out which new features you can apply to your solutions.

Do You Know How Much Your Azure VM Costs? Feb 09

You go through the Azure price calculator and figure out everything you think are going to need for your Azure VM and how much each resource will cost you on a monthly basis.  You then pick a template from the marketplace and implement it assuming you know what services are being stood up.  Do you really know?

What you may not realize is that if you didn’t go through the pricing calculator thuroughly you may be in for some surprises on your bill.  Not only do you get charged for the hardware that you have selected, but also the image that you choose to run on it. 

The cost per minute of each image is not posted on the image itself.  This is made up of each component you are using and can be figured by reviewing adding up all the costs listed here.  If you don’t know to look there then you won’t know that there are additional costs until you get the invoice at the end of your billing period, and these can really add up.

As I mentioned above this surprise can also be avoided by knowing server type you should select in the pricing calculator.  You may assume you are just getting a Windows server and then adding SQL Server too it, but if you pick SQL Server as the Type for you vitual machine you will see that the real cost almost doubles for you VM.  This is the server type that is actually used for the SQL Server templates.

In some cases you will find that licensing the software in the template costs much less than running the image.  Since you are renting SQL Server instead of buying it you are paying for it each month. 

If a SQL Server standard license cost you $600 per processor and you are running a 2 processor server you can put SQL server on it yourself for $1200.  The same VM loaded with SQL Server standard will cost you ~$300 a month.  At that rate you would pay for the license in 4 months.  You need to take this into account when making your decisions.

Of course if you are only running the server as a testing server that is shut down most of the time then your cost for a year may actually come in under the cost of a license.

The point is be educated about what your Azure VM is really going to cost you and what your return on investment is.