Tim Murphy's .NET Software Architecture Blog

Starting An Umbraco Project Nov 10

VisualStudioUmbraco

As I have been documenting Umbraco development I realized that people need a starting point.  This post will cover how to start an Umbraco project using an approach suitable for ALM development processes.

The criteria I feel a maintainable solution include are a customizable development project which can be easily in source control with a robust and replicatable database.  Of course this has to fall within the options available with Umbraco.  For mean this means an ASP.NET web application and a SQL Server database.  Let’s take a look at the steps required to get started with this architecture.

Create The Database

I prefer a standard SQL Server database instance over SQL Server Express due to its manageability.  For each Umbraco instance we need to create an empty database and then a SQL Server login and a user with permissions to alter the database structure.  You will need the login credentials when your first start your site.

Create The Solution

This is the easiest part of the an Umbraco project.  The base of each Umbraco solution I create starts with an empty ASP.NET Web Application.  Once that is created open the NuGet package manager and install the UmbracoCms package.  After that it is simply a matter of building and executing the application.

Finish Installation

As the ASP.NET application starts it will present the installation settings.  The first prompt you will get is to create your admin credentials as shown below.  Fill these fields in but don’t press any buttons.

image

The key is the be sure to click the Customize button before the install button as it doesn’t verify whether you want to use an existing database before running the install.  It will simply create a SQL Server Express instance on its own.  Pressing the Customize button will show the configuration screen shown below.  Fill in your SQL Server connection information and click Continue.

image

Conclusion

Once you start the install sit back and relax.  In a few minutes you will have an environment that is ready for your Umbraco development.  This will be the starting point for other future posts.  Stay tuned.

Relating Umbraco Content With the Content Picker Nov 08

umbracologo

After addressing Umbraco team development in my previous post I want to explore maintaining relationships between pieces of content in Umbraco and accessing them programmatically here.

For those of us who have a natural tendency to think of data entities and their relationships working within a CMS hierarchy can be challenging.  Add to that the fact that users don’t only want to query within that hierarchy and things get even more challenging.  Fortunately we will see here that adding the Content Picker to your document type defintion and a little bit of LINQ to your template you can deliver on all these scenarios.

Content Picker

image

Adding the Content Picker to your document type definition is the easiest part of the process but make sure that you use the new version and not the one that is marked as obsolete.  You will then be presented with a content tree that allows you to navigate to and select any node in your site.

Querying Associated Content

The field in your content will return the ID of the content instance you associated using the Content Picker.  Unfortunately it actually returns it as a HtmlString so you need to use the ToString method before utilizing it or you will get unexpected results and compile errors.

In the example below I am looking for the single piece of content selected in the Content Picker.  The LINQ query show a more complicated approach, but this also gives you and idea of how you could get a list of all nodes of a certain content type and use a lambda expression to filter it.  It requires that you first back up to the ancestors of the content you are displaying and find the root. 

The easier way is to use the Content or TypedContent methods of the UmbracoHelper.  In future posts I will show alternate methods for finding the root node as well.

    var contentId = Umbraco.Field("contentField");

    var associatedContent = Model.Content.Ancestors().FirstOrDefault().Children<ContentModels.MyContentType>().Where(x => x.Id == int.Parse(contentId .ToString())).FirstOrDefault();

Conclusion

While the Umbraco team needs to create some better documentation for this feature it is extremely useful for building and using relationships between content in your Umbraco site.

How Did I Become An IT Consultant Curmudgeon? Nov 02

walter

I have been accused of being a curmudgeon by more than one co-worker.  The short, pithy answers to the question of I got to this point would be “experience” or “it comes with age”.  But what is the real reason and does it have any benefit?

Firstly, I was raised in an Irish-German family which by default makes me surly and sarcastic.  At almost half a century of this habit I don’t see any change coming there.  I also find that most developers have similar traits along with a dry sense of humor.

The main thing that gained me the label of curmudgeon is the knack for identifying issues that could adversely affect a project.  Things like scope creep and knowledge voids that could push a project beyond its budget and deadlines.  This tendency has come from 20 years of being a technical lead and architect.  It is an attribute that has served me well.

The place where this becomes a thorn in some team members side is that with years I think I have become more blunt with my assessments.  I am still capable of tact, but probably need to employee a little more liberally.

Ultimately, I think being a self-aware-curmudgeon is a good thing.  As long as we continue to learn and strive to work with people a little surly is just the spice of life.

Umbraco Team Development Oct 27

umbracologo

The Umbraco CMS platform give you the ability to create a content managed site with the familiar development process of ASP.NET MVC.  If you are the only developer things don’t get too complicated, but the moment you are sharing your solution with a team you gain a few wrinkles that have to be addressed.

Syncing Content and Document Types

Umbraco saves its content partially to the file system and partially to the database.  This complicates sharing document types, templates and content between developers.  While Courier allows you to sync these elements between your local machine and your stage and production servers it doesn’t do well between to local host instances.

We addressed this problem using the uSync package which is available in the Umbraco package page under the developer section of your site.  It allows you to automatically record changes every time you make a save in the back office.  They are saved to files that can then be transferred to another system and will automatically be imported when the application pool restarts.

Other Special Cases

Another area that you will have to address is the content indexing files and the location of the auto-generated classes.  Files like “all.dll.path” will need to be ignored in your source control since it contains a fully qualified directory location which will cause problems as you pull source code to multiple machines.

Of course you will also need to manage your web.config files as you would with any ASP.NET based solution to make sure that you don’t step on each developer’s local settings.

Summary

If you follow these couple of guidelines you will overcome some of the more annoying aspects of developing an Umbraco solution.  Ultimately it will make the life of your developers will be much easier. 

Azure Functions Visual Studio 2017 Development Aug 10

Image result for azure functions logo

The development tools and processes for Azure Functions are ever changing.  We started out only being able to create a function through the portal which I did a series on.  We then got a template in VS2015, but it really didn’t work very well.  They have since been able to create functions as Web Application libraries and now we are close the the release of a VS2017 template.

This post will walk through the basics of using the VS2017 Preview with the Visual Studio Tools For Azure Functions which you can download here.

Create New Project

To create the initial solution open up the New Project dialog and find the Azure Function project type, name your project and click OK.

image_thumb10

Create New Function

To add a function to your project, right-click the project and select New Item.  In the New Item dialog select Azure Function and provide a name for the class and click Add. 

image_thumb12

The next dialog which will appear is the New Azure Function dialog.  Here you will select the function trigger type and its parameters.  In the example below a timer trigger has been selected and a Cron schedule definition is automatically defined to execute every 5 minutes.

Also in this dialog you can set the name of the function.  When you compile a folder will be created with that name in you bin directory which will be used later for deployment.

image_thumb14

Add Bindings

With each generation of Azure Function development the way you initially define bindings changes (even if they stay the same behind the scenes).  Initially you had to use the portal Integrate page.  This had its advantages.  It would visually prompt you for the type of binding and the parameters for that binding.

With the Visual Studio template you have to add attributes to the Run method of your function class.  This requires that you know what the attribute names are and what parameters are available and their proper values.  You can find a list of the main binding attributes here.

At compile time the attributes will be used to generate a function.json file with your trigger and bindings definition.

Add NuGet Packages

If you are building functions in the portal you have to create a projects.json file that defines the packages you want to include.  This requires that you know the format of the file.  Thankfully with the Visual Studio template you can use the normal Nuget Package manager.

Deploying

There are a couple of ways to deploy your solution.  In the end a Function App is a specialized App Service.  This means you have the same deployment options of Visual Studio, PowerShell or via VSTS continuous deployment.  The main difference is that you don’t have a web.config file and have to manage you app settings and connection strings through the portal.  This can be reached by following the Application Settings link under the Configured Features section of the Function App Overview page.

image

Summary

While creating Azure Functions still isn’t a WYSIWYG turn key process the latest incarnation gives us an ALM capable solution.  I believe this is the development approach that will stabilize for the foreseeable future and anyone who is creating Functions should invest in learning.

Query Application Insights REST API To Create Custom Notifications Aug 04

Image result for azure application insights logo

Application Insights is one of those tools that has been around for a number of years now, but is finally getting understood as more companies move to Azure as a cloud solution.  It has become an amazing tool for monitoring the performance of your application, but it can also work as a general logging platform as I have posted before.

Now that you are capturing all this information how can you leverage it?  Going to the Azure portal whenever you want an answer is time consuming.  It would be great if you could automate this process.  Of course there are a number of metrics that you can create alerts for directly via the portal, but what if you want a non-standard metric or want to do something beside just send an alert?

Fortunately Microsoft has a REST API in beta for Application Insights.  It allows you to check standard metrics as well as run custom queries as you do in the Analytics portal.  Let’s explore how to use this API.

In this post will show how to create a demo that implements an Azure Function which calls the Application Insights REST API and then send the results out using SendGrid.  I created them with the VS2017 Preview and the new Azure Functions templates.

Generate Custom Events

First we need some data to work with.  The simplest way is to leverage the TrackEvent and TrackException method of the Application Insights API.  In order to do this you first need to setup a TelemetryClient.  The code below I have as part of the class level variables.

        private static string appInsightsKey = System.Environment.GetEnvironmentVariable("AppInsightKey", EnvironmentVariableTarget.Process);
        private static TelemetryClient telemetry = new TelemetryClient();
        private static string key = TelemetryConfiguration.Active.InstrumentationKey = appInsightsKey; //System.Environment.GetEnvironmentVariable("AN:InsightKey", EnvironmentVariableTarget.Process);

After that it is simple to call the TrackEvent method on the TelemetryClient object to log an activity in your code (be aware it may take 5 minutes for an event to show up in Application Insights).

            telemetry.TrackEvent($"This is a POC event");

Create a VS2017 Function Application

I will have another post on the details in the future, but if you have Visual Studio 2017 Preview 15.3.0 installed you will be able to create an Azure Functions project.

image

Right click the project and select the New Item context menu option and select Azure Function as shown below.

image

On the New Azure Function dialog select TimerTrigger and leave the remaining options as default.

image

Call Application Insights REST API

Once there are events in the customEvents collection we can write a query and execute it against the Application Insights REST API.  To accomplish this the example uses a simple HttpClient call.  The API page for Application Insights can be found here and contains the ULRs and formats for each call type.  We will be using the Query API scenario which will be setup with a couple of variables.

        private const string URL = "https://api.applicationinsights.io/beta/apps/{0}/query?query={1}";
        private const string query = "customEvents | where timestamp >= ago(20m) and name contains \"This is a POC event\" | count";

The call to the service is a common pattern using the HttpClient as shown below.  Add this to the Run method of your new function.

            HttpClient client = new HttpClient();
            client.DefaultRequestHeaders.Accept.Add(
                new MediaTypeWithQualityHeaderValue("application/json"));
            client.DefaultRequestHeaders.Add("x-api-key", appInsightsApiKey);
            var req = string.Format(URL, appInsightsId, query);
            HttpResponseMessage response = client.GetAsync(req).Result;

Process Results

After we have a result we can deserialize the JSON using JSON.NET and send it to our support team via SendGrid.  You will have to add the NuGet package Microsoft.Azure.WebJobs.Extensions.SendGrid.

Modify the signature of your function’s Run method to match the code sample shown here.  In this example “message” is defined as an output variable for the Azure Function which is defined as a binding by using the SendGrid attribute. 

        public static void Run([TimerTrigger("0 */15 * * * *")]TimerInfo myTimer, TraceWriter log, [SendGrid(ApiKey = "SendGridApiKey")]out Mail message)

We will also need a structure to deserialize the returned JSON message into. If you look at the message itself it can appear rather daunting but it breaks down into the following class structure.  Create a new class file and replace the default class with this code.

    public class Column
    {
        public string ColumnName { get; set; }
        public string DataType { get; set; }
        public string ColumnType { get; set; }
    }

    public class Table
    {
        public string TableName { get; set; }
        public List<Column> Columns { get; set; }
        public List<List<object>> Rows { get; set; }
    }

    public class RootObject
    {
        public List<Table> Tables { get; set; }
    }

The last code example below performs the deserialization and creates the SendGrid email message.  Insert this to the Run method after the HttpClient call we previously added.

                string result = response.Content.ReadAsStringAsync().Result;
                log.Info(result);

                RootObject aiResult = JsonConvert.DeserializeObject<RootObject>(result);

                string countString = aiResult.Tables[0].Rows[0][0].ToString();

                string recipientEmail = System.Environment.GetEnvironmentVariable($"recipient", EnvironmentVariableTarget.Process);
                string senderEmail = System.Environment.GetEnvironmentVariable($"sender", EnvironmentVariableTarget.Process);

                var messageContent = new Content("text/html", $"There were {countString} POC records found");

                message = new Mail(new Email(senderEmail), "App Insights POC", new Email(recipientEmail), messageContent);

Publish your solution to an Azure Function App by downloading the Function App’s profile and using the VS2017 projects publish options.  You will also need to define the application settings referred to in the code so that they are appropriate for you environment.  At that point you will be able to observe the results of you efforts.

Summary

This post demonstrates how a small amount of code can give you the ability to leverage Application Insights for more than just out of the box statistics alerts.  This approach is flexible enough to be use for report on types of errors and monitoring if subsystems are remaining available.  Combining the features within Azure’s cloud offerings gives you capabilities that would cost much more in development time and resource if they were done on premises. 

My only real problem with this approach is that I would prefer to be accessing values in the result by name rather than indexes because this makes the code less readable and more brittle to changes.

Try these examples out and see what other scenarios they apply to in your business.

Logging To Application Insights In Azure Functions Feb 16

In my last post I covered logging in Azure Functions using TraceWriter and log4net.  Both of these work, but Application Insights rolls all your monitoring into one solution, from metrics to tracking messages.  I have also heard a rumor that in the near future this will be an integrated part of Azure Functions.  Given these factors it seem wise to start give it a closer look.

So how do you take advantage of them right now?  If you go to GitHub there is a sample written by Christopher Anderson, but let me boil this down.  First we need to create an Application Insight instance and grab the instrumentation key.

When I created my Application Insight instance I chose the General application type and the same resource group as my function app.

image

Once the instance has been allocated you will need to go into the properties blade.  There you will find a GUID for the Instrumentation Key.  Save this off so that we can use it later.

You then need to add the Microsoft.ApplicationInsights NuGet package by creating a project.json file in your function.  Insert the following code in the new file and save it.  If you have your log window open you will see the package being loaded.

 {   
  "frameworks": {   
   "net46":{   
    "dependencies": {   
     "Microsoft.ApplicationInsights": "2.1.0"   
    }   
   }   
   }   
 }  

In the sample code read.me it says that you need to add a specific app setting, but as long as your code reads from the appropriate setting that is the most important part.  Take the Instrumentation Key that you saved earlier and place it in the app settings.  In my case I used one called InsightKey.  

Next setup your TelemetryClient object like the code here by creating global static variables that can be used throughout your application.  After that we are ready to start tracking our function. 

 private static TelemetryClient telemetry = new TelemetryClient();   
 private static string key = TelemetryConfiguration.Active.InstrumentationKey = System.Environment.GetEnvironmentVariable("InsightKey", EnvironmentVariableTarget.Process);  

To track and event or an exception simply call the appropriate method.  I prefer to encapsulate them in their own methods where I can standardize the usage.  I have added the function name, method name, and context ID from the function execution to make it easier to search and associate entries.

 private static void TrackEvent(string desc, string methodName)   
 {   
   telemetry.TrackEvent($"{FunctionName} - {methodName} - {contextId}: {desc}");   
 } private static void TrackException(Exception ex, string desc, string methodName)   
 {   
   Dictionary<string,string> properties = new Dictionary<string,string>() {{"Function",FunctionName}, {"Method",methodName}, {"Description",desc}, {"ContextId",contextId}};   
   telemetry.TrackException(ex, properties);   
 }  

Analytics

This isn’t an instant answer type of event store.  At the very least there is a few minute delay your application logging and event or exception and when it is visible in the Analytics board.

Once you are logging and sending metrics to Application Insights you need to read the results.  From your Application Insight main blade click on the Analytics button at the top of the overview.  It will open a new page that resembles what you see below.

image

Click the new tab button at the top next to the Home Page tab.  This will open a query window. The query language has a similar structure to SQL, but that is about as far as it goes.

The table objects are listed on the left navigation with the fields listed as you expand out each table.  Fortunately intellisense works pretty well in this tool.  You have what would normally be considered aggregate functions that make life easier.  As you can see below you can use the contains syntax that acts similar to a SQL like comparison.  There are also date range functions like the ago function used below.  I found that these two features can find most things you are looking for.

image

Summary

This posted didn’t cover a lot of the native functionality in Application Insight, but hopefully it gives you a starting point to instrument your Azure Functions.  The flexibility of this tool along with it the probability of it being built into Functions in the future make it a very attractive option.  Spend some time experimenting with it and I think you find it will pay dividends.

Implementing Logging In Azure Functions Feb 13

image

Logging is essential to the support of any piece of code.  In this post I will cover two approaches to logging in Azure Functions: TraceWriter and log4net.

TraceWriter

The TraceWriter that is available out of the box with Azure Functions is a good starting point.  Unfortunately it is short lived and only 1000 messages are kept at a maximum and at most they are held in file form for two days.  That being said, I would not skip using the TraceWriter.

Your function will have a TraceWriter object passed to it in the parameters of the Run method.  You can use the Debug, Error, Fatal, Info and Warn methods to write different types of messages to the log as shown below.

log.Info($"Queue item received: {myQueueItem}");

Once it is in the log you need to be able to find the messages.  The easiest way to find the log files is through Kudu.  You have to drill down from the LogFiles –> Application –> Functions –> Function –> <your_function_name>.  At this location you will find a series of .log files if you function has been triggered recently.

image

The other way to look at your logs is through Table Storage via the Microsoft Azure Storage Explorer.  After attaching to your account open the storage account associated with your Function App.  Depending on how you organized your resource groups you can find the storage account by looking at the list of resources in the group that the function belongs to.

Once you drill down to that account look for the tables named AzureWebJobHostLogsyyyymm as you see below.

image

Opening these tables will allow you to see the different types of log entries saved by the TraceWriter.  If you filter to the partition key “I” you will see the entries your functions posted.  You can further filter name and date range to identify specific log entries.

image

log4net

If the default TraceWriter isn’t robust enough you can implement logging via a framework like log4net.  Unfortunately because of the architecture of Azure Functions this isn’t as easy as it would be with a normal desktop or web application.  The main stumbling block is the lack of ability to create custom configuration sections which these libraries rely on.  In this section I’ll outline a process for getting log4net to work inside your function.

The first thing that we need is the log4net library.  Add the log4net NuGet package by placing the following code in the project.json file.

{
  "frameworks": {
    "net46":{
      "dependencies": {
        "log4net": "2.0.5"
      }
    }
   }
}

To get around the lack of custom configuration sections we will bind a blob file with your log4net configuration.  Simply take the log4net section of and save it to a text file.  Upload that to a storage container and bind it to your function using the full storage path.

image

Add the references to the log4net library and configure the logger.  Once you have that simply call the appropriate method on the logger and off you go.  A basic sample of the code for configuring and using the logger is listed below.  In this case I am actually using a SQL Server appender.

using System;
using System.Xml;
using log4net;

public static void Run(string input, TraceWriter log, string inputBlob)
{
    log.Info($"Log4NetPoc manually triggered function called with input: {input}");
    log.Info($"{inputBlob}");

    XmlDocument doc = new XmlDocument();
    doc.LoadXml(inputBlob);
    XmlElement element = doc.DocumentElement;

    log4net.Config.XmlConfigurator.Configure(element);

    ILog logger = LogManager.GetLogger("AzureLogger");

    logger.Debug($"Test log message from Azure Function", new Exception("This is a dummy exception"));
   
}

Summary

By no means does this post cover every aspect of these two logging approaches or all possible logging approaches for Azure Functions.  In future posts I will also cover AppInsight.  In any case it is always important to have logging for you application.  Find the tool that works for your team and implement it.

Building Azure Functions: Part 3 – Coding Concerns Feb 02

Image result for azure functions logo

In this third part of my series on Azure Function development I will cover a number of development concepts and concerns.  These are just some of the basics.  You can look for more posts coming in the future that will cover specific topics in more detail.

General Development

One of the first things you will have to get used to is developing in a very stateless manner.  Any other .NET application type has a class at its base.  Functions, on the other hand, are just what they say, a method that runs within its own context.  Because of this you don’t have anything resembling a global or class level variable.  This means that if you need something like a logger in every method you have to pass it in.

[Update 2016-02-13] The above information is not completely correct.  You can implement function global variables by defining them as private static.

You may find that it makes sense to create classes within your function either as DTOs or to make the code more manageable.  Start by adding a .csx file in the files view pane of your function.  The same coding techniques and standards apply as your Run.csx file, otherwise develop the class as you would any other .NET class.

image

In the previous post I showed how to create App Settings.  If you took the time to create them you are going to want to be able to retrieve them.  The GetEnvironmentVariable method of the Environment class gives you the same capability as using AppSettings from ConfigurationManager in traditional .NET applications.

System.Environment.GetEnvironmentVariable("YourSettingKey")

A critical coding practice for functions that use perishable resources such as queues is to make sure that if you catch and log an exception that you rethrow it so that your function fails.  This will cause the queue message to remain on the queue instead of dequeuing.

Debugging

image

It can be hard to read the log when the function is running full speed since instance run in parallel but report to the same log.  I would suggest that you added the process ID to your TraceWriter logging messages so that you can correlate them.

Even more powerful is the ability to remote debug functions from Visual Studio.  To do this open your Server Explorer and either connect to your Azure subscription.  From there you can drill down to the Function App in App Services and then to the run.csx file in the individual function.  Once you have open the code file and place your break points, right-click the function and select Attach Debugger.  From there it acts like any other Visual Studio debugging session.

image

Race Conditions

I wanted to place special attention on this subject.  As with any highly parallel/asynchronous processing environment you will have to make sure that you take into account any race conditions that may occur.  If at all possible keep the type of functionality that your create to non-related pieces of data.  If it is critical that items in a queue, blob container or table storage are processed in order then Azure Functions are probably not the right tool for your solution.

Summary

Azure Functions are one of the most powerful units of code available.  Hopefully this series gives you a starting point for your adventure into serverless applications and you can discover how they can benefit your business.

Building Azure Functions: Part 2–Settings And References Feb 01

Image result for azure functions logo

This is the second post in a series on building Azure Functions.  In this post I’ll continue by describing how to add settings to your function and reference different assemblies to give you more capabilities.

Settings

image_thumb1[1]

Functions do not have configuration files so you must add app settings and connection strings through the settings page.  The settings are maintained at an Function App level and not individual functions.  While this allows you to share common configuration values it means that if your custom assemblies need different values in configuration settings per function they will each function will have to live in a separate function app.

To get to them go to the Function App Settings link at the lower left of your App Function’s main page and then click the Configure App Settings button which will bring you to the blade shown below.  At that point it is the same any .NET configuration file.

image

At some point I would like to see the capability of importing and exporting settings since maintaining them individually, by hand leads to human error and less reliable application lifecycle management.

Another drawback to the Azure Functions development environment is that at the time of this post you don’t have the ability to leverage custom configuration sections.  The main place I have found this to cause heartburn is using logging libraries such as log4net where the most common scenario is to use a custom configuration section to define adapters and loggers.

Referencing Assemblies And Nuget

No .NET application is very useful if you can’t reference all of the .NET Framework as well as third party and your own custom assemblies.  There is no add references menu for Azure functions and there are multiple ways to add references.  Lets take a look at each.

There are a number of .NET assemblies that are automatically referenced for your Function application.  There are a second group of assemblies that are available but need to be specifically reference.  For a partial list consult the Azure Function documentation here.  You can also load your own custom assemblies or bring in Nuget packages. 

In order to load Nuget packages you need to create a project.json file.  Do this by clicking the View Files link in the upper right corner of the editor blade and then the Add link below the file list pane. 

project.json files require the same information that is contained in packages.config file, but it is formatted in json as shown in the example below.  Once you save this file and reference the assembly in your Run.csx file Azure will load the designated packages.

image_thumb8

If you have custom libraries that you want to leverage you will need to add a bin folder to your function.  The easiest way I have found to do this is to open the App Service Editor from the Function App Settings page.  This will open up what is essentially Visual Studio Code in a browser.  Navigate the file tree to your function under wwwroot.  Right click your function name and select New Folder.  The folder must be named “bin”.  You can then right click the bin folder and upload your custom assemblies.

Once you have an assembly available you need to reference it using the “r#” directive as shown below.  You will notice that native assemblies and Nuget package loaded libraries do not need the dll extension specified, but they must be added for custom assemblies.

#r "System.Xml"
#r "System.Xml.Linq"
#r "System.Data.Entity"
#r "My.Custom.Data.dll"
#r "My.Custom.Domain.dll"
#r "Newtonsoft.Json"
#r "Microsoft.Azure.Documents.Client"
#r "Microsoft.WindowsAzure.Storage"

Now we are ready to declare our normal using statements and get down to the real business of functions.

Summary

After this post we have our trigger, bindings, settings and dependent assemblies.  This still isn’t enough for a useful function.  In the next post I will cover coding and debugging concerns to complete the story.

Building Azure Functions: Part 1–Creating and Binding Jan 31

Image result for azure functions logo

The latest buzz word is serverless applications.  Azure Functions are Microsoft’s offering in this space.  As with most products that are new on the cloud Azure Functions are still evolving and therefore can be challenging to develop.  Documentation is still being worked on at the time I am writing this so here are some things that I have learned while implementing them.

There is a lot to cover here so I am going to break this topic into a few posts:

  1. Creating and Binding
  2. Settings and References
  3. Coding Concerns

Creating A New Function

The first thing you are going to need to do is create a Function App.  This is a App Services product that serves as a container for your individual functions.  The easiest way I’ve found to start is to go to the main add (+) button on the Azure Portal and then do a search for Function App.

image

Click on Function App and then the Create button when the Function App blade comes up.  Fill in your app name remembering that this a container and not your actual function.  As with other Azure features you need to supply a subscription, resource group and location.  Additionally for a Function App you need to supply a hosting plan and storage account.  If you want to take full benefit of Function Apps scaling and pricing leave the default Consumption Plan.  This way you only pay for what you use.  If you chose App Service Plan your function will will pay for it whether it is actually processing or not.

image

Once you click Create the Function App will start to deploy.  At this point you will start to create your first function in the Function App.  Once you find your Function App in the list of App Services it will open the blade shown below.  It offers a quick start page, but I quickly found that didn’t give me options I needed beyond a simple “Hello World” function.  Instead press the New Function link at the left.  You will be offered a list of trigger based templates which I will cover in the next section.

image

Triggers

image

Triggers define the event source that will cause your function to be executed.  While there are many different triggers and there are more being added every day, the most common ones are included under the core scenarios.  In my experience the most useful are timer, queue, and blob triggered functions.

Queues and blobs require a connection to a storage account be defined.  Fortunately this is created with a couple of clicks and can be shared between triggers and bindings as well as between functions.  Once you have that you simply enter the name of the queue or blob container and you are off to the races.

When it comes to timer dependent functions, the main topic you will have to become familiar with is chron scheduling definitions.  If you come from a Unix background or have been working with more recent timer based WebJobs this won’t be anything new.  Otherwise the simplest way to remember is that each time increment is defined by a division statement.

image

In the case of queue triggers the parameter that is automatically added to the Run method signature will be the contents of the queue message as a string.  Similarly most trigger types have a parameter that passes values from the triggering event.

Input and Output Bindings

image

Some of the function templates include an output binding.  If none of these fit your needs or you just prefer to have full control you can add a binding via the Integration tab.  The input and output binding definitions end up in the same function.json file as the trigger bindings. 

The one gripe I have with these bindings is that they connect to a specific entity at the beginning of your function.  I would find it preferable to bind to the parent container of whatever source you are binding to and have a set of standard commands available for normal CRUD operations.

Let’s say that you want to load an external configuration file from blob storage when your function starts.  The path shown below specifies the container and the blob name.  The default format show a variable “name” as the blob name.  This needs to be a variable that is available and populated when the function starts or an exception will be thrown.  As for your storage account specify it by clicking the “new” link next to the dropdown and pick the storage account from those that you have available.  If you specified a storage account while defining your trigger and it is the same as your binding it can be reused.

image

The convenient thing about blob bindings is that they are bound as strings and so for most scenarios you don’t have to do anything else to leverage them in your function.  You will have to add a string parameter to the function’s Run method that matches the name in the blob parameter name text box.

Summary

That should give you a starting point for getting the shell of your Azure Function created.  In the next two posts I will add settings, assembly references and some tips for coding your function.

Sketchnotes: Microsoft Windows 10 Creator Update Event Oct 26

On October 26, 2016 Microsoft had an event to show off the future of Windows 10 and some new hardware.  The following sketchnotes summarize the announcements from that event.

clip_image001

clip_image001

clip_image001[6]

clip_image001[8]

Cloud Battles: Azure vs AWS–The Video Jun 29

Earlier this month Norm Murrin and I gave a talk at the Chicago Coder Conference.  We learned a lot about how the offerings of each company compares during our preparation.  In the end we come to the conclusion that there is no clear winner except those of us who are leveraging the resources.  Check out this video posted by the conference do get the blow-by-blow details.

Application Integration: Azure Functions Vs WebJobs Jun 02

image

[Updated]

UI development gets all the attention, but application integration is where the real work is done.  When it comes to application integration in the Azure ecosystem you better learn how Functions and WebJobs are developed and under what conditions you should use each.  In this post I will try to answer those questions.

For me it is important that a solutions is reasonably maintainable, deployable through environments and can be easily managed under source control.

Both products are built on the same code base and share the same base API.  From that perspective they are closely matched.  Functions do have the advantage of handling web hooks as opposed to simply timer and storage events with WebJobs.

There is another difference that I haven’t been able to prove you, but I’ve seen mentioned in a couple of places.  It seems like Functions may take time to warm up since they aren’t always instantiated.  Since WebJobs are always running they would not incur this startup cost.  If immediate processing is important then WebJobs may be the more appropriate options for you.

When it comes to actual development I prefer to have the resources of Visual Studio to write and manage source code as well as package my deliverables for deployment.  As of this writing I have not been able to find a Visual Studio project type.  This means you edit the code through a web browser.  This in portal editor does allow you to integrate with Git or VSTS for source control.  I would expect at some point in the future we will get a Functions project type.

Both WebJobs and Functions can be written using C#/VB.NET and Node.js.  From the language availability perspective they are even.

Summary

So what is the real separating line between using one or the other.  From what I have experienced so far, if you need the web hooks then Functions are the right choice.  If you don’t need the web hooks and maintainability is you priority then WebJobs are the way to go.  I’m sure there are more reason, but these are the most obvious in the early days of Functions.  As the products evolve I’ll post updates.

[Update]

Christopher Anderson (@crandycodes) from the Azure team replied via Twitter with the following:

You hit on some key points like lack of tooling/VS integration. We plan on addressing those before GA.
I think the major point missing is the dynamic scale functionality, pay per use. Functions scale automatically and don't cost a VM.
Also, if you run Functions in dedicated with always on, there is no cold start issues, but you pay per VM at that point.
WebJobs vs Functions is really: "Do I want to manage my own custom service?" Yes: WebJobs, No: Functions. Otherwise, similar power.

A TFS Developer In A GitHub World May 04

Git and GitHub have been around for a few years now.  They are becoming more popular by the day.  I finally got around to looking at them more closely over the last few months and decided to summarize the experiences.

My first experience with GitHub was not the most pleasant.  I was using Visual Studio 2013 which doesn’t seem to have the best integration story (or at least didn’t when I tried it).  The fact that it required that an existing repository be cloned via the GitHub desktop before Visual Studio knew anything about it was the biggest pain.

Visual Studio 2015 on the other hand has a much better use story.  You are able to log into your repository and get a clone of the repository without breaking out to another tool.  The commit process is pretty similar to that of TFS from that point on.

From my trials I have found that GitHub works well as a source control repository.  It is hard at first getting used to the non-Microsoft verbs that are used.  Retraining yourself that you have to do a commit and push before something is actually checked in instead of just doing a

As for working as a team I think that TFS still has the better features.  This may just be because it isn’t as well integrated with Visual Studio.  Having customizable work items in TFS comes in very handy, especially on larger enterprise projects.

The wiki gives a good place to put documentation, but it doesn’t give you a place to manage Word, Excel, PowerPoint, PDF and Visio documents that might have important information about your project. This is where TFS and SharePoint really shine.

Another drawback I see over TFS is that GitHub repositories are public and can’t be private unless you have a paid account.  I can create a free TFS Online account that gives me private repositories and access for up to five users.  This makes it better for the individual developer to the small team.

Of course there is a third option that you can use Git in TFS.  This gives you the source control of Git with the project management features of TFS.  It took a little bit to get may existing code into the new Git-TFS project repository.  Then came the realization that the only source control viewer for Git repositories is in the portal.  The growing pains continue.

Summary

I am sure that the story around GitHub will improve over time, but right now it just seems like people are using it because it is what the cool kids are doing or they are working on open source projects.  If I have to advise a client I am going to suggest they go with the product with the best and most complete story with the best integration to their current toolset.  For now that is TFS, especially if you are a Microsoft development shop.

As for my GitHub experiment, it goes on but I deleted the repository I had created for security reason.  Stay tuned and see what else develops.  The next step is probably Git in TFS.