A few years back I created Skype Voice Changer Pro which I sold online, using Paddle as my payment provider. Whenever I make a sale (which isn’t too often these days thanks to an issue with recent versions of Skype), I get notified via a webhook. On receiving that webhook, I need to generate a license file and email it to the customer.

Azure Functions are perfect for this scenario. I can quickly create a secure webhook to handle the callback from Paddle, post a message onto a queue to trigger license generation, and then another queue to trigger sending an email.

Let’s see how we can set this up.

First of all, I need to create a new Azure Function, which I’ll create as a generic C# webhook:


The first thing I needed to do for my webhook, was edit the function.json file to remove the “webHookType” setting from the input httpTrigger. By default this will be set to “genericJson”, but that means we can only accept webhooks with JSON in their body. Paddle’s webhook comes in as x-www-form-urlencoded content, so removing the webHookType setting allows us to receive the HTTP request.


Now in our run.csx file we can use ReadAsFormDataAsync to get access to the form parameters.

Next, we need to validate the order. Azure Functions has got built-in webhook validation for GitHub and Slack, but not for Paddle, so we must do this ourselves. This is done using a shared secret which we can set in the App Service configuration and access through the ConfigurationManager in the same way you would with a regular web app.

If the order is valid, for now let’s just respond saying thank you. Paddle will include this text in their confirmation email to the customer.

public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
    var formData = await req.Content.ReadAsFormDataAsync();

    var orderId = formData["p_order_id"];
    var customerEmail = formData["customer_email"];
    var messageId = formData["message_id"];
    var customerName = formData["customer_name"];    

    log.Info($"Received {orderId}");

    var sharedSecret = ConfigurationManager.ConnectionStrings["PaddleSharedSecret"].ConnectionString;
    if (!ValidateOrder(sharedSecret, customerEmail, messageId, log))
        log.Info($"Failed to Validate!");
        return req.CreateResponse(HttpStatusCode.Forbidden, new {
            error = $"Invalid message id"

    return req.CreateResponse(HttpStatusCode.OK, new {
        greeting = $"Thank you for order {orderId}!"

And here’s the C# code to validate a Paddle webhook:

public static bool ValidateOrder(string sharedSecret, string customerEmail, string messageId, TraceWriter log)
    if (customerEmail == null || messageId == null) 
        log.Warning("Missing email or message id");
        return false;
    var input = HttpUtility.UrlEncode(customerEmail + sharedSecret);

    var md5 = System.Security.Cryptography.MD5.Create();
    byte[] inputBytes = Encoding.ASCII.GetBytes(input);
    byte[] hash = md5.ComputeHash(inputBytes);

    var sb = new StringBuilder();
    foreach (byte t in hash)
    var expectedId = sb.ToString();
    var success = (expectedId == messageId);
    if (!success) 
        log.Warning($"Expected {expectedId}, got {messageId}");
    return success;

Now the only thing left to do is to trigger the license generation and email, which we’ll do by posting a message to a queue. This is preferable to doing everything there and then in the webhook as queues allow our webhook to respond quickly and give us retrying if the email service is temporarily down. Breaking the process down into three small loosely coupled pieces will also give us maintainability and testability benefits.

We support send a message to a queue by going to the “Integrate” section in the portal and adding a new output binding of type Azure Storage Queue. This is the easiest to set up as there’s already a storage account associated with your function app you can use called “AzureWebJobsStorage” (although arguably you should create your own to keep your application data separate from the Azure Functions runtime’s data which resides in that storage account).


I’ll call my queue “orders”, and Azure Functions will automatically create it for me.


To send the message to the queue there are a number of options but I chose to create a strongly typed class “OrderInfo” and use a IAsyncCollector<T> parameter type for binding. This has the advantage of working with async functions (which mine is), but also supports sending 0 or more messages to the queue. We won’t be generating a license if the webhook is invalid so this is handy.

Here’s the key bits of the updated function:

public class OrderInfo 
    public string OrderId { get; set; }
    public string CustomerEmail { get; set; }
    public string CustomerName { get; set; }
    public string LicenseDownloadCode { get; set; }

public static async Task<object> Run(HttpRequestMessage req, IAsyncCollector<OrderInfo> outputQueueItem,  TraceWriter log)

    // ... order validation here
    // send on to the queue to generate license
    var orderInfo = new OrderInfo {
        OrderId = orderId,
        CustomerEmail = customerEmail,
        CustomerName = customerName,
        LicenseDownloadCode = licenceDownloadCode,
    await outputQueueItem.AddAsync(orderInfo);

    return req.CreateResponse(HttpStatusCode.OK, new {
        greeting = $"Thank you for order {orderId}!"

As you can see it’s super easy to send the message, just call AddAsync on the collector.

Finally, we need to handle messages in the queue. There’s a super feature in the portal where if you go to the Integrate tab for your function and select the queue output binding, there’s a button to set up a new function that is triggered by messages on that queue:


By clicking this, it will auto-fill in the bindings for my new function, giving me a new function all set up to read off the queue and log each message received:



Now it’s just a case of putting my license generation code into this function, as well as posting to another queue to trigger a third function which sends out the license email. Azure functions includes a built-in SendGrid binding which makes sending emails very easy (although I’m currently using a different service).

We can easily test our function using Postman (can’t use the portal in this case as it only sends JSON), and sure enough the webhook function is successful, and we can see in the logs for the license generation function that a message was indeed posted to the queue.



Using Azure Functions to handle webhooks is a big improvement from the quick and dirty code I originally created which simply did everything synchronously in a hidden API sat on my website. It meant my order webhook code was now coupled to the web server, which got in the way of me doing things like switching the website to use WordPress. With Azure functions I can move this webhook (and several others for things like letting users report errors from the app) out of my website into small loosely coupled functions.


The great thing about “serverless” code is that you don’t need to worry about servers at all. If my function gets invoked 10 times, all 10 invocations might run on the same server, or they might run on 10 different servers. I don’t need to know or care.

But suppose every time my function runs I need to look something up in a database. I might decide that it would be nice to temporarily cache the response in memory so that subsequent runs of my function can run a bit faster (assuming they run on the same server as the previous invocation).

Is that possible in Azure Functions? I did a bit of experimenting to see how it could be done.

To keep things simple, I decided to make a C# webhook function that counted how many times it had been called. And I counted in four ways. First, using a static int variable. Second, using the default MemoryCache. Third, using a text file in the home directory. Fourth, using a per-machine text file in the home directory. Let’s see what happens with each of these methods.

1. Static Integer

If you declare a static variable in your run.csx file, then the contents of that variable are available to all invocations of your function running on the same server. So if our function looks like this:

static int invocationCount = 0;

public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
    log.Info($"Webhook triggered {++invocationCount}");
return ... }

And we call it a few times, then we’ll see the invocation count steadily rising. Obviously this code is not thread-safe, but it shows that the memory persists between invocations on the same server.

Unsurprisingly, every time you edit your function, the count will reset. But also you’ll notice it reset at other times too. There’s no guarantee that what you store in a static variable will be present on the next invocation. But it’s absolutely fine for temporarily caching something to speed up function execution.

2. MemoryCache

The next thing I wanted to try was sharing memory between two different functions in the same function app. This would allow you to share a cache between functions. To try this out I decided to use MemoryCache.Default.

static MemoryCache memoryCache = MemoryCache.Default;
public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
    var cacheObject = memoryCache["cachedCount"];
    var cachedCount = (cacheObject == null) ? 0 : (int)cacheObject;   
    memoryCache.Set("cachedCount", ++cachedCount, DateTimeOffset.Now.AddMinutes(5));

    log.Info($"Webhook triggered memory count {cachedCount}");
    return ...

Here we try to find the count in the cache, increment it, and save it with a five minute expiry. If we copy this same code to two functions within the same Azure Function App, then sure enough they each can see the count set by the other one.

Again, this cache will lose its contents every time you edit your code, but its nice to know you can share in-memory data between two functions running on the same server.

3. On Disk Shared Across All Servers

Azure function apps have a %HOME% directory on disk which is actually a network share. If we write something into that folder, then all instances of my functions, whatever server they are running on can access it. Let’s put a text file in there containing the invocation count. Here’s a simple helper method I made to do that:

private static int IncrementInvocationCountFile(string fileName)
    var folder = Environment.ExpandEnvironmentVariables(@"%HOME%\data\MyFunctionAppData");
    var fullPath = Path.Combine(folder, fileName);
    Directory.CreateDirectory(folder); // noop if it already exists
    var persistedCount = 0;
    if (File.Exists(fullPath))
        persistedCount = int.Parse(File.ReadAllText(fullPath));
    File.WriteAllText(fullPath, (++persistedCount).ToString());
    return persistedCount;

We can call it like this:

public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
    var persistedCount = IncrementInvocationCountFile("invocations.txt");
    log.Info($"Webhook triggered {persistedCount}");
    return ...;

Obviously this too isn’t thread-safe as we can’t have multiple instances of our function reading and writing the same file, but the key here is that anything in this folder is visible to all instances of our function, even across different servers (although it was several days before I saw my test function actually run on a different server). And unlike the in memory counter, this won’t be lost if your function restarts for any reason.

4. Per Machine File

What if you want to use disk storage for temporary caching, but only want per machine? Well, each server does have a local disk, and you can write data there by writing to the %TEMP% folder. This would give you temporary storage that persisted on the same server between invocations of functions in the same function app. But unlike putting things in %HOME% which the Azure Functions framework won’t delete, things you put in %TEMP% should be thought of as transient. The temp folder would probably best be used for storing data needed during a single function invocation.

For my experiment I decided to use System.Environment.MachineName as part of the filename, so each server would maintain its own invocation count file in the %HOME% folder.

public static async Task<object> Run(HttpRequestMessage req, TraceWriter log)
    var machineCount = IncrementInvocationCountFile(System.Environment.MachineName + ".txt");
    log.Info($"Webhook triggered {machineCount}");
    return ...;

And so now I can use Kudu to look in my data folder and see how many different machines my function has run on.

Should I do this?

So you can use disk or memory to share state between Azure Functions. But does that mean you should?

Well, first of all you must consider thread safety. Multiple instances of your function could be running at the same time, so if you used the techniques above you’d get file access exceptions, and you’d need to protect the static int variable from multiple access (the MemoryCache example is already thread-safe).

And secondly, be aware of the limitations. Anything stored in memory can be lost at any time. So only use it for temporarily caching things to improve performance. By contrast anything stored in the %HOME% folder will persist across invocations, restarts and different servers. But it’s a network share. You’re not really storing it “locally”, so it’s not all that different from just putting the data you want to share in blob storage or a database.

One of my favourite things about the new Azure Functions service is how easy it is to quickly prototype application ideas. Without needing to provision or pay for a web server, I can quickly set up a webhook, a message queue processor or a scheduled task.

But what if I’m prototyping a web-page and want a simple REST-like CRUD API? I decided to see if I could build a very simple in-memory database using nodejs and Azure Functions.

Step 0 – Create a function app

It’s really easy to get started with Azure Functions, but if you’ve not yet used them, you can sign in to functions.azure.com and you’ll be offered the opportunity to open an existing function app or create a new one:


It’s just a matter of choosing which region you want to host it in. By default this will create a function app on the Dynamic pricing tier, which is what you want since you only pay for what you use, and there’s a generous free monthly grant, so for most prototyping purposes it’s likely to be completely free.

Step 1 – Create a new NodeJS Function

From within the portal, select “New Function”, and choose the “Generic Webhook – Node” option.


Give your function a name relating to the resource you’ll be managing, as it will be part of the URL:


Step 2 – Enable more HTTP Verbs

In the portal, under your new function, select the “Integrate” tab, and then choose “Advanced Editor” to let you edit the function.json file directly.


You’ll want to add in a new “methods” property of the “httpTrigger” binding, containing all the verbs you want to support. I’ve added get, post, put, patch and delete


Step 3 – Add the In-Memory CRUD JavaScript File to your Function

I’ve made a simple in-memory database supporting CRUD in node.js. I’m not really much of a JavaScript programmer or a REST expert, so I’m sure there’s a lot of scope for improvement (let me know in the comments), but here’s what I came up with:


To get this into your function you have a few options. The easiest is in the “Develop” tab to hit “View Files”, and drag and drop my inMemoryCrud.js file straight in. Or you can just create a new file and paste the contents in.


If you look through you’ll see it supports GET of all items or by id, inserting with POST and deleting with DELETE, as well as replacing by id using PUT and even partial updates with PATCH. It optionally lets you specify required fields for POST and PUT, and there’s a seedData method for you to give your API some initial data if needed.

Obviously, since this is an in-memory database there are a few caveats. It will reset every time your function restarts, which will happen whenever you edit the code for your function, but can happen at other times too. Also if there are two servers running your function, they would both have their own in-memory databases, but Azure functions is unlikely to scale up to two instances for your function app unless you are putting it under heavy load.

Step 4 – Update index.js

Our index.js file contains the entry point for the function. All we need to do is import our in memory CRUD JavaScript file, seed it with any starting data we want, and then when our function is called, pass the request off to handleRequest, optionally specifying any required fields.

Here’s an example index.js:

Step 5 – Enabling CORS

If you’re calling the function from a web-page then you’ll need to enable CORS. Thankfully this is very easy to do, although it is configured at the function app level rather than the function level. In the portal, select function app settings and then choose Configure CORS.



In there, you’ll see that there are some allowed origins by default, but you can easily add your own here, as I’ve done for localhost:


Step 6 – Calling the API

The Azure function portal tells you the URL of your function which is based on your function app name and function name. It looks like this:


The code is Azure Function’s way of securing the function, but for a prototype you may not want to bother with it. You can turn it off by setting authLevel to anonymous in the function.json file for the httpTrigger object.


The other slight annoyance is that Azure Functions doesn’t support us using nice URLs like https://myfunctionapp.azurewebsites.net/api/Tasks/123 to GET or PUT a specific id. Instead we must supply the id in the query string: https://myfunctionapp.azurewebsites.net/api/Tasks?id=123

Is there a quicker way to set this up?

If that seems like quite a bit of work, remember that all an Azure function is, is a folder with a function.json file and some source files. So you can simply copy and paste the three files from my github gist into as many folders as you want and hey presto you have in-memory web-apis for as many resources as you need – just by copying and renaming folders – e.g. orders, products, users etc. If you’re deploying via git (which you should be), this is very easy to do.

What if I want persistence?

Obviously in-memory is great for simple tests, but at some point you’ll want persistence. There are several options here, including Azure DocumentDb and Table Storage, but this post has got long enough, so I’ll save that for a future blog post.


The Azure C# SDK contains the NamespaceManager.GetQueues method to receive a list of all the queues in your Azure Service Bus namespace. This is great if you’ve just got a few queues, but if you have hundreds or even thousands, you might want to filter them.

The good news is that there’s an overload of GetQueues that takes a filter string. The bad news is that the MSDN documentation gives you no clues whatsoever as to what you’re supposed to put in that string.

So after a lot of guesswork, googling and experimentation, here’s what I’ve worked out.

Filter by Queue Name

To get all the queues that start with a particular prefix, you can use the startswith method, and check that it returns true:

var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);
var myQueues = namespaceManager.GetQueues("startswith(path, 'mheath') eq true");

I did experiment to see if there also were methods available like endswith, contains or substringof, but couldn’t get anything to work.

Filter by Message Count

You can also query for all queues based on their message count (which includes the number of dead-lettered messages). Here we’re looking for queues with over 1000 messages. The queue objects themselves are instances of QueueDescription which has a MessageCountDetails property which allows you to see the breakdown of messages by type (active, dead-letter, scheduled etc).

var backedUpQueues = namespaceManager.GetQueues("messageCount Gt 1000");

Filter by Date

Finally you can query for all queues by their created date, or last accessed or updated date. Here’s a query for inactive queues that haven’t been accessed in the past few months:

namespaceManager.GetQueues("AccessedAt Lt '2016-06-01T00:00:00Z'")

What about Topics and Subscriptions?

NamespaceManager has GetTopics and GetSubscriptions method which also take filter strings, so you should be able to apply the same operators to get filtered topics and subscriptions.


Azure Service Bus Topics and Subscriptions offer you a powerful way to configure multiple subscribers for a single message. So you could post an OrderCreated message to a topic, and one subscriber might initiate payment processing, another might send a confirmation email, and another might write to an audit log. So using topics and subscriptions instead of queues offers you a lot of flexibility.

But sometimes in a topics and subscriptions setup you might find your subscriber is only interested in receiving a subset of the messages posted to that topic. One way to handle this is for your subscribers to simply ignore the messages they don’t care about, but that is wasteful since you pay for every receive operation.

A better approach is to create a filtered subscription which will only receive messages you are interested in. A filter can be based on any properties of the BrokeredMessage, except the body, since that would require every message to be deserialized in order to be handed on to the correct subscriptions.

Let’s see how easy it is to get started with filtered subscriptions.

Step 1 - Create Your Topic

Subscriptions are based off of topics, so we need to ensure we have a topic. Here’s some simple code to create a topic if it doesn’t already exist:

string connectionString = // your servicebus connection string here;

// the names of topics and subscriptions we'll be working with
const string topicName = "MyTestTopic";
const string allMessagesSubName = "AllMessages";
const string filteredSubName1 = "Filtered1";
const string filteredSubName2 = "Filtered2";

// let's create the topic if it doesn't already exist...
var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);
if (!namespaceManager.TopicExists(topicName))
    var td = new TopicDescription(topicName);  

Step 2 – Create the Filtered Subscriptions

You can create subscriptions at any time, but one important thing to remember with topics is that unlike queues, if no one is listening, messages sent to the topic will be lost. So make sure you create your subscriptions before sending messages to that topic.

Let’s create three subscriptions on our topic. The first one will have no filter, and so will receive all messages.

The second is going to filter on a user defined message property called “From” that we will add to each message we send. It uses a SqlFilter which lets us use a SQL like syntax to say that we want all messages whose ‘From’ property ends in ‘Smith’.

Our third subscription will use a built-in property of the BrokeredMessage. We want to only receive messages whose Label property is set to “important”. We have to prefix Label with sys. in order to indicate this is a built-in property rather than one of the custom user defined properties.

if (!namespaceManager.SubscriptionExists(topicName, allMessagesSubName))
    namespaceManager.CreateSubscription(topicName, allMessagesSubName);

if (!namespaceManager.SubscriptionExists(topicName, filteredSubName1))
    namespaceManager.CreateSubscription(topicName, filteredSubName1, new SqlFilter("From LIKE '%Smith'"));

if (!namespaceManager.SubscriptionExists(topicName, filteredSubName2))
    namespaceManager.CreateSubscription(topicName, filteredSubName2, new SqlFilter("sys.Label='important'"));

Step 3 – Send Some Messages

You don’t send your messages directly to a subscription, you send them to the topic, and that will forward them to all the relevant subscriptions based on their filters.

Here we send three messages, setting up the “From” user defined property for each one, and also setting the built-in Label property for two of them.

var message1 = new BrokeredMessage("Hello World");
message1.Properties["From"] = "Ian Wright";

var message2 = new BrokeredMessage("Second message");
message2.Properties["From"] = "Alan Smith";
message2.Label = "important";

var message3 = new BrokeredMessage("Third message");
message3.Properties["From"] = "Kelly Smith";
message3.Label = "information";

var client = TopicClient.CreateFromConnectionString(connectionString, topicName);

Step 4 – Receive Messages

Now we need to listen on each of those three subscriptions. I’m just going to use the SubscriptionClient’s ReceiveBatch method to pull off a batch of messages from each subscription. Here’s how to perform ReceiveBatch for a single subscription:

var subClient = SubscriptionClient.CreateFromConnectionString(connectionString, topicName, subscriptionName);

var received = subClient.ReceiveBatch(10, TimeSpan.FromSeconds(5));
foreach (var message in received)
    Console.WriteLine("{0} '{1}' Label: '{2}' From: '{3}'", 


If this works correctly, our unfiltered subscription will receive all three messages, the "From LIKE '%Smith'" filter will get two, and the "sys.Label='important'" filter will get one:

AllMessages 'Hello World' Label: '' From: 'Ian Wright'
AllMessages 'Second message' Label: 'important' From: 'Alan Smith'
AllMessages 'Third message' Label: 'information' From: 'Kelly Smith'
Filtered1 'Second message' Label: 'important' From: 'Alan Smith'
Filtered1 'Third message' Label: 'information' From: 'Kelly Smith'
Filtered2 'Second message' Label: 'important' From: 'Alan Smith'


Bonus Step – Modifying Your Filters

You might get into a situation where you’ve already created a subscription but now you want it to be filtered. It is possible to change the filter, by deleting the default “rule” that was created when you initially created the subscription (which will be called “$Default”), and creating your own new rule with the new subscription. For safety, let’s add the new rule before we delete the old one, to eliminate the chance that we miss a message we wanted during the window when there are no rules.

var subClient = SubscriptionClient.CreateFromConnectionString(connectionString, topicName, subscriptionName);


var newRule = new RuleDescription("FilteredRule", new SqlFilter("From LIKE '%Smith'"));






And now your subscription has the new filter applied. Note that any messages already forwarded to this subscription before this filter was applied will still be in the subscription even if they don’t match the new filter.