0 Comments Posted in:

I'm very excited to announce that in October I'll be speaking on LINQ at Techorama Netherlands, sharing some of my best practices for becoming more effective with LINQ.

This means its time for another LINQ Challenge! These are short programming challenges, can be solved with a single LINQ expression. Of course, that might not always make for the most readable code, so feel free to solve these with or without the help of LINQ, and of course solutions in other languages are welcome. If you're using LINQ, then the MoreLINQ library often has extension methods that simplify the task.

This challenge is a little more tricky than the previous ones, so if you'd like a more gentle introduction, try some of the earlier challenges first (I've linked to the answers, but I recommend trying to solve them first before looking at the solutions)

Problem 1 - Longest Sequence

The following string contains number of sales made per day in a month:

"1,2,1,1,0,3,1,0,0,2,4,1,0,0,0,0,2,1,0,3,1,0,0,0,6,1,3,0,0,0"

How long is the longest sequence of days without a sale? (in this example it's 4)

Problem 2 - Full House

In poker a hand is a "full house" if it contains three cards of one value and two of another value. The following string defines five poker hands, separated by a semicolon:

"4♣ 5♦ 6♦ 7♠ 10♥;10♣ Q♥ 10♠ Q♠ 10♦;6♣ 6♥ 6♠ A♠ 6♦;2♣ 3♥ 3♠ 2♠ 2♦;2♣ 3♣ 4♣ 5♠ 6♠".

Write a LINQ expression that returns an sequence containing only the "full house" hands.

Problem 3 - Christmas Days

What day of the week is Christmas day (25th December) on for the next 10 years (starting with 2018)? The answer should be a string (or sequence of strings) starting: Tuesday,Wednesday,Friday,...

Problem 4 - Anagrams

From the following dictionary of words,

"parts,traps,arts,rats,starts,tarts,rat,art,tar,tars,stars,stray"

return all words that are an anagram of star (no leftover letters allowed).

Problem 5 - Initial Letters

From the following list of names

"Santi Cazorla, Per Mertesacker, Alan Smith, Thierry Henry, Alex Song, Paul Merson, Alexis Sánchez, Robert Pires, Dennis Bergkamp, Sol Campbell"

find any groups of people who share the same initials as each other.

Problem 6 - Video Editing

A video is two hours long exactly, and we want to make some edits, cutting out the following time ranges (expressed in H:MM:SS):

"0:00:00-0:00:05;0:55:12-1:05:02;1:37:47-1:37:51".

(You can assume that the input ranges are in order and contain no overlapping portions)

We would like to turn this into a sequence of time-ranges to keep. So in this example, the output should be:

"0:00:05-0:55:12;1:05:02-1:37:47;1:37:51-2:00:00"

Share your answers

I hope you have fun solving these. Why not your solutions in GitHub Gists, and share your approach with the rest of us in the comments below. Although I've already created LINQ solutions to each of these puzzles which I'll post later, every time I do this I find I learn something from the way other people have approached the problems.

Want to learn more about LINQ? Be sure to check out my Pluralsight course More Effective LINQ.

0 Comments Posted in:

Rapid API Development with Azure Functions

If you need to rapidly create a simple REST-Style CRUD (Create, Read, Update, Delete) API, then Azure Functions makes it really easy. You can create multiple functions, one for each operation, and then map each of the HTTP Verbs (GET, POST, PUT, DELETE) to the appropriate function.

In this post, I'll show how we can create a REST API to manage TODO list items. It will support five methods:

  • Get all TODO items (GET)
  • Get a TODO item by id (GET)
  • Create a new TODO item (POST)
  • Update a TODO item (PUT)
  • Delete a TODO Item (DELETE)

Yes, I know, it's not exactly the most imaginative demo scenario, so to make things a bit more interesting we'll support four different backing stores

  • In-memory
  • Azure Table Storage
  • Azure Blob Storage (a JSON file per TODO item)
  • Cosmos DB

The Azure Function app will use C# precompiled-functions with the attribute-based binding syntax. All the code is available on GitHub so do feel free to download and experiment.

In-Memory Implementation

For our first implementation, we'll just store the data in memory. This approach is great for very simple experiments, but comes with obvious limitations. When the Azure Functions runtime de-allocates the server instance running our function app, all items are lost. And if we loaded it heavily enough that there were several instances running our function app, they'd each present a different view of the world.

However, this gives us the opportunity to focus how the routing is set up for our functions.

First of all, here's the code for our Todo entity, and we've created a static List<Todo> list that our functions will store the items in.

    public class Todo
    {
        public string Id { get; set; } = Guid.NewGuid().ToString("n");
        public DateTime CreatedTime { get; set; } = DateTime.UtcNow;
        public string TaskDescription { get; set; }
        public bool IsCompleted { get; set; }
    }

    public static class TodoApiInMemory
    {
        static List<Todo> items = new List<Todo>();

        //...
    }

Now let's create our five functions.

First, to get all Todo items is nice and easy. We specify the Route for our HttpTrigger to be todo and that we support the HTTP GET verb. Then we can simply return the entire contents of our in-memory list with the OkObjectResult:

[FunctionName("InMemory_GetTodos")]
public static IActionResult GetTodos(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "todo")]
    HttpRequest req, TraceWriter log)
{
    log.Info("Getting todo list items");
    return new OkObjectResult(items);
}

To get a specific Todo item by id, we adjust our Route to include an id with the syntax todo/{id}. This allows us to bind the value of the route that triggered this function to another parameter with the name id, that we can use to look up in our list. If the item doesn't exist, we'll return a NotFoundResult:

[FunctionName("InMemory_GetTodoById")]
public static IActionResult GetTodoById(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "todo/{id}")]HttpRequest req, 
    TraceWriter log, string id)
{
    var todo = items.FirstOrDefault(t => t.Id == id);
    if (todo == null)
    {
        return new NotFoundResult();
    }
    return new OkObjectResult(todo);
}

To create our Todo items, users POST to the todo route, and we deserialize the body of the request into an object defining the new item.

I've chosen to deserialize to a type I've called TodoCreateModel - so that the caller of this function can only specify the properties I allow on creation of a new item - in this case just the task description.

Then when we've added the new Todo item to our list, we return the entire new object so the caller can discover the id of the new item:

public class TodoCreateModel
{
    public string TaskDescription { get; set; }
}

[FunctionName("InMemory_CreateTodo")]
public static async Task<IActionResult>CreateTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "todo")]HttpRequest req, TraceWriter log)
{
    log.Info("Creating a new todo list item");
    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    var input = JsonConvert.DeserializeObject<TodoCreateModel>(requestBody);

    var todo = new Todo() { TaskDescription = input.TaskDescription };
    items.Add(todo);
    return new OkObjectResult(todo);
}

Next up is updating an existing Todo item.

Again, I've defined a custom model called TodoUpdateModel as I'm allowing the TaskDescription and IsCompleted flag to be updated, but nothing else. The verb in this case will be PUT, and the id is specified in the route as we did with the get by id function.

I've made supplying an updated description optional, and I return the new state of the entire Todo item in the OK response:

[FunctionName("InMemory_UpdateTodo")]
public static async Task<IActionResult> UpdateTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "put", Route = "todo/{id}")]HttpRequest req, 
    TraceWriter log, string id)
{
    var todo = items.FirstOrDefault(t => t.Id == id);
    if (todo == null)
    {
        return new NotFoundResult();
    }

    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    var updated = JsonConvert.DeserializeObject<TodoUpdateModel>(requestBody);

    todo.IsCompleted = updated.IsCompleted;
    if (!string.IsNullOrEmpty(updated.TaskDescription))
    {
        todo.TaskDescription = updated.TaskDescription;
    }

    return new OkObjectResult(todo);
}

Our final method is deleting a Todo item, and things are very familiar. We use the same todo/{id} syntax for the route, and this time its the HTTP DELETE verb we want. The implementation is just to remove it from the in-memory list, and return an OK result.

[FunctionName("InMemory_DeleteTodo")]
public static IActionResult DeleteTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "delete", Route = "todo/{id}")]HttpRequest req, 
    TraceWriter log, string id)
{
    var todo = items.FirstOrDefault(t => t.Id == id);
    if (todo == null)
    {
        return new NotFoundResult();
    }
    items.Remove(todo);
    return new OkResult();
}

So that was relatively straightforward. The code for the in-memory implementation is available on GitHub here.

But obviously an in-memory implementation is going to be of limited use. Let's look at introducing persistent stores next.

Table Storage Implementation

The next implementation we'll look at is Azure Table Storage. As I said in my article on choosing a database for a serverless application, Table Storage has the advantage of being extremely cheap, with a "consumption-based pricing" model. It's very cost effective and performs well assuming that you have no need for advanced queries.

Another benefit of Table Storage is that if you're using Azure Functions, you've already got a Storage Account that the Azure Functions runtime is making use of, so it's nice and convenient to use that for your tables, especially in the prototyping phase. And when testing locally, the Storage Emulator works just fine.

Before we can get started with Table Storage, we do need to adjust our entity definitions slightly.

It will help us to inherit from TableEntity so that we have a RowKey and PartitionKey property (which Table storage requires as it uses them as a combined key, and also has an ETag which is needed when we update rows).

I've created a TodoTableEntity plus two mapping extension methods, to convert between Todo and TodoTableEntity. For our purposes, it's sufficient to hard-code the PartitionKey to a constant value, as we're not anticipating the need to support many thousands of items in the table.

public class TodoTableEntity : TableEntity
{
    public DateTime CreatedTime { get; set; }
    public string TaskDescription { get; set; }
    public bool IsCompleted { get; set; }
}

public static class Mappings
{
    public static TodoTableEntity ToTableEntity(this Todo todo)
    {
        return new TodoTableEntity()
        {
            PartitionKey = "TODO",
            RowKey = todo.Id,
            CreatedTime = todo.CreatedTime,
            IsCompleted = todo.IsCompleted,
            TaskDescription = todo.TaskDescription
        };
    }

    public static Todo ToTodo(this TodoTableEntity todo)
    {
        return new Todo()
        {
            Id = todo.RowKey,
            CreatedTime = todo.CreatedTime,
            IsCompleted = todo.IsCompleted,
            TaskDescription = todo.TaskDescription
        };
    }
}

With these entities in place, let's look at the code for each of our five functions.

First, to get all Todo items, we can use the Table binding. I wanted to bind this to an IEnumerable<TodoTableEntity> parameter, but this does not appear to be supported in Azure Functions v2. So instead, I bind to a CloudTable, and use ExecuteQuerySegmentedAsync to query for all rows in the table.

The Table binding attribute has already specified the table name (todos) and the name of the connection string (AzureWebJobsStorage). And for simplicity I assume that the first segment returned by the query contains all the Todo items.

You may also notice that I'm using a route of todo2. That's simply so that this table-storage implementation of the API doesn't clash with the in-memory one.

[FunctionName("Table_GetTodos")]
public static async Task<IActionResult> GetTodos(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "todo2")]HttpRequest req,
    [Table("todos", Connection = "AzureWebJobsStorage")] CloudTable todoTable, 
    TraceWriter log)
{
    log.Info("Getting todo list items");
    var query = new TableQuery<TodoTableEntity>();
    var segment = await todoTable.ExecuteQuerySegmentedAsync(query, null);
    return new OkObjectResult(segment.Select(Mappings.ToTodo));
}

Next, we want to get a Todo item by id, and the Table binding helps us out a bit more here. By specifying the PartitionKey (hard-coded to TODO) and the row key (dynamically extracted from the route with the {id} syntax) we can simply bind directly to an instance of TodoTableEntity. This will be set to null if there was no matching row.

[FunctionName("Table_GetTodoById")]
public static IActionResult GetTodoById(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "todo2/{id}")]HttpRequest req,
    [Table("todos", "TODO", "{id}", Connection = "AzureWebJobsStorage")] TodoTableEntity todo,
    TraceWriter log, string id)
{
    log.Info("Getting todo item by id");
    if (todo == null)
    {
        log.Info($"Item {id} not found");
        return new NotFoundResult();
    }
    return new OkObjectResult(todo.ToTodo());
}

Creating a new Todo item is also simplified by using a binding. Since our function is asynchronous, we need to bind to an instance of IAsyncCollector<TodoTableEntity> and call AddAsync to add a row to the table. But otherwise the technique is exactly the same as for the in-memory implementation - deserialize the request body into a TodoCreateModel and turn that into a TableEntity we can store in our table.

[FunctionName("Table_CreateTodo")]
public static async Task<IActionResult>CreateTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "todo2")]HttpRequest req, 
    [Table("todos", Connection="AzureWebJobsStorage")] IAsyncCollector<TodoTableEntity> todoTable,
    TraceWriter log)
{
    log.Info("Creating a new todo list item");
    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    var input = JsonConvert.DeserializeObject<TodoCreateModel>(requestBody);

    var todo = new Todo() { TaskDescription = input.TaskDescription };
    await todoTable.AddAsync(todo.ToTableEntity());
    return new OkObjectResult(todo);
}

Updating a Todo item is where things get a bit painful, as the binding syntax doesn't give us an easy way to do this.

We need to bind directly to CloudTable, and perform two operations. First, a Retrieve to get hold of the item to be updated by its PartitionKey (hard-coded to TODO) and RowKey (the id). Then, assuming we found the item, we update its properties, and then implement a Replace operation to replace the row in Table Storage.

[FunctionName("Table_UpdateTodo")]
public static async Task<IActionResult> UpdateTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "put", Route = "todo2/{id}")]HttpRequest req,
    [Table("todos", Connection = "AzureWebJobsStorage")] CloudTable todoTable,
    TraceWriter log, string id)
{

    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    var updated = JsonConvert.DeserializeObject<TodoUpdateModel>(requestBody);
    var findOperation = TableOperation.Retrieve<TodoTableEntity>("TODO", id);
    var findResult = await todoTable.ExecuteAsync(findOperation);
    if (findResult.Result == null)
    {
        return new NotFoundResult();
    }
    var existingRow = (TodoTableEntity)findResult.Result;

    existingRow.IsCompleted = updated.IsCompleted;
    if (!string.IsNullOrEmpty(updated.TaskDescription))
    {
        existingRow.TaskDescription = updated.TaskDescription;
    }

    var replaceOperation = TableOperation.Replace(existingRow);
    await todoTable.ExecuteAsync(replaceOperation);

    return new OkObjectResult(existingRow.ToTodo());
}

Finally, how can we delete a row in Table Ttorage? Again, we bind to CloudTable, and perform a Delete operation on the table, to delete the row with specified PartitionKey and RowKey.

Notice that we do need to specify a wildcard ETag in order to successfully delete the row. I also detect if the item doesn't exist by catching a specific exception.

[FunctionName("Table_DeleteTodo")]
public static async Task<IActionResult> DeleteTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "delete", Route = "todo2/{id}")]HttpRequest req,
    [Table("todos", Connection = "AzureWebJobsStorage")] CloudTable todoTable,
    TraceWriter log, string id)
{
    var deleteOperation = TableOperation.Delete(
        new TableEntity() { PartitionKey = "TODO", RowKey = id, ETag = "*" });
    try
    {
        var deleteResult = await todoTable.ExecuteAsync(deleteOperation);
    }
    catch (StorageException e) when (e.RequestInformation.HttpStatusCode == 404)
    {
        return new NotFoundResult();
    }
    return new OkResult();
}

So that's how to get a basic CRUD REST-style API backed by Table Storage up and running. It's a shame that the bindings don't help us out quite as much as I'd like for all the operations. Where you'd really start running into limitations with table storage is if you wanted to support some kind of complex querying as its really optimized for lookup by row and partition keys only.

The full code for my table storage implementation is available here.

Before we move onto a real database, let's see how we can do the same thing, but using Blob Storage as the backing store.

Blob Storage Implementation

Of course, Blob Storage isn't a database, so this is not a backing store you're likely to want to use in a real-world application. I'm simply using a Blob Storage container and storing each Todo item as a JSON file. This might be handy if you like the idea of quickly backing up and restoring your "database" just by moving JSON files in and out of a container with Azure Storage Explorer.

Let's see how the Blob storage bindings work to let us implement this backing store:

To get all Todo items, we need to bind with the Blob binding to a CloudBlobContainer. Then we use ListBlobsSegmentedAsync to list all the blobs in the container, and for each one, we download the contents of that blob as text and deserialize it into a Todo item, which we then put in a list to be returned.

You can also see that I'm creating the container if it doesn't exist to avoid us needing to do this manually in advance. The container name is todos and once again I'm using the storage account in the AzureWebJobsStorage connection string.

[FunctionName("Blob_GetTodos")]
public static async Task<IActionResult> GetTodos(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "todo3")]HttpRequest req,
    [Blob("todos", Connection = "AzureWebJobsStorage")] CloudBlobContainer todoContainer, 
    TraceWriter log)
{
    log.Info("Getting todo list items");
    await todoContainer.CreateIfNotExistsAsync();
    var segment = await todoContainer.ListBlobsSegmentedAsync(null);

    var todos = new List<Todo>();
    foreach(var result in segment.Results)
    {
        var blob = todoContainer.GetBlockBlobReference(result.Uri.Segments.Last());
        var json  = await blob.DownloadTextAsync();
        todos.Add(JsonConvert.DeserializeObject<Todo>(json));
    }
    return new OkObjectResult(todos);
}

To get a specific Todo item by id, we can make use of a nice feature of the Blob binding that we can use a the {id} parameter from the function route in the binding to specify which blob we are interested in. In our case it's path will be todos/{id}.json.

We bind to a string parameter, so the contents of the blob (if it exists) will get automatically read into a string for us. All that remains is for us to deserialize it into a Todo (ironically so that it can be immediately serialized back into JSON).

[FunctionName("Blob_GetTodoById")]
public static IActionResult GetTodoById(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "todo3/{id}")]HttpRequest req,
    [Blob("todos/{id}.json", Connection = "AzureWebJobsStorage")] string json,
    TraceWriter log, string id)
{
    log.Info("Getting todo item by id");
    if (json == null)
    {
        log.Info($"Item {id} not found");
        return new NotFoundResult();
    }
    return new OkObjectResult(JsonConvert.DeserializeObject<Todo>(json));
}

For creating a Todo item in blob storage, I guess it might have been possible to use IAsyncCollector, but I found it easier to bind to a CloudBlobContainer, and then use UploadTextAsync to upload the newly created Todo item serialized as JSON into a text file in the blob container:

[FunctionName("Blob_CreateTodo")]
public static async Task<IActionResult>CreateTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "todo3")]HttpRequest req, 
    [Blob("todos", Connection="AzureWebJobsStorage")] CloudBlobContainer todoContainer,
    TraceWriter log)
{
    log.Info("Creating a new todo list item");
    await todoContainer.CreateIfNotExistsAsync();
    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    var input = JsonConvert.DeserializeObject<TodoCreateModel>(requestBody);
    var todo = new Todo() { TaskDescription = input.TaskDescription };

    var blob = todoContainer.GetBlockBlobReference($"{todo.Id}.json");
    await blob.UploadTextAsync(JsonConvert.SerializeObject(todo));

    return new OkObjectResult(todo);
}

As with Table Storage, the update operation is the most complex. We can however bind to a specific blob by using the same todos/{id}.json syntax we saw when we got an item by id. And we bind to an instance of CloudBlockBlob.

If the blob doesn't exist, we return a NotFoundResult, but otherwise, we download the blob contents with DownloadTextAsync, update the deserialized Todo item, and then re-serialize it and replace it with UploadTextAsync.

[FunctionName("Blob_UpdateTodo")]
public static async Task<IActionResult> UpdateTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "put", Route = "todo3/{id}")]HttpRequest req,
    [Blob("todos/{id}.json", Connection = "AzureWebJobsStorage")] CloudBlockBlob blob,
    TraceWriter log, string id)
{

    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    var updated = JsonConvert.DeserializeObject<TodoUpdateModel>(requestBody);

    if (!await blob.ExistsAsync())
    {
        return new NotFoundResult();
    }
    var existingText = await blob.DownloadTextAsync();
    var existingTodo = JsonConvert.DeserializeObject<Todo>(existingText);

    existingTodo.IsCompleted = updated.IsCompleted;
    if (!string.IsNullOrEmpty(updated.TaskDescription))
    {
        existingTodo.TaskDescription = updated.TaskDescription;
    }

    await blob.UploadTextAsync(JsonConvert.SerializeObject(existingTodo));

    return new OkObjectResult(existingTodo);
}

Finally, to delete a Todo item from Blob Storage, we can use the same binding technique as for updating and bind to CloudBlockBlob, only we now simply need to check if the blob exists and delete it with DeleteAsync.

[FunctionName("Blob_DeleteTodo")] 
public static async Task<IActionResult> DeleteTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "delete", Route = "todo3/{id}")]HttpRequest req,
    [Blob("todos/{id}.json", Connection = "AzureWebJobsStorage")] CloudBlockBlob blob,
    TraceWriter log, string id)
{
    if(!await blob.ExistsAsync())
    {
        return new NotFoundResult();
    }
    await blob.DeleteAsync();
    return new OkResult();
}

So that's how you can implement a CRUD REST-style API backed by Blob Storage. Not something you'd likely do very often, but it does at least show off a few of the ways of binding to Blob Storage in Azure Functions. The code for this implementation is available here.

Cosmos DB implementation

Finally, let's use a real database as the backing store. Although I've heard lots of good things about Cosmos DB, I've not played with it very much, so this was my first attempt at using it in Azure Functions.

I downloaded and installed the Cosmos DB emulator which is a great option to have at your disposal for experiments, as running even a basic Cosmos DB instance in the cloud isn't particularly cheap. Once you have the Cosmos DB emulator running, you will need to add a connection string pointing at it to your local.settings.json file.

My local.settings.json file looks like this:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "AzureWebJobsDashboard": "UseDevelopmentStorage=true",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet",
    "CosmosDBConnection": "AccountEndpoint=https://localhost:8081/;AccountKey=C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw=="
  }
}

Also, since I'm using Azure Functions v2, the Cosmos DB bindings don't come out of the box, and so I needed to add a PackageReference in my csproj file to install the necessary extension. It's currently in pre-release:

<PackageReference Include="Microsoft.Azure.WebJobs.Extensions.CosmosDB" Version="3.0.0-beta7" />

To get all Todo items from Cosmos DB, I needed to use the CosmosDB binding, specifying the database name (in my case tododb) and collection name (tasks). I'd already created these in the emulator UI, and decided that I would use the "SQL API" for this database.

We need to specify the name of the connection string, and a SqlQuery which already shows the benefits of using a real database - I can make use of an ORDER BY statement to order the Todo items by their timestamp.

We bind to an IEnumerable<Todo> and since Cosmos DB supports us storing schemaless documents, there is no problem with it deserializing directly into our own custom Todo entity.

[FunctionName("CosmosDb_GetTodos")]
public static IActionResult GetTodos(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "todo4")]HttpRequest req,
    [CosmosDB(
        databaseName: "tododb",
        collectionName: "tasks",
        ConnectionStringSetting = "CosmosDBConnection",
        SqlQuery = "SELECT * FROM c order by c._ts desc")]
        IEnumerable<Todo> todos,
    TraceWriter log)
{
    log.Info("Getting todo list items");
    return new OkObjectResult(todos);
}

Retrieving an item by id is something that the CosmosDb binding has good support for. We can set the Id property of the CosmosDB binding to use the {id} from the route. We can bind directly to our Todo entity, making it really easy to return the requested item if it exists.

[FunctionName("CosmosDb_GetTodoById")]
public static IActionResult GetTodoById(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "todo4/{id}")]HttpRequest req,
    [CosmosDB(databaseName: "tododb", collectionName: "tasks", ConnectionStringSetting = "CosmosDBConnection",
        Id = "{id}")] Todo todo,
    TraceWriter log, string id)
{
    log.Info("Getting todo item by id");

    if (todo == null)
    {
        log.Info($"Item {id} not found");
        return new NotFoundResult();
    }
    return new OkObjectResult(todo);
}

Creating our Todo item in theory ought to have been simple, but there was a nasty gotcha.

We bind to an IAsyncCollector which we've seen before, but if I bound to an IAsyncCollector<Todo> I ended up creating documents that had two id properties. One was called Id (capital I) which was my own id, and one called id (all lower case) which was the one auto-generated by CosmosDb.

The workaround I show here is a bit ugly. I simply create an anonymous object with all the same properties, but with a lower case id property, so that my own generated Todo item id is the one that gets used. I'm sure there's other ways to work round this, so do let me know what you prefer to do in the comments.

[FunctionName("CosmosDb_CreateTodo")]
public static async Task<IActionResult>CreateTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "todo4")]HttpRequest req,
    [CosmosDB(
        databaseName: "tododb",
        collectionName: "tasks",
        ConnectionStringSetting = "CosmosDBConnection")]
    IAsyncCollector<object> todos, TraceWriter log)
{
    log.Info("Creating a new todo list item");
    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    var input = JsonConvert.DeserializeObject<TodoCreateModel>(requestBody);

    var todo = new Todo() { TaskDescription = input.TaskDescription };
    //the object we need to add has to have a lower case id property or we'll
    // end up with a cosmosdb document with two properties - id (autogenerated) and Id
    await todos.AddAsync(new { id = todo.Id, todo.CreatedTime, todo.IsCompleted, todo.TaskDescription });
    return new OkObjectResult(todo);
}

As with all our binding types, updating an existing item proves the most complicated. We need to bind to a DocumentClient to allow us to perform a query to find the item to update. We do this with CreateDocumentQuery, and can use a LINQ Where clause to find the document with our id.

Although CreateDocumentQuery returns an IQueryable, it doesn't support all the LINQ extensions you might expect, so trying to pick out the document we want directly with FirstOrDefault doesn't work - hence the need for AsEnumerable().FirstOrDefault().

Once we've found the document, we need to update it, but the syntax to change properties is a little cumbersome. We call SetPropertyValue on each field in the document we want to change.

Next, we can update a CosmosDb document with ReplaceDocumentAsync, but we also want our update method to return the new Todo item. There's a handy trick you can use to convert a Document to a strongly typed class, and that's to cast it to a dynamic and then assign it to a strongly typed variable of the desired type. This is easier than calling GetPropertyValue which I've shown how to do in the commented out code:

[FunctionName("CosmosDb_UpdateTodo")]
public static async Task<IActionResult> UpdateTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "put", Route = "todo4/{id}")]HttpRequest req,
    [CosmosDB(ConnectionStringSetting = "CosmosDBConnection")]
        DocumentClient client,
    TraceWriter log, string id)
{
    string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
    var updated = JsonConvert.DeserializeObject<TodoUpdateModel>(requestBody);
    Uri collectionUri = UriFactory.CreateDocumentCollectionUri("tododb", "tasks");
    var document = client.CreateDocumentQuery(collectionUri).Where(t => t.Id == id)
                    .AsEnumerable().FirstOrDefault();
    if (document == null)
    {
        return new NotFoundResult();
    }
    

    document.SetPropertyValue("IsCompleted", updated.IsCompleted);
    if (!string.IsNullOrEmpty(updated.TaskDescription))
    {
        document.SetPropertyValue("TaskDescription", updated.TaskDescription);
    }

    await client.ReplaceDocumentAsync(document);

    /* var todo = new Todo()
    {
        Id = document.GetPropertyValue<string>("id"),
        CreatedTime = document.GetPropertyValue<DateTime>("CreatedTime"),
        TaskDescription = document.GetPropertyValue<string>("TaskDescription"),
        IsCompleted = document.GetPropertyValue<bool>("IsCompleted")
    };*/

    // an easier way to deserialize a Document
    Todo todo2 = (dynamic)document;

    return new OkObjectResult(todo2);
}

Finally, to delete items from our Cosmos DB database, we again bind to a DocumentClient, and in order to call DeleteDocumentAsync we need to get hold of the item to delete so we can access its "self link". We can use exactly the same technique as we did in the update function to call CreateDocumentQuery to find the document with the specified id.

[FunctionName("CosmosDb_DeleteTodo")]
public static async Task<IActionResult> DeleteTodo(
    [HttpTrigger(AuthorizationLevel.Anonymous, "delete", Route = "todo4/{id}")]HttpRequest req,
    [CosmosDB(ConnectionStringSetting = "CosmosDBConnection")] DocumentClient client,
    TraceWriter log, string id)
{
    Uri collectionUri = UriFactory.CreateDocumentCollectionUri("tododb", "tasks");
    var document = client.CreateDocumentQuery(collectionUri).Where(t => t.Id == id)
            .AsEnumerable().FirstOrDefault();
    if (document == null)
    {
        return new NotFoundResult();
    }
    await client.DeleteDocumentAsync(document.SelfLink);
    return new OkResult();
}

As you can see, the Azure Functions bindings for Cosmos DB offer us some help, but the experience could do with some improvements. I have noticed that some people have made Cosmos DB helper libraries to simplify these operations. Let me know in the comments if you have a favourite one you can recommend. The code for my Cosmos DB implementation is available here.

Summary

With Azure Functions its really quick and easy to create a basic CRUD REST-style API, and the built-in bindings can save you time working with Blob Storage, Table Storage and Cosmos DB. However, operations like updating existing items are currently not well supported by the bindings, so you have to be prepared to do a little bit of additional work yourself.

Also, my implementations here all ignore the concerns of paging and filtering when retrieving all Todo items, which you'd likely want to do in a real-world application. In that scenario, having a proper database like Cosmos DB would start paying dividends, while you'd quickly run into performance issues with Table or Blob Storage.

Want to learn more about how easy it is to get up and running with Azure Functions? Be sure to check out my Pluralsight course Azure Functions Fundamentals.

0 Comments Posted in:

Azure Functions proxies and the v2 runtime

One of the many great features of Azure Functions is the ability to define proxies. I've blogged before about how to use proxies for static website hosting, and I also used proxies in my Building Serverless Applications in Azure Pluralsight course, but until recently, proxies were not available in the newer v2 Azure Functions runtime.

However, proxies support for Azure Functions v2 was recently announced. This was perfect timing for me as I'm currently building a sample Durable Functions project for my upcoming talk at ProgNET London, and I wanted to create a simple webpage that called through to some HTTP Azure Functions to trigger durable workflows, check on their status, and send events to them.

Configuring CORS

In my case the webpage is just a static HTML page, so it can be hosted in a blob storage container. But if you've ever tried calling a an Azure Function from JavaScript on a webpage, then you'll know that your request is going to get blocked because of CORS.

There's two ways to get round CORS issues.

First, is simply to configure your Function App to allow CORS requests from the domain hosting the webpage. For cloud deployed Azure Function apps, you can do this in the Platform Features section of the Function App settings in the Azure Portal, and update the CORS settings to whitelist your domain. If you're testing your Azure Functions app locally, then you can make use of a command line parameter to configure CORS for the local tooling.

Using proxies

However, the second way around CORS issues is to make your webpages appear to be on the same domain as the Azure Functions they are calling by using proxies. It's very simple to set up, and a handy trick to have up your sleeve as proxies are useful in all kinds of situations, not just for avoiding CORS issues.

To get started with proxies, all you need is to create a proxies.json file in the root of your Function App, alongside the host.json file. If you're working in Visual Studio, don't forget to set the Copy to Output Directory setting for proxies.json to Copy if Newer.

My website simply has an index.html page, and a few images in an images/ folder. I'll likely add a css and scripts folder soon as well. I usually set up one proxy for the root path / pointing to my index.html, and one for each folder containing web assets using a wildcard to proxy calls to any route inside that folder. You might be wondering whether we could do everything with a single proxy definition. The trouble with doing that is that we don't want to proxy calls to the /api route as they are heading to our functions.

Obviously, there are a few different ways you could tackle setting up the proxies, but here's how I set up my proxies.json file.

{
  "$schema": "http://json.schemastore.org/proxies",
  "proxies": {
    "proxyHomePage": {
      "matchCondition": {
        "methods": [ "GET" ],
        "route": "/"
      },
      "backendUri": "%WEB_HOST%/index.html"
    },
    "proxyImages": {
      "matchCondition": {
        "methods": [ "GET" ],
        "route": "/images/{*restOfPath}"
      },
      "backendUri": "%WEB_HOST%/images/{restOfPath}"
    }
  }
}

There's a few things worth highlighting here.

First, for the backendUri setting, I'm using an application setting called WEB_HOST. This is set to http://localhost:54045 in my local.settings.json file. This allows me to proxy through to a simple ASP.NET Core webapp running my webpage while I'm testing locally. Then, when I deploy my application to Azure, I can set up an App Setting containing the URL of where the static web assets are stored in blob storage. For example: https://mystatichosting.blob.core.windows.net/container.

Secondly, you'll notice the special {*restOfPath} wildcard syntax on the route which gives me a {restOfPath} variable that can be used in the backendUri setting (notice you don't need the * character here).

Finally, a gotcha that had me stumped for a while is that to be able to use proxies that point to another port on localhost when you're testing locally, you need to create an app setting called AZURE_FUNCTION_PROXY_DISABLE_LOCAL_CALL with the value of true. This should only be set in your local.settings.json file though. I guess this is needed because localhost has a special meaning in Azure Functions proxies, allowing you reference a function within the same function app without a roundtrip proxy request.

There's plenty more that can be achieved with Azure Functions proxies. They provide "override" settings to manipulate the request before it is proxied on, and the response before it is returned, although it's not a feature I've needed to make use of yet. But it's great that proxies are now available in Azure Functions v2.

Want to learn more about how easy it is to get up and running with Azure Functions? Be sure to check out my Pluralsight course Azure Functions Fundamentals.