0 Comments Posted in:

The dotnet new command

One of my favourite things about .NET Core is the dotnet command line tool. With dotnet new, you can quickly scaffold a new project of various types. For example dotnet new webapp creates an ASP.NET Core web app. And if you simply type dotnet new you can see a list of all of the available templates.

Templates               Short Name         Language        Tags
-------------------------------------------------------------------------
Console Application     console            [C#], F#, VB    Common/Console
Class library           classlib           [C#], F#, VB    Common/Library
Unit Test Project       mstest             [C#], F#, VB    Test/MSTest
NUnit 3 Test Project    nunit              [C#], F#, VB    Test/NUnit
NUnit 3 Test Item       nunit-test         [C#], F#, VB    Test/NUnit
xUnit Test Project      xunit              [C#], F#, VB    Test/xUnit
Razor Component         razorcomponent     [C#]            Web/ASP.NET
Razor Page              page               [C#]            Web/ASP.NET
MVC ViewImports         viewimports        [C#]            Web/ASP.NET

Installing templates

Of course, there may not be templates available out of the box that meet your needs, but it's very easy to install additional templates. For example, if you want to create a Vue.js project, you can install a new template pack with dotnet new --install "Microsoft.AspNetCore.SpaTemplates" and then create a new project with dotnet new vue.

Questions

Now as cool as this feature is, it left me with a bunch of questions. Where are all these templates coming from? If I install a template pack, how do I keep it up to date? How do I find out what other template packs are available? If I wanted to make my own template, how would I do that? So I did a bit of digging, and here's what I found.

Templates are stored in NuGet packages

Templates are distributed as NuGet packages (.nupkg), typically hosted on NuGet.org, but you can install them from any NuGet server. The Vue.js template pack I mentioned earlier can be found here. Knowing this is very handy as it enables you to see whether the package is still being actively maintained, and whether there have been recent updates. Looks like this particular template pack hasn't been updated in a while.

How do I know what's available?

How can you find out what template packs are available? There are two main ways I know of.

First, there's this list maintained on GitHub containing many packages.

Second, there's a great searchable website at dotnetnew.azurewebsites.net. So if I'm looking for more up to date Vue.js templates, I can see that there is a very wide choice available.

How do I know what template versions I have?

This one took me a while to find, but I discovered that if you type dotnet new -u (the uninstall command) it gives you a really nice summary of each package installed, in this kind of format.

  Microsoft.AspNetCore.Blazor.Templates
    Details:
      NuGetPackageId: Microsoft.AspNetCore.Blazor.Templates
      Version: 0.7.0
      Author: Microsoft
    Templates:
      Blazor (hosted in ASP.NET server) (blazorhosted) C#
      Blazor Library (blazorlib) C#
      Blazor (Server-side in ASP.NET Core) (blazorserverside) C#
      Blazor (standalone) (blazor) C#
    Uninstall Command:
      dotnet new -u Microsoft.AspNetCore.Blazor.Templates

This command also conveniently shows the syntax to uninstall a template package.

How do I know when updates are available?

Of course, you don't want to have to constantly visit NuGet.org to check up on new versions of template packs, so how do you know when something updated is available? Well the good news is that there are a couple of helpful commands here for you.

First, to update a package to it's latest version, you can always simply install it again. So if I say dotnet new -i Microsoft.AspNetCore.Blazor.Templates, then I'll either install the Blazor templates, or update to the latest (non-prelease) version if they are already installed.

There are also a couple of new commands that perform an update check for you. dotnet new --update-check will check to see if there are new versions of any installed templates, and dotnet new --update-apply also updates them for you.

Note: I attempted to use this feature by deliberately installing an older version of a template, and then running the update check, but it reported that no updates were available. I don't know if that was because by explicitly specifying a version I had perhaps "pinned" to that version, or whether it was just a temporary glitch with the tool.

How do I install a specific template version?

Because templates are stored in NuGet packages, you might want to install a specific version (maybe a pre-release). For example, at the moment, to play with the new WebAssembly Blazor features, you need to install a pre-release of the Microsoft.AspNetCore.Blazor.Templates. That can easily be done by appending :: and the package number after the package name:

dotnet new -i Microsoft.AspNetCore.Blazor.Templates::3.0.0-preview9.19465.2

How can I create my own templates?

Finally, you might be wondering what it takes to create your own template. I'm not going to go into detail here, as there's a helpful tutorial on the Microsoft docs site. But it's relatively straightforward. You create the source files for the template, and a template.json that contains template metadata.

A great way to get a feel for what's possible is to use the excellent NuGet Package Explorer utility to take a look inside the contents of existing NuGet package templates.

Summary

dotnet new is a great productivity tool and once you know a little bit more about what's going on behind the scenes, you can confidently install and update additional template packs to greatly speed up your daily development tasks.


0 Comments Posted in:

If you work on a commercially successful project, the chances are you've experienced the pain of technical debt. As a codebase grows larger and larger, inevitably you find that some of the choices made in the past are resulting in a slowdown in productivity.

It's a problem I've given a lot of thought to over the years. I created Pluralsight course on "Understanding and Eliminating Technical Debt" and often speak about it. In fact I'll be doing so again next month at Technorama Netherlands (would be great to see you there if you can make it!)

Tracking technical debt

One of my recommendations is that technical debt should be tracked. In my Pluralsight course, I suggested creating a "technical debt document". In this document, you list specific issues that need addressing, explain what problems they are currently causing, and what the proposed solution is. Other useful information includes estimating how long a fix would take, and identifying upcoming features in the roadmap that will become easier to implement once this technical debt item has been resolved.

Technical debt can come in many forms. I often break it down into categories like "Code debt", "Architectural debt", "Technological debt", "Test debt" etc, but the common theme is that there is something less than ideal about the code or tooling that needs to be addressed. By tracking these issues somewhere, you can prioritize and plan to address them.

Book recommendation: "Managing Technical Debt"

I recently discovered a great new book, Managing Technical Debt, written by Philippe Kruchten, Robert Nord and Ipek Ozkaya. The authors have researched technical debt at an academic level, (there is even a conference about technical debt now!) and so I was very eager to read what they had to say to pick up any new ideas.

Tech debt book

It's a great read, and written at a level that you could share it with both developers and managers alike. It's practical and pragmatic, and it was nice to see that they share a very similar understanding and approach to technical debt to the one I take in my Pluralsight course.

But there were several new insights, and the one I want to highlight in this post is what the best way to store "technical debt items" (as they call them) is.

Technical debt register

In the book, the authors recommend having a "technical debt register". It's similar to my document idea, but they recommend using your regular work tracking tools to store the "technical debt items". In other words, wherever you record your defects or backlog of new features (e.g. GitHub issues, Jira, Azure DevOps), should also be where you store technical debt items.

I recommend including debt actions (stories) in the same tracking tool as all other work and as part of a single program or team backlog. Our agile-inspired phrasing is "all work is work; all work goes on the backlog" Kruchten, Nord & Ozkaya, Managing Technical Debt

The reasoning for this is simple. Technical debt, just like defects and features, represents work that needs to be done. So it needs to be visible and able to be planned in. You could either create a new entity to represent a technical debt item, or just tag items with "TechDebt" to differentiate them.

This idea is one I had initially considered but rejected because I feared that it could be abused. What if the technical debt register filled up with thousands of "leftover" tasks that were in reality never going to get actioned? A kind of way to assuage our guilt that we didn't really finish our work. "Didn't get round to writing any unit tests? Don't worry! Just add it to the technical debt register!"

However, having read the book, I think I'm coming round to their way of thinking. I can see a number of benefits over using a document for tracking technical debt items:

  • they can be planned and estimated with the same tooling used for defects and features
  • you can associate commits to technical debt items
  • technical debt items can also be linked to featured and defects
  • they support discussion (not everyone will agree on the best way to address these issues)
  • they are easily added by anyone on the team (don't need to know where to find the document, or wait to check it out)

Two levels of technical debt items

The authors suggest that there are two levels of technical debt item: (1) simple, localized code-level debt and (2) wide-ranging, structural, architectural debt.

The simple items relate to a specific area of code and might take a day or so to address. A team could allocate a certain percentage of time each iteration to resolving these technical debt items.

Examples of wide-ranging items might be wanting to move from a monolith to microservices, or switching from server side rendering to a SPA framework. These are not necessarily achievable in one step, and need to be broken down into smaller chunks. In some cases, dedicating an entire iteration to working on one of these technical debt items might be warranted.

Automating creation of technical debt items

Another interesting idea is whether static code analysis tools could generate technical debt items. Personally, my fear here is that there would simply be too many of these, and many static code analysis tools generate high numbers of "false positives".

So personally although I find a lot of value in static code analysis tools, I wouldn't want to automatically convert each item discovered into a technical debt item. My preference is to use code analysis tools that give you immediate feedback as you are coding (this is one of the great strengths of ReSharper) as the most efficient time to address problems with code is while you are still working on it.

If a static analysis tool does highlight some particular areas of concern in the codebase, then you could always manually create an entry in the technical debt register that groups related instances together into a single item, rather than creating one entry per offending line of code.

Another good suggestion was to standardize on a way of marking technical debt issues in code. Many developers include comments like "TODO" or "FIXME", as a way to highlight improvements that are desirable in the future, but for whatever reason could not be done at the time. By adopting a standard, it's easy to find those items and generate technical debt items as necessary in the technical debt register.

Prioritizing technical debt items

Let's suppose we follow this guideline and create a technical debt register. Now we've got hundreds of items, big and small, each detailing a way in which our code should be improved to make our lives easier going forward. But how do we prioritise them?

Well, I would not recommend simply randomly picking off technical debt items to solve. Technical debt items are only potential problems - they're not actually causing any harm unless you are working on a specific part of your system.

So it's really important that your technical debt items are associated with a specific area of code. That way, when you are about to embark on new work in that area of code, you can review the technical debt register to see what known issues might get in your way. This means you can strategically address the ones that will most benefit upcoming work.

"Before starting a new feature or story, check the backlog to identify any known debt items that should be considered during implementation because they impact the same area of the code or would otherwise impede its development" (Kruchten, Nord & Ozkaya, Managing Technical Debt)

Numbers don't matter

It's also important to point out that it really doesn't matter if the number of technical debt items grows very large. Remember they are only potential issues, not actual problems, and it's totally fine if many of them are never addressed.

Of course, you might want to eventually prune some that have sat dormant for several years, or that have been obsoleted by other advances in the code. But they're not like defects. With defects, we typically want to get to a count of zero. Technical debt items are more akin to cool ideas for future features: they're not all going to get done - only the ones that bring real value.

Summary

I highly recommend having at least some way to track outstanding technical debt items for your projects, and reading the Managing Technical Debt book has convinced me to give tracking them in the standard project management tools a go. So I'm planning to migrate the issues listed in my existing technical debt document into our regular working tool and see if that helps us more effectively plan and prioritise which technical debt items should be addressed next.

Want to learn more about the problem of technical debt and how you can reduce it? Be sure to check out my Pluralsight course Understanding and Eliminating Technical Debt.

0 Comments Posted in:

Have you ever been burned by installing a beta or preview version of some developer tools that destabilised your development environment? It's certainly happened to me a few too many times over the years.

But what if you want to try out some of the new cool stuff in the pipeline such as C# 8 and .NET Core 3 (which is still in preview at the time of writing)? Is there any way of trying them out without installing the tools?

(Sidenote: in recent years, preview versions of .NET Core and Visual Studio have been very well behaved and not interfered with the non-preview versions which can be installed side by side. But still I tend to err on the side of caution and avoid installing preview tooling unless I absolutely need it)

Well, thanks to the awesome "Visual Studio Code Remote - Containers" extension, you have a very risk-free way of trying out the latest tooling for your language of preference, without needing anything more than Docker and VS Code installed.

In this post, we'll see how easy it is to set up a .NET Core 3 development environment in a container, and then develop in it using Visual Studio Code.

1. Pre-requisites

To follow along, you need Docker installed (I'm running Docker Desktop on Windows 10), and Visual Studio Code, with the "Visual Studio Code Remote - Containers" extension enabled. I also needed to go into my Docker Desktop settings dialog and enable "Folder Sharing" for my C drive. This is needed as your source code will be mounted as a volume in your container.

2. Create a project folder

Next create an empty folder (I called it core3container) and open Visual Studio Code in that folder (e.g. with code .).

3. Setup container configuration

Next we need to run the "Remote Containers: Add development container configuration files" command in VS Code. You can find this either by pressing F1 and searching for it, or by clicking the green Remote Window icon in the bottom left of VS Code.

This gives us a whole host of pre-defined container images in a variety of languages. I didn't see one for the .NET Core 3 preview in the list, so I just picked the "C# (.NET Core Latest)" option.

4. Open the folder in a container

This prompted me to reopen Visual Studio Code in a container, which I did. You can also do this on demand with the "Remote-Containers: Reopen folder in container" VS Code command.

The first time we do this, it builds the container image for us, which might take a little while as it could need to download base images you don't have.

We can use regular Docker commands such as docker image ls and docker ps to see what's going on behind the scenes. On my machine I can see that there is a new container with the name vsc-core3container-dfa84ec1259930dde9355646f1b8c6d2 running.

5. Examine the .devcontainer folder

Enabling remote container support for VS Code essentially means that a new folder called .devcontainer is created for you. This contains two files - devcontainer.json and Dockerfile.

devcontainer.json holds various configuration settings, such as the location of the Dockerfile, but also any VS Code extensions we want to be enabled when we're working in this container. This is an awesome feature. When you are connected to a container, you can have additional VS Code extensions enabled that just apply to development in that container. In our example, the C# VS Code extension is listed.

{
  "name": "C# (.NET Core Latest)",
  "dockerFile": "Dockerfile",
  "extensions": [
    "ms-vscode.csharp"
  ]
}

This configuration file can also be used for things like publishing ports which is useful if you're doing web development in a container. You can find the full reference documentation for devcontainer.json here.

The Dockerfile that got generated for us began with FROM mcr.microsoft.com/dotnet/core/sdk:latest but then also had some apt-get commands to install a few additional bits of software into the container, such as Git. Here's the Dockerfile that got created:

FROM mcr.microsoft.com/dotnet/core/sdk:3.0

# Avoid warnings by switching to noninteractive
ENV DEBIAN_FRONTEND=noninteractive

# Or your actual UID, GID on Linux if not the default 1000
ARG USERNAME=vscode
ARG USER_UID=1000
ARG USER_GID=$USER_UID

# Configure apt and install packages
RUN apt-get update \
    && apt-get -y install --no-install-recommends apt-utils dialog 2>&1 \
    #
    # Verify git, process tools, lsb-release (common in install instructions for CLIs) installed
    && apt-get -y install git procps lsb-release \
    #
    # Create a non-root user to use if preferred - see https://aka.ms/vscode-remote/containers/non-root-user.
    && groupadd --gid $USER_GID $USERNAME \
    && useradd -s /bin/bash --uid $USER_UID --gid $USER_GID -m $USERNAME \
    # [Optional] Uncomment the next three lines to add sudo support
    # && apt-get install -y sudo \
    # && echo $USERNAME ALL=\(root\) NOPASSWD:ALL > /etc/sudoers.d/$USERNAME \
    # && chmod 0440 /etc/sudoers.d/$USERNAME \
    #
    # Clean up
    && apt-get autoremove -y \
    && apt-get clean -y \
    && rm -rf /var/lib/apt/lists/*

# Switch back to dialog for any ad-hoc use of apt-get
ENV DEBIAN_FRONTEND=

If you want you can update this Dockerfile to install any additional tooling that your development process needs. We'll see how to update the Dockerfile shortly.

The great thing about the .devcontainer folder is that it can be checked into source control so that everyone who clones your repository can use VS Code to develop against it in a dontainer.

6. Run some commands in the container

With VS Code connected to the container, we can run commands directly against the container in the terminal window (accessible with CTRL + '). Let's see what version of .NET Core we have installed, with dotnet --version:

@68cec1b9578c:/workspaces/core3container# dotnet --version
2.2.401

That's not actually what I wanted. I want the preview of .NET Core 3, so I need to point at the correctly tagged version of the .NET SDK.

I can fix this quite easily though. I first run the "Remote-Containers: Reopen Folder Locally" command. Then I edit the Dockerfile to point to the 3.0 SDK base image:

FROM mcr.microsoft.com/dotnet/core/sdk:3.0

Then I run the "Remote-Containers: Rebuild Container" command. This rebuilds the container, and VS Code relaunches inside the new container. If we run dotnet --version again we can see that we now are running the preview version of .NET Core we needed.

[email protected]:/workspaces/core3container# dotnet  --version
3.0.100-preview8-013656

7. Try out C# 8 IAsyncEnumerable

Now in the VS Code terminal, we can run dotnet new console to create a new console app, which will generate a .csproj and Program.cs file for us.

And we'll edit Program.cs to make use of the cool new IAsyncEnumerable capabilities of C# 8. You can read more about it in this article by Christan Nagel.

I've updated my Program.cs with a very simple example of how we can use await and yield return to generate an IAsyncEnumerable<string>, and iterate through it with the new await foreach construct.

using System;
using System.Collections.Generic;
using System.Threading.Tasks;

namespace core3container
{
    class Program
    {
        static async Task Main(string[] args)
        {
            Console.WriteLine("Starting");
            await foreach(var message in GetMessagesAsync())
            {
                Console.WriteLine(message);
            }
            Console.WriteLine("Finished");
        }

        static async IAsyncEnumerable<string> GetMessagesAsync()
        {
            for (int n = 0; n < 10; n++)
            {
                await Task.Delay(TimeSpan.FromSeconds(1));
                yield return $"Async message #{n+1}";
            }
        }
    }
}

We can easily check this is working with dotnet run in the terminal in VS Code, which will show a new message appearing every second.

[email protected]:/workspaces/core3container# dotnet run
Starting
Async message #1
Async message #2
Async message #3
...
Async message #10
Finished

8. Cleaning up

When you exit Visual Studio Code, it will automatically stop the running container, but it does not delete the container (you can still see it with docker ps -a) or any Docker images. This means that it will be faster to start up next time you do some remote container development on this project.

But you will need to manually remove the containers and images when you're done - there doesn't seem to be any built-in support for cleaning up.

9. Try other stuff

Of course, this feature isn't only for .NET Core development, or just for when you want to use preview SDKs. You can use it to develop in any language and get an extremely consistent development experience across all members of your team without having to tell everyone to install specific versions of development tools.

There's also a really nice collection of quick starts which are GitHub repos you can clone which already have the .devcontainer set up for various languages, including Node, Python, Rust, Go, Java and more. I used it to create my first ever Rust app, which I had up and running in just a couple of minutes and without needing to install any new tools on my development PC, thanks to VS Code remote containers.