Skip to main content

Building .NET Core Nuget Packages

My last blog post was on building and publishing npm packages, specifically for typescript projects. In my opinion, packages are an important fundamental unit in software development. They sit at the confluence of technical competence and collaboration, and represent something the software community should be proud of.

Most of the time, you're going to be creating software packages once you're comfortable with some essential pillars:
  • Coding
  • Project structure
  • Software architecture
  • Building
  • Delivery
  • Community
  • Licensing

After I got my npm package up and running, my next task was to do the same thing with my C# libraries. Similar to protoculture, I have made a library called praxis.  This is done by leveraging the services and tooling known in the .NET ecosystem as nuget.



In this case, praxis abstracts many of the concepts and functionality I require when producing server projects. It builds on top of ASP.NET Core, so in that sense you can almost think of it as a framework-framework. The intention is to opinionate the powerful, largely unopinionated abstractions Microsoft has provided. Again, nothing about this blog post is coupled specifically to praxis, but it is something I gladly maintain in my personal time and is available via the Apache 2 license.

The process I outline in this post may expand over time as I don't have the luxury of something like semantic-release in .NET -- yet? The biggest differences between what I did in .net versus what I did in npm are that version information is still being committed to the repository and semantic-release has some fairly polished interactions with github.
Obviously versioning is something that can take on any number of forms, but for now, I'm looking to keep things simple, especially as I feel like I'm still learning the finer parts of publishing .NET Core packages.

One last bit of meta before getting started: Rather than put code inline, I've been linking to github. I promise I'm not hiding any of the juicy specifics on you as I know having examples is important!
So far I've liked blogging this way as it keeps my blog posts conversational, but I know from time to time it can be helpful to see it something inline. Do leave a comment or tweet at me if you want anything spelled out a bit more, I see you all visiting from around the world and I'm very eager to act on any feedback!

Building

Up front, you'll want to understand the fundamentals of building your .NET Core application. Most people will have some level of a handle on this by the time you're reading a post like this. If you haven't done much beyond authoring your code however, this will be where your journey begins. Set aside a little time to understand the dotnet build command and all its objectives.  You should also make sure you're comfortable with doing things command line.

The --configuration flag and --version-suffix flags will probably be the most relevant takeaways. In the case of my build setup however, I've elected to not use version suffixes. Some variance is to be expected here as there will always be some component of customization when doing builds.

Spend time thinking about what your deliverable is and how you're going to automate building it. Your only task before tackling the next section is to make sure building works and that you're happy with the output!

Nuget

In the past, creating a nuget package was a matter of generating a nuspec file by hand, from other sources and/or from your assembly. That nuspec file would then be used to wrap everything up into a nupkg and then you'd finally have your package artifact. If while browsing around for advice on building nuget packages any of your sources mention a nuspec file, you may want to verify that you're not reading an outdated article.

Today, the way you package your project into a nupkg file is by using the dotnet command. It will conveniently take the standard project files and condense them into the necessary information and structure, outputting a nupkg file.
I'm glad to see producing a nuspec file taken off of the table for most scenarios as it has been known to get quite involved.

The other side of nuget is the actual delivery, hosting and distribution of packages. If you're using anything before Preview 3 of the .NET Core tools, you're still going to need nuget.exe to push your packages to nuget.org. This can be a bit of a pain as you're now managing another build tool, but you'll see that I've managed to smooth this out…

So how do you do it all? Good question! If you just want it straight, skip to the next heading. I won't be offended, promise! For the rest of you though, the next few paragraphs explain some things I wish I could have had a shoulder-tap away while bumbling through it all.

At this blog posts time of writing, project.json is in the process of being replaced with a massively updated msbuild. Going forward, the advice you see here will still be useful once project.json is fully phased out, but a few specific items will have to be adjusted.
That said, regardless of which you happen to be using, to publish a nuget package, the rules are the same: At a bare minimum, you're going to need to provide some kind of version and a project name.

This information will have to come from somewhere and by extension this can also mean that it will have to live somewhere or be generated. Where it ends up going before you package things will always be the same however: Inside project.json.
For the time being, the approach I'm taking will have the version committed to source control. This isn't totally my preference, but the main reason is because to achieve anything otherwise requires setup I - or the community - haven't made pretty yet. It's also largely dependent on how you or your team chooses to handle versioning and distribution.

Be sure to understand that when you're building your package, whatever your project.json or msbuild files say is going to apply to the dll and nupkg that get packed and ultimately sent to and read by nuget.org.

Documentation of the nuget schema, behaviours and ecosystem can be a bit scattered, but here are a few things that since picking all of this up, I likely take for granted:
  • All .nupkg files must have a unique version. Two uploaded packages cannot share a version within the same project.
  • Nuget packages follow the semver spec. Study this one, because all your tooling is going to be banking on this standard. From dotnet, to Visual Studio, to nuget, right the way through to TeamCity and Octopus.
  • Make sure your csproj name and your package name are identical. To the last letter. This is important because if anyone wants to debug using your sources, Visual Studio is able to overlay packages with project files sideloaded into to the solution. As a former PHP developer, I cannot even begin to explain how important this is to me!
  • MyGet is an alternative package repository to nuget.org. You're almost never going to need it and if you are lead to believe that you do, give it a second thought. It may be that you're looking at outdated documentation or libraries.
  • I'm not sure why, but nuget.exe is basically just an executable file that you download. I wish this was prettier, but I suppose that would require Windows to have proper package management like debian. Worth mentioning that I am aware of chocolatey, but it still pales in comparison to what the debian package system is capable of.
  • You can run nuget.exe on linux because it's actually written in .NET. Sadly, this doesn't make obtaining or even running nuget on linux any easier. To do this, you have to have mono installed and download the nuget binary, similar to how you have to on Windows.
  • If you're providing them inline when calling nuget.exe, the current version requires that the ApiKey and Source parameters be supplied explicitly via switches. Current documentation and even the help text of the command would lead you to believe otherwise. It lies.
  • It's okay to pick a license! This is an important topic that you as a citizen of the software development community have a stake in. Microsoft has been using Apache-2.0, as have countless other open source projects including my own.

Packing

Okay, enough delay. You can create your nupkg quite easily by going into your project directory and running dotnet pack. As a convenience, this command will also build your project if necessary, and thus also accepts build-related parameters. Take a look at the command that praxis uses for a working example.

Because you've done all the up-front work of setting up your project, version and other metadata from the last section, dotnet handles all the details for you. Consider dotnet pack a victory for the ecosystem.

Pushing

You have the bits, nuget has the bandwidth. Let's make lots of downloads!

Head over to nuget.org and create yourself an account. Once that's done, go to your settings and get your API key. This is what you'll use to have tools act on your behalf, specifically nuget.exe. It probably doesn't need to be said, but I'll say it anyway: Do not commit your API key to source control!

To get your package up and indexed on nuget org, from inside of your project, you're going to run nuget push. The parameters to this one are yet again straightforward. Again, you can always reference what praxis does if you need something concrete.
Once run, nuget grabs whatever nuget package was output from the pack step and uses the API key with the hostname for the package server we're uploading to.


Output is going to look something like this when you run pack and then push. Some of the output is part of my build script, but this is literally what it looked like to publish version 1.0.4 of Praxis!

In the future, this will likely still be the general gist of things if you're using preview 3 or greater of the .NET Core tooling. But if you manage to get the above, there's no reason you won't see something similar to below on nuget.org.

Was that Docker & Travis CI?

Yup, caught red-handed again I suppose. But obviously all for a good reason. What's important to me is being able to take my code and run it anywhere. That doesn't just mean execute the output anywhere, it also means:
  • If every piece of technology I own breaks down, any other machine I can reach can be set up for development within an hour. Whether it runs Windows, OSX or Linux - I need to be productive on it without having to observe the conflicts between Apple, Microsoft and the community.
  • If Travis CI closes up shop tomorrow, I need to be able to quickly move my builds elsewhere without having my whole process codified in a language only they understood.
  • If someone wants to take my project and work with it, is it more helpful for them to have to trial and error and bug me until it's working? Or is there some way that I can author not just the functionality, but its substrate as well?
Docker by way of Docker for Windows and Docker for Mac has been a major shift in potential that I think is still clicking for some people. It's facilitated not just a technical convergence, but a convergence of attitudes.

So, starting with Travis CI, it's actually quite simple. There is a file in praxis called .travis.yml that serves to coordinate what happens during the standard lifecycle prescribed by Travis CI. Inside this file I tell it what functionality I plan on using, which branches to trigger on and what commands to run for each step I've hooked. The most enjoyable part of this file as I have it is that it contains no abstractions. It merely acts as a passthrough to my containerized build process, Travis CI doesn't even need to have dotnet or anything else installed. So long as I have docker and docker-compose, my build runs, just the same as I prepared it on my own machine.

The first command tells Docker to build my container image. This reaches out to the dockerfile I've specified in my project. It's only around 14 lines long and in effect says:
The second command instructs Travis CI to take the container image I've prepared and run an instance of it. Remember, with Docker, you don't run the image. The image only serves as the origin of a new, short lived system. Any changes made are not saved to the image, so when the instance goes away, so too does everything it's done that hasn't been written outside of it.

That might seem frightening, but as I'm about to point out, it's only because if this is the first time you're hearing it, you've still only got half the story. The way you handle Docker's transience is by granting containers access to filesystems and services that live longer than them.
If you're just using Docker on its own, this is done by passing additional flags when running the container. This can get a bit verbose and isn't always the nicest to edit, so Docker has a tool called docker-compose which allows you to define the state of multiple containers as interconnected services. This is why you see a docker-compose.yml file in praxis.
There's only one service defined in here with the name praxis-cli. This service contains instructions on what image to use when it's requested, what environment variables to bind to and what filesystem paths to make available to it.

A more detailed look at docker-compose is easily outside the scope of anything but its own documentation. But it's a very satisfying way to get familiar with the attitudes of the Docker universe.

So, bringing that all back to the .travis.yml file, the second command calls docker-compose, tells it to create an instance of the praxis-cli service and run the build.sh script using bash in it.

Care to guess what the third command does? It sure is boring, but it's easy to digest by design! It calls publish.sh to pack and push the package up to nuget.org as we covered in the previous sections. The one special mention here is that you can see I'm passing an environment variable to docker-compose as part of the command. This is consistent with my philosophy of not allowing the build system to influence my builds. While my build requires a branch name, I don't want it to be available as "TRAVIS_BRANCH", so I simply map the same value over to "BRANCH" and allow docker-compose to pass that value to the script it invokes.

Summary

I really hope that by this point, you have a clear idea of where you are in your understanding of what it takes to produce a package for the .NET ecosystem. More importantly though, I hope you have a custom tailored idea of what the next steps to take are. As I originally preached at the start of this post, producing a package is something to be proud of and involves lots of moving parts.

If you're still not quite there yet and want to get a feel for things, one thing that you should see by now is that there's nothing about building and publishing a package that requires external infrastructure. My Docker container setup is portable and repeatable no matter where it is run. So long as it's provided with an API key, it can produce a package and push it anywhere.

Really, if it suited you, you could even adopt a process where you manually perform every package update by lovingly running it yourself. Obviously that's not a solution anyone is going to fall in love with for the long term. But this should pave the way for experimentation and learning. Feel free to clone Praxis, rename the project and publish it to a private feed on MyGet even!

The last thing I hope is that you've enjoyed reading this. I love talking software development, so if there's anything I've missed, feel free to comment here or reach out to me on twitter!

Comments

Post a Comment

Popular posts from this blog

Laravel Project Architecture: The Missing Guide

At my job, we've been doing a lot of learning and development using Taylor Otwell 's Laravel 4 PHP framework.  As we've become more familiar with it, we've had to come up with better ways to structure our projects outside of what the documentation indicates and the default distribution that is provided. If you've been working with Laravel 4 for any amount of time or come with experience from another framework and are just getting started, you've probably noticed that there are a lot of different ways to cut up your projects. Choice is nice, but sometimes it can be paralysing or misleading. Concrete Advice This post is done in such a way that you can just skim the headings, but if you want a detailed explanation in each section, feel free to read in where necessary. While I can't say the entirety of my advice is in practice throughout the community, I can say that we are starting to use it, and to very good effect at my job.  Especially consider

Amazon in Winnipeg?

Late last week, Amazon put word out that they're starting to look for a city to locate a second headquarters .  The fuel this announcement provides has the hype train in Winnipeg going so fast, it's missing stops. I'm curious though.  As you've been digesting this exciting news, who have you spoken or listened to? Was it this guy  and his party's historic theft of public infrastructure (pdf) and incompetence with digital strategy ? Or how about this guy , who maintains that Winnipeg doesn't need a proper rapid transit option ?  You know, the kind that might actually catch the eye of handsome Amazon who now has Winnipeg trying to be the belle of the ball. Transit "system", Peggo glitches are free - thanks  @infidelatheist Runner up articles are  Winnipeg's inability to join the 21st centry on active transport  and the  compounding of the aforementioned digital strategy mistakes . Stop Listening to These Guys They are not technolo