Last week I went to the NDC conference in London, along with some of my fellow endjineers. NDC is a great event—it always seems to attract good speakers. I’ve been out of the conference circuit for a few years because I went from zero to three children in the space of slightly under 4 years, so I wanted to reduce my travel for a while, but now that I’m able to spend a bit more time being active in the developer community, NDC is a brilliant place to reconnect, and to meet new people. And the talks are pretty good too!
My colleagues have already posted a few reports of what they’ve been up to here:
Here are the talks I managed to get to on day 1.
OpenID Connect & OAuth 2.0 – Security Best Practices
Speaker: Dominic Baier
Dom is the reason IdentityServer exists, so there are few people who understand OpenID Connect (OIDC) and OAuth 2.0 as well as him—there’s nothing quite like building your own implementation of a specification if you want to understand how it works. If you’re not familiar with IdentityServer, it’s a free, open source OpenID Connect and OAuth 2.0 framework for ASP.NET Core.
In most of my recent experience with OIDC and OAuth 2.0, I’ve used Microsoft’s implementations—either the middleware built into ASP.NET Core, or the Microsoft Authentication Library (MSAL). (These are good when they meet your needs, but MSAL is designed specifically for Azure AD, and with the ASP.NET Core middleware it’s certainly possible to run into scenarios that the specifications support but which these implementations do not. That’s where IdentityServer can fit in.) So it was valuable to get an perspective on these standards that wasn’t tied to the particular implementations I was familiar with.
Dom has very generously made his slides available here: https://speakerdeck.com/leastprivilege/oauth-and-openid-connect-security-best-practices.
Here are some of the most valuable points for me. Dom talked about the “Security Current Best Practice” documentation for OAuth 2.0, a vital resource that captures what we currently know about how best to apply OAuth 2.0. (OAuth 2.0 is a flexible specification, and there are good and bad ways to use it. Morever, the industry is always learning more about what works and what doesn’t, so it’s import to have a current best practice guide to capture our ever-evolving understanding, even as the spec remains carved into stone.) He talked about how, although OAuth 2.0 offers many different ways to do things, we have converged on two main ways to do things, one for machine-to-machine communications, and one for interactive applications. He showed some fascinating (and alarming) attacks that are possible on fully-spec-compliant-but-naive OAuth 2.0 or OIDC implementations (which is why you should never roll your own) and how to avoid them. He also described a new idea I was unfamiliar with: pushed authorization requests, in which authorization servers create single-use URLs for particular user logon events to avoid a bunch of security pitfalls that surround traditional authorization endpoints.
Application Diagnostics in .NET Core 3.1
Speakers: David Fowler and Damian Edwards
This talk described the enhancements in .NET Core 3.1 to support diagnostics. When .NET first appeared, it relied on the diagnostic mechanisms built into Windows. That was a sensible choice for the first 15 years or so, but since going cross-platform with .NET Core, that has no longer been a viable solution for anyone wanting to run on, say, Linux. .NET Core 3.1 is the first of the cross-platform versions of .NET to offer a truly comprehensive diagnostic solution that works on all supported platforms.
If you follow the developments of .NET Core you will already be familiar with David and Damian. And if you’re not, you should follow them on twitter immediately: https://twitter.com/davidfowl and https://twitter.com/DamianEdwards – David is a Partner Software Architect on the ASP.NET team, and Damian is a Program Manager for all things .NET at Microsoft.
One of the most useful aspects of this talk was that it provided a great overview of all the different ways in which .NET code can produce diagnostic data. There are many .NET library features available, and if you’re new to .NET it can be a bit baffling—why do we need ILogger, DiagnosticSource, Activities and EventSource? They explained the various scenarios these are meant for and how to choose which to use. They also walked through the new runtime and tooling features .NET Core 3.1 introduces to get access to diagnostic data—you can use these to collect either custom information produced by your application’s own instrumentation, or a wide range of information about the behaviour of the .NET runtime and its libraries. (For example, .NET offers detailed information about the behaviour of its garbage collector.)
Modernizing the enterprise desktop application
Speaker: Oren Novotny
This talk’s theme was close to my heart: building desktop applications. You know, real applications, the kind of thing that could be distributed on a pile of floppy disks—none of this transient web-based nonsense that flakes out the moment your train goes into a tunnel. I like the kind of software that could take a 2 year stint working at the South Pole, with little to no connectivity, in its stride. My love for this kind of application is reflected in my work: my first book (co-authored with endjin founder Matthew Adams) was on Windows Forms, and my 3rd was on WPF (two editions), and although Pluralsight has seen fit to hide them from search results, you can still find my WPF courses here:
It’s old code but it still checks out—those are around a decade old, but the fundamental concepts of WPF haven’t changed, so it’s all still relevant today.
Sadly, WPF was moribund for quite some time. For a while Microsoft diverted its client-side efforts to Silverlight, fighting for territory that was about to become irrelevant. (Both sides, Silverlight and Adobe Flash, lost that particular fight.) But the fact remained that for all the web’s many particular advantages, these old-school pure desktop technologies are really useful, and for the right scenario, Windows Forms or WPF can be by far the best choice of technology.
This is why WPF and Windows Forms have had a rennaissance. Having been apparently on the long and lonely path to end-of-life for years, the irresistible brilliance of these technologies has meant that developers continue to use them and love them, with the effect that Microsoft ultimately decided to support them in .NET Core 3. Naturally, this is not a cross-platform feature—these have always been techologies for building Windows desktop applications, so they’re only ever going to work on Windows. But much like practical nuclear fusion, the year of Linux on the desktop seems to be stubbornly stuck in the future, and not everyone prefers Macs you know. Part of the appeal of these technologies is precisely that they aren’t attempting to be a one-size-fits-all solution.
Oren brought us up to date on what’s new in this world: .NET Core 3 support, the open sourcing of these frameworks, how to apply modern CI/CD practices to apps built this way, how to instrument desktop apps with telemetry, and all the latest developments in deployment mechanisms on Windows.
How to Steal an Election
Speaker: Gary Short
This, the last talk of the day, was entertaining, but one I never ever want to apply in practice. Having started my day with a talk that helped me to protect myself from people who want to break into my application and steal either my data or my money, I ended with one which, to paraphrase Gary, was about people who want to break into my country and steal my democracy.
If that seems slightly off-topic for a tech conference, this was ostensibly a data science talk. (Gary is a freelance data scientist.) But it also covered the analytical and technical techniques that can be used to undermine democracy, along with the broader armory that includes blackmail, extortion, and psychological manipulation, used by those who know their political agenda would never be voted in on its merits alone. Many of the attack vectors he described have been enabled recently by technology, and it is incumbent on us as technologists to take responsibility for the ways in which the tools we build might be used.
This was a fitting end to my first day at NDC because, as m’colleague Carmel Eve describes in her post, NDC London – A dive into responsible and inclusive technology, the message of responsibility and consideration for the possible implications of our actions and decisions has been an overriding theme of the conference as a whole. This is a vitally important theme for our time.