Endjin - Home

Cloud

In this post we show how a combination of Kubernetes, Azure Durable Functions and Azure API Management can be used to make legacy batch processing code available as a RESTful API. This is a great example of how serverless technologies can be used to expose legacy software to the public internet in a controlled way, allowing you to reap some of the benefits of a cloud first approach without fully rewriting and migrating existing software.


Integrating Azure Analysis Services into custom applications means more than just querying the data. By surfacing the metadata in your models, you can build dynamic and customisable UIs and APIs, tailored to the needs of the client application. This post explains how easy it is to query model metadata from .NET, so you can create deeper integrations between your data insights and your custom applications.


One of the first steps in integrating Azure Analysis Services into your applications is creating and opening a connection to the server – just like any other database technology. This post explains the ins and outs of creating Azure Analysis Services connections, including code samples for each of the key scenarios. 


NDC London 2020 – My highlights

by Ed Freeman

A couple of weeks back, along with a rabble of other endjineers, I was fortunate enough to attend NDC London. This wasn’t my first time at an NDC conference – in fact, my previous outing was to Oslo to experience the “original” flavour of NDC back in 2018. That was extremely fun and packed with […]


So, my time at NDC 2020 has come to an end. But before I make any more general observations, here’s my thoughts on the sessions I saw on day 3. Crash, bang, wallop: miscellaneous lessons from exploring a drum kit On Friday morning, technical interest won out over practical use, and I found myself at […]


So, another packed day at NDC has completed and following on from my day 1 retrospective, here’s a rundown of my day. The State of Vue.js in 2020 I had intended to start the day with Troy Hunt’s “The Internet of pwned things” talk, but changed my mind at the last minute. At endjin, we’re […]


Along with several of my endjin colleagues, I’m attending NDC London this week. Today was day 1, and here’s a run through of the sessions I attended and my thoughts. Hello #ndclondon! pic.twitter.com/m3kO0c7otC — Jonathan George @ #NDCLondon (@jon_george1) January 29, 2020 Keynote The day started with the keynote from Tess Ferrandez-Norlander, titled “We are […]


With a variety of integration support through client SDKs, PowerShell cmdlets and REST APIs, it can be hard to know where to start with integrating Azure Analysis Services into your custom applications. This posts walks through the options, and lays out a simple guide to choosing the right framework.


We’ve done a lot of work at endjin with Azure Analysis Services over the last couple of years – but none of it has been what you’d call “traditional BI”. We’ve pulled, twisted and bent it in all sorts of directions, using it’s raw analytical processing power to underpin bespoke analysis products and processes. This post explains some of the common (and not-so-common) reasons why you might want to do similar things, and how Azure Analysis Services might be the key to unlocking your data insights.


AI for Good Hackathon

by Ian Griffiths

Towards the end of last year, Microsoft invited endjin along to a hackathon session they hosted at the IET in London as part of their AI for Good initiative. I’ve been thinking about the event and the broader work Microsoft is doing here a lot lately, because it gets to the heart of what I love about working in this industry: computers can magnify our power to do to good.


In this blog from the Azure Advent Calendar 2019 we discuss building a secure data solution using Azure Data Lake. Data Lake has many features which enable fine grained security and data separation. It is also built on Azure Storage which enables us to take advantage of all of those features and means that ADLS is still a cost effective storage option!

This post runs through some of the great features of ADLS and runs through an example of how we build our solutions using this technology!


Very excited to be speaking at NDC in London in January! The talk is focused on “Combatting illegal fishing with Machine Learning and Azure” and will focus on the recent work we did with OceanMind. OceanMind are a not-for-profit who are working on cleaning up the world’s oceans with the help of Microsoft’s cloud technologies. […]


We recently ran into quite an obscure error whilst trying to integrate a VNet with our app service using the Regional VNet integration (which is currently in preview): As you can see – not many details about this error, other than “NotImplemented” and “Access is denied”. What’s more, is that we were only seeing this […]


C#, Span and async

by Ian Griffiths

The addition of ref struct types, most notably Span, opened C# to a range of high performance scenarios that were impractical to tackle with earlier versions of the language. However, they introduce some challenges. For example, they do not mix very well with async methods. This article shows some techniques for mitigating this.


GitHub Actions is GitHub’s new CI/CD platform. It is comparable with Azure Pipelines, which forms part of the Azure DevOps suite. In this post, Mike Larah looks at the similarities and differences in the high-level concepts and terminology between the two platforms.


Long Running Functions in Azure Data Factory

by Jess Panni

Azure Functions are powerful and convenient extension points for your Azure Data Factory pipelines. Put your custom processing logic behind an HTTP triggered Azure Function and you are good to go. Unfortunately many people read the Azure documentation and assume they can merrily run a Function for up to 10 minutes on a consumption plan […]


How Azure DevTestLabs is helping me climb Everest

by Carmel Eve

Remote working allows us to work from anywhere we want. This brings a huge amount of flexibility in freedom, however we do need the help of a working laptop! When Carmel’s laptop gave in just before a trip, she used Azure DevTestLabs to allow her to continue to work using a 10 year old Mac that probably couldn’t wouldn’t have been up to the task alone…


We worked on a project recently which required us to build a highly performant system for processing vast quantities of messages in real time. We had made the decision to run this processing using Azure Functions with C#. This post runs through some of the techniques we used for writing highly performant, low allocation code, including data streaming, list preallocation and the relatively new C# feature: Span.


Running Azure functions in Docker on a Raspberry Pi 4

by Jonathan George

At our endjin team meet up this week, we were all presented with Raspberry Pi 4b’s and told to go away and think of something good to do with them. I first bought a Raspberry Pi back in 2012 and have to admit, beyond installing XBMC and playing around with it, I haven’t done a […]


Import and export notebooks in Databricks

by Ed Freeman

Sometimes we need to import and export notebooks from a Databricks workspace. This might be because you have a bunch of generic notebooks that can be useful across numerous workspaces, or it could be that you’re having to delete your current workspace for some reason and therefore need to transfer content over to a new […]


1 2 3 6