Endjin - Home

Azure

If you use Azure Functions on a regular basis, you’ll likely have grappled with the challenge of testing them. Even now, several years after their introduction, the testing story for Functions is not hugely well defined. In the fourth of this series of posts, we look at how configuration can be supplied from your tests to the functions apps being tested.


If you use Azure Functions on a regular basis, you’ll likely have grappled with the challenge of testing them. Even now, several years after their introduction, the testing story for Functions is not hugely well defined. In the third of a series of posts, we look at using classes in the Corvus.SpecFlow.Extensions library to run functions apps via scenario and feature hooks.


If you use Azure Functions on a regular basis, you’ll likely have grappled with the challenge of testing them. Even now, several years after their introduction, the testing story for Functions is not hugely well defined. In the second of a series of posts, we look at using step bindings provided by the Corvus.SpecFlow.Extensions library to run functions apps as part of your SpecFlow scenarios.


If you use Azure Functions on a regular basis, you’ll likely have grappled with the challenge of testing them. Even now, several years after their introduction, the testing story for Functions is not hugely well defined. In the first of a series of posts, we look at some different approaches to testing your functions apps, and introduce the Corvus.SpecFlow.Extensions library.


Integrating Azure Analysis Services into custom applications doesn’t just mean read-only data querying. But if your application changes the underlying model, it will need to be re-processed before the changes take effect. This post describes how to use the REST API for Azure Analysis Services inside a custom .NET application to perform asynchronous model refreshes, meaning your applications can reliably and efficiently deal with model updates.


Being able to construct DAX queries dynamically in C# means the possibilities are endless in terms of integrating Azure Analysis Services queries into your custom applications, and with the code samples in this post, you have everything you need to get started.


The Azure CNAB Quickstart Templates we’ve created are only half the story. Much of the work we’ve done over the last few months involved the authoring, contribution and DevOps pipelines required to support an open source project. The project is inspired by the original Azure Quickstart Templates – which over the last 5 years has grown to over 850 templates. In this post we’re going to explain how you can author CNAB templates and contribute them.


An Overview of the Azure CNAB Quickstarts Library

by Howard van Rooijen

The Azure CNAB Quickstarts Library helps you get up and running with CNAB and Porter. We’ve built quickstarts covering solutions like WordPress, Ghost, Mattermost, and data platforms like Apache Airflow, SQL Server AlwaysOn clusters and Kubernetes features like an nginx ingress controller and an Azure AD enabled OAuth2 Proxy. We’ve condensed all our learnings from the past 9 months of working on the project and turned them into a 10 minute video which explains all the key concepts. We hope this video helps accelerate your own CNAB & Porter epiphanies!


Integrating Azure Analysis Services into custom applications means more than just querying the data. By surfacing the metadata in your models, you can build dynamic and customisable UIs and APIs, tailored to the needs of the client application. This post explains how easy it is to query model metadata from .NET, so you can create deeper integrations between your data insights and your custom applications.


One of the first steps in integrating Azure Analysis Services into your applications is creating and opening a connection to the server – just like any other database technology. This post explains the ins and outs of creating Azure Analysis Services connections, including code samples for each of the key scenarios. 


NDC London 2020 – My highlights

by Ed Freeman

A couple of weeks back, along with a rabble of other endjineers, I was fortunate enough to attend NDC London. This wasn’t my first time at an NDC conference – in fact, my previous outing was to Oslo to experience the “original” flavour of NDC back in 2018. That was extremely fun and packed with […]


Along with several of my endjin colleagues, I attended NDC London in January this year – here’s a run through of the sessions I attended on Day 3 and my thoughts. This final day was a mixed bag, taking in talks on drumming and AKKA.net, as well as something a bit more close to home – a session from endjin’s own Jess Panni and Carmel Eve on our recent project for OceanMind.


With a variety of integration support through client SDKs, PowerShell cmdlets and REST APIs, it can be hard to know where to start with integrating Azure Analysis Services into your custom applications. This posts walks through the options, and lays out a simple guide to choosing the right framework.


We’ve done a lot of work at endjin with Azure Analysis Services over the last couple of years – but none of it has been what you’d call “traditional BI”. We’ve pulled, twisted and bent it in all sorts of directions, using it’s raw analytical processing power to underpin bespoke analysis products and processes. This post explains some of the common (and not-so-common) reasons why you might want to do similar things, and how Azure Analysis Services might be the key to unlocking your data insights.


In this blog from the Azure Advent Calendar 2019 we discuss building a secure data solution using Azure Data Lake. Data Lake has many features which enable fine grained security and data separation. It is also built on Azure Storage which enables us to take advantage of all of those features and means that ADLS is still a cost effective storage option!

This post runs through some of the great features of ADLS and runs through an example of how we build our solutions using this technology!


Long Running Functions in Azure Data Factory

by Jess Panni

Azure Functions are powerful and convenient extension points for your Azure Data Factory pipelines. Put your custom processing logic behind an HTTP triggered Azure Function and you are good to go. Unfortunately many people read the Azure documentation and assume they can merrily run a Function for up to 10 minutes on a consumption plan […]


How Azure DevTestLabs is helping me climb Everest

by Carmel Eve

Remote working allows us to work from anywhere we want. This brings a huge amount of flexibility in freedom, however we do need the help of a working laptop! When Carmel’s laptop gave in just before a trip, she used Azure DevTestLabs to allow her to continue to work using a 10 year old Mac that probably couldn’t wouldn’t have been up to the task alone…


Running Azure functions in Docker on a Raspberry Pi 4

by Jonathan George

For one of my first experiments with the Raspberry Pi 4, I decided to get an Azure Function running in a Docker container. This post gives a step-by-step guide on how to do it, as well as providing code you can use a starting point for your own experiments.


Quite often it’s beneficial to work with pre-built CLIs/SDKs to interact with your favourite tools, instead of making requests to the underlying REST API. Much of the complexity around constructing requests has been abstracted, and authentication is often easier. The Databricks CLI makes it easier to interact with your Databricks instance, but sometimes you can run into strange errors when constructing the values passed in as arguments. In this blog, we take a look at a JsonDecodeError that can occur when speaking to the Clusters CLI, and look at a way we can avoid this error.


Building a secure solution on Azure can be a daunting task. Using Azure Functions and Managed Identities, we have built up a pattern for giving services access to one another, woithout the need to store credentials. These managed identities can be given access to necessary resources. For example, they can be granted roles and added to access control lists in ADLS Gen2 accounts, or the ability to access keys in key vault. This means that data can be securely accessed without needing to store connection strings or app passwords.


1 2 3