Endjin - Home

Cloud

Did you know that Azure Synapse has great support for .NET and #csharp? Learning new languages is often a barrier to digital transformation, being able to use existing people, skills, tools and engineering disciplines can be a massive advantage.


Have you ever needed an automated process to use alternative credentials for a subset of tasks? This post will demonstrate a technique that allows you setup multiple, concurrent authenticated sessions when using the azure-cli and switch freely between them.


Have you or are you about to invest in Azure Databricks? If so, the new Spark offering in Azure Synapse Analytics is likely to have grabbed your attention and rightly so. Why is Microsoft putting yet another Spark offering on the table and what does it mean for you?


For years we have been building modern cloud data solutions on Azure and helping our customers transform their use of data to drive outcomes. Here are 5 reasons why Azure Synapse Analytics might just be the service that we have been crying out for.


Whilst “read/write XMLA endpoint” might seem like a technical mouthful, its addition to Power BI is a significant milestone in the strategy of bringing Power BI and Analysis Services closer together. As well as closing the gap between IT-managed workloads and self-service BI, it presents a number of new opportunities for Power BI developers in terms of tooling, process and integrations. This post highlights some of the key advantages of this new capability and what they mean for the Power BI developer.


Azure Key Vault is used to protect encryption keys and secrets. These keys and secrets can be used to access encrypted data and protected services. Individual Key Vaults can be used to preserve security information for isolating keys and secrets. The keys stored can be either hardware or software protected. Access to the keys and secrets is controlled using Azure Active Directory, RBAC and access policies.


Despite being inherently difficult to test, the need to validate data modelling, business rules and security boundaries in Power BI reports is important, as well as the need for ensuring that quality doesn’t regress over time as the insights evolve. This post explains that, by connecting to the underlying tabular model, it is possible to execute scenario-based specifications to add quality gates and build confidence in Power BI reports, just as any other software project.


Jess and Carmel recently gave a talk at Azure Oxford on “Combatting illegal fishing with Machine Learning and Azure – for less than £10 / month). The recording of that talk is now available for viewing!

The talk focuses on the recent work we completed with OceanMind. They run through how to construct a cloud-first architecture based on serverless and data analytics technologies and explore the important principles and challenges in designing this kind of solution. Finally, we see how the architecture we designed through this process not only provides all the benefits of the cloud (reliability, scalability, security), but because of the pay-as-you-go compute model, has a compute cost that we could barely believe!


Whilst testing Power BI Dataflows isn’t something that many people think about, it’s critical that business rules and associated data preparation steps are validated to ensure the right insights are available to the right people across the organisation. Data insights are useless, even dangerous, if they can’t be trusted, so despite the lack of “official support” or recommended approaches from Microsoft, endjin treat Power BI solutions just as any other software project with respect to testing – building automated quality gates into the end to end development process. This post outlines an approach that endjin has used to test Power BI Dataflows to add quality gates and build confidence in large and complex Power BI solutions.


Azure Analysis Services provides an enterprise-grade analytical platform with massive scale and flexibility. But, as one of the more expensive services in the Azure platform, consideration should be given to cost management, especially in multi-environment ALM scenarios. This post explains how to massively reduce running costs through automation using Powershell and orchestration tools like Azure DevOps.


Building a proximity detection pipeline

by Carmel Eve

At endjin, our approach focuses on using scientific experimental method to support the creation of fully proved and tested decision making, and the use of scientific research to support our work. This post runs through how we applied that process to creation a pipeline to detect vessel proximity.
This is an example which is based around the project we recently worked on with OceanMind. In this project we helped them to build a #serverless architecture which could detect vessel proximity in close to real time. The vessel proximity events we detected were then fed into machine learning algorithms in order to detect illegal fishing!
Carmel also runs through some of the actual calculations we used to detect proximity, how we used #data projections to efficiently process large quantitities of incoming data, and the use of #durablefunctions to orchestrate the processing.


This post explains how to update Azure Analysis Services model schemas from inside custom .NET applications. Whilst not a common scenario for most, it shows that this is easy to do using the AMO SDK. So, there’s nothing stopping you from developing complex and rich end-user functionality over the top of your data analysis solutions – providing run-time, user-driven schema changes like “what if” analysis.


Wardley Maps are a fantastic tool to help provide situational awareness, in order to help you make better decisions. We use Wardley Maps to help our customers think about the various benefits and trade-offs that can be made when migrating to the Cloud. In this blog post, Jess Panni demonstrates how we used Wardley Maps to plan the migration of OceanMind to Microsoft Azure, and how the maps highlighted where the core value of their platform was, and how PaaS and Serverless services offered the most value for money for the organisation.


Optimising C# for a serverless environment

by Carmel Eve

In our recent project with OceanMind we used #AzureFunctions to process marine vessel telemetry from around the world. This involved processing huge quantities of data in close to real time. We optimised our processing for a #serverless environment, the outcome of which being that the compute would cost less than £10 / month!

This post summarises some of the techniques we used, including some concrete examples of optimisations we made.

#bigdata #dataprocessing #dataanalysis #bigcompute


Integrating Azure Analysis Services into custom applications doesn’t just mean read-only data querying. But if your application changes the underlying model, it will need to be re-processed before the changes take effect. This post describes how to use the REST API for Azure Analysis Services inside a custom .NET application to perform asynchronous model refreshes, meaning your applications can reliably and efficiently deal with model updates.


As part of our work with OceanMind, endjin wrote a high performance .NET AIS parser. AIS (Automatic Identification System) is how commercial ships report location information. This blog describes the parser, and the performance techniques it uses.


Being able to construct DAX queries dynamically in C# means the possibilities are endless in terms of integrating Azure Analysis Services queries into your custom applications, and with the code samples in this post, you have everything you need to get started.


The theme of this year’s British Science Week (6 – 15 March 2020) is “Our Diverse Planet”. We’ll be getting involved by speaking to school children about the work we’ve been doing with Oxfordshire-based OceanMind (part of the Microsoft AI for Good programme) to help them combat illegal fishing, hopefully inspiring some of the next generation of data scientists!


The Azure CNAB Quickstart Templates we’ve created are only half the story. Much of the work we’ve done over the last few months involved the authoring, contribution and DevOps pipelines required to support an open source project. The project is inspired by the original Azure Quickstart Templates – which over the last 5 years has grown to over 850 templates. In this post we’re going to explain how you can author CNAB templates and contribute them.


Setting up Porter on Windows

by Mike Larah

Porter is a tool based on the CNAB (Cloud Native Application Bundle) spec. It can be used for building, managing, and installing application bundles. This guide will walk you through how to get set up with Porter on Windows.


1 2 3 7