CHunky Universe of Vigourous Astonishing SHarepoint :)

Tag Archives: azure

Using secrets in Logic Apps in a secure way

This is a guide for how to handle secrets in a logic app in a secure way. It combines three resources:

First, enable a Managed Identity for your Logic App:

In the KeyVault, add a new Access Policy for the new Managed Identity (from the previous step). Use the least priviliges. In my case it is just enough with GET for secrets.

Next add an HTTP action to the key vault.

The values should be:

Next, open the Settings of the “Get Client Secret” action and tick the “Secure Outputs (Preview)”

To get the secret we need to parse the http response. Only the value is needed.

Now let’s call the Graph API and authenticate using this secret:

In the run history we can see now, that the password is not shown anymore.

Neither it is visible in the next http call:

Note that the run history is kept for a while, if you have used secrets in plain text, it is a good practice to change them.

Azure Key Vault vs. Pipeline Variables

Using Azure Key Vault in a Pipeline is cool, but it is less secure.

The Key Vault setup

Have you tried the Key Vault Step in an Azure DevOps Pipeline? If you haven’t, please follow these awesome guides:

The steps described in these guides are easy, but that effort made me think about the first pair of pros and cons.

A pipeline variable is faster to configure

A variable in a pipeline takes zero time to set up. Also, A secret variable remains as a secret, since no one can read it in plain text. To configure the Key Vault way of getting secrets requires admin time. Unless you have Admin rights in your Azure Active Directory and your Azure Subscription, you might need to request and argument for one or more of the following:

  • A service principal (an App registration) with a secret.
  • An Azure Key Vault (and maybe a resource group) with an Access Policy for the service principal
  • A service connection in your Azure DevOps Project

Of course, most of it is one-time-job. But still, in many organizations it will require good preparation. The pros for an Azure Key Vault secrets in a pipeline is that

  • Admins can manage the secrets centrally from Azure
  • It is easier to audit the Key Vault Access.
  • Set it up once and let Azure DevOps people use it and re-use it in many pipelines, but still you need to set up a new Service Connection in every Azure DevOps Project

The fact that it is easier to reuse lets me think of my second pair of pros and cons.

A pipeline secret variable is more secure

Let’s say you need a password to a service account that will upload something important, e.g. an account that will upload a new SPFx package to the SharePoint App Catalog.

Doing it the pipeline variable way means that it remains as a secret on that particular release pipeline. Only release administrators of that project can alter the pipeline steps. No one else.

Doing it the Key Vault way, means that you must watch out on every part of that chain:

  • Users who have access to the Key Vault in Azure.
  • Service Principal that can read the secrets through access policy. Who has access to the secret?
  • Service Connection in an Azure DevOps Project. Who can use this Service Connection – to add and modify release pipelines? By default, all Release Administrators can do that. To do it more secure, you need to limit the count of Release Administrators. But it means less flexibility in a team and more admin effort for the allowed Release Administrators.

Also, the service principal used for getting the secrets and in the service connection, should not be reused across projects in Azure DevOps. Dedicated Service Principals will make it more secure because misuse can be more easily discovered and stopped – and thats on a project level, not for all service connections.


In flat small organizations, using a Key Vault for using secrets in Azure DevOps Pipelines is great, it saves you time. But it is less secure, and requires time and effort for an appropriate security.

Trust gulp-connect certificate from Visual Studio Online on Mac OS

I have read and followed this awesome post:

Getting SPFx working in Visual Studio Online by SPDavid.

I got my fingers and tried that guide out. This worked good, I spent some time, though, googling (binging) around to get rid of the SSL Warnings for the remote “localhost” on my Mac.

I would like to share this simple instruction on how to trust a self signed certificate from gulp-connect on Mac OS. The implication is that the certificate is on the remote linux machine (on the Visual Studio Environment), that you are connected to through the Visual Studio Code extension.

The first step (after you have connected and set up a project) is to download the certificate. It can be found in the following directory:


Choose a folder (like Desktop or whatever) to save it to. Then double click server.crt to open it in the Keychain Access.

In the Keychain Access, locate the certificate, it will have the name “gulp-connect”. Open it and enter the “Trust” section. Under “When using this certificate” select “Always Trust”.

Keychain Access – certificate – Trust – Always Trust

After that you might need to restart the browser. But then it should stop warning you.

This certificate is trusted for this account

Tips and Trick for Azure Functions

These are my favourite tips and tricks. These are only those who me and my colleguages have tried out.

Architecture tips

Keep it slim

Functions should do one thing and they should do it well. When you develop it in C# and Visual Studio, it is so tempting to develop a “microservice” in a good way, you add interfaces, implement good patterns, and all of a sudden you get a monolith packaged in a microservice. If your function grows, stop, rethink. Better to see how it input and output bindings can be used. Also orchestration with Logic Apps or Durable Functions can help.

Automated Deployment

It might be an obvious one, but it is super easy to setup CI/CD for Azure Functions in Azure DevOps. Set it up as early as possible. Don’t rely on publishing from Visual Studio.


Different environments like Production and Staging (Test, UAT, QAT, verification), and DEV are not straight forward anymore, when everything is reactive and micro. But it is good to have at least two setups: one for Production and one for Staging. Especially separating the storage accounts has been proven to be a success story. You can have the same queue name, but different connections. Deploying to Staging and Production will be easier. The functions in different “environments” will write/read a queue with the same name but in different storage accounts.

I also find it convenient to have postfix in the azure function names, like collect-shipments-staging and collect-shipments-production.

If it is possible, use separate resource groups for the “environments”.

Tips for performance

One instance at a time

Use host.json to prevent parallelization

  "queues": {
      "batchSize": 1,
      "newBatchThreshold": 0

Add messages to a queue as output

Instead of adding queue messages in code, define it as an output. You can even add multiple messages. This saves you instantiating of CloudStorageAccount which is a good thing for performance.

Take Last Run into account

Just check the timer parameter: timer.Schedule.Last for the time when your Azure Function ran last.

Reuse HttpClient

This tip is from CloudBurst in Malmö in September 2019. Eventhough your function runs on a consumption plan, the chance is big that your code will run on the same server, which means that you can reuse some resources, like HttpClient.

Simple Build for dotnet new react

I created a sample ASP.NET Core app with React. 

dotnet new react



Then it took a couple of hours to get the build to work. Here is my working azure-pipelines.yml:



S01E01 IoT: Posting Temperature from Raspberry Pi to Azure

Recently I have looked more at IoT, Raspberry Pi in my spare time. In my blog post I want to share my experience in a series of posts. This post is about measuring temperature, humidity and pressure with Raspberry Pi 2 Model B and Sense Hat and posting this data to Azure Table Storage.

I followed this tutorial for connecting to azure with python and these instructions for reading data from Sense Hat.

The python script is on github. Along the way I learned that only python 2.x can be used with azure and table names cannot contain underscore (I got Bad Request error when I tried to create a table with the name “climate_data”). But overall, the process was straightforward. The temperature is not correct, maybe because the sensor is inbetween Raspberry Pi and Sense Hat where it gets warm. But it is just a Proof-of-Concept.

I have used Visual Studio 2015 to see the data in Azure Table Storage. For that I needed to install Azure SDK 2.7. There are many other “explorers” for Azure Storage.


Other resources

Accessing Azure from Linux and Mac

Improvement #1 Corrected Temperature

I found a formula for calculating more correct temperature on the raspberry pi forum.

Ta = 0.0071*Tm*Tm+0.86*Tm-10.0
Tm = measured with the temp+humidity sensor
Ta = ambient temperature

I also added a notifcation when data is sent by showing an “S” on the Sense Hat.

Вула Чăвашла

VulaCV - Чăвашла вулаттаракан сайт

Discovering SharePoint

And going crazy doing it

Bram de Jager - Architect, Speaker, Author

Microsoft 365, SharePoint and Azure

SharePoint Dragons

Nikander & Margriet on SharePoint

Cameron Dwyer

Office 365, SharePoint, Azure, OnePlace Solutions & Life's Other Little Wonders


Me and My doings!

Share SharePoint Points !

By Mohit Vashishtha

Jimmy Janlén "Den Scrummande Konsulten"

Erfarenheter, synpunkter och raljerande om Scrum från Jimmy Janlén

Aryan Nava

DevOps, Cloud and Blockchain Consultant


SharePoint for everyone


Ryan Dennis is a SharePoint Solution Architect with a passion for SharePoint and PowerShell

SharePoint 2020

The Vision for a Future of Clarity

Aharoni in Unicode

Treacle tarts for great justice

... And All That JS

JavaScript, Web Apps and SharePoint


Mostly what I know and share about...


SharePoint på ren svenska