CHunky Universe of Vigourous Astonishing SHarepoint :)

Category Archives: Office 365

DIY: Integrating Trådfri lights with Teams presence

It seems that Work from Home (WFH) is here to stay, it’s okay. I’d say, Work from a Smart Home is even more okay. To me, Home Automation (HA) and Work from Home (WFH) are really two peas in a pod.

Today’s “guest” is a tiny application that I’ve set up on my raspberry pi to listen to my presence (status) in Teams and show it with colors of my smart RGB light (IKEA Trådfri).

The code

You can find the application on github:

I’ll try to provide the needed documentation on the github repo and focus more on the story part in this blog post.

The story

I stumbled upon Elio Struyf’s blog post and I was amazed:

Wow! I thought immediately: that would be a cool challenge, I wanted to set up this, too. Although, with some adjustments for my smart home:

  • I wanted to run the whole application on one raspberry pi only, because I don’t have the second one, neither I have HomeBridge installation (maybe something for future projects, though).
  • I wanted to have as little code as possible, maintenance should be kept to minimal.
  • I wanted to use python in order to learn more python and because python seems to be the most supported language on the pi.
  • I wanted to use IKEA Trådfri lights (with a gateway and a remote) that I already have invested into.

I omit the configuration steps for Trådfri lights and Raspberry Pi, you can see them in my previous blog post:

Why show Teams presence with a smart light

Elio wrote his blog post in April this year – in the times of the lockdown in Belgium. In Sweden, we hadn’t a real lockdown, but it seems that it might come times when my children would need to be at home more while I work. In that case a superclear system that shows when I have important meetings is just awesome. Maybe, with that I am prepared for such times.

But to be really honest, the main driving factor is the fact that it is very satisfying to tinker around with this DIY stuff 😜😎

Lessons learned

There is a python wrapper for MSGraph which is awesome, but it needs more contributors:

In your Azure AD App Registration you can specify auth flow type as public, with that you don’t need to store a client secret for delegated access. That was a kind of a new thing to me.

Presence endpoint in MSGraph is in beta, make sure you call the beta endpoint. The scope is ‘’ and you need an admin consent for that permission grant.

Flashing Trådfri lights on Azure Alerts

What if you put together Work From Home and Home Automation? Well, removing the common denominator (HOME) would mean Work Automation (sic!). I want to tell you about a tiny hobby project I have had at home, still related to work of mine: Whenever an Azure alert is triggered, my Trådfri smart light from IKEA flashes for a couple of seconds.

Summary (if you want to skip the long story below): The solution is a tiny web application. The publicly accessible url, exposed using ngrok, is registered as a webhook in an Azure Alert. It’s on Github, you’re welcome to use it as you please 😎:

How I did this (the long story)

The github repo (linked above) is self-expaining, but here comes the story. I used the same setup for Azure Alerts as described in my previous blog post:

When I was done setting up an alert, I thought: besides a notification in a Teams channel, I thought: what if I could show the alert visually using some LED or similar? Then I thought about Home Automation and a Trådfri RGB bulb I’ve got. That’s the beauty of the mentioned equation: Work From Home and Home Automation. We can pick the best parts of it and combine to something unique.

Since I have a kit from IKEA containing a gateway, a remote, and an RGB lamp, I wanted to do something with that. Unfortunately I didn’t find any routines (Google Home), applets (IFTTT) or automations (Home app in iOS) that could do it.

Luckily, there is a way of controlling the Trådfri lights, best described in this tutorial:

As in this tutorial I also used a Raspberry Pi Zero W, and it went very well, except one thing: Trådfri team introduced a change for the security code, I needed an additional step that was missing, more on that later.

The flow from an Azure Alert to the flashing light.

The tutorial says: the world is your lobster. My “lobster” is a webhook that makes lights flash on an alert, so I needed to have a simple web server (http.server) and a tunnel to my network (ngrok). It was best to take one step at a time.

Step 1. Connect

First, I wanted to make sure I could have a simple web server that could host my webhook. I followed the advice from that tutorial and used http.server python module:

I didn’t need to install any additional modules, you have this already on the Raspberry Pi OS. Just create a simple file like this:

from http.server import BaseHTTPRequestHandler, HTTPServer
host_name = ''
host_port = 8000
class MyServer(BaseHTTPRequestHandler):
def do_HEAD(self):
self.send_header('Content-Type', 'text/html')
def do_GET(self):
if __name__ == '__main__':
http_server = HTTPServer((host_name, host_port), MyServer)
print("Server Starts – %s:%s" % (host_name, host_port))
except KeyboardInterrupt:

Start it:

view raw hosted with ❤ by GitHub

I opened that page, (, and I could see “hej”, time to go further.

Step 2. Connect World

Next step was to open up this “web app” for the world, to make it accessible from outside my local network. ngrok is the best solution for that. I followed that guide to install ngrok on my Raspberry Pi Zero W.

The installation process was pretty straight forward, for the record I tried to install ngrok as a snap, it did not work.

cd ~
tar -xvzf ngrok-stable-linux-arm.tgz
view raw hosted with ❤ by GitHub

I also fetched the authtoken and registered it locally

Then I started the ngrok tunnel:

And my web app went online:

Step 3. Harness the lights

Now to the core of this hobby solution: controlling Trådfri lights.

I installed, configured and built the libcoap client, as described in the blog post I already mentioned:

But I also installed git, because my Raspberry Pi OS installation didn’t have it.

sudo apt-get install build-essential autoconf automake libtool git -y
git clone –recursive
cd libcoap
git checkout dtls
git submodule update –init –recursive
./configure –disable-documentation –disable-shared
sudo make install

Next, I found the IP Address and the security code of the IKEA Trådfri Gateway, using my router:

Then I created a new preshared key (that’s the news I mentioned above). With just the security code, you will get 4.01 “Unauthorized” when you try to control the lights, as described:

# -k = Security Code, that you can find on the back of the gateway
# 9090: xxx, your new client identity, you decide, in my case TOLLERASP0
# coaps: the ip address is the one of your gateway
coap-client -m post -u "Client_identity" -k "OHsfKxV0UaJu81" -e '{"9090":"TOLLERASP0"}' "coaps://"

I got the pre-shared key that I saved for later use:

With this information you can harness the IKEA lights:

# off
coap-client -m put -u "TOLLERASP0" -k "{presharedkey}" -e '{ "3311": [{ "5850": 0 }] }' "coaps://"
# on
coap-client -m put -u "TOLLERASP0" -k "{presharedkey}" -e '{ "3311": [{ "5850": 1 }] }' "coaps://"
view raw hosted with ❤ by GitHub

5850:0 is off, 5850:1 is on. Easy-peasy, right?

Want to know how to control the brightness, the colors etc, just check this documentation (already mentioned):

Step 4. Put everything together

When I knew I could have a simple webhook service, locally (step 1) and on the WWW (step 2), and that I could control the smart light I’ve got from IKEA using code running on my raspberry pi, then connecting everything was easy. I created a repo for that and you can see that it is a very simple one:

The main part is in the When it gets invoked, it calls the flash function. It uses os.system to call the libcoap-client and time.sleep for delay parts needed in the flash action. The configuration is parsed using configparser and the server is a simple http.server.

In the end I registered the ngrok endpoint in my Azure Alert Rule Action Group:

Then I triggered my test logic app that failed reliably 🙂

After 1-2 minutes my smart light started flash:

Success 🎯🎯🎯🎯

Words of caution and Tips


http.server does not provide the right level of security, it’s most for prototyping. For this tiny hobby project I have, it’s exactly what I need. Don’t use it as it is for production.

Treat the security code your preshared key appropriately, you don’t want to be hacked.

Flashing lights reacting to alerts is cool, but think about the work-life balance. Don’t have it in your bedroom 😎.

Inspect ngrok from other computer

By default the ngrok web inspect interface is only available from localhost (, make it available across your network by configuring ngrok:

# ip address of your raspberry pi
echo "web_addr:" >> ~/.ngrok2/ngrok.yml

Reserve your local ip addresses

The router can assign new ip addresses to your devices. Reserve the ip addresses of your raspberry pi and your IKEA Trådfri Gateway. It will make your life easier.

Start ngrok closer to you and in the background

EU is closer to me, but also running the background is nice when you only have one terminal:

# -region eu
# >/dev/null & for running in the background
~/ngrok http -region eu > /dev/null &

Replay ngrok calls

This is a game changer: rather than wait for an alert to be triggered, you can just Replay it over and over again while you mickle-muckle your python code locally.

Keep running your server after logout

You just need to to have “nohup” when you start your server, ngrok has already what’s needed: nohup python3 With that the server will run even when you log out or, your ssh connection disappears.

Next steps

I’d like to end this post also by saying: The world is your lobster. Try out the flashing lights on Azure Alerts, or why not to replace Azure Alerts with Exoprise Alarms, or some triggers in Power Automate, perhaps, when a new site has popped up 🙂 Or maybe you want to elaborate the flashing behaviour, why not to use Morse code to send a message? Or maybe color-code the different types of alarms/alerts. Once again, the world is your lobster 🦞(or oyster 🦪, well whatever) .

Automatically detect new sites in SharePoint Online

Original image by William Warby.

Sites in SharePoint are created all the time, not only for SharePoint, but also as storage for Yammer, Teams, Planner and other services in Microsoft 365. There are ways to keep track of them, but the ability to automatically detect a new site creation is quite appealing. Automatic detection means a trigger of a Power Automate (Flow) or a Logic App.

There are a few blog posts that exactly describe how you can detect when a new site is created in SharePoint Online:

The provided blog posts are great how-tos, I am not giving you a new how-to for that, I’d like to reason about that solution.

The solution for automatic detection of new sites

Power Automate and Logic Apps can listen to new items in SharePoint. There is a list in the admin site ( that has SharePoint Sites as list items, its name is DO_NOT_DELETE_SPLIST_TENANTADMIN_ALL_SITES_AGGREGATED_SITECOLLECTIONS.

That’s it, in essence, it’s just setting up a new flow with “When an item is created in SharePoint” as a trigger, and you have thousands business scenarios you could implement, but let’s dig a little bit deeper.

One List to rule them all

Honestly, I was not aware of that list before I started looking at that. What is that list, why is it called DO_NOT_DELETE_SPLIST_TENANTADMIN_ALL_SITES_AGGREGATED_SITECOLLECTIONS.

The name is hillarious. Why name something to “DO_NOT_DELETE…” and all capslock🤣. But I suppose, there were support cases.

Beware, that list is not documented, that means you’re on your own when Microsoft changes the name or moves the list to somewhere else. So don’t build business critical solutions with that.

From what I can see, that list keeps information about all sites (site collections) in SharePoint Online, even those that are deleted and permanently deleted (?). This might be a source for deeper troubleshooting in some scenarios. It is like an old card index in a library you might have seen long time ago. It is hidden nowadays, but it is still there.

Image by LisaJasminAdams from Pixabay

First, that list is in the SharePoint Admin Site Collection, you need to be at least a SharePoint Administrator to access it. Okay, I’d like to know what’s more in its Site Contents (_layouts/15/viewlsts.aspx):

Well, the UI of that page has not been focused on, but nevermind, the lists are there. But you cannot navigate to that list in the browser directly:

It doesn’t matter since we can use it as a trigger but also the SharePoint REST API to get the items, e.g.:'DO_NOT_DELETE_SPLIST_TENANTADMIN_AGGREGATED_SITECOLLECTIONS')/Items

You can see more examples of listing the sites in the linked posts. Unfortunately I bumped into an issue when trying to filter the results. If that list contains more than 5000 items (and it will, soon or later), you’ll have to deal with the ListView Threshold.

If you filter on Modified, you won’t able to anything because of the ListView Threshold, but filtering on Created will work.

But this is a side note, this post is automatically detecting new sites, not listing them

Alternative solutions

Using this kind of a hidden list mentioned above is a bit of a hack. I’d say it’s okay as long as it works, and it serves an complementary function, e.g. notifying IT about new sites, and the work is backed up by documented and reliable alternatives:

SharePoint Online Admin

Visiting “Active Sites” in SharePoint Online Admin gives you all the sites, you can sort by Created and see all the new sites. You cannot set up an alert or a flow directly from that, but maybe there will be some built-in functionality for that.

Office 365 Usage Reports

You can get all the sites in an Usage Report, their created, size, last activity etc. It’s not real time, but if you’re fine with 1-2 days delay, you can get this report, extract the new ones and do whatever you wanted to do in your original scenario/need.

SharePoint PowerShell Module

It’s worth mentioning, too, although it’s “heavy”. In a tenant with many sites, the scripts for getting all the sites and connected groups may take hours. I am refering to those scripts that start with Connect-SPOService.

Permissions, Licenses, and Security

The SharePoint connection that listens to the DO_NOT_DELETE_SPLIST_TENANTADMIN_ALL_SITES_AGGREGATED_SITECOLLECTIONS list in the Admin Site Collection needs to be set up with a SharePoint Administrator role account. Beware of who has access to that solution (Power Automate or Logic App), this SPO Admin connection in wrong hands can be disastrous. Especially in Azure, pay attention to who has access to the resource, but also to the resource group and the azure subscription.

The account who sets up a Power Automate needs obviously an appropriate license and also Power Automate activated. In my scenario, I don’t need any premium connectors, but depending on your solution, you might need to license your account appopriately.


In “my” scenario, I want to be notified of all new sites in my business unit within a shared tenant, so that we can contact the site owners, provide guidance and also provision important parts (initial folder structure, some spfx solutions etc).

What is your sceanario?

The code

When I am done developing my proof-of-concept, I’ll try to share more details on the actual implementation. It might be an idea to submit the template to the Microsoft Power Community, but I am not sure it will be accepted, given the fact that it uses undocumented and hidden parts of SharePoint Online that soon or later will be subject to change.

Is an M365 Group a Yammer Community

Nowadays a Yammer Community gets a corresponding Microsoft 365 Group (Office 365 Group, Unified Group). In your work as an SPO Admin, you might need to differentiate “ordinary” Modern Team Sites from those ones that were created for a Yammer Community.

They both have GROUP#0 as Template. On the actual SPO Site object, there is nothing that you can use to differentiate those. Neither you can use the Office 365 Group information. But there is a way: if you connect to Exchange Online and get the group from there, then there is something useful.

I’ll share a piece of code with you, as the rest of the posts and code snippets, it is “evergreen”, it changes all the time, maybe when you read this in future, there is a better way, but today I am using this code:

# Prerequisites
# AllowBasic as Admin, perhaps in a separate window
# Set-ItemProperty -path 'HKLM:\\SOFTWARE\\Policies\\Microsoft\\Windows\\WinRM\\Client' -Name AllowBasic -Value 1
# Connect to Exchange Online
Import-Module ExchangeOnlineManagement
# you can get $groupId from the SPO object
$exogroup = Get-UnifiedGroup Identity $groupId
$isYammer = $exogroup.GroupSKU -eq "Yammer"
#Bonus: determine if Team is connected (if $isYammer is $false)
$hasTeam = "Team" -in $exogroup.ResourceProvisioningOptions

Estimated Completion in Write-Progress in PowerShell

Have you also got many sites in your tenant? Write-Progress is the bare minimum in a script that goes through all sites. But there is also another nice way to make easier to see the progress – estimated completion time.

Although the idea comes from another blog post (My life is a message), I thought it could be worth sharing it again, especially in the cloud context.

Here is a bit simplified scenario: Getting information for every site. The status message in Write-Progress contains also the estimated completion time.

# This is just an example for time estimations in write-progress,
# though a simplified scenario
$sitesBareMinimum = Get-SPOSite Limit All
$starttime = Get-Date
$count = 0 # kind of an index, counter
$total = $sitesBareMinimum.Count
$sites = $sitesBareMinimum | ForEach-Object {
$site = $_
$estimation = ""
$now = Get-Date
if ($count -gt 0) { # noone wants a DividedByZeroException 🙂
$elapsed = $now $starttime # how much time has been spent
$average = $elapsed.TotalSeconds / $count # how many seconds per site
$totalSecondsToGo = ($total $count) * $average # seconds left
$span = New-TimeSpan Seconds $totalSecondsToGo # time left
$estimatedCompletion = $now + $span # when it will be complete
$estimation = $estimatedCompletion.ToString() # readable estimation
$percent = 100 * $count / $total # percentage complete
$status = "#{0:d5} of $total. Est:d $estimation. $($site.URL)" -f $count # aggregated status message
Write-Progress Activity "Getting information for " Status $status PercentComplete $percent
$siteWithMoreInfo = Get-SPOSite Identity $site.URL # the actual time consuming operation
$siteWithMoreInfo # return the site with more information

I included the comments, and it should be straight forward to follow the logic in the script. Every iteration tries to estimate time, by calculating the average time of time per site, mulplying it by the remainder of the sites and adding it to the current time. The more sites are processed, the more accurate is the estimation.

Optimizing lookups in PowerShell

Have you had a PowerShell script that contains two bigger arrays and you wanted merge the information. It can become quite slow if you need to search for every item from array A through all items in array B. The solution is called a HashTable! It might be not an advanced tip for some, but I was really glad to see a huge improvement, so I decided to share it as a post.

My Array A ($sites) is a list of SharePoint Sites (over 10K of them). For every site I need to get information on the owner (such as UsageLocation). In order to minimize calls to the server I want to reuse the information – in my array B: $users. This array of users has also thousands of entries.

Here is my main (simplified) setup:

$users = # @() array, code ommitted for brevity
$sites = # @() array, code ommitted for brevity
$sitesAndOwners = $sites | ForEach-Object {
Site = $_
Owner = GetUserInfo($_.Owner)

Traversing the array B for the right item for every entry in array A is slow: Where-Object:

function GetUserInfoSlow($upn) {
$user = $users | Where-Object { $_.UserPrincipalName -eq $upn }
if ($user.Count -eq 0) {
$user = Get-AzureADUser SearchString $upn
$users = $users + $user
return $user

Using a hashtable is much faster:

$sersHash = @{}
function GetUserInfoFast($upn) {
# we check if there is an entry even if value is null
if ($sersHash.Contains($upn)) {
$user = $sersHash[$upn]
else {
$user = Get-AzureADUser SearchString $upn
$sersHash.Add($upn, $user)

In my example it took hours first. Now it takes seconds. A bonus: here is how you can convert an array to a hash table:

#bonus: convert array to a hash table
$users | ForEach-Object {
$usersHash.Add($_.UserPrincipalName, $_)

That’s all I’ve got today. Sharing is caring… And of course, big thanks to my colleague Anton H. for his advise.

Page Diagnostics for SharePoint

While trying to set up a new Home Site, I discovered that there is a tool (browser extension) called Page Diagnostics for SharePoint.

After running this, I tried that command again and it was smart enough to detect the problem the tool discovered.

Also Network Trace is available.

Network trace

Page Diagnostics Tool is defnitely a tool to have in the troubleshooting toolbelt for SharePoint.

Setting up a Home Site

Here is the script:

# Sets up a SharePoint Home Site at Skanska
$tenant = "takana17"
Connect-SPOService https://$
$baseUrl = "https://$"
# site swap takes 1-2 minutes. be patient
Invoke-SPOSiteSwap SourceUrl "$baseUrl/sites/futurehomesite" TargetUrl "$baseUrl" ArchiveUrl "$baseUrl/sites/oldroot-deleteit"
# Home Site. Docs:, it make take some time
Set-SPOHomeSite HomeSiteUrl $baseUrl
view raw spo-home-site.ps1 hosted with ❤ by GitHub

Deploying SPFx using Office 365 cli, custom AAD App and Azure Pipelines

In this post I would like to share some findings from setting a deployment of SPFx. In my work:

  • I need to deploy SPFx solutions using Azure Pipelines
  • I need to use the least privileges/permissions
  • I cannot use Legacy Authentication

First of all, big thanks to @waldekm and the whole community of @office365cli and @m365pnp for the quick help, and that outside working hours.

Let’s take a look at the setup piece by piece

Least Privileges

I followed this guide to set up a custom App Registration for Office 365 CLI in order to use the least privileges:

Custom Azure AD App

For uploading and deploying SPFx packages I found these permissions to be the bare minimum:

  • Delegated Microsoft Graph User.Read
  • Delegated SharePoint AllSites.FullControl

Service Account

The second part is the service account that just has access to one site collection – Tenant App Catalog. That plus Delegated AllSites.FullControl of the app registration narrows the access to just that site. To install apps the Uploader Account needs to be Site Collection Administrator.

Least privileges for SPFx Upload & Deploy

Azure Pipelines

In our project we use Azure Pipelines where we also define the release using .yml. The deployment consists of series of bash inline scripts.

I am not going to describe all the steps for setting up node, npm and installing the office 365 cli. If you already have used Office 365 CLI with the default AAD APP it might look like this:

task: Bash@3 # login
displayName: "Login to O365 spAppCatalogSiteUrl with user $(username)"
targetType: "inline"
script: 'o365 login "${{ parameters.spAppCatalogSiteUrl }}" -t password -u $(username) -p $(password)'
task: Bash@3 #upload
displayName: "Upload web part ${{ parameters.spfxPackageName }} to catalog"
targetType: "inline"
script: 'o365 spo app add -p "$(Pipeline.Workspace)/${{ parameters.environment }}/${{ parameters.spfxPackageName }}" –overwrite'
task: Bash@3 #deploy
displayName: "Deploy ${{ parameters.spfxPackageName }} web part"
targetType: "inline"
script: 'o365 spo app deploy –name "${{ parameters.spfxPackageName }}" –appCatalogUrl "${{ parameters.spAppCatalogSiteUrl }}"'
view raw deploy-spfx.yml hosted with ❤ by GitHub

Now comes the tricky part! If you followed the guide mentioned above, you must have noticed the two environment variables that you need to have:

export OFFICE365CLI_AADAPPID=506af689-32aa-46c8-afb5-972ebf9d218a
export OFFICE365CLI_TENANT=e8954f17-a373-4b61-b54d-45c038fe3188
view raw hosted with ❤ by GitHub

That’s straight forward when you run the cli in your own console. But the fact is (or at least from what I can see), you cannot “export” variables to other pipeline tasks.

Instead of setting the variables in the inline script, we can take advantage of the Bash task parameter called env:.

Some other findings:

  • Office 365 CLI needs them in all three commands: login, spo app add, and spo app deploy
  • If you create and export a variable in a pipeline task, it won’t persist, because every task starts a new shell session.

That means that we need to provide environment variables in every task in the pipeline, that uses Office 365 CLI with a custom Azure AD App. Or is there a better way? Anyway, the version below (the same tasks plus `env`) will work:

task: Bash@3 # login
displayName: "Login to O365 spAppCatalogSiteUrl with user $(username)"
targetType: "inline"
script: 'o365 login "${{ parameters.spAppCatalogSiteUrl }}" -t password -u $(username) -p $(password)'
OFFICE365CLI_AADAPPID: "${{ parameters.o365cliAppId }}"
OFFICE365CLI_TENANT: "${{ parameters.tenantId }}"
task: Bash@3 #upload
displayName: "Upload web part ${{ parameters.spfxPackageName }} to catalog"
targetType: "inline"
script: 'o365 spo app add -p "$(Pipeline.Workspace)/${{ parameters.environment }}/${{ parameters.spfxPackageName }}" –overwrite'
OFFICE365CLI_AADAPPID: "${{ parameters.o365cliAppId }}"
OFFICE365CLI_TENANT: "${{ parameters.tenantId }}"
task: Bash@3 #deploy
displayName: "Deploy ${{ parameters.spfxPackageName }} web part"
targetType: "inline"
script: 'o365 spo app deploy –name "${{ parameters.spfxPackageName }}" –appCatalogUrl "${{ parameters.spAppCatalogSiteUrl }}"'
OFFICE365CLI_AADAPPID: "${{ parameters.o365cliAppId }}"
OFFICE365CLI_TENANT: "${{ parameters.tenantId }}"
view raw deploy-spfx-env.yml hosted with ❤ by GitHub

Eliminating Legacy Authentication

My goal is to remove the need of legacy authentication. Previously we installed spfx packages using PnP PowerShell. PnP PowerShell in Pipelines causes Legacy Authentication, it can be solved, though:

Using Office 365 CLI rather than PnP PowerShell with a certificate has some significant benefits:

  • Office 365 CLI is multi-platform, you can reuse the scripts. PnP PowerShell requires Windows (yet, but still).
  • Setting up certificates and using it in the deployment process is a bigger initial task.

Release Pipelines

Just for completeness, in a classic release pipeline, you can use a bash script to upload and deploy an app:

#runs in Ubuntu 20.04 Bash Task
sudo npm install -g @pnp/office365-cli
o365 login –authType password –userName $(AppCatalogUsername) –password "$(AppCatalogPassword)"
export filePath="$(System.DefaultWorkingDirectory)/dist/$(env)/$(fileName)"
o365 spo app add -p "$filePath" –overwrite
o365 spo app deploy –name "$(fileName)" –appCatalogUrl "$(AppCatalogSiteUrl)"
view raw hosted with ❤ by GitHub

In our example we also send data to Azure CDN using Azure CLI:

az storage blob upload-batch \
–source $(sourceFolder)/bundledFiles \
–destination $(storageContainer)/$(toolPath) \
–account-name $(storageAccount)
view raw hosted with ❤ by GitHub

Power Automate for a one-time operations

Honestly, Power Automate is great for automating repetetive stuff. But I think there is room for one-time flows as well. I’ll give you an example.

I’ve got an excel file with quite a few rows. And I need to convert it to a SharePoint List. I know there is a couple of options, such as Quick Edit in Classic View, Import an Excel file as a list (it also requires the classic view), there will be Excel import in Modern as well. But I need to also change the column names, maybe adjust something “on-the-go”.

If you had asked me to do that same thing one year ago, I would have created a script (powershell or javascript), loaded the rows and created all the list items.

But today, I find it much faster to set up a Power Automate (No worries, there is still need of “real” scripts and applications).

So my spreadsheet has two columns.

I create a new SharePoint List and adjust the columns to my needs.

After that I set up a very simple flow.

I could have loaded that excel, but I just pasted the rows directly in that flow. Hey, I will only run this once!

A positive side effect is that I also get a verification of the user accounts (my second column)

Since it run in an “Apply to each”, it keeps working even if specific rows fail.

Daniel Chronlund Cloud Tech Blog

News, tips and thoughts for Microsoft cloud fans

Вула Чăвашла

VulaCV - Чăвашла вулаттаракан сайт

Discovering SharePoint

And going crazy doing it

Bram de Jager - Architect, Speaker, Author

Microsoft 365, SharePoint and Azure

SharePoint Dragons

Nikander & Margriet on SharePoint

Mai Omar Desouki

PFE @ Microsoft

Cameron Dwyer

Office 365, SharePoint, Azure, OnePlace Solutions & Life's Other Little Wonders


Me and My doings!

Share SharePoint Points !

By Mohit Vashishtha

Jimmy Janlén "Den Scrummande Konsulten"

Erfarenheter, synpunkter och raljerande om Scrum från Jimmy Janlén

Aryan Nava

DevOps, Cloud and Blockchain Consultant


SharePoint for everyone


Ryan Dennis is a SharePoint Solution Architect with a passion for SharePoint and PowerShell

SharePoint 2020

The Vision for a Future of Clarity

Aharoni in Unicode

Treacle tarts for great justice

... And All That JS

JavaScript, Web Apps and SharePoint


Mostly what I know and share about...