CHunky Universe of Vigourous Astonishing SHarepoint :)

A powershell script for activating an eligible role assignment in Azure AD

Recently my role assignments in Azure AD were switched from permanent to eligible ones. This is part of PIM – Privileged Identity Management, you can read more about it on MS Docs:

To activate your eligible assignment you can use Azure Portal, Graph API, and PowerShell. The activation in the portal and Graph API is described on MS Docs:

My roles within Privileged Identity Management in Azure Portal

I created a simple powershell script for activating my eligible roles quickier when I need it. There are two variants of this script:

  • a generic one, that can be run by anyone
  • a “shortcut” version that can be created for a specific account, a specific role, to make it even quicker.

A generic version

This version fetches the assignments you have, tenant id (resourcid), your account id (objectid, subjectid), and then it activates your desired role. Some parts can be made even more generic, but the key thing here is that you can adjust it and run for any account.

# I use SPO Admin a lot, change it to your desired role
$roleToActivate = "SharePoint Administrator"
# default 2 hours, update it to your needs
$hours = 2
$reason = Read-Host "Justify your elevation"
$connection = Connect-AzureAD
$account = $connection.Account
$tenantId = $connection.TenantId
$user = Get-AzureADUser SearchString $account
$objectId = $user.ObjectId
$roleDefs = Get-AzureADMSPrivilegedRoleDefinition ProviderId aadRoles ResourceId $tenantId
$roleDefinition = $roleDefs | Where-Object { $_.DisplayName -eq $roleToActivate }
$roleDefinitionId = $roleDefinition.Id
$filter = "(subjectId eq '$objectId') and (roleDefinitionId eq '$roleDefinitionId')"
$assignment = Get-AzureADMSPrivilegedRoleAssignment ProviderId "aadRoles" ResourceId $tenantId Filter $filter
if (!$assignment) {
Write-Error "There is no assignment for you as $roleToActivate"
} elseif ($assignment.AssignmentState -eq "Active") {
"Your role assignment as a $roleToActivate is already Active"
} else {
$schedule = New-Object Microsoft.Open.MSGraph.Model.AzureADMSPrivilegedSchedule
$schedule.Type = "Once"
$now = (Get-Date).ToUniversalTime()
$schedule.StartDateTime = $now.ToString("yyyy-MM-ddTHH:mm:ss.fffZ")
$schedule.EndDateTime = $now.AddHours($hours).ToString("yyyy-MM-ddTHH:mm:ss.fffZ")
Open-AzureADMSPrivilegedRoleAssignmentRequest `
ProviderId 'aadRoles' `
ResourceId $tenantId `
RoleDefinitionId $roleDefinitionId `
SubjectId $objectId `
Type 'UserAdd' `
AssignmentState 'Active' `
Schedule $schedule Reason $reason
"Your assignment as $roleToActivate is now active"

Shortcut version

This version assumes that you already know all the ids, by running the generic version or by looking it up in Azure. When you know those ids, you can skip many calls to Azure AD, which makes activation quicker and you can start working on your task rather than surfing around to activate your role in Azure.

# find your guids once and fill in the values
$values = [PSCustomObject]@{
Reason = "Support"
Hours = 2
ResourceId = "f7aa13e9-c03a-49f9-8fd4-c943d2612301"
SubjectId = "cafc35f9-bf31-489a-b468-76580f780506"
RoleDefinitionId = "9039a352-599b-4e09-8693-4a17eb83a73e"
$schedule = New-Object Microsoft.Open.MSGraph.Model.AzureADMSPrivilegedSchedule
$schedule.Type = "Once"
$now = (Get-Date).ToUniversalTime()
$schedule.StartDateTime = $now.ToString("yyyy-MM-ddTHH:mm:ss.fffZ")
$schedule.EndDateTime = $now.AddHours($values.Hours).ToString("yyyy-MM-ddTHH:mm:ss.fffZ")
Open-AzureADMSPrivilegedRoleAssignmentRequest `
ProviderId 'aadRoles' `
ResourceId $values.ResourceId `
RoleDefinitionId $values.RoleDefinitionId `
SubjectId $values.SubjectId `
Type 'UserAdd' `
AssignmentState 'Active' `
Schedule $schedule `
Reason $values.Reason


Save it as a script and run it when you need it. Much quicker. One important note, though: Please be aware that it still can take time to fully activate (propagate) your role, especially SharePoint Administrator, often a couple of minutes. But instead of clicking around, run the script and go grab a cup of coffee, when you’re back, you are good to go.

Security Note. Automating role activations is not less secure. You still have to log in to Azure AD using MFA (I hope you have it) even when you run the script.

Sites.Selected and Governance

The new permission in Graph API – Sites.Selected – is a step in the right direction. Since long we have been looking for ways of scoping the accesses to live up to the least privilege principle. It was either nothing or everything. I have tried out the new Sites.Selected permission and here are my findings.

First of all, if you haven’t heard about Sites.Selected, please visit these pages to find out more. I am skipping the introduction, since there are already good resources on that out there.

List of resources

How to grant permissions

Once you have your Azure AD App and the admin consent for Graph Sites.Selected, all you need is the Azure AD Application Id and Site Collection Administrator on a particular site. The simplest way is to use PnP.PowerShell:

Grant-PnPAzureADAppSitePermission -AppId $appId -DisplayName 'MyTest' -Site $url -Permissions Write

How to see the granted permissions

The only way to the application permissions is PowerShell or Graph, there is no indication on the site.


What about governance

A site collection administrator can grant Read or Write permissions on a site. It gives the desired granularity for application access. But on the other side, there is no way (as of writing) to get all the sites that an Azure AD Application has permissions to.

Which leads me to the biggest weakness of the today’s implementation. Of course, we can traverse through all the sites using powershell and get the summary of all application permissions. The problem is that it can be time consuming in a bigger where you have plenty of sites. Also, it requires that your account that runs the script is a Site Collection Administrator on every site, which is a complete opposite of the granularity goal that Sites.Selected permission tries to achieve.

With that you might end up with several applications that have Write permissions to many sites and you might not have any clue wether it is used or not, who has access to those applications and if they need it.

My wish is that:

  • There will be an api (graph) or azure cli (or similar) that can list all the sites that an application with Sites.Selected has access to, without me being a Site Collection Admin on every site.
  • There will be transparency in the user interface, so that users and site owners can see which applications can read and write content on their sites, the same way as we can see the members of a site.

Smarter way of loading SharePoint CSOM dll in PowerShell

Have you also got a legacy powershell script that loads SharePoint dlls and runs CSOM code directly? It’s quite easy to convert to PnP PowerShell. But if you run out of time and just need to execute the script, then I have a quick tip for you.

First of all, a CSOM script can be recognized by Add-Type commands (or Import-Module) plus the SharePoint dll paths.

Loading the dll the old way.

The odds are high that you don’t have those directories and files, unless you run it on a SharePoint Server (who would do that at all?) or you have installed the SharePoint SDK.

SharePoint SDK can be downloaded and installed (as suggested here), but why would you want to do that? An easier way is just to locate the files that are distributed with the PnP.PowerShell module, let me show how to do that.

All the dlls are available from the PnP.PowerShell module directory:

So the only thing you need to do is to re-point the path from the original (the “GAC”) folder to the PnP.PowerShell folder. You don’t need to guess the folder. It’s easy.

Thanks to the PowerTip: Find the Path to a PowerShell Module (Scripting Guy) I could find a way to read the information dynamically, so it doesn’t matter where your folder actually is. The fact what version number the module has, what OS you run on, and whether or not you installed it for your user account only or for all users on your computer – allt that has impact on the folder location. So we need to read the right path and then use it in the Add-Type command.

$pnpModule = Get-Module ListAvailable PnP.Powershell
$base = $pnpModule.ModuleBase
Connect-PnPOnline $url Interactive
Add-Type Path "$base\Framework\Microsoft.SharePoint.Client.dll"
Add-Type Path "$base\Framework\Microsoft.SharePoint.Client.Runtime.dll"

Other notes

The PnP.PowerShell is built on top of.NET Core and it works cross plattform, that’s better.

Loading dlls on a Mac.

If your legacy script does not work with the newer PnP.PowerShell, you might need to install the older PnP PowerShell and adjust the module name in the script above accordingly.

The SharePoint SDK is built on top of .NET Framework (as of my understanding) and it can only be installed on a Windows machine.

The SharePoint SDK requires local administrator rights to be installed. The PnP.PowerShell can be installed for a user without beeing an administrator by adding -Scope CurrentUser (to the Install-Module), which makes the work much smoother.

If you have two or more versions of the PnP.PowerShell module installed, you have to adjust the script a little by loading only the latest version of the module:

$pnpModule = Get-Module PnP.PowerShell ListAvailable | Sort-Object Version Descending | Select-Object First 1

That was a quick tip on how you can use the types from the original CSOM libraries when you don’t have time to convert a script to a PnP code or if there is some functionality that is not covered in PnP yet (not quite sure if there is something you cannot do with PnP that you can do with CSOM).

The good sides of that approach:

  • it can be a step towards rewriting a legacy script to a newer PnP.PowerShell
  • the dlls are up-to-date thanks to an easy way to update the PowerShell Module (Update-Module)
  • it is cross platform, meaning you can execute your legacy script on a linux or on a Mac as well, good for automation!

A cost effective way of running legacy scripts in the cloud

Have you also got some old huge scripts that run on a server locally? Have you also considered moving them to the cloud? Here comes an idea of how to do it quickly and easy.

In my case I have some older powershell scripts that are harder to convert to serverless applications:

  • They use MSOnline module in PowerShell, hence they require rewriting to AzureAD before using them in an Azure Function
  • They take around 15 minutes to complete, Azure Functions Consumption Plan is limited to 10 minutes. Of course I can split them in several parts, but I am looking for an easy way right now, I have to postpone refactoring because I am not sure if there is a real need for this script solution.
  • They process a lot of data and consume more that 400 MB memory which makes it crash when I put it in a Azure Automation Runbook.

Well, maybe a Windows Server VM in Azure is the only way? While recently setting up a minecraft server and following a blog post that proposes auto shutdown and logic apps to start the server, I came up with an idea to use exactly the same approach to make it as cost effective as possible.

The script solution I’ve got needs 15 minutes to complete. It runs every night. 23 hours and 45 minutes a day, the vm is not needed at all, I can stop it. Here is what I’ve tried and got working:

  1. A logic app that starts once a day
  2. It turns on the Windows Server VM
  3. A powershell script runs as a job scheduled to run at startup
  4. Once done, the powershell makes an http call to the Stop logic app
  5. The logic app stops and deallocates the Windows server VM

Job at startup (3)

PowerShell has a native way to register jobs that run at startup. I just followed this digestable guide:

I created a new folder: C:\Scripts

I copied my legacy script to that folder, let’s call it ‘Legacy.ps1’ for the sake of simplicity, then I created a startup job by running these two lines:

$trigger = New-JobTrigger -AtStartup -RandomDelay 00:00:30
Register-ScheduledJob -Trigger $trigger -FilePath C:\Scripts\Legacy.ps1 -Name Legacy

The Windows Server VM (3)

I created a Windows Server 2019 Datacenter Server Core VM to make it as lightweight as possible. I put it in a separate resource group, I didn’t reserve any ip addresses, nor dns names. I disabled even all ports including the RDP to have the highest security.

Other specs:

  • CPU: 1vCPU
  • Size: Standard B1ms
  • RAM: 2GB
  • HDD: HDD (no redundancy)
My choice of the vm image.

The start logic app (1, 2)

In the same resource group I created a logic app that turns on the VM daily at 00:13 UTC.

Easy, isn’t it.

Once started the vm triggers a scheduled – the Legacy.ps1 script.

In the end of the script there is an http call to my ‘Stop’ logic app:

Invoke-WebRequest -Uri ""

The stop logic app (4,5)

Obviously, the trigger I use in my next logic app is an http request.

Whenever it is triggered is stops and deallocates the vm.

What does it cost?

I’ll save this discussion for later. Maybe It costs more than serverless applications, for sure, but less than a vm that is on and idle for hours every day. What I propose is a workaround for running huge legacy Windows scripts on Azure, in case you don’t have time to refactor your legacy scripts.

The cost of the last three days.

Monitoring Microsoft 365 using Raspberry Pi and M365 CLI

I would like to show you my recent hobby project with a raspberry pi, a unicorn phat and the powerful cli-microsoft365: A simple monitoring solution of Microsoft 365 Services.

Status of some important services in Microsoft 365

In essence, I put the unicorn phat onto the raspberry pi zero w and wrote this python script:

The python script checks the service status every five minutes and shows it with colors on the unicorn phat.

Color coding

Since the unicorn phat is just a grid of 8×4 rgb leds, I needed to color code the different service statuses (more on the statuses later in this post). I came up with these color combinations. It doesn’t matter what combinations they are as long as they mean something to you (or as long as you can decode them).

  1. 🟩 🟩 🟩 🟩 ServiceOperational
  2. 🟩 🟩 🟩 🟨 ServiceRestored
  3. 🟪 🟪 🟪 🟪 Investigating
  4. 🟩 🟩 🟩 🟪 FalsePositive
  5. ⬜️ ⬜️ ⬜️ ⬜️ InformationUnavailable
  6. 🟥 🟥 🟥 🟥 ServiceInterruption
  7. 🟥 🟥 🟥 🟨 ExtendedRecovery
  8. 🟥 🟥 🟨 🟩 ServiceDegradation
  9. 🟩 🟩 🟩 🟦 PIRPublished
  10. 🟥 🟨 🟨 🟩 RestoringService


There is a list of all possible statuses you can get for Microsoft 365 Services, and it is here:


Install the cli-microsoft365 npm package globally.

sudo npm i -g @pnp/cli-microsoft365

You have to log in, admin consent (if you run this for the first time) and then you can get the status of the Microsoft 365 Services by running:

m365 tenant status list


There are many services in Microsoft 365. I choose the 8 most important ones (from my point of view), because there are only 8 rows on the unicorn phat, you can choose your services and order them as you prefer of course. Beware the spelling and the casing:

  1. SharePoint
  2. microsoftteams
  3. Exchange
  4. OneDriveForBusiness
  5. yammer
  6. Forms
  7. PowerBIcom
  8. Intune

Assembling the hardware

I had my raspberry pi zero w, with raspberry pi os already installed. I attached the unicorn phat using solderless pogo pins. I found a little white cardboard box, cut out a rectangular hole for the unicorn phat and glued the raspberry pi with unicorn inside the box. On the front side I put a sticker with the actual service names for every led row. I connected it to the power, ran the script.

Only three pins are needed.
I tested it first without a box.
I glued the hardware on the inside of the cardboard with a glue gun.
Exploring the @pnp/cli-microsoft365.

Other tips and tricks

The pogo pins were to loose and the leds did not work. I had to shorten the plastic holders a little to tighten the the pins.

Login to cli-microsoft365 as sudo

When I explored the m365 commands, it worked perfectly. My login was cached. Then I needed to run my scripts as sudo, since it requires communication with GPIO pins and the unicorn phat. It didn’t work. The login cache is in different place if you run as sudo. Obvious, when I look at it afterwards, but it took some time to realize that. So, if you are going to do the same, just make sure you log in to m365 as sudo as well, before running the script:

sudo m365 login


This web resource is gold, it shows the pinout and connections to many hats etc:

You only need three pins:

  • 5V Power (Pin 2)
  • Ground (Pin 6)
  • GPIO 18, Data (Pin 12)

subprocess in python

m365 is a command that you can run in terminal, from a python script I use subprocess to call it and get the results.

Running the script even when you log out

start the script with nohup:

nohup python3 &

Git Merge develop to main in an Azure DevOps Release

This post is a techy one. It’s about running git commands in Azure DevOps Releases in order to finalize a deployment job to production.

Let me first describe our scenario:

We use Azure DevOps for code and for deployment. Our branch strategy a simplified Gitflow model, where all the current work is merged to the “develop” branch. The code from the “develop” branch is then built and released to staging environments and production. After a release to Production and regression tests the develop branch needs to be merged into the “main” branch (or “master”). So simply put, the git merge into main is what we mean by finalizing a production release.

The “Finalize” stage in a release definition consists of one step: a bash script. But before you can run git commands you need to configure a couple things. Let’s go through them:

Step 1: Permissions

Step 2. New Stage

Next, create a new stage, call it “Finalize Production Deployment” (or other name of your choice). On the Build Agent step, enable “Allow scripts to access the OAuth token”

Add a step: Bash, call it “Git – merge changes to main”. Paste this git code.

git config –global ""
git config –global "Azure DevOps"
EXTRAHEADER="Authorization: Bearer $(System.AccessToken)"
git -c http.extraheader="$EXTRAHEADER" clone $REPO
cd $(Build.Repository.Name)
MAINBRANCHNAME=$(git rev-parse –abbrev-ref HEAD)
git config http.$REPO.extraHeader "$EXTRAHEADER"
echo — Merge $(Build.SourceBranchName) to $MAINBRANCHNAME
git fetch origin $(Build.SourceBranchName) –prune
git merge origin/$(Build.SourceBranchName) -m "merge $(Build.SourceBranchName) to $MAINBRANCHNAME" –no-ff
echo — Create tag named $TAGNAME
git tag -a -m "$(Build.SourceBranchName) installed to production" $TAGNAME
git push origin $MAINBRANCHNAME
git push origin –tags

That’s it, the code is pretty universal. Let me know if something does not work.

You can stop reading unless you want more details 🙂

More details

A neat list of all available variables

When I started working I found this very useful: The built-in “Initialize job”. Click on that:

There you can find all the built-in and your variables in a nice list. Pretty useful for building a bash script:

Here is how I construct the repo url, neat, isn’t it?

Pipelines vs. Releases

If you run this code in a classic Release Definition, you won’t get the repo. You need to clone it first. Why Release and a Pipeline. Well, due to reasoning described in my other blog post, we still run Releases: Azure Key Vault vs. Pipeline Variables. But Pipelines should work, too.


I found it useful while mickle-mackling with the finalize step, disable all the other steps and commenting out the actual push to origin. That way I could run it fast and focus on the steps I needed to fix first.


Before you can configure the authorization header, you need to clone it first and cd into that directory. In order to clone it you need to have the extraheader. Tough luck? No, not at all, you just need to add in two places, when cloning and and then in the git repository for all the following commands:

User Identity

Using git config you can define any user identity. Use something that makes sense and is easy to recognize.

main vs. master

If your repos main branches are both main and master, no worries, you don’t need to guess or create a variable. All you need to do is to check the current branch after you have cloned it. NB: it’s different in case you use a Pipeline.

Further reading and links

1TB=1024GB in SPO Storage

You want to calculate your storage capacity in SharePoint Online? Here is how:

  • Every 1TB is 1024GB (it might be confusing, see my previous post, but it’s how it is calculated)
  • A tenant gets 1024GB by default
  • For every user license of a product that includes the service plan called “SHAREPOINTSTANDARD”/SharePoint Online (Plan 1) you get 10 GB extra
  • For every user license of a product that includes the service plan called “SHAREPOINTENTERPRISE”/SharePoint Online (Plan 2) you get 10 GB extra
  • For every user license of a product that includes the service called “ONEDRIVEBASIC”/SharePoint Online OneDrive Basic you get 0.5 GB extra

Products vs. Service Plans

A product (a.k.a. SKU) consists of service plans. E.g. Office 365 E3 (product) consists of SharePoint Online Enterprise among others. It is a Service Plan that gives you additional storage, not a product. The information on “SharePoint Limits” page is (over-)simplified. Simplified for a good reason of course – to give a rule of thumb for calculating your storage.

But if you want to calculate the exact storage capacity, like I do, and even break it down into different departments etc based on licenses, then you need to be aware of the fact that a service plan makes you eligible of more space. A service plan, such as SharePoint Online (Plan 1) can be part of 1 or more products.

Service Plans eligible additional storage and the corresponding SKUs

  • SharePoint Online (Plan 1) – “SHAREPOINTSTANDARD” – 10 GB per user license
    • Project Online Plan 1 – PROJECT_P1
    • Office 365 Enterprise E1 – STANDARDPACK
  • SharePoint Online (Plan 2) – “SHAREPOINTENTERPRISE” – 10 GB per user license
    • MICROSOFT 365 E3 – SPE_E3
    • Dynamics 365 Customer Service Professional – DYN365_CUSTOMER_SERVICE_PRO
  • “ONEDRIVE_BASIC” – 0.5 GB per user license
    • VISIO Online Plan 2 – VISIOCLIENT

ActiveUnits vs. WarningUnits vs. ConsumedUnits

You can ignore the ConsumedUnits, because they are not used in the storage calculation. The ActiveUnits are the ones that are purchased. The WarningUnits are the licenses that have not been renewed and will be removed after 30 days.

So you need to count both the ActiveUnits and WarningUnits. Licenses = ActiveUnits + WarningUnits.

Further reading

Print2SPO – en enkel utskrift till SharePoint

Den här bloggposten är ett (en aning större) användar- (eller verksamhetsutvecklar-)tips om hur man kan sätta upp smarta utskrifter till SharePoint Online – utan några extraappar eller lösningar.

Först och främst, stort tack till min kollega Shahram som har presenterat idén för mig. Tänk dig ett följande scenario:

Du har en mall i Word som du fyller i, skriver ut på papper. Låt säga, det är en plockorder. Du vill digitalisera processen genom att skicka pdf:en till ett gemensamt dokumentbibliotek i SharePoint eller Teams.

Tekniskt är det enkelt, bara man i sin grupp kan komma överens om att göra det så. Då finns många möjligheter, både för att spara skog och kunna samarbeta smartare.


I det här scenariot använder jag ett bibliotek i SharePoint Online, men du kan koppla det även till Teams eller personliga OneDrive. Låt oss kalla det “Plockordrar”


Nästa steg är att lägga till en genväg till min personliga OneDrive. Man kan såklart synkronisera direkt, men i det här fallet väljer jag en OneDrive-genväg.

Mappen dyker snyggt upp på min dator:


Jag trycker på “Srkiva ut” och väljer “Microsoft Print to PDF”

Sedan väljer jag min OneDrive och “Plockordrar” och skriver in namn på filen

Det dyker upp i dokumentbiblioteket.

Smarta funktioner

Nu är jag inte längre begränsad till det analoga. Jag kan jag göra all magi som finns i SharePoint Online för att sätta upp ett smart samarbete med mina kollegor, som till exempel:

  • Lägga till kolumn Ansvarig och en vy “Mina plockordrar”
  • Lägga till kolumn Status för att skilja på aktiva och färdiga plockordrar
  • Lägga till kolumn Datum för att hålla en eventuell deadline
  • Använda kommentarer för att samarbeta med mina kollegor
  • Sätta upp alerts och påminnelser
  • Sätta upp godkännande-flöden etc
  • Formatera listan med olika färger för att kunna se bättre aktuellt arbete

Nästa gång du skriver ut från Word till PDF, kommer systemet ihåg ditt senaste val, så att det kan gå riktigt snabbt.

Det analoga

Den här processen betyder inte att det en tvär övergång till digitalisering heller. Det går fint att kombinera speciellt om man föredrar att ha det på papper! Mer än så. Du kan skriva ut den flera gånger om så behövs. Du minskar risken att det faller mellan stolarna (bokstavligt) och kan minska stressen för dig och dina kollegor.

Andra appar

Det finns dedikerade appar för att skriva ut till SharePoint, de kan vara mer precisa i vissa fall, men även med befintliga medel och smarta processer kan man ha ett smart samarbete i Office 365.

1 TB = 1024 GB in SPO?

There is confusion around how the storage is calculated in SharePoint Online. I believe, in SharePoint Online 1 TB is 1024 GB (based on powers of two), although the SI Prefix is for numbers based on powers of 10 (1TB = 1000GB, Wikipedia). In this post I would like to summarize the results of my investigations and I hope Microsoft or the community can confirm or disconfirm this.

First, let me explain why we care about it. The storage in SharePont is limited and we need to keep an eye on it. Especially in our case, where we need to track storage utilization across different parts of the organization/our tenant. The storage in SharePoint is calculated like so:

1 TB + 10GB * E-licensed users

The tricky part, though, is how to convert it into TB correctly.

Why I believe Microsoft treats 1 TB as 1024 GB

First of all, I can see it clearly in my dev tenant with exactly 25 licenses.

That would give 1TB + 10GB*25 = 1,25 TB if it would be based on powers of 10. But it isn’t because the storage I get is 1,24 TB, or 1,244 to be precise.

That means, for every E-license you get 10 GB or 10/1024 TB.

That also means you need more licenses to get the desired storeage. E.g. 10 TB more storage requires 1024 licenses and not 1000, 10 TB = 10240 GB, 10240 GB / 10 = 1024 E-licenses.

Also in OneDrive, the initial space I get, is 1024 GB (or 1TB). If 1TB = 1024GB in OneDrive, why should SPO be different?

Further, the MSDocs page reveals that the 25 TB are 25600 GB (which is exactly the product of 25 and 1024):

One contradictory page, though is the news about storage increase:

The calculations there are based on the decimal system:

Calculation of MB and GB

Just to verify how the storage is calculated in KB, MB and GB, I looked at the Storage of a SharePoint site. Luckily, I can get the storage used in Bytes, MB and GB (from different sources) and compare them to each other.

When I calculate back and forth I can defnitely see, it is multipled/divided by 1024, hence powers of 2:

The values in blue are the reported values. The other values are calculated.

The values in GB are exactly the same, the Bytes, KB and MB differ a bit due to rounding

Vår robot Stefan

Roboten Stefan

Här är historian om Stefan, en robot jag och barnen har jobbat på under den senaste månaden. I en sann DIY-anda vill jag skriva om vårt projekt på bloggen och förhoppningsvis inspirera andra att utforska det. I och med projektet innebär en hel del pyssel, så är det lämpat i princip för alla åldrar.

Även om allting togs fram parallellt och stegvis (“i iterationer”), kommer projektet presenteras det i förenklad ordning

  • Idé, hur allting uppstod
  • Hårdvara, material som vi använt
  • Pyssel, utklippning, måleri etc
  • Mjukvara (och elektroniken)
  • Lärdomar, diverse insikter vi har fått på vägen


Vi har en microbit v2 som vi har letat användning för. Microbit v2 har en mikrofon och en liten högtalare, definitivt en enorm fördel om man vill få barn med på pysslandet med elektronik. Jag hade sett någonstans på nätet om figurer man kunde göra av mjölkkartonger. “Stefan” då, varför just det namnet? Min fyraåriga dotter brukade skylla på Stefan om det var “någon” som stökade till, råkade trampa på någon annans fötter och sånt. “Det var inte jag, det var Stefan” brukade hon säga ett tag. Så vi kände att vi behövde ge Stefan en kropp, ett ansikte för att göra en rolig grej av det. Under arbetet med roboten har betydelsen av Stefan minskat dock. Roboten behöll sitt fina namn – Stefan – i alla fall.

en enklare gif


Det här var vad vi använde. Det finns miljontals andra material-kombinationer som kan funka. Tipset är att se vad man har hemma.


  • Mjölkkartong, 1,5 L
  • Microbit v2
  • 1 Servo-motor (180 grader)
  • 2 Krokodilklämmor
  • 1 batterihållare för 2xAAA-batterier för att driva microbiten
  • 1 batterihållare för 4xAAA-batterier för att driva servon
  • Diverse sladdar
  • Hobbyfärg
  • Elastisk tråd (pärlarmbandstråd)
  • Batterier 6xAAA (vi har laddningsbara och kan ladda dem då och då när roboten blir “trött”)


  • Sax
  • Limpistol
  • Pensel
  • Microsoft MakeCode för Microbit (gratis som webbapplikation eller Windows-app). Windows-app har fördelen att man kan skriva till microbiten direkt, utan att ladda ner det först, plus att det kommer ihåg dina senaste projekt


Roboten Stefan skulle säga “Hej” och han måste kunna öppna munnen. Det första vi gjorde var just att klippa ut munnen. Vi klistrade fast servon (efter långa undersökningar i vilken vinkel och hur mycket den rörde sig). Vi använde limpistol för att sätta fast servon. För att munnen skulle stängas, följa med servon tillbaka, knöt vi läppen till servon med en rosa elastisk tråd (den har man för att göra pärlarmband bland annat).

Vi satte fast servon med limpistol.

Roboten Stefan skulle också kunna visa med sin “näsa”. Vi gjorde ett fyrkantigt hål för microbitens ledlampor. För knapparna och A och B gjorde vi små hål. På det sättet kunde man jacka in microbiten in i de hålen så att det höll sig fast. Knapparna blev irisar i Stefans ögon senare.

På baksidan gjorde vi ett runt hål för att Stefan kunde gå på toa 💩. (Det var inte min idé 😜).

Stefan behövde också ett större hål på baksidan för att stoppa in elektroniken.

Målningen var väldigt kul och det gjorde vi i många omgångar (ibland med en hel veckas mellanrum, så länge tålamodet räckte och när lusten att måla var på topp). Prototypen på ansiketet skissade vi först på papper.

Prototypen på Stefans ansikte

Stefan skulle ha stora tänder, röda läppar och blå ögon. 🤩

Frost-fantasten målar Stefan.

Elektroniken kopplade vi med några krokodilklämmor och sladdar. Vi försökte följa samma färgkodning. PIN 1 använde vi för att styra. GND delades av microbiten, servon och servons batteripack.

Här kan man se hur sladdarna är kopplade.


Vi använde MakeCode for micro:bit, en app från Microsoft Store.

MakeCode i Microsoft Store

Projektet döpte vi till Stefan, kort och gott.

Projektet Stefan

Vi använde blockprogrammering. Det är kul även för mig som “skriv”-programmerar på jobbet. What-You-See-Is-What-You-Get (nästan i alla fall) är såklart toppen för barn.

Blockprogrammering i MakeCode

Nackdelen med blockprogrammering är att det inte är så lätt att återanvända koden eller spara koden. Som tur är kan man bara öppna den i python eller javascript. Vi (jag) satsade på python, för att det är vackert och koncist. Den aktuella koden är sparad på github, den kan dock ändras och justeras i framtiden:

Låt oss gå igenom blocken lite grann. Programmet består av tre huvudkomponenter:

  • Vid start (det som händer när Stefan slås på): Den visar “STEFAN” med rullande bokstäver, spelar upp ljudet Mysterious.
  • För alltid (det Stefan gör under sin vakna tid, huvudprogrammet): den spelar upp 6 gånger “Hello” och öppnar munnen samtidigt, lika många gånger. Om ljudet runtomkring tyst/lågt, tystnar även Stefan. Om Stefan hör högre ljud (någon kommer in i rummet och pratar eller någon ropar “hej”), börjar Stefan säga “Hej” från början igen, och öppna munnen, och detta 6 gånger igen. Så fortsätter det tills man stänger av Stefan.
  • Vid knapptryck (möjligheten att styra Stefan med knappar, ifall inget annat funkar): när man trycker på A-knappen, börjar Stefan hälsa och prata, när man trycker på B-knappen, slutar Stefan.
Stefan vaknar – “vid start”
Huvudmomentet – “för alltid”

Så knapparna användes mest för felsökning och återställning i början. Även om de inte används idag, har vi behållit funktionen för säkerhets skull.

Knapparna A och B sätter och stänger av Stefan.


Den sista bilden med knapptrycken får vara den första lärdomen. Vi hade flera variabler: paus, running, cycles. Så småningom hade vi endast “cycles” – cykler för hälsningar. En cykel är en munöppning och Hello-fras. Den börjar på 1 och slutar efter 6 gånger. Upptäcker en högre ljudnivå, börjar den från början. Vi skulle kunna ha nedräkning istället, men det blev uppräkning istället.

Hakan vikte vi först, men det var trögt, så vi skära långa sträck i vecket för att göra munöppningen lite smidigare.

Servon drevs först av microbit, det var oväsen även när munnen var stängd, servon “darrade” och väsnades. Vi kopplade in ett extra batteripack endast för servon, då blev det tyst. GND delas av microbiten och servons batteripack.

Vi fick experimentera ganska mycket med servons grader, och hittade bra vinklar till slut. Lite mer än 90 grader: 180 och 60. Detta beror främst på själva servons placering och elastiska tråden, man fick ta i lite mer för att den skulle spänna tråden och stänga munnen.

Tidigt i projektet hade vi en powerbank som drev det hela. Tyvärr stängde den av sig vid inaktivitet. En bra funktion för mobilladdning, men dåligt för Stefan.

Powerbanken fick plats men klarade inte uppgiften.

Mjölkkartongen kändes lite skum när man målade på den med hobbyfärg, som om den inte riktigt ville fastna i. Men vi tänkte att Stefan inte skulle trivas i att vara perfektionist, så det syns igenom att det inte är mjölk, utan en laktosfri mjölkdryck på vissa ställen.

“Vid högt ljud” fungerar inte. Det upptäcktes inte eller var det i osynk, så vi fick skriva om det till att mäta ljudnivån i varje körning i huvudmomentet, alltså i “för alltid”.

Det finns “döda vinklar” i systemet. Ljudnivån mäts, och sedan är det vissa pauser då ljudnivån inte mäts. Det har hänt att högt ljud har passerat förbi. Det vore bättre med ett event – “vid högt ljud”, men det fungerade tyvärr inte. Även med “döda vinklar” (sekunder) är det förvånansvärt pålitligt. Helt enkelt att vi tilltalar Stefan med högre röst, och inte använder Stefan som en supermaskin kapabel att upptäcka varenda ljudnivåhöjning. Vår tolerans för Stefans tillkortakommanden är helt enkelt mycket högre.

Så ser den ut och så hälsar vår robot Stefan
Daniel Chronlund Cloud Tech Blog

News, tips and thoughts for Microsoft cloud fans

Вула Чăвашла

VulaCV - Чăвашла вулаттаракан сайт

Discovering SharePoint

And going crazy doing it

Bram de Jager - Architect, Speaker, Author

Microsoft 365, SharePoint and Azure

SharePoint Dragons

Nikander & Margriet on SharePoint

Mai Omar Desouki

PFE @ Microsoft

Cameron Dwyer

Office 365, SharePoint, Azure, OnePlace Solutions & Life's Other Little Wonders


Me and My doings!

Share SharePoint Points !

By Mohit Vashishtha

Jimmy Janlén "Den Scrummande Konsulten"

Erfarenheter, synpunkter och raljerande om Scrum från Jimmy Janlén

Aryan Nava

DevOps, Cloud and Blockchain Consultant


SharePoint for everyone


Ryan Dennis is a SharePoint Solution Architect with a passion for SharePoint and PowerShell

SharePoint 2020

The Vision for a Future of Clarity

Aharoni in Unicode

Treacle tarts for great justice

... And All That JS

JavaScript, Web Apps and SharePoint


Mostly what I know and share about...