Have you also got a legacy powershell script that loads SharePoint dlls and runs CSOM code directly? It’s quite easy to convert to PnP PowerShell. But if you run out of time and just need to execute the script, then I have a quick tip for you.
First of all, a CSOM script can be recognized by Add-Type commands (or Import-Module) plus the SharePoint dll paths.
The odds are high that you don’t have those directories and files, unless you run it on a SharePoint Server (who would do that at all?) or you have installed the SharePoint SDK.
SharePoint SDK can be downloaded and installed (as suggested here), but why would you want to do that? An easier way is just to locate the files that are distributed with the PnP.PowerShell module, let me show how to do that.
All the dlls are available from the PnP.PowerShell module directory:
So the only thing you need to do is to re-point the path from the original (the “GAC”) folder to the PnP.PowerShell folder. You don’t need to guess the folder. It’s easy.
Thanks to the PowerTip: Find the Path to a PowerShell Module (Scripting Guy) I could find a way to read the information dynamically, so it doesn’t matter where your folder actually is. The fact what version number the module has, what OS you run on, and whether or not you installed it for your user account only or for all users on your computer – allt that has impact on the folder location. So we need to read the right path and then use it in the Add-Type command.
The PnP.PowerShell is built on top of.NET Core and it works cross plattform, that’s better.
If your legacy script does not work with the newer PnP.PowerShell, you might need to install the older PnP PowerShell and adjust the module name in the script above accordingly.
The SharePoint SDK is built on top of .NET Framework (as of my understanding) and it can only be installed on a Windows machine.
The SharePoint SDK requires local administrator rights to be installed. The PnP.PowerShell can be installed for a user without beeing an administrator by adding -Scope CurrentUser (to the Install-Module), which makes the work much smoother.
If you have two or more versions of the PnP.PowerShell module installed, you have to adjust the script a little by loading only the latest version of the module:
$pnpModule=Get-Module PnP.PowerShell –ListAvailable |Sort-Object Version –Descending |Select-Object–First 1
That was a quick tip on how you can use the types from the original CSOM libraries when you don’t have time to convert a script to a PnP code or if there is some functionality that is not covered in PnP yet (not quite sure if there is something you cannot do with PnP that you can do with CSOM).
The good sides of that approach:
it can be a step towards rewriting a legacy script to a newer PnP.PowerShell
the dlls are up-to-date thanks to an easy way to update the PowerShell Module (Update-Module)
it is cross platform, meaning you can execute your legacy script on a linux or on a Mac as well, good for automation!
They take around 15 minutes to complete, Azure Functions Consumption Plan is limited to 10 minutes. Of course I can split them in several parts, but I am looking for an easy way right now, I have to postpone refactoring because I am not sure if there is a real need for this script solution.
I created a Windows Server 2019 Datacenter Server Core VM to make it as lightweight as possible. I put it in a separate resource group, I didn’t reserve any ip addresses, nor dns names. I disabled even all ports including the RDP to have the highest security.
Size: Standard B1ms
HDD: HDD (no redundancy)
The start logic app (1, 2)
In the same resource group I created a logic app that turns on the VM daily at 00:13 UTC.
Easy, isn’t it.
Once started the vm triggers a scheduled – the Legacy.ps1 script.
In the end of the script there is an http call to my ‘Stop’ logic app:
Obviously, the trigger I use in my next logic app is an http request.
Whenever it is triggered is stops and deallocates the vm.
What does it cost?
I’ll save this discussion for later. Maybe It costs more than serverless applications, for sure, but less than a vm that is on and idle for hours every day. What I propose is a workaround for running huge legacy Windows scripts on Azure, in case you don’t have time to refactor your legacy scripts.
Allowing custom script has its security implications. But what exactly does it mean? Is it dangerous? My colleauge Daniel and me have done a little experiment. There are two implications stated on MS Docs:
Scripts have access to everything the user has access to.
Scripts can access content across several Office 365 services and even beyond with Microsoft Graph integration.
To summarize, we can look at that picture:
So the risk that user 1 (the Blue User) intentionally or unintentionally places a script and lets user 2 (the Red User) run this script by linking to the page that has this script. The page must be in a “common” place.
Let’s try that. Let’s have a little hackathon.
Experiment A. Very Simple.
Scenario: the Red User (my colleague Daniel) has a site with with sensitive information (He has a list called Secrets). The Blue user (me) does not have access to that site. The Red User accidentially shares the link to the secret list. Now, to prove the security risks with the custom script, the Blue User has to get the secret information.
To make it more visual and fun, we have found a box where we have put candies and locked it with a combination lock. The code (the combination) is stored in the Secrets list.
The combination consists of three digits from 0 to 9. So there are 1000 possible combinations.
Since the Red User knows that he is going to be hacked, he acts as normal as possible, not especially suspicious. And since the experiment A is a simple one, we let the Red User to accidentially show the link to the Secrets list. That can be:
A screenshot that shows something else but contains the confidential site url
An email that is sent to a wrong recipient
A pasted link in a wrong Teams chat
So the Blue User finds out that there is a Secrets list that has the following address:
“Fortunately”, the Blue User has access to a classic site (where Custom Script is allowed). He shares that site with the Red User. Now both the Blue User and the Red User have it in common.
Then, a new Web Part Page is added with a tiny script.
That script makes a REST call to the Secrets list and uploads the result in Shared Documents. For the sake of simplicity, a built-in document library is used. An attacker would probably add a hidden list for that.
Now to get the Red User to that page, the Blue User adds some “important” information so that the Red User will not become suspiciuos.
Now it is just the matter of sending a link to the Red User. The Red User navigates to that page and the script is executed while he is reading the information on that page.
The Blue User gets the Secret that was generated by the Red User:
With that information the Blue User has hacked the combination lock and he can now open the red box with candies!
Experiment B. Harder
For now I think it is enough with the Experiment A, the Simple Experiment. We have proved, that, indeed, security is compromised with custom script allowed. A user can get confidential information from another user. Maybe some of you want to set up a more sophisticated experiment. Make sure the “hacked” person is warned and agrees to be part of an ethical hacking act. We maybe can try more to show the importance of keeping custom script disabled and not using classic sites that have it by default.
Custom Script (or DenyAddAnCustomizePages=false) is dangerous, it is of course not black and white. It provide the ability to have small appications directly on sites created by users. But as with all other aspects of IT, you should know what you are doing. Every site with custom script is a potential trojan horse. Minimizing the number of classic sites and sites with custom scripts lowers the risk. All the collaboration sites should be modern team sites without custom script.
Whether this .resx could be read by scriptResx.ashx handler. Only a file
marked with scriptResx:true could be returned to client.
This results in:
So what we have to do for our custom resx file is to add classFullName resheader:
For a while ago I needed to copy all files from a document library within a SharePoint 2007 site to the hard drive. So I didn’t need to copy files from SharePoint to SharePoint so I couldn’t use the stsadm -o export command or Chris O’Brien’s nice SharePoint Content Deployment Wizard. I came across the SPIEFolder application which should work with SharePoint 2007 and 2010. It has a site on codeplex: spiefolder.codeplex.com, but neither the binary nor the source code can be downloaded from there. After some searching I found the binary in the author’s skydrive. The fact that the source code was not available seemed as an disanvantage because I could not know what code was run. Nevertheless I tried it out and it didn’t work:
The Web application at http://dev/Documents could not be found. Verify that you have typed the URL correctly. If the URL should be serving existing content, the system administrator may need to add a new request URL mapping to the intended application.
So I wrote my own code to copy the documents. To write a console application feels so yesterdayish, so it is written in PowerShell. Even if there are no PowerShell snapins for SharePoint 2007, you have access to the entire Server Object Model, the only thing you have to do is to load the SharePoint assembly:
PowerShell är ett sätt att manipulera data och struktur i SharePoint-portalen. Samma uppgifter låter göras med konsoll-applikationer. Fördelen med PowerShell är att man kan skapa skript som är flexibla och kan köras med olika parametrar. PowerShell påminner starkt om shell-skript i Linux. För mig som egentligen kommer snarare från bash-världen. Där har Microsoft låtit sig inspireras starkt :).