PowerShell - Best Practice - creating better scripts
PowerShell offers relatively much freedom in designing the scripts. In order to write scripts in a readable and understandable way, it is advantageous if certain rules are followed. In PowerShell, comments can be inserted to explain the code, or information about the script and help can be stored in the header of a cmdlet, and the possible parameters can be documented. Thegoal should be that the code is written as self-explanatory as possible, which makes some comments unnecessary and simplifiesalater adjustment by oneself or by others. Additional code for error handling can increase stability and speed up debugging. The use of a suitable editor supports development, see PowerShell editors in comparison: ISE, Visual Studio Code.
write self-explanatory
- When called, the whole cmdlet name should be used (no aliases).
As an example, the cmdlet: "Get-ChildItem" can also be called with the alias "gci". When reading the source code, the name "Get-ChildItem" suggests the purpose of the function purely from the name, according to the help: get-help get-childitem:
"Gets the items and child items in one or more specified locations." - So-called "named parameters" should be used as parameters in scripts:
As an example, all files in a given folder could be displayed with the cmdlet: "Get-ChildItem":
Not recommended: "get-childitem c:\temp", or "gci c:\temp".
The call should be used as follows:
"get-childitem -Path c:\temp"
- Documentation within the code
- In PowerShell ISE, Ctrl+J with Cmdlet (advanced) can be used to insert a template for the PowerShell header, see: Cmdlet (advanced function)
- For commands that are not self-explanatory: Add comments:
- A single line comment can be created with "#"
#Description of the following call
- multiline comments start with a "<#" and end with "#>"
</#" and end with "#><# multiline comment #>
- A single line comment can be created with "#"
There are predefined prefixes for the function names. For own functions a valid prefix (verb) should be used:
Get-Verb - permissible cmdlet names for own functions
With the command Get-Verb it is possible to display all allowed verbs for own commands:
PS C:\Windows\system32> get-verb
Verb Group
---- -----
Add Common
Clear Common
Close Common
Copy Common
Enter Common
Exit Common
Find Common
Format Common
Get Common
Hide Common
Join Common
Lock Common
Move Common
New Common
Open Common
Optimize Common
Pop Common
Push Common
Redo Common
Remove Common
Rename Common
Reset Common
Resize Common
Search Common
Select Common
Set Common
Show Common
Skip Common
Split Common
Step Common
Switch Common
Undo Common
Unlock Common
Watch Common
Use Other
Verb Group
---- -----
Backup Data
Checkpoint Data
Compare Data
Compress Data
Convert Data
ConvertFrom Data
ConvertTo Data
Dismount Data
Edit Data
Expand Data
Export Data
Group Data
Import Data
Initialize Data
Limit Data
Merge Data
Mount Data
Out Data
Publish Data
Restore Data
Save Data
Sync Data
Unpublish Data
Update Data
Verb Group
---- -----
Approve Lifecycle
Assert Lifecycle
Complete Lifecycle
Confirm Lifecycle
Deny Lifecycle
Disable Lifecycle
Enable Lifecycle
Install Lifecycle
Invoke Lifecycle
Register Lifecycle
Request Lifecycle
Restart Lifecycle
Resume Lifecycle
Start Lifecycle
Stop Lifecycle
Submit Lifecycle
Suspend Lifecycle
Uninstall Lifecycle
Unregister Lifecycle
Wait Lifecycle
Verb Group
---- -----
Debug Diagnostic
Measure Diagnostic
Ping Diagnostic
Repair Diagnostic
Resolve Diagnostic
Test Diagnostic
Trace Diagnostic
Verb Group
---- -----
Connect Communications
Disconnect Communications
Read Communications
Receive Communications
Send Communications
Write Communications
Verb Group
---- -----
Block Security
Grant Security
Protect Security
Revoke Security
Unblock Security
Unprotect Security
No plural should be used for the verb, the suggested verbs are all singular ...
no long one-liners
Due to the use of multiple parameters, commands are often a bit more difficult to read:
Get-ChildItem -Path "c:\temp" -Recurse -Depth 2 -Include "*.txt" -Force -Exclude "*temp*" -WarningAction Continue -ErrorAction Stop
It becomes somewhat clearer if the parameters are written in separate lines:
Line breaks
with a "`" at the end of the line the command can be split to several lines:
Get-ChildItem `
-Path "c:\temp" `
-Recurse `
-Depth 2 `
-Include "*.txt" `
-Force `
-Exclude "*temp*" `
-WarningAction Continue `
-ErrorAction Stop
Alternatively, the parameters can also be passed via a hashtable:
Splatting
By swapping the parameters into a hashtable, they can be made clearer:
$HashArguments = @{
Path = "c:\temp"
Recurse = $true
Depth = 2
Include = "*.txt"
Force = $true
Exclude = "*temp*"
WarningAction = "Continue"
ErrorAction = "Stop"
}
Get-ChildItem @HashArguments
one purpose per function
Functions should, as with other script languages, have a specific purpose and not combine several tasks in one function. As an example, the command "Get-ChildItem" outputs all files or folders of a directory. To perform a specific action on the files, for example, deleting all files (items), another cmdlet is used: "Remove-Item". The two cmdlets can be combined in the call:
Get-ChildItem -path "c:\temp" | Remove-Item
Get-ChildItem is used to return a list of files, Remove-Item is used to remove (delete) a specific item. The example is for interactive use in PowerShell and should only illustrate the well-defined task of the cmdlets.
Do not terminate functions with an exit
Functions should be terminated with a throw in case of an error, not with an exit. The reason for this is that when exit is used, the entire script is terminated. If, on the other hand, "throw" would be used, an error of the function can be handled with a try/catch. Without try/catch, the script still terminates with an error at this point.
function test{
Param($x)
if($x){
Write-Output $x
}else{
Throw 'no $X passed'
}
}
try {
Test
} catch {
Write-Output "Param x fehlt"
}
Abstraction into own functions when it makes sense
Functions make sense when they add value:
- When a certain block of code is to be used multiple times
- For certain logics that can simply only be implemented with functions.
- When it makes the script more reliable.
- If it makes the script more understandable and easier to use
As an example, the Get-ChildItem function could be wrapped in a new function and then called through it:
<#
.Synopsis
Get-MyChildItem
.DESCRIPTION
Wrapper für Get-ChildItem
.EXAMPLE
Get-MyChildItem
#>
function Get-MyChildItem
{
Param
(
$Path
)
Get-ChildItem -path $path
}
#Aufruf der Funktion:
Get-MyChildItem -path "c:\temp"
The call to the custom function in the example is just as readable as the created function: The example makes no sense in the form, of course, but it is representative of a function whose only task is to call another function. I have already seen examples where a separate function makes the script neither more readable, nor simpler. It gets really complicated when one function calls another function, which calls another function, and so on. It is possible that the author has an overview at the time of writing the script, but after some time finds himself having to reverse-engineer the script before he can make any adjustments to it, and it is even more difficult for another person. At this point , some compassion for the people who will be reading or apassing the script later makes sense. In addition it should not be forgotten that a function, as already mentioned, should have only one task if possible: Less is often more and the simpler the better.
{{percentage}} % positive