Azure Automation provides seamless integration with Azure DevOps and other Version Control Systems. Having change-tracking and backup & restore capabilities is a blast!
This first blog covers workstation preparation, creation and initial setup of Azure Automation and the connection to Azure DevOps.
Permissions to created resources in Azure, administrative permissions on Azure DevOps organization and permissions to create Enterprise Applications in EntraID are needed to perform the next steps.
Programs on your computer
The following free programs shall be installed on your computer. I strongly recommend to follow the installation order as mentioned.
Lastly the following VSCode extension as they simplify working with Azure: PowerShell, Azure Account, Azure Automation, Azure Resource Manager, Azure Resources, Azure Storage, Git History, GitLens
Create an Azure DevOps Project
To keep things organized and to avoid conflicts I suggest created a dedicated project in Azure DevOps. Ensure that you select private repository.
If desired, create new repository named “Runbooks”. Click on “initialize”.
Next, GIT on your workstation needs to be initiated. Open a GIT CMD and type the following commands:
Azure Automation and Function Apps offer server-less script execution.
This post explains key differences between Azure Automation and Function Apps, concludes with an opinion on the preferred choice for a Scheduled Task replacement.
Comparing site by site
Usage / Use cases
Both allow the execution of scheduled jobs which themselves are based on scripts. PowerShell is supported on both options.
Azure Automation is triggered either by time, via web-hook offer eventgrid integration and “watcher tasks”. The “result” is like when working with Scheduled tasks. From file-operations to changes in the Azure infrastructure or any other job which is deterministic.
Function Apps also allow time trigger, but also offer a eventgrid integration and some native events, like for instance “on new file in storage account”. Also they can expose web endpoint. That means you could pass in parameters via HTTP Post or in the Query string and get result as JSON for example. There is also the concept of “durable functions” which more or less means that there are always ready to deliver results.
Audience
Azure Automation are targeting IT-Pros, formerly known as System Administrators.
Function Apps on the other hand are more developer focused. SREs or DevOps are also included here.
Manageability
Azure Automation uses “Accounts” as container objects for RunBooks. Settings that apply to all RunBooks such as Environment-Variables, Secrets or shared PowerShell modules.
Activating “Managed Identity” allows granting permissions on other Azure services like for instance allowing starting and stopping of VMs.
A RunBook itself is as dedicated Azure object that contains the script which may bound to a schedule and the execution choice.
Function Apps require an App Server Plan as foundation. The plan defines supported frameworks and dictates the costs. On the other hand, it allows vertical and horizontal scaling!
Dependencies and other settings are defined in *.json files and require little research.
Typically, only one script is stored on a Function App. Multiple scripts are possible but require attention when maintaining them in the different folder structures.
Complexity Level
Azure Automation in general is “simpler” in setting up and maintaining. Mostly because multiple jobs can be managed from one console. Another important aspect is that many aspects configurable via graphical user interface.
Once figured out how they work, using them is convenient.
Function Apps require more considerations when setting up and while maintaining. Most aspects must be done via configuration files. Setting up a schedule for instance requires knowledge of Cron syntax. In my experience PowerShell module dependencies are regular problematic. It is advisable to only import those which are essential for the script to work.
Mastering them have a steeper learning curve.
Various Differences
Azure Automation know the concept of “Hybrid Worker”. Behind that term are ARC enabled Windows Servers which allow the execution of scripts on them. This allows access to resources within the datacenter / the active directory / the private cloud with benefiting for all advantages that Azure technologies brings.
GIT integration with bidirectional sync. – Deployment pipeline managed automatically.
Function Apps offer a rich set of integrations with Azure PaaS services. As they run within the context of an App Service Plan there are horizontal and vertical scaling options available to overcome also very demanding jobs.
Supported languages: PowerShell, C#, NodeJS, Python and more.
GIT integration with bidirectional sync and support for complex pipelines within Azure DevOps allow also sophisticated scenarios.
Conclusion
In my opinion is Azure Automation the best fit to be the better alternative to tradition Scheduled Task. Version Control, Change-Tracking, Security, Monitoring, Redundancy options and much more are reason for it.
Scheduled Tasks have been around since the early days of Windows Server. The task scheduler is the inbuilt component used to plan and execute jobs. Typically, System administrators use them to automate reoccurring activities which are defined in scripts.
While Scheduled Tasks are robust and easy to use, they have a couple of shortcomings.
This blog explains the advantages of Azure Automation or Function App over Scheduled tasks.
Looking on Scheduled Tasks and their opponents in Azure
Storage / Location
Scheduled Tasks are stored on an individual computer. – If the computer requires replacement or decommissioning, the task needs to be migrated.
Azure Services live in the cloud and have no fixed relation to a computer.
Execution
Scheduled Tasks are executed on an individual computer. – In case the computer is unavailable, the task will not run.
Azure Services run typically in the cloud OR on designated computers.
Redundancy
Scheduled Tasks have no redundancy options out of the box. – They are typically stored only on 1 computer.
Azure Services provide different options for been redundant. Even if execution is on computers.
Change Tracking / Versioning
Scheduled Tasks are changed on demand without direct support of managing the change properly.
Azure Services can be connected to a Version Control System, e.g. Azure DevOps and allow tracking on changes and versioning via underlying GIT protocol.
Monitoring
Scheduled Tasks can be monitored on a the individual computer. For centralized monitoring the use of SCOM or an alternative Server-Monitoring system is required.
Azure Services can send diagnostic and performance data via single click to a central Log Analytics Workspace. Kusto queries allow monitoring and alerting.
Security
Scheduled Task run typically under the context of “System” which has the highest level of permissions on the individual server. By using the computer object, permissions on remote locations, like file-shares or others can be granted.
In cases a dedicated User account is created, has to be granted sufficient permissions on the computer to run the task. Typically the password is stored and never changed.
In worst cases, the credentials are stored in plain text within the script.
Azure Services can leverage a “managed identity” which password is neither exposed nor it requires attention. Furthermore, if credentials for different operations within the script are needed, the managed identity can be granted access to a Key Vault to obtain secret information securely.
Costs
Scheduled Tasks are part of the Windows operating system and do not come with extra costs or licenses.
Azure Services offer free tiers or are charged for minimal wages.
Round Up
To conclude, Azure Runbooks and Function Apps outperform the traditional Scheduled Tasks in many aspects.
Aruba Wireless technology is one of the market leaders. Squared Up can bring in visibility and reduces the MTTR*.
This post explains by example how to extract, transform, and visualize Aruba data by using the PowerShell and Squared Up.
Introduction
Aruba (part of HP Enterprise) provides wireless LAN solutions for small offices to large enterprise networks. Mobility-controller are used to manage access points centrally. By using master-controller a management hierarchy can be setup by keeping one point of administration.
Network specialists either use command line or the web interface. Automation and programmatic access are given via common REST API.
Squared Up in version 5.3 (Jan 2022) offers many possibilities to retrieve data and visualize it in no time.
Most versatile capability is the native integration of PowerShell. It allows any kind of data transformation or aggregation before passing it the dashboard engine.
Prerequisites
At least Aruba OS 8.5 on the controller and PowerShell 5.1 on the Squared Up server are suggested.
A local user account “aruba_monitor” with password needs to be created on the controller.
Dry Run
Before coming to the details, try if retrieving Aruba information from the Squared Up server works.
Getting all switches:
The script below retrieves all switches ( controller ) and adds state information that is needed for Squared Up. The information is stored in a CSV file which allows instant showing of the information. Use Scheduled Tasks to run the script regularly (e.g. every 5 minutes).
#region PREWORK Disabling the certificate validations
add-type -TypeDefinition @"
using System.Net;
using System.Security.Cryptography.X509Certificates;
public class TrustAllCertsPolicy : ICertificatePolicy {
public bool CheckValidationResult(
ServicePoint srvPoint, X509Certificate certificate,
WebRequest request, int certificateProblem) {
return true;
}
}
"@
[Net.ServicePointManager]::CertificatePolicy = New-Object -TypeName TrustAllCertsPolicy
#endregion PREWORK
$API_BASE_URI = 'https://your-aruba-master-url'
$DeviceUsername = 'aruba_monitor'
$DevicePassword = 'your-password'
$session = Invoke-RestMethod -Uri "${API_BASE_URI}/v1/api/login" -Method Post -Body "username=$DeviceUsername&password=$DevicePassword" -SessionVariable api_session
$sessionID = $session._global_result.UIDARUBA
$allSwitches = Invoke-RestMethod -Uri "${API_BASE_URI}/v1/configuration/showcommand?command=show+switches+all&UIDARUBA=${sessionID}" -WebSession $api_session
$switches = $allSwitches.'All Switches'
$switchList = New-Object -TypeName System.Collections.ArrayList
foreach ($sw in $switches) {
$stateTmp = ($sw.Status -split ' ')[0]
$state = 'Healthy'
if ($stateTmp -ieq 'up') {
$state = 'Healthy'
} elseif ($stateTmp -ieq 'down') {
$state = 'Critical'
} else {
$state = 'Warning'
}
$swObj = [pscustomobject]@{
ConfigID = $sw.'Config ID'
ConfigSyncTimeSec = $sw.'Config Sync Time (sec)'
ConfigurationState = $sw.'Configuration State'
name = $sw.Name
IP = $sw.'IP Address'
Status = $stateTmp
Location = $sw.Location
Model = $sw.Model
SiteCode = $sw.Name.Substring(0,5).ToUpper()
Type = $sw.Type
State = $state
}
$null = $switchList.Add($swObj)
} #end foreach ($sw in $switches)
$switchList | Export-Csv -NoTypeInformation -Path C:\temp\aruba_switchlist.csv -Force
#log off – Important as otherwise all sessions will be blocked!
Invoke-RestMethod -Uri "${API_BASE_URI}/v1/api/logout" -Method Post -SessionVariable api_session
Getting all access points:
For convenience, simply extend the script above ( just before log off section ) the following lines which export all access points. – Some meta information is appended, too:
Another great use case is using the PowerShell (Status Icon) tile in combination with your building layout. – See your access points health and know exactly where they are:
Squared Up’s Web-API tile allows it to integrate information
from any web-service that returns JSON data.
With Polaris, a free and open source framework it is
possible to build web-services just in PowerShell.
This example explains the steps to create web-service in Polaris which returns locked out user information and how to integrate them nicely in Squared Up.
Requirement
A windows server that will host your Polaris web-service
On that server PowerShell version 5.1
Active Directory Users & Computer and its module installed ( part of the RSAT )
Administrative permissions on to install Polaris (Install-Module -Name Polaris)
Create an empty directory adsvc insight of C:\Program Files\WindowsPowerShell\Modules\Polaris
Open the Windows firewall to allow incoming connection to the port you specify, here 8082.
Limit this port to only accept request from your Squared Up server.
NSSM – the Non-Sucking Service Manage to run your web-service script as a service. https://nssm.cc/
Realization
The solution consists of two PowerShell scripts. The first
one exports locked user information into a JSON file. It needs to be scheduled via
Task Scheduler to provide up-to-date information for the dashboard. It would be
also possible to extract locked user information on each dashboard load, but
that would be very slow.
Export Script
Create a directory C:\ScheduledTasks and copy the following lines into text file. Name it Export-ADLockedAndExpiredUsers.ps1. Place the following content into it:
Import-module ActiveDirectory
$jsonFilePath = 'C:\ScheduledTasks\SquaredUpExports\ADLockedAndExpiredUsers.json'
# storing raw active directory information in ArrayList
$rawLockedUersList = New-Object -TypeName System.Collections.ArrayList
Search-ADAccount -LockedOut | Select-Object -Property Name,SamAccountName,Enabled,PasswordNeverExpires,LockedOut,`
LastLogonDate,PasswordExpired,DistinguishedName | ForEach-Object {
if ($_.Enabled) {
$null = $rawLockedUersList.Add($_)
}
}
# helper function to get account lock out time
Function Get-ADUserLockedOutTime {
param(
[Parameter(Mandatory=$true)]
[string]$userID
)
$time = Get-ADUser -Identity $_.SamAccountName -Properties AccountLockoutTime `
| Select-Object @{Name = 'AccountLockoutTime'; Expression = {$_.AccountLockoutTime | Get-Date -Format "yyyy-MM-dd HH:mm"}}
$rtnValue = $time | Select-Object -ExpandProperty AccountLockoutTime
$rtnValue
} #End Function Get-ADUserLockedOutTime
# main function that sorts and formats the output to fit better in the dashboard
Function Get-ADUsersRecentLocked {
param(
[Parameter(Mandatory=$true)]
[System.Collections.ArrayList]$userList
)
$tmpList = New-Object -TypeName System.Collections.ArrayList
$tmpList = $userList | Sort-Object -Property LastLogonDate -Descending
$tmpList = $tmpList | Select-Object -Property Name,`
@{Name = 'UserId' ; Expression = { $_.SamAccountName }}, `
@{Name = 'OrgaUnit' ; Expression = { ($_.DistinguishedName -replace('(?i),DC=\w{1,}|CN=|\\','')) -replace(',OU=',' / ')} }, `
Enabled,PasswordExpired,PasswordNeverExpires, `
@{Name = 'LastLogonDate'; Expression = { $_.LastLogonDate | Get-Date -Format "yyyy-MM-dd HH:mm" }}, `
@{Name = 'AccountLockoutTime'; Expression = { (Get-ADUserLockedOutTime -userID $_.SamAccountName) }}
$tmpList = $tmpList | Sort-Object -Property AccountLockoutTime -Descending
# adding a flag character for improved visualization (alternating)
$rtnList = New-Object -TypeName System.Collections.ArrayList
$itmNumber = $tmpList.Count
for ($counter = 0; $counter -lt $itmNumber; $counter ++) {
$flack = ''
if ($counter % 2) {
$flack = ''
} else {
$flack = '--'
}
$userProps = @{
UserId = $($flack + $tmpList[$counter].UserId)
OrgaUnit = $($flack + $tmpList[$counter].OrgaUnit)
Enabled = $($flack + $tmpList[$counter].Enabled)
PasswordExpired = $($flack + $tmpList[$counter].PasswordExpired)
PasswordNeverExpires = $($flack + $tmpList[$counter].PasswordNeverExpires)
LastLogonDate = $($flack + $tmpList[$counter].LastLogonDate)
AccountLockoutTime = $($flack + $tmpList[$counter].AccountLockoutTime)
}
$userObject = New-Object -TypeName psobject -Property $userProps
$null = $rtnList.Add($userObject)
Write-Host $userObject
} #end for ()
$rtnList
} #End Function Get-ADUsersRecentLocked
if (Test-Path -Path $jsonFilePath) {
Remove-Item -Path $jsonFilePath -Force
}
# exporting result to a JSON file and storing it on $jsonFilePath
Get-ADUsersRecentLocked -userList $rawLockedUersList | ConvertTo-Json | Out-File $jsonFilePath -Encoding utf8
Publish Script
Create a directory C:\WebSrv
and create an empty text file in it. Rename the file Publish-ADData.ps1. Place the following content into it. This
directory contains your web-service.
Import-Module -Name Polaris
$polarisPath = 'C:\Program Files\WindowsPowerShell\Modules\Polaris'
# runs every time the code runs and ensure valid JSON output
$middleWare = @"
`$PolarisPath = '$polarisPath\adsvc'
if (-not (Test-path `$PolarisPath)) {
[void](New-Item `$PolarisPath -ItemType Directory)
}
if (`$Request.BodyString -ne `$null) {
`$Request.Body = `$Request.BodyString | ConvertFrom-Json
}
`$Request | Add-Member -Name PolarisPath -Value `$PolarisPath -MemberType Noteproperty
"@
New-PolarisRouteMiddleware -Name JsonBodyParser -ScriptBlock ([scriptblock]::Create($middleWare)) -Force
# the Get route is launched every time the web-service is called
New-PolarisGetRoute -Path "/adsvc" -ScriptBlock {
$rawLockedUersList = New-Object -TypeName System.Collections.ArrayList
$rawData = Get-Content -Path 'C:\ScheduledTasks\SquaredUpExports\ADLockedAndExpiredUsers.json'
$jsonData = $rawData | ConvertFrom-Json
if ($jsonData.Count -ne 0) {
$jsonData | ForEach-Object {
$null = $rawLockedUersList.Add($_)
}
}
$reportTime = Get-item -Path C:\ScheduledTasks\SquaredUpExports\ADLockedAndExpiredUsers.json `
| Select-Object -ExpandProperty LastWriteTime | Get-Date -Format "yyyy-MM-dd HH:mm"
$maxNoOfUsers = $null
$maxNoOfUsers = $request.Query['maxNoOfUsers']
$getReportTime = 'no'
$getReportTime = $request.Query['getReportTime']
$getLockedUserCount = 'no'
$getLockedUserCount = $request.Query['getLockedUserCount']
#if getLockedUserCoutn is yes then return number of locked users
if ($getLockedUserCount -eq 'yes') {
$noProps = @{ 'number' = $rawLockedUersList.Count }
$noObj = New-Object psobject -Property $noProps
$response.Send(($noObj | ConvertTo-Json))
}
#if maxNumber is a number than return locked user information
if ($maxNoOfUsers -match '\d') {
$rawLockedUersList = $rawLockedUersList | Select-Object -First $maxNoOfUsers
$response.Send(($rawLockedUersList | ConvertTo-Json))
}
#if getReportTime is yes then the time of export will be returned
if ($getReportTime -eq 'yes') {
$tmProps = @{
'Time' = $reportTime
'DisplayName' = [System.TimezoneInfo]::Local | Select-Object -ExpandProperty DisplayName
}
$tmObj = New-Object psobject -Property $tmProps
$response.Send(($tmObj | ConvertTo-Json))
}
} -Force
Start-Polaris -Port 8082
#Keep Polaris running
while($true) {
Start-Sleep -Milliseconds 10
}
Configure your web-service to run as a service
Download NSSM and store the nssm.exe in C:\WebSrv . Run the following PowerShell line to convert Publish-ADData.ps1 into a service. – Use ISE or VSCode.