MSP PowerShell & Automation Articles - Altaro DOJO | MSP https://www.altaro.com/msp-dojo Managed Service Provider guides, how-tos, tips, and expert advice Mon, 08 Nov 2021 12:28:17 +0000 en-US hourly 1 How to Use Azure Automation Runbooks for MSP Customers https://www.altaro.com/msp-dojo/azure-automation-runbooks/ https://www.altaro.com/msp-dojo/azure-automation-runbooks/#respond Thu, 04 Jul 2019 18:04:01 +0000 https://www.altaro.com/msp-dojo/?p=1396 Azure Automation is a great tool for Managed Service Providers with log monitoring, encrypted credentials, & Hybrid Cloud Versatility. Here's how to use it!

The post How to Use Azure Automation Runbooks for MSP Customers appeared first on Altaro DOJO | MSP.

]]>

Microsoft has made great strides in the hybrid cloud automation space with Azure Automation. For Managed Service Providers this is a great tool to take advantage of when managing multiple clients. We can now run our “scheduled tasks” on-premise with Azure Automation and get the following benefits:

  • Azure Log Monitoring – We can now configure alerts for scheduled tasks with Azure Monitor Logs. Now there is more visibility into our scheduled scripts that fail.
  • Encrypted Credentials – Azure Automation provides that ability to securely store credentials and call them via scripts. This one is huge, as we can easily call credentials without having to mess around with encrypted password files and certificates.
  • Hybrid Cloud Versatility with Scripts – We can choose whether to run a script either on-prem, in Azure, or both. This gives us more versatility to run our scripts anywhere.

Looking for general PowerShell Credential Encryption Guidance? See our guide on encrypting passwords with PowerShell!

Looking for ways an MSP can get started with Azure? We have more resources to help MSPs get started with Azure!

Setting Up an Automation Account

To get started using Azure Automation Runbooks, we need to have an Automation Account set up. If you don’t have an Azure account already, sign up for the free trial. Then login to the portal and search for Automation Accounts. Select the Add button to create a new account:

Automation accounts Azure

Fill out the required fields to create the Automation Account. Select Yes to create the Azure Run As account, we could create it manually if needed but the easiest route it to just have Azure create it when setting up the Azure Automation account:

add automation account

 

How to Create an Azure Log Analytics Workspace

In order to use Azure Automation Runbooks on-premise, we will need to set up a Log Analytics Workspace. This a service in Azure that provides monitoring and logging for the various Azure services. In the Azure Portal search for Log Analytics Workspaces, select Add:

log analytics workspace azure

Fill out the required fields, the pricing for Log Analytics is based on storage, so your only paying for the storage to store your logs:

Now we need to link our Azure Automation account with our Log Analytics Workspace. As of right now, this has to be done through PowerShell. So open up an administrative PowerShell window and run the following command to install the AZ module:

Install-Module AZ -Force

Then run Connect-AZAccount to connect to your Azure Subscription:

Connect-AZAccount

Now we need to get the resource ID for our Automation Account, we’ll use the Get-AZResource cmdlet and filter by resource type and our automation account name. In my example it’s LukeLabAA. We want to save the resource ID to a variable so we can use it shortly:

$AAResourceID = (Get-AzResource -ResourceType "Microsoft.Automation/automationAccounts" -Name "lukelabaa").resourceid

We will do the same for the workspace resource ID using the same cmdlet with the workspace resource type and the name of the workspace we just set up:

$WSResourceID = (Get-AzResource -ResourceType "Microsoft.OperationalInsights/workspaces" -Name "lukelabaa-LA").resourceid

To link the account with the workspace we will use the Set-AZDiagnosticSetting cmdlet and reference both resource ID’s:

Set-AzDiagnosticSetting -ResourceId $AAResourceID -WorkspaceId $WSResourceID -Enabled 1

To verify that the account is linked we can see in the output that JobLogs and JobStreams are enabled:

Setting Up the Hybrid Worker Node

The Hybrid Worker Node is an agent that is installed on an on-premise server running either Linux or Windows. This agent is used to execute commands from the runbook to the on-premise environment. The image below from Microsoft’s documentation gives a good depiction on the high-level topology for the communication between the Hybrid Runbook Worker and Azure Automation. You can group Runbook Workers together to create a redundant solution for your Runbooks. Also, note the port 443 connectivity which provides us with a secure way of transferring data back and forth between on-prem and cloud:

Setting Up the Hybrid Worker Node

In this example, we’ll configure a Windows Server 2016 Core node with the Hybrid Worker Agent. Currently, as this article is being published, there are two ways to set this up, there is a PowerShell script that can be downloaded from the PowerShell gallery and ran, however, it is using the AzureRM cmdlets and running Connect-AzureRMAccount on server core produces the “Unable to load DLL ‘IEFRAME.dll'” error. So we go over how to add the Hybrid Worker Node on Server Core using the manual process. First, we will need to run the following command with the resource group and name of our log analytics workspace that we set up. This tells our workspace to push the worker components to the agent computer when we add it to the workspace in the next steps :

Set-AzureRmOperationalInsightsIntelligencePack -ResourceGroupName LukeLabAA-RG -WorkspaceName LukeLabAA-LA -IntelligencePackName "AzureAutomation" -Enabled $true

Next, we will download the agent from our workspace. Navigate to the Log Analytics Workspace and select Advanced Settings on the left-hand side. Select Connected Sources and since we are setting up a Windows node we will choose Windows Servers. Select Download Windows Agent (64 bit) and transfer it to the Hybrid Worker. Also, make note of the Workspace ID and Primary Key, these need to be used in order to configure the agent installation to point to the Azure environment:

When we run the executable click next through the wizard. Select Connect the agent to Azure Log Analytics (OMS) and click Next:

Paste in the Workspace ID and Primary Key that we saw from the previous step, choose Azure Commercial and click Next, then Install:

Wait a few minutes for the agent to show in the workspace

When the agent installs the Hybrid Registration PowerShell module gets copied down to the Hybrid Worker. So, on the Hybrid Worker node navigate to “C:\Program Files\Microsoft Monitoring Agent\Agent\AzureAutomation\<version>\HybridRegistration” and import the module:

Import-Module .\HybridRegistration.psd1

Then run the following command. The URL and Token are obtained from the Azure Automation account. Select Keys on the left-hand side and the Primary Key will be the token and the URL will be displayed. Also include the Hybrid Worker Group name that you would like to use if the one specified doesn’t exist it will automatically get created:

Add-HybridRunbookWorker –GroupName LukeLabOnPrem -EndPoint "https://eus2-agentservice-prod-1.azure-automation.net/accounts/d3d71ed2-e761-4333-b333-fce7b97e3333" -Token "0B/RNjlieKGSk2QjXmsuGoQtSQW0QVb6vfjqIY2342KJOiYOmedVP/vY+vpP8sfwdomliECn/GTasWmViJg=="

Now when we go to our Automation Account and select Hybrid Worker Groups we can see our new hybrid worker under the LukeLabOnPrem group we specified:

How to Use an Azure Automation Runbook

Let test out running a Runbook through our new Hybrid Worker. I have installed the VMware PowerCLI module onto the node. We will run a simple script that will connect to our ESXi host and display a list of all the VMs. First, let’s add in some credentials. This is one of the coolest features of Azure Automation. Go to the automation account and select Credentials on the left-hand side. Choose the Add a Credential option and input some credentials, in this example I’m inputting my credentials to my VMware environment so we can use them in our runbook:

credentials

Now, let’s create a Runbook. In the Automation Account select Runbooks on the left-hand side and choose Create a Runbook:

Runbooks

Fill out the required fields, I am going to create a runbook called VMwareVMs, there can’t be spaces in the name:

How to Use an Azure Automation Runbook

Another slick feature to point out, while we are creating our scripts in the runbook editor, we can select Assets on the left-hand side and choose our credentials that we saved and select Add to canvas. This will paste in the exact command that we need to retrieve those credentials:

Now that I have my quick script to retrieve VM info, I’ll Save and Publish the runbook:

When we go to Start the runbook, we have the option to have it run from our Hybrid Worker. I also selected the LukeLabOnPrem worker group:

The runbook will kick off on-premise and retrieve the VM information from the Get-VM PowerCLI cmdlet proving that our runbook is executing and connecting to infrastructure on-premise:

Wrap-Up

Managed Service Providers should definitely take advantage of the hybrid worker option with Azure Automation Runbooks. It can be a great tool to have in the back pocket for not only clients that have hybrid cloud solutions, but also MSP cloud solutions that require an on-premise presence into client environments. Instead of setting up scheduled tasks in native Windows Server where there is no centralized reporting or visibility on the status of a failed task, consider using Azure Runbooks with the power of Log Analytics alerting.

Let me know in the comments below of ways you’ve been able to utilize Azure Runbooks with Hybrid Workers in a hybrid cloud scenario.

Thanks for reading!

The post How to Use Azure Automation Runbooks for MSP Customers appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/azure-automation-runbooks/feed/ 0
Managing Applications with COM Objects https://www.altaro.com/msp-dojo/com-objects/ https://www.altaro.com/msp-dojo/com-objects/#respond Thu, 21 Feb 2019 20:00:14 +0000 https://www.altaro.com/msp-dojo/?p=1256 PowerShell is awesome. There are tons of cmdlets readily available but sometimes you need a custom solution. Enter COM objects. Here's how to use them.

The post Managing Applications with COM Objects appeared first on Altaro DOJO | MSP.

]]>

For Managed Service Providers, automation is essential for customer satisfaction. Each customer has their own needs that need to be attended to, and they all want theirs taken care of as quick as possible. Luckily, the clever use of PowerShell allows us to live up to the exceptions and SLA’s we’ve established with our customers. There are many PowerShell cmdlets that have been created that allow us to manage various devices and software in IT. However, sometimes we need to provide a custom solution where there isn’t a PowerShell cmdlet available. That’s where COM objects come into play, they can be used to manage applications instead of using a cmdlet.

What is COM?

COM stands for Microsoft’s Component Object Model which is a platform standard that is designed to allow applications to reuse code between multiple applications, hosts, and platforms. PowerShell can use these COM’s to manipulate applications and perform specific tasks that would otherwise be impossible due to lack of cmdlets. I’m going to demonstrate how to use COM objects in PowerShell to input data directly into an excel spreadsheet and save it.

Creating the Excel COM Object With PowerShell

First, let’s create our COM object instance within Windows Powershell by using the New-Object cmdlet with the -ComObject parameter specifying the excel.application class. You can find the various classes to turn into COM objects by searching in the HKEY_CLASSES_ROOT\Excel.Application registry location:

You can also use this PowerShell one-liner from PowerShellMagazine.com to list all available COM objects:

Get-ChildItem HKLM:\Software\Classes -ErrorAction SilentlyContinue | Where-Object {
   $_.PSChildName -match '^\w+\.\w+$' -and (Test-Path -Path "$($_.PSPath)\CLSID")
} | Select-Object -ExpandProperty PSChildName

COM Object in PowerShell

We’ll also set it as a variable so that we can use the COM object to interface with Excel and allow us to manipulate the application.

$Excel = New-Object -ComObject excel.application

When we pipe our $Excel variable to Get-Member, we can see all the various properties that can be set as well as the various Methods that we can use to manage Excel:

$Excel | Get-Member

Creating A New Excel WorkSheet

Now that we have our COM object for Excel set up, we can now start performing administrative tasks such as creating a new worksheet and naming it. I want to get the running processes on my computer and input them into an excel worksheet and then finally save it. We’ll start by creating a new workbook by using the Add() method of the workbooks property and we’ll want to capture it as a variable so we can then rename the worksheet. The syntax looks like this:

$WorkBook = $Excel.workbooks.add()

Now, to rename the worksheet from “sheet1” to “CPU Information we will capture the default Sheet1 as a workbook sheet object and then modify the name property:

$Sheet = $workbook.sheets("sheet1")
$sheet.name = "CPU Information"

To see our work, we can make the excel worksheet visible by the following syntax:

$excel.visible = $true

Importing Data from PowerShell into Excel

Now let’s turn up the complexity, I want to import some of the data from Get-Process into my excel spreadsheet and save it. We can input data into each cell in Excel by using the .range() method of our Sheet object that we previously created. Since I want to create Headers for the information I’ll be inputting I’ll fill row 1 with the header information. The syntax will look like the following:

$Sheet.range("A1:A1").cells="Process Name"
$Sheet.range("B1:B1").cells="CPU"
$Sheet.range("C1:C1").cells="Memory(MB)"
$Sheet.range("D1:D1").cells="Description"

Now when we look at the Excel Spreadsheet, we can see our new headers:

Next step is to create the logic for filling each column with their respective data from Get-Process. We’ll capture our data from Get-Process to a variable. Then use a ForEach loop to add a row into excel for each process that we collect. I’ve created a counter $n that represents the row number for each process, it will increment by 1 for each process. Because we already have headers and don’t want to fill in that row, I’ve made the counter start at 1 before the ForEach loop. The syntax looks like this:

$processes = Get-process -name * | Where-object {$_.CPU -gt 0} | Sort-Object CPU -Descending

$n = 1

Foreach ($process in $processes)
{
$n +=1

$Sheet.range("A$n" + ":" + "A$n").cells= $process.name
$Sheet.range("B$n" + ":" + "B$n").cells= ($process.cpu).tostring()
$Sheet.range("C$n" + ":" + "C$n").cells= ($process.ws/1mb).tostring()
$Sheet.range("D$n" + ":" + "D$n").cells= $process.description

}

When we take a peek at our Excel spreadsheet we can see the data has been populated in each column:

Now let’s save our data by using our sheet object’s SaveAs method:

$sheet.saveas("c:\temp\cpu.xlsx")

 

Conclusion

By creating an Excel COM object in PowerShell, we were able to create an Excel Workbook, collect and import data into our spreadsheet, and then finally save it. You can see now how powerful manipulating COM objects can be. We could use this process to create scripts for Excel that clients can use to increase their efficiency. Also, outlook and word can be manipulated in the same fashion. So think about the use cases where clients need to have their outlook profile recreated to fix an email issue; or for doing a migration to Office 365. Instead of sending someone onsite to run from workstation to workstation and set up user’s emails, a script can be created to set up the new office 365 outlook profile and sent out to each user to run on their own time. What are some other ways that you’ve used or plan on using COM objects with PowerShell? Let me know in the comments below!

Want to learn more about how PowerShell can help you? Read on

Building PowerShell Tools for MSPs series

Further PowerShell reading

 

The post Managing Applications with COM Objects appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/com-objects/feed/ 0
7 Ways You Can Use PowerShell To Supercharge Your MSP Operations https://www.altaro.com/msp-dojo/powershell-msp-operations/ https://www.altaro.com/msp-dojo/powershell-msp-operations/#respond Fri, 08 Feb 2019 13:24:26 +0000 https://www.altaro.com/msp-dojo/?p=1255 What can PowerShell do for your MSP? LOTS. In fact, the possibilities can be daunting so we've put together this guide to the most important applications!

The post 7 Ways You Can Use PowerShell To Supercharge Your MSP Operations appeared first on Altaro DOJO | MSP.

]]>

I’ve seen a lot of discussions online recently from IT Pros who work for managed service providers who are finding it difficult to use PowerShell in their environment. Let’s be honest, there are many hurdles to overcome in the MSP workspace. You are managing multiple customers and typically they all have their own domains and networks. Each customer environment is its own unique IT ecosystem and most of the time you can’t just do an Invoke-Command from your laptop at your desk to manage them. One of the best ways to help with this is with automation and have a better understanding of what PowerShell is capable of. It’s much easier to automate tasks when you know what you can (and cannot) do with PowerShell. To help you get a little bit of an idea, I’m going to go over 7 PowerShell techniques in this post that you can use to help you supercharge your MSP operations.

This article is a great introduction for newcomers to using PowerShell in an MSP setting but also for current users who may learn a thing or two about optimal applications. Let’s get started!

1. Using SFTP For Transferring Files

This one is HUGE. Being able to transfer files back and forth between a customer location opens up so many doors for automation. There are many situations that can take advantage of this such as pushing installation files to client workstations, but one unique way that I’ve used this on in the past was when troubleshooting network issues. We had a customer that was dropping internet connectivity intermittently every day, they were getting upset at us that it was happening and wanted us to fix it. Their ISP could see that they were overusing their connection, which was causing it to appear like it was cutting in and out. Now the witch hunt started, who was using all the bandwidth?

They didn’t have expensive network equipment that allowed them to audit bandwidth consumption on devices so options were limited. I ended up creating a script that would run for hours at a time and create a log file on the workstations of network traffic on the NIC. It would then upload the log hourly to our SFTP server. We used our RMM agent to deploy this script out to all the customer’s workstations and when they were having bandwidth issues we were able to analyze each workstation’s log files on the SFTP server and determine the culprit. As you can see there are many many situations where SFTP can be used with PowerShell.

2. Encrypting Passwords to Run Scripts

One of the hurdles when creating scripts is securing access for a specific user to run a certain script. Maybe you have a customer that wants their managers to be able to run a specific automated task for HR. The ability to encrypt passwords allows one to create service accounts that can be used in combination with encrypted password files to allow end-users to run their automated tasks in a secure way. Not to mention this is useful for your own scripts as well!

Learn how to encrypt passwords using PowerShell

3. HTML Reports

In my five years of working for an MSP, I’ve run into countless situations where a customer asked for a specific report that our current systems were unable to create. Luckily, Powershell can be used to create custom HTML reports that look professional and can be emailed to out on a schedule. One example I’ve had in the past was a customer was working on a big application rollout and was going to be creating multiple VMs on their Hypervisor cluster. He wanted to have a daily report on the redundancy state of his cluster, meaning if it was able to host all the VM workloads in the event of a host failure. We created an HTML report that would calculate the host resources and provide an email with an HTML table specifying the host resources and whether or not it was “N+1”. The client was very grateful that we were able to accommodate his request.

Learn how to create custom HTML reports using PowerShell

4. Ticketing systems and APIs

Ticketing systems are the bread and butter of an MSP. Being able to automate processes into your ticketing system can produce game-changing efficiency in the company. There are many ticketing systems with PowerShell modules already available out on the Powershell Gallery. These modules can provide the ability to create and search through tickets. Also, using Powershell with restful APIs is another must-have technique that allows PowerShell to interact with other applications and services (Like your ticketing system). This can open so many doors for automation in the MSP workspace and the sky really is the limit with this option. I’ve found it’s simply best to start tinkering with this option and see what it can do for you in your own individual case.

Learn how to use PowerShell to optimize your ticking system

5. Deploying Software

Powershell can be used in combination to quickly deploy software to workstations. This has saved me many hours and trips to customer sites. One time I had a customer call into the helpdesk in a panic because they upgraded their main business software and the current version of java on the workstations was no longer compatible. In a matter of minutes, we were able to use PowerShell and our RMM agent to push out the java upgrade to all the workstations.

6. Windows Updates

There are many useful PowerShell modules out on the PowerShell Gallery, I recommend you take a peek at it if you haven’t already. One of the most useful Modules I’ve found was PSWindowsUpdate. This module allows IT Pros to manage Windows Updates with PowerShell. This can be used as a step to build scripts or just whenever you need it! I can’t count how many times I’ve built a server and had to manually kick off windows updates for a few hours to get the OS up to date. It can also be used to remotely kick off Windows Updates on a large list of workstations across multiple customers, making mass rollout of a zero-day patch, for example, much easier and less time-consuming.

Learn how to manage Windows Updates using PowerShell

7. Dashboards

The PowerShell Universal Dashboard module is one of the greatest up-and-coming modules in the PowerShell community. This allows one to easily create their own customized dashboards and they look amazing!. For MSPs, this is a great tool that can be used to display and analyze data for not only customers but for internal use as well. I recommend checking this out and playing around with it. Just like the API section listed above, this is one you have to tinker with and see how it can fit into your own MSPs workflow. It will, you just need to figure out exactly what your needs are first.

PowerShell dashboard

Learn how to be a pro with the PowerShell Universal Dashboard

Wrap-Up

Powershell isn’t always going to fit into every solution (though it will come close!), and with the complexity of a Managed Service Provider business, it can be rather difficult at times. However, in knowledgeable hands, it can be used to overcome many challenges in the MSP workspace. Let me know in the comments below about other ways you’ve used PowerShell in your MSP. We’d love to hear about it, and I’m sure our other readers would be grateful for the mind share!

Thanks for reading!

The post 7 Ways You Can Use PowerShell To Supercharge Your MSP Operations appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/powershell-msp-operations/feed/ 0
Creating Web Scraping Tools for MSPs with PowerShell https://www.altaro.com/msp-dojo/web-scraping-tool-for-msps/ https://www.altaro.com/msp-dojo/web-scraping-tool-for-msps/#comments Thu, 31 Jan 2019 21:16:49 +0000 https://www.altaro.com/msp-dojo/?p=1224 Scraping web pages is a huge time saver for MSPs used it for many tasks. Here's how to create your own web scraping tool using PowerShell!

The post Creating Web Scraping Tools for MSPs with PowerShell appeared first on Altaro DOJO | MSP.

]]>

Building a web scraping tool can be incredibly useful for MSPs. Sometimes there isn’t always an API or PowerShell cmdlet available for interfacing with a web page. However, there are other tricks we can use with PowerShell to automate the collection and processing of a web pages contents. This can be a huge time saver for instances where collecting and reporting on data from a web page can save employees or clients hundreds of hours. Today I’m going to show you how to build your own Web Scraping tool using PowerShell. Let’s get started!

We are going to scrape the BuildAPCSales subreddit. This is an extremely useful web page as many users contribute to posting the latest deals on PC parts. As an avid gamer such as myself this would be extremely useful to check routinely and report back on any deals for the PC parts I’m looking for. Also, because of the limited amount of stock for some of these sales, it would be extremely beneficial to know about these deals as soon as they are posted. I know there is a Reddit API available that we could use to interface with, but for the purpose of demonstrating making a web scraping tool we are not going to use it.

Web Scraping with Invoke-WebRequest

First, we need to take a look at how the website is structured. Web Scraping is an art since many websites are structured differently, we will need to look at the way the HTML is structured and use PowerShell to parse through the HTML to gather the info we are looking for. Let’s take a look at the structure of BuildAPCSales. We can see that each Sale is displayed with a big header which contains all the info we want to know, the item and the price:

scraping a web page

Now, let’s use the Web Developer tool in our browser to further inspect the HTML portion of these posts. I am using FireFox in this example. I can see that each Post is tagged in HTML with the “h2” tag  :

Web Developer tool

Let’s try scraping all of our “h2” tags and see what we come up with. We will use the Invoke-WebRequest PowerShell cmdlet and the URL to the Reddit webpage and save it as a variable in order to collect the HTML information for parsing:

$data = invoke-webrequest -uri "https://www.reddit.com/r/buildapcsales/new/"

Now we are going to take our new variable and parse through the HTML data to look for any items tagged as “h2”. Then we will run through each object and display the “innertext” content which is the text content of the tag we are searching for:

$data.ParsedHtml.all.tags("h2") | ForEach-Object -MemberName innertext

Yay, it worked! We are able to collect all the deals posted:

I like what we have so far, but I don’t only want the post headings, but also the links for each sale. Let’s go back and look at the webpage formatting and see what else we can scrape from it to get the links. When using the inspection tool in Firefox (CTL + SHIFT + C) and clicking on one of the sale links, I can see the HTML snippet for that post:

Looks like these are tagged as “a” which defines a hyperlink in HTML. So we want to run a search for all HTML objects tagged as an “a” and we’ll want to output the “href” for these instead of the “innertext” as we did in the example above. But this would give us all hyperlinks on this page, we need to narrow down our search more to only pull the links that are for sales. Inspecting the web page further, I can see that each sale hyperlink has the class name “b5szba-0 fbxLDD”. So we’ll use this to narrow our search:

$data.ParsedHtml.all.tags("a") | Where{ $_.className -eq ‘b5szba-0 fbxLDD’ }| ForEach-Object -MemberName href

Now we have the links to the items for each post. We now have all the information we are looking for:

Processing Our Web Information

Now that we have the information we want, we need to process it, I would like to create a table for each sale and its respective link. We can do this by using the following syntax:

$data = invoke-webrequest -uri "https://www.reddit.com/r/buildapcsales/new/"

$Sales = $data.ParsedHtml.all.tags("h2") | ForEach-Object -MemberName innertext
$Links = $data.ParsedHtml.all.tags("a")  | Where{ $_.className -eq ‘b5szba-0 fbxLDD’ }| ForEach-Object -MemberName href


Foreach ($Sale in $Sales)
    {
    $index = $sales.IndexOf($sale)
    $row = new-object -TypeName psobject
    $row | Add-Member -MemberType NoteProperty -Name Sale -Value $sale
    $row | Add-member -MemberType NoteProperty -Name Link -Value $links[$index]
    [array]$table += $row
    }


When we go to look at our $table, we can see the correct info:

Taking It Further

Now, let’s take it a step further and make this web scraping script useful.  I want to be notified by text if there is a specific sale for a PC component that I’m looking for. Currently, I’m searching for a good 144hz monitor. So, to get notified of the best deals, I created a script that will run as a scheduled task every 15 minutes on my computer. It will scrape the Reddit web page for any monitor deals and notify me of the deal via text, then it will make note of the deals that have been sent to me in a text file to ensure that I’m not getting spammed repeatedly with the same deal. Also, since I don’t have an SMTP server at my house, I’ve set up a g-mail account to send email messages via PowerShell. Since I want to receive these alerts via text and not email, I am sending the email alerts to my phone number which can be done with each popular carrier. I’m using Google Fi, so I just simply put in my phone number with @msg.fi.google.com and the email goes right to my phone as a text. I’ve also encrypted my g-mail account password into a file with the process outlined in our blog post about encrypted passwords in PowerShell.  After everything’s done, the syntax will look like this:

#Edit this to change the string to web scrape for
$PCPart =  "Monitor]"
#Edit this to change the email address to send alerts to
$EmailAddress = "1234567890@msg.fi.google.com"

#Collect information from web page
$data = invoke-webrequest -uri "https://www.reddit.com/r/buildapcsales/new/"

#filter out headers and links
$Sales = $data.ParsedHtml.all.tags("h2") | ForEach-Object -MemberName innertext
$Links = $data.ParsedHtml.all.tags("a")  | Where{ $_.className -eq ‘b5szba-0 fbxLDD’ }| ForEach-Object -MemberName href

#create table including the headers and links
Foreach ($Sale in $Sales)
    {
    $index = $sales.IndexOf($sale)
    $row = new-object -TypeName psobject
    $row | Add-Member -MemberType NoteProperty -Name Sale -Value $sale
    $row | Add-member -MemberType NoteProperty -Name Link -Value $links[$index]
    [array]$table += $row
    }




#analyze table for any deals that include the PC Part string we are looking for
If ($table.Sale -match $PCPart)
    {
    $SaletoCheck = $table | where-object {$_.sale -match $PCPart}
    ForEach($sale in $SaletoCheck)
        {
            if ((Get-Content C:\scripts\SaleDb.txt) -notcontains $sale.link)
            {
                #Save link to text file so we don't send the same deal twice
                $sale.link | out-file C:\scripts\SaleDb.txt -Append

                #obtain password for gmail account from encrypted text file
                $password = Get-Content "C:\Scripts\aespw.txt" | ConvertTo-SecureString 
                $credential = New-Object System.Management.Automation.PsCredential("lukeautoscript@gmail.com",$password)

                $props = @{
                    From = "lukeautoscript@gmail.com" 
                    To = $EmailAddress
                    Subject = $sale.sale
                    Body = $sale.link
                    SMTPServer = "smtp.gmail.com"
                    Port = "587"
                    Credential = $credential
                    }
                Send-MailMessage @props -UseSsl
              }
        }
    } 

We wait for a sale for a good monitor to pop up and see our end result:

Wrap-Up

As you can see web scrapping tools can be incredibly powerful for parsing useful web pages. It opens up so many possibilities to create useful scripts that one might think were not possible. Like I said previously, it is an art, a lot of the difficulty depends on how the web site is formatted and what information you are looking for. Feel free to use my script in the demo if you want to configure your own notifications for PC part deals. If you’re curious, I ended up getting a good deal on an Acer XFA240 and the picture looks amazing with 144hz! Let me know in the comments below if you’ve created or plan on creating a web scraping tool.

The post Creating Web Scraping Tools for MSPs with PowerShell appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/web-scraping-tool-for-msps/feed/ 2
How to Teach PowerShell to MSP Staff with PSKoans https://www.altaro.com/msp-dojo/teach-powershell-pskoans/ https://www.altaro.com/msp-dojo/teach-powershell-pskoans/#comments Fri, 21 Dec 2018 13:11:02 +0000 https://www.altaro.com/msp-dojo/?p=1189 Automation is vital for MSPs who want to grow and your MSP staff need to get onboard with it. PSKoans is an easy way to learn automation with PowerShell

The post How to Teach PowerShell to MSP Staff with PSKoans appeared first on Altaro DOJO | MSP.

]]>

If you don’t use automation in your MSP then quite frankly, you’re doing it wrong. It’s pretty much the standard way of doing things these days and if you’re not doing it, your competitor is and will probably be providing a better service because of it.

For MSPs, having an employee that is proficient with PowerShell is far more valuable than those who don’t. In fact, Jeffery Snover, the founder of PowerShell, predicts that in the future those who refuse to learn some sort of automation language like PowerShell will gradually become replaced by those who do. The skill set is that vital to a company.

Imagine a scenario where we need to know if a specific event has occurred in the event logs on all client servers. If we had no one with PowerShell skills, we would have to manually dig through the event logs of each server and spend hours filtering through each one. On the other hand, if we have an employee who is adept at PowerShell they can write a script that will automatically dig through the event logs of each server and report back if the event exists.

The benefit is just incomparable, which is why it is important for employees to be trained up on how to use PowerShell. Luckily, there are free community-driven tools like PSKoans that allow IT Pro’s to interactively learn and practice PowerShell to improve their skills and worth to the company.

What is PSKoans?

PSKoans is a project on GitHub started by Joel Sallow. It allows users to learn and practice PowerShell right through the console by providing “challenges” or “questions” that must be completed before progressing on to more challenging questions. The questions start off simple and eventually progress to intermediate and advanced PowerShell concepts. The module makes clever use of Pester, which is an automated unit test framework for PowerShell. It provides an interactive way to learn PowerShell by using PowerShell!

How to Install

You can literally start using PSKoans within 5 minutes. All you need to get started is a machine with PowerShell 5.1 or higher (PowerShell Core included). Just open up an Administrative PowerShell console of the required version and run the following command to install  the Pester module (if you already don’t have it installed):

Install-Module -Name Pester

After the Pester module is installed, we can now install the PSKoans module by using the same Administrative PowerShell console:

Install-Module -Name PSKoans

Now we are set to start using PSKoans. Simple, right? Because of the ease of installing PSKoans, it really makes this tool a “must have” tool for employees to use for “sharpening” their PowerShell skills.

How It Works

Now that we just installed PSKoans in under 5 minutes, let’s get started using it. To start using PSKoans, simply open up a PowerShell console and type:

Measure-Karma

Now, the journey begins! Because this is all done through the PowerShell console, it may be a little confusing at first, but let’s go over what we are looking at. The goal is to reach “Enlightenment” by completing each challenge. The first challenge is in the red text. The script-block shown is incorrect and we need to input the correct answer where the “_” symbol is. For the first question, the expected outcome is $true but we are currently getting null with this comparison script block. We need to edit the “AboutAssertions.Koans.ps1” file with the correct syntax and run a pester test against the koan to check if we pass the question:

PSKoans PowerShell

Let’s answer the first problem by running the following command to open up the .ps1 and edit line 27 to contain the proper code:

Measure-Karma -Meditate

The command checks if Visual Studio Code is in $env:Path and will open up VSCode to use for editing the .ps1 files:

PSKoans Visual Studio Code

Otherwise, it will simply open up the Koan’s folder directory in Explorer which is housed in UserFolder\PSKoans, in this case, we will just want to open up the .ps1 file with PowerShell ISE (or even better, install Visual Studio Code, its FREE!):

To answer the question, we replace the “_”  on line 27 with “$true” and save the file:

To check out the answer we re-run Measure-Karma since we entered the correct answer we are now on to a new “Koan”, also we can see in the Path Thus Far section that we have 1 out of 246 questions correct now:

Now, on to the next Koan. Check out the GIF below which demonstrates me answering the next Koan:

Reaching the Journey To Enlightenment

The problems start off very simple and get harder and harder with each level. If you get stuck on any Koan along the way, be sure to reach out to us in the automation section of the Dojo Forums by Altaro. These forums were recently released for helpful hints and discussion.  This is definitely an amazing tool for studying PowerShell since it can be installed so quickly and is community driven which means new Koans will continuously come out. Also, the biggest reason why PSKoans is so awesome is that it’s 100% FREE. I highly recommend that MSPs INVEST in their engineers and utilize PSKoans as a training tool for staff. Having some sort of monetary incentive for completing PSKoans would be extremely beneficial to the company as it allows for more and more employees to use automation to improve company processes and customer experiences. Like I mentioned earlier, an engineer’s worth starts to multiply once they begin to utilize PowerShell and in the spirit of PSKoans, reach Enlightenment!

If you’re looking for specific help with PowerShell why not open up your query to the IT community here at Altaro? Head on over to the Altaro Dojo Forums and ask any question you like about PowerShell. I’m active on the forums so I’ll be happy to answer your question myself!

Want to learn more about how PowerShell can help you? Read on

Building PowerShell Tools for MSPs series

Further PowerShell reading

The post How to Teach PowerShell to MSP Staff with PSKoans appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/teach-powershell-pskoans/feed/ 2
These 3 PowerShell Modules Will Supercharge Your Ticketing System https://www.altaro.com/msp-dojo/powershell-ticketing-system/ https://www.altaro.com/msp-dojo/powershell-ticketing-system/#respond Thu, 20 Dec 2018 18:23:28 +0000 https://www.altaro.com/msp-dojo/?p=1168 Ticketing systems are essential for swift and efficient customer service however these PowerShell modules will push your ticketing onto the next level

The post These 3 PowerShell Modules Will Supercharge Your Ticketing System appeared first on Altaro DOJO | MSP.

]]>

We all know that Automation for Managed Service Providers can provide huge benefits. In the past, I’ve written about automating many components of IT. However, I believe that one of the most efficient gains an MSP can obtain from automation is through automating their ticketing system. The ticketing system is the core of any MSP’s service delivery. It provides the tracking of incidents as well as documented historical data on client resources. An MSP that can find a way to bring automation to their ticketing system and to integrate it with other internal processes and technologies puts them leaps and bounds ahead of the next competitor and is one step towards the future of how MSPs operate.

Having the ability to perform automated processes based on ticket creation opens up so much potential for automated workflows for clients.

For example, tickets that get automatically created for servers with low disk space – – – An automated “disk cleanup” process can be run against servers specified in these tickets for automated self-healing which can provide faster reaction time and reduce operational strain on staff resources!

Not only do we get the ability to provide automated self-healing on tickets, but also to use PowerShell to perform additional queries on ticket data. This can be extremely useful in cases where the native reports in the ticketing system don’t measure up. If we can use Powershell to quickly sort through ticket data for clients, we can manipulate the data and create useful reports for information that we wouldn’t normally be able to gather.

Before we get started, there is one other thing I wanted to mention. If you want a more holistic overview of MSP monitoring and ticket generation before you go any further on this post, I just recently finished up a new eBook specifically on MSP Monitoring and Reporting. The eBook can be downloaded here.

How To Find PowerShell Cmdlets for My PSA

The automation movement in IT is growing larger and larger each year. Some vendors that are aware of this are publishing their own PowerShell cmdlets for their products. At the very least most have some sort of REST API that can be manipulated with PowerShell’s Invoke-Restmethod cmdlet.  The “I want it done now” theme is starting to become extremely common in IT and we are now in an era with PowerShell that vendor and community made cmdlets for 3rd party applications are starting to become a norm.

One of the most promising platforms to find PowerShell modules is the PowerShell Gallery. This is an online repository of PowerShell Modules that anyone can download and start using in an instant. Also, many of these modules start out as projects on GitHub. GitHub is a public software development platform that allows developers to upload their code and collaborate with others around the world on projects. This movement has allowed for many interesting projects to come about and PowerShell modules for certain applications is one of them.

I recommend checking out both the PowerShell Gallery and Github for any PowerShell related projects available for your current Ticketing System. Just like how IT Pro’s use google to find information about a certain issue they are trying to resolve, it is now a necessary skill to be able to search for available PowerShell cmdlets and use them to fix certain scenarios.

One scenario I ran into a year or so ago was when I was working with a file system archiving application, I was tasked with running configurations on hundreds of folders and it would have taken a few hours to perform this task manually. However, I did some digging around on PowerShell gallery and found that someone had published a module for this application and within 10 minutes I had a script that was automatically performing my tasks for me.

3 Popular MSP Ticketing Systems and PowerShell Modules for Automation

Below I have outlined GitHub projects for three of the most commonly used ticketing systems today. For the sake of time, I won’t go into how to use each one, I am just providing the links to the current projects and directions on how to install the modules. If your ticketing system is not one of them I recommend searching for it on PowerShell Gallery first to see if there is a module published for it yet. If nothing turns up, do some digging on Github to find out if someone has created a project for your PSA application.

Service Now

Install the modules by running the following syntax in an administrative PowerShell Console:

install-module ServiceNow

Check out the Github project documentation here.

Connectwise

Install the modules by Downloading the .zip file from the GitHub project. Extract the contents of the .zip, and Import the module by using the following syntax:

Import-Module "PathToModuleFolder\PSConnectWise.psm1" -Force

Check out the GitHub project documentation here.

Auto Task

Install the modules by Downloading the .zip file from the GitHub project. Extract the contents of the .zip, and Import the module by using the following syntax:

Import-Module "PathToModuleFolder\Autotaskcli.psm1" -Force

Check out the GitHub project documentation here.

Wrap Up

For an MSP, having the ability to integrate their ticketing system with other applications and scripts can open up many doors for efficiency and standardization. There are many projects on GitHub where developers can come together and create a tool for many to use. Again, this is starting to become a norm and MSPs need to learn to take advantage of this in order to become more efficient with the services that they are selling. Let me know in the comments below of other ticketing systems you’ve found PowerShell modules for and your experience with using them.

Thanks for reading!

Want to learn more about how PowerShell can help you? Read on

Building PowerShell Tools for MSPs series

Further PowerShell reading

 

The post These 3 PowerShell Modules Will Supercharge Your Ticketing System appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/powershell-ticketing-system/feed/ 0
4 Methods for Reducing Excessive Customer Monitoring Email Alerts https://www.altaro.com/msp-dojo/monitoring-email-alerts/ https://www.altaro.com/msp-dojo/monitoring-email-alerts/#comments Thu, 20 Dec 2018 16:39:27 +0000 https://www.altaro.com/msp-dojo/?p=1167 Comprehensive monitoring doesn't have to mean being constantly bugged by email alerts. These 4 methods enable you to monitor everything without the mess

The post 4 Methods for Reducing Excessive Customer Monitoring Email Alerts appeared first on Altaro DOJO | MSP.

]]>

Monitoring and alert notifications are the bread and butter of Managed Service Providers. Without a proper monitoring solution in place, a Managed Service Provider would have an extremely difficult and almost impossible time managing their client’s IT Infrastructure. Naturally, with monitoring come email alerts. And the more you monitor the more email alerts you will have to put up with right?

Wrong.

I recently penned a new Altaro eBook on the subject of monitoring, and while the book was about MSP monitoring more holistically, I wanted to pair it with a blog post that specifically talked about handling excessive alerts as well. So if you want to know more about boosting your MSP through monitoring read the free eBook Best Practices for Mastering MSP Customer Monitoring.

Moving on, there are many monitoring solutions out there that allow MSPs as well as their clients, to receive notifications via automated ticket creation, emails, phone, or text. These options are all great and very critical to providing outstanding service to clients, however, there is a finite limit to how many different alerts and notifications that can occur at one time before they start to lose their benefit and just become “noisy alerts”. The human brain can only focus on so much in one day and spamming employee inboxes with multiple daily alerts for every little issue for every client starts to become overwhelming and the truly important alerts become easily missed. MSPs need to limit the amount of email spam to ultimately be successful.

How to Reduce Excessive Customer Monitoring Email Alerts

Out of all the places I’ve worked in IT, the number one thing they all have in common is an extremely massive amount of email alerts. One of the very first tasks everyone does when they first start a new IT job at a company is to create filters for their email alert spams. With the number of email alerts that typically flow through each day, it’s just too much to have them go to the main inbox. Other non-alert related emails are then easily missed. The unfortunate trade-off that comes with this is that now that alerts are going to a separate email folder, they can be easily missed if that folder isn’t checked continuously which in turn makes it very easy to miss alerts. MSPs can help reduce email alerts by using the following 4 guidelines:

Method 1 – Use Dashboards Where Possible

Not every alert needs to be an email. There are many cases where a dashboard should be used to convey alerts to personnel instead. For example, everyone doesn’t need to get spammed a client “disk space” report every day. Instead, make a dashboard that the operations team can review daily and solve the issues. This makes much more sense for this type of alert and prevents the IT staff from creating inbox filters or deleting this email every day.

For the helpdesk, dashboards are amazing and can be greatly used for providing ticketing metrics and client outages. There are many monitoring software solutions out there that provide dashboard capabilities. I also highly recommend looking into the PowerShell Universal Dashboard, as it can allow MSPs to create their own dashboards and integrate them with other application’s APIs to essentially provide one single pane of glass for multiple applications. Be sure to check out one of our previous posts that explains how to get started with the PowerShell Universal Dashboard. A simple example with code and screenshot from our how to post on PUD is shown below as an example:

Dashboard

Method 2 – Only Send Alerts to Those Who Need Them

Email alerts should only be sent to the teams that are responsible for dealing with them. Don’t send email alerts on low disk space to the Network Engineers that only deal with tel-cos, routers, firewalls, and switches. Rarely should an email alert get sent to the entire engineering department. Emails can be hard enough to keep up as it is, reduce this by keeping the email alerts that IT staff receive relevant to their role and responsibility.

Properly labeled and maintained distribution groups play a key role in this step.

Method 3 – Reduce E-Mail Reports

On the operational side, it’s very easy to go report crazy. We want our daily reports on metrics like low disk space on servers, VMs that are showing high CPU utilization, Hypervisors that are over-utilized….etc…etc. These are all very important metrics to monitor and keep track of, however, the amount of “daily reports” emailed out to teams can start to build up quickly. Instead of just adding another “daily report” to email out, try to combine reports where they fit. For example, make a “Server Health” report that covers all of these important metrics that we mentioned previously in just one email. Not only should MSPs look for ways to combine their email reports in a single email, but also to send their email reports out only if there is an issue that needs to be visible. We don’t need a daily email report that shows all of our servers are in good health, that’s what a dashboard is for. Alert the team only when they need to be alerted.

As an example of this take a look at one of my previous posts on using PowerShell with HTML to mail out server storage information. You could easily add other metrics into a report like this with similar syntax, and it can prove to be quite valuable.

Method 4 – Mute Alerts During Maintenance

This one is a big one. I’ve seen it many times where engineers are doing maintenance and forget to “mute” alerts. Now the entire department gets a flurry of email alerts due to monitored components being down. Some people start looking into the outage alerts and end up wasting their time to find out that it’s just a scheduled maintenance event where the alerts were not muted. No matter what monitoring system is in place, this always happens. MSPs need to have processes in place that prevent these situations from happening. Have a monitoring system in place that allows engineers to mute their alerts ahead of time for a one time schedule, or ensure that all staff is proficiently trained on how to properly mute all their alerts. Believe it or not, but I’ve seen instances where these steps were forgotten and needless alerts were created.

Wrap-Up

Email alerts are an absolute must for MSPs to be successful, however, they need to be careful that they are not causing “Email Alerts Numbness” for their IT staff. There is a real risk of having too many email alerts daily to the point where critical issues start to slip through the cracks. Let me know in the comments below about your experience with too many Alerts in your organization and what you did to solve it.

Also, once again, if you’re interested in more about monitoring for Managed Service Providers, be sure to check out my recently released eBook:

Best Practices for Mastering MSP Customer Monitoring – free MSP eBook

Best Practices for Mastering MSP Customer Monitoring is authored by long-time MSP monitoring and automation ninja, Luke Orellana. In this eBook, Luke draws on his extensive experience to explain what makes a best-in-class customer monitoring service by answering the following questions:

  • What should be monitored?
  • What is the most effective way to provide monitoring?
  • Who should be alerted?
  • How should alerts be handled?
  • What automated actions should be taken (if any)?
  • And more!

customer monitoring eBook

The post 4 Methods for Reducing Excessive Customer Monitoring Email Alerts appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/monitoring-email-alerts/feed/ 2
Building Powershell Tools for MSPs: Using Try and Catch https://www.altaro.com/msp-dojo/powershell-try-catch/ https://www.altaro.com/msp-dojo/powershell-try-catch/#respond Thu, 06 Dec 2018 15:59:44 +0000 https://www.altaro.com/msp-dojo/?p=1159 How to use Try/Catch blocks in PowerShell to add an extra layer of stability to your custom built tools. This has been a massive time-saver for me!

The post Building Powershell Tools for MSPs: Using Try and Catch appeared first on Altaro DOJO | MSP.

]]>

As a PowerShell tool creator, planning for script errors is one of the biggest challenges that I’ve faced when creating a tool that will be used by multiple people. For example, maybe the tool I just created is supposed to move some files and the user running that tool doesn’t have proper permissions to move said files; or maybe the tool requires password input and the user “fat fingered” their password so the rest of the commands are failing. For MSPs that are creating their own PowerShell scripts either for their team or for clients, planning for these scenarios and incorporating Try and Catch statements into their tools makes for a far more stable experience for the tool users. Unfortunately for PowerShell newcomers, Try and Catch statements tends to be looked at as more of an advanced method of scripting and is usually overlooked. Let’s go through Try and Catch so you can get started using it with your custom tools!

What is a Try/Catch Block in PowerShell?

You should be using a Try block in any section of your script that requires monitoring for errors that can cause the script to terminate. If there is a terminating error that is found inside a Try block, PowerShell will look for a Catch block to run. The code that is contained in the Catch block should be used to attempt to remediate the failing commands inside the Try block.

For a basic example of this use case, lets say we want to run a command against multiple servers using a certain account that has local admin access to the servers in our list. Easy enough, we could just make a foreach loop that cycle’s through the servers in the list and executes the command remotely with Invoke-command. Something like this:

ForEach ($server in $serverlist){

    invoke-command -ComputerName $server -Credential $credential -ScriptBlock { $script}

}

Now we will throw in a hurdle to our scenario. There are some server’s that exist in our list that the 1st set of credentials don’t have access to. So essentially, those servers will error out on their attempt to invoke a command due to improper credentials. However, if we incorporate a Try Catch statement, we can place our first attempt at invoke-command inside a Try block and then include our second set of credentials in a Catch block immediately after. This will direct PowerShell to try the first set of credentials and if the command fails, try the 2nd set of credentials that we know will be able to connect. The Syntax looks like the following:

ForEach ($server in $serverlist){

    Try{
        invoke-command -ComputerName $server -Credential $credential -ScriptBlock { $script}
    }Catch{
        
        invoke-command -ComputerName $server -Credential $credential2 -ScriptBlock { $script}
    }
}

This is a very basic example, however, we can use Catch blocks for more than just providing a “work around” in certain situations. They can also be used for tracking errors. If we wanted to just keep a log of servers that couldn’t connect we could just output the server name to a log file. Also, we could use the Catch block to notify the script user of the error by performing a write-verbose about the issue. Using the Try Catch blocks we went from having a basic script that would just error out on server’s where invoke-command would fail, but now we have a much more efficient and stable script.

What is a Finally Block in PowerShell?

Now that we know how to use a Try and Catch Blocks, we can’t forget about Finally blocks. Finally blocks are typically used for the “cleanup” of the Try block, usually to output info to a log, or clean up background jobs. After a Catch block is completed the Finally block is run.

Note: that you can also have a Try Finally block that doesn’t contain any Catch block at all.

What benefit do I get for using a Finally block? Why not just do a Try Catch block and then clean up at the end of the script? This is why a Finally block is so awesome, even if you terminate a script using CTRL + C the contents of the Finally block are still going to be run before the entire script is terminated.

Check out this example, we create a simple counter loop with a Do Until statement. We let it run in ISE and then i terminate the script by clicking the Stop button:

Do { $number++} Until ($number -eq 0)

Write-host -m "The counter got to $number"

We can see in the picture below that once the script is terminated, the write-host line is never executed:

PowerShell

Now, lets run through the same scenario with a Try Finally block. We’ll put the counter loop in the Try block and the Write-Host output in the Finally block. We let the counter run and then terminate the script in ISE:

Try {
    Do { $number++} Until ($number -eq 0)
}Finally {
    Write-host "The counter got to $number"
}

We can see in the example that immediately our contents in the Finally block are run after we terminate the script. We can see our write-host statement:

PowerShell write-host statement

Finally, blocks are extremely useful for cleaning up complex scripts that are kicking off background jobs or performing some sort of detailed process that must be cleaned up or “reset” if not completed successfully. This is all the more reason why MSPs need to use Try/Catch/Finally blocks in their PowerShell scripts.

This creates a ton of options as you continue building your collection of PowerShell scripts!

Wrap-Up

As an MSP having PowerShell tools for the team to use is an extremely powerful benefit. But, in order to ensure that a script has a successful cleanup at the end upon either termination or completion, PowerShell tool users need to create their tools with error scenarios in mind and plan for what remediation and clean up steps need to be done. Once you’ve done this, you take your PowerShell game to an entirely new level and you and your customers will reap the benefits!

What about you? Have you tried using these types of blocks in your own scripts? Have they worked well for you? Let us know in the comments section below!

 

Want to learn more about how PowerShell can help you? Read on

Building PowerShell Tools for MSPs series

Further PowerShell reading

 

 

The post Building Powershell Tools for MSPs: Using Try and Catch appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/powershell-try-catch/feed/ 0
How to Create Intelligent PowerShell Scripts with Azure AI https://www.altaro.com/msp-dojo/azure-ai/ https://www.altaro.com/msp-dojo/azure-ai/#respond Thu, 29 Nov 2018 12:01:38 +0000 https://www.altaro.com/msp-dojo/?p=1127 Microsoft's Azure Cognitive Services use artificial intelligence which can streamline MSP tasks. How to use Microsoft's machine learning algorithms here

The post How to Create Intelligent PowerShell Scripts with Azure AI appeared first on Altaro DOJO | MSP.

]]>

Worried artificial intelligence will take over the world? Well, it doesn’t look like that’s happening any time soon but it can help your MSP take to new levels with Azure AI.

Artificial Intelligence has become a big buzzword within the IT world. Companies like Microsoft have been heavily investing in AI projects and are now allowing companies to take advantage of this through their Azure Cognitive Services. One of the recently released cognitive services that they are offering is the custom vision service which allows companies to use Microsoft’s machine learning algorithms to analyze images and classify them however they desire. Managed Services Providers can now harness this power in their own PowerShell scripts to provide even more automation for their clients. For example, we’ll make up a simple scenario. Let’s say as an MSP we want to start offering clients a “rack organization” service, where we will go onsite and organize their racks and clean the cabling up. We need to determine which clients could benefit from this service so we can target them. This would require going through each client’s rack picture in our client documentation to determine which ones require some rack cleanup, then we can reach out to each client and offer additional services for organizing their server racks. However, with Azure’s Custom Vision service, we can automate this process.

Getting Started

To get started creating our own Custom Vision API, we need to go to Azure. If you don’t have an account then you can sign up for one. Once logged in, go to Create a Resource on the left-hand side and type in Custom Vision. This is currently in the preview stage right now, so there will be a “Preview” next to the service. Go ahead and select it:

custom vision - Azure AI

We will name our new resource LukeLab-CustomVision. Fill out the information for pricing, location, and resource group. Make sure you read up on how the pricing works, it is calculated based on how much you use the API. Click Create when done:

Create - Azure AI

Wait a little bit for the resource to finish deploying. Once we are all set we can now start training our API to classify images of our server racks. Navigate to the resource and select the Custom Vision Portal option under Quick Start:

Custom Vision Portal Quick Start - Azure AI

Now click the Sign In button and we will be navigated to the custom vision web page:

Cognitive Services

Let’s get set up with a new project, click New Project:

New Project - Azure AI

We will call our project “Server Rack Health” and set the project type to Classification as we want to look at a picture and classify it as something. Since we are just wanting to classify each image as either good (organized and neat rack) or bad (messy cabling and needs cleanup) we will choose the Multiclass setting. Now we can select Create Project:

Create Project - Azure AI

Training Our API

Now that we are all set up, we can begin uploading pictures and training our classifier to detect a messy server rack. The more pictures uploaded the more accurate the prediction will be. I’m not going to go into detail on the classifier because that would need an entire post in itself, but I recommend checking out this guide for more information on how to train your classifier. What I have done is upload 10 pictures and classified them as either good or bad. Then I have trained the classifier to determine which pictures contain a good rack or a bad rack:

 

 

Service Rack Health

Using the Custom Vision API in PowerShell

So now that we have our classifier trained. We have our folder set up with 3 client-server rack pictures:

PowerShell clients

Now we can use PowerShell to upload each photo to our Custom VIsion API to analyze the image and classify it as “good” or “bad”. To do this we need a few keys from our Custom Vision Resource and Azure makes it extremely easy to get this information. Select the Performance tab of our classification that we made on the Custom Vision Web Page. Then click on the Prediction URL to get our keys:

Prediction URL

There are two different sets of URL’s one is for if we are analyzing images on the web ( which could be amazing for analyzing pictures a crossed the internet) the other, which we are using, is for uploading an image from a file on our computer to the API. So we will save the info from the second group:

 

How to Use Prediction API

 

Now comes the magic, we need to establish our keys in our code first in order to properly communicate with the Custom Vision REST API, so we will set the following variables:

$header = @{'Prediction-Key'= '1765f185ce924daf80e3f60661c03b0c'}
$url = "https://southcentralus.api.cognitive.microsoft.com/customvision/v2.0/Prediction/faa8a0a3-2a36-46d0-b3d0-3a661712a59f/image?iterationId=4489e653-85bd-4da8-bc92-a647efa83eb0"
$Method = "POST"
$ContentType = "application/octet-stream"

Since all these variables will be used as parameters, let’s clean it up a bit and split them:

$properties = @{
    Uri         = $url
    Headers     = $Header
    ContentType = $ContentType
    Method      = $Method
    
}

Now we will collect all of our image files in our “clients” folder. In our case, the server rack pictures are all .jpg files. So I’m using the Where-Object cmdlet to grab any .jpg files in our “clients” folder:

$photos = Get-childitem "C:\AITest\Clients" -Recurse | where-object {$_.name -like "*.jpg"}

Lastly, we can create a ForEach loop to run through each picture in our $photos variable and upload each image to the Custom Vision API which then analyzes the image and classifies it according to the classifier that we built in the previous steps. I would also like the script to return the full name of the image as well as the prediction result:

Foreach($picture in $photos) {
$info = (Invoke-RestMethod @properties -InFile $picture.FullName).predictions
$picture.FullName
$info | Sort-Object -Property probability | Select-object -ExpandProperty TagName -Last 1
}

When we run the ForEach Loop, we get our file name and prediction:

ForEach Loop

As we can see, the Custom API did a great job with classifying each rack:

Custom API

If we wanted to, we could take our script a step further and automate the process of emailing each client with the picture of their server rack and explain the new MSP “Server Rack Cleanup” offering. This is just a simple example of what types of feats can be done when utilizing artificial intelligence with our PowerShell scripts. As AI services like these get more and more advanced we will start to see more of an appearance of AI in the IT workspace. Managed Service Providers that are able to find these “niches” that utilize AI will be able to assist their clients and provide services on another level compared to their competitors.

Want to learn more about how PowerShell can help you? Read on

Building PowerShell Tools for MSPs series

Further PowerShell reading

The post How to Create Intelligent PowerShell Scripts with Azure AI appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/azure-ai/feed/ 0
Building PowerShell Tools for MSPs: Automating KeePass https://www.altaro.com/msp-dojo/keepass/ https://www.altaro.com/msp-dojo/keepass/#comments Thu, 27 Sep 2018 14:32:49 +0000 https://www.altaro.com/msp-dojo/?p=946 Automate the deployment of an ESXi host, generate a new password using KeePass, then automatically store it with this PowerShell module. Step-by-step guide

The post Building PowerShell Tools for MSPs: Automating KeePass appeared first on Altaro DOJO | MSP.

]]>

KeePass is one of the most widely used password management tools used today in IT. It’s simple to use and secure which meets the needs of most businesses. Also, the fact that it’s open source goes a long way for the community to trust its code and ensure it’s not stealing customer data. It is one of the top free tools an MSP can use for password management. To top it all off, there is a PowerShell module available for automating and managing KeePass databases. This is incredibly useful for MSPs as they usually have a tough time tracking all the passwords for their clients and ensuring each engineer is saving the passwords to KeePass for each project. Being able to automate the deployment of an ESXi host, generate a new password, then automatically store it without any human intervention is a godsend. I can’t count how many times I’ve run into an issue where a password was nowhere to be found in KeePass or was typed incorrectly into KeePass. Quite simply, if you’re an MSP managing a large number of systems, you need KeePass, and you need to automate it. Here’s how.

Getting Started Automating with KeePass

If you don’t have KeePass, you can download it here. Also, you can get the module from the PowerShell Gallery here. Or on a Windows 10 machine open up an administrative command prompt and type:

Install-Module -Name PoShKeePass

KeePass

Now when we run a quick Get-Command search, we can see we have our new KeePass functions and we’re ready to go!:

get-command -Name *kee*

setting up KeePass

Connecting With KeePass

First, we need to establish our connection to our KeePass database. I have a KeePass database created in a folder, notice the .kdbx file extension which is the database extension used for KeePass:

Now we will use the New-KeePassDatabaseConfiguration cmdlet to set up a profile for the connection to the .kdbx file. We will also use the -UserMasterKey parameter to specify that this KeePass database is set up to use a Master Key. There several different ways of configuring authentication to a KeePass database, but for the purpose of this demo we are going to make it simple and use a MasterKey which is just a password that is used to access the database:

New-KeePassDatabaseConfiguration -DatabaseProfileName 'LukeLab' -DatabasePath "C:\Temp\KeePass\Database.kdbx" -UseMasterKey

If I run Get-KeePassDatabaseConfiguration I can see my database profile is set up and points to the .kdbx:

 

Generating Passwords with KeePass

One really cool feature of the KeePass PowerShell cmdlets is that we can use KeePass to generate a password for us. This is very useful when setting unique passwords for each server deployment. This can be done with regular PowerShell coding, however, the KeePass modules allow us to generate a password in 1 line and on the fly without any extra coding. We will create our password variable and use the New-KeePassPassword cmdlet to generate a password. The parameters are used to specify the type of password complexity that we want:

$password = New-KeePassPassword -UpperCase -LowerCase -Digits -SpecialCharacters -Length 10

 

Now when we inspect our $password variable we can see that its a secured KeePass object:

Now let’s upload a new entry to our KeePass database. Let’s say we are deploying an ESXi host and want to generate a random password and save it to our KeePass database. We will use the New-KeePassEntry cmdlet and specify our profile “LukeLab” that we set up earlier. We are also going to use our $password variable as the password profile that we want to use for the password complexity requirements. Then we get prompted for our Master Key password:

New-KeePassEntry -DatabaseProfileName LukeLab -Title 'ESXi1' -UserName root -KeePassPassword $password -KeePassEntryGroupPath Database/ESXi

Now, when we open up our password database we can see our new entry and a randomly generated password:

Updating A KeePass Password

Password rotating is now becoming a normal standard within IT. Security is bigger than ever and the need to change passwords every so often has become a necessity. Luckily, with KeePass and PowerShell, we can create scripts that automate the process of changing our ESXi host password and then update the new password in KeePass. We start by collecting the current KeePass entry into a variable by using the Get-KeePassEntry cmdlet:

$KeePassEntry = Get-KeePassEntry -KeePassEntryGroupPath Database/ESXi -Title "ESXi1" -DatabaseProfileName "LukeLab"

Next, we use the Update-KeePassEntry cmdlet to update the entry with a new password.

Update-KeePassEntry -Title 'ESXi1' -UserName root -KeePassPassword (New-KeePassPassword -UpperCase -LowerCase -Digits -SpecialCharacters -Length 10) -KeePassEntryGroupPath Database/ESXi -KeePassEntry $KeePassEntry -DatabaseProfileName 'LukeLab'

Now when we look at our password entry for ESXi1 we can see it has been updated with a new password:

Now let’s update our ESXi system by obtaining the secure string from the new Entry and changing the password on the ESXi Host.

We save our entry to a variable again:

$KeePassEntry = Get-KeePassEntry -KeePassEntryGroupPath Database/ESXi -Title "ESXi1" -DatabaseProfileName "LukeLab"

Now we have our password as a secure string. If we look at the properties using $keepassentry we can see the secure string object is there:

So now we can use this variable to create a credential object and pass that along to a script and change the ESXi password to this new password:

$newcreds = New-Object System.Management.Automation.PSCredential("Root",$keepassentry.password)

 

And, just to prove it, I can use the GetNetworkCredential method and show the decoded password is the same that is in our KeePass:

Wrap-Up

This is a very powerful module as it allows IT Pros to automate the way they manage and enter passwords. There are many more capabilities than the KeePass module allows us to do and the module author is expanding on more and more advanced features. This is an exciting time to be in IT where we are able to use an assortment of PowerShell modules to accomplish feats that were near impossible only a few years ago.

What about you? Do you see this being useful for your engineers? Have you used KeePass for password tracking before? Let us know in the comments section below!

Thanks for reading!

The post Building PowerShell Tools for MSPs: Automating KeePass appeared first on Altaro DOJO | MSP.

]]>
https://www.altaro.com/msp-dojo/keepass/feed/ 2