Tag Archives: Powershell

Compare installed vs available Microsoft Azure PowerShell versions

When running Microsoft Azure PowerShell certain cmdlets and functions are only available in the latest version of Azure PowerShell. So how do you know if you have the latest version? Well, this snippet will check your currently installed version and then ask the Web Platform Installer for the available version. It’ll then display the version numbers, letting you know if you’re current or not.

Just paste the entire code snippet into your PowerShell-prompt or embed it and just call the function.

— Begin snippet —

function Get-WindowsAzurePowerShellVersion
{
[CmdletBinding()]
Param ()

## - CHECK INSTALLED VERSION
Write-Host "`r`nInstalled version: " -ForegroundColor 'Yellow';
(Get-Module -name "Azure" | Where-Object{ $_.Name -eq 'Azure' }) `
| Select Version, Name, Author | Format-List;

## - CHECK WEB PI FOR AVAILABLE VERSION
Write-Host "Available version: " -ForegroundColor 'Green';
[reflection.assembly]::LoadWithPartialName("Microsoft.Web.PlatformInstaller") | Out-Null;
$ProductManager = New-Object Microsoft.Web.PlatformInstaller.ProductManager;
$ProductManager.Load(); $ProductManager.Products `
| Where-object{
($_.Title -like "Microsoft Azure Powershell*") `
-and ($_.Author -eq 'Microsoft Corporation')
} `
| Select-Object Version, Title, Published, Author | Format-List;
};
Get-WindowsAzurePowerShellVersion

— End of snippet —

Azure PowerShell

Uploading your RemoteApp image directory from Azure to RemoteApp

If you’ve been working with RemoteApp for a while you’ve most likely gotten tired of downloading and uploading that image by now. Most of us have probably set up a VM in Azure and added a disk to it, just bouncing the VHD off of that one. Saves a lot of time just staying in Azure. But still, downloading it IS time consuming so to get around that I’ve written a script. Before you download it there are some pointers:

There is NO error checking. Meaning you must remember to disable EFS, install all the roles/features and run sysprep manually. If you forget something you’ll notice that when you try to start your image. That’s VERY late in the process.

The script needs you to have the Azure Storage SDK installed. Same here, if the path to the DLL has changed it’ll fail. If my calendar decides to clear out I’ll give it some time and clean it up but for now it’s a quick and dirty fix… Copy below, save as .ps1 and off you go!

# Load Assembly – Without this file, it’ll all fail…
Add-Type -Path “C:Program Files (x86)Microsoft SDKsAzurePowerShellServiceManagementAzureNetworkMicrosoft.WindowsAzure.Storage.dll”

# Source information
# Information from storage account
$sourceStorageAccount = “storageaccountname” # <- Storage account name
$sourceStorageKey     = “yourstoragekey” # <- The key to your storage account
$sourceContainer      = “vhd” # <- Container name
$sourceFilename       = “RemoteAppTemplate.vhd” <- VHD name, can be seen in your container
$sourceContainerUri   = [String]::Format(“https://{0}.blob.core.windows.net/{1}”, $sourceStorageAccount, $sourceContainer)

# Destination information
# Information from RemoteApp upload script commandline
$destStorageAccount = “cdvne195334804rdcm”    # <- Destination name
$destStorageSAS     = “?sv=2012-02-12&sr=b&si=f6939bb2-a99d-43b6-823a-fe8ad44f5c20&sig=6q%2Bk8t7xzzC7DeICrWvb39rh4lUEijg93UFL7631V6s%3D” # <- SAS key
$destContainer      = “goldimages” # <- Container name
$destFilename       = “f6939bb2-a88d-43b6-811a-fe8ad41f5c20.vhd” # <- VHD name, can be seen in the command line from RemoteApp
$destUri            = [String]::Format(“https://{0}.blob.core.windows.net/{1}/{2}”, $destStorageAccount, $destContainer, $destFilename)

# This is where the magic happens

Write-host “Uploading your image…”
$sourceCredentials = New-Object Microsoft.WindowsAzure.Storage.Auth.StorageCredentials($sourceStorageAccount, $sourceStorageKey)
$sourceContainer = New-Object Microsoft.WindowsAzure.Storage.Blob.CloudBlobContainer($sourceContainerUri, $sourceCredentials)
$sourceBlob = $sourceContainer.GetBlobReferenceFromServer($sourceFilename)
$sourceStream = $sourceBlob.OpenRead()

$destCredentials = New-Object Microsoft.WindowsAzure.Storage.Auth.StorageCredentials($destStorageSAS)
$destBlob = New-Object Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob($destUri, $destCredentials)
$destBlob.UploadFromStream($sourceStream)

$sourceStream.Close()
$destBlob.Metadata[“Status”] = “UploadComplete”
$destBlob.SetMetadata()

 

A good idea is to run this script from a VM in Azure too. That’ll speed up the process. Azcopy would be able to do the same thing if it supported SAS-usage across subscriptions.

Creating and uploading your Azure RemoteApp template image

Creating and uploading the image for RemoteApp turned out to be a challenge for some odd reasons. For the script Upload-AzureRemoteAppTemplateImage.ps1 to work you need to make sure you fulfill the prereqs it needs. Which can be found if you read the script 😉

If Upload-AzureRemoteAppTemplateImage.ps1 fails for “odd” reasons you need to make sure that you’re running PowerShell as Admin and that you’re starting “Windows Azure PowerShell” or have the Azure module loaded.

Here’s my short list of what you need to do:

  • Create a new Hyper-V VM with a 40 GB FIXED dynamic (now supported) size disk.
  • Install Windows Server 2012 R2 (only OS supported)
  • Install RDSH role and Desktop Experience feature (both needed)
  • Reboot (needed to make sure application installations are aware of RDS)
  • Login
  • Install the applications you want to publish to your users
  • From an elevated CMD, run “fsutil behavior set disableencryption 1” (disables EFS encryption of file system)
  • Reboot (makes sure EFS disable is written to registry)
  • Login
  • Run “sysprep /oobe /generalize /shutdown”

Once your machine is turned off you need to start PowerShell as administrator with the Azure cmdlets.

Run the script provided by the portal, find your VHD-file and you should be on your way!

Uploading the file

upload4

 

 

The portal states the template status as “uploading”

upload3

Azure Automation – Using the assets

After yesterdays post about getting started I’ve gotten some questions about the assets library. Thought I’d explain how to use some of the assets (or at least how I’ve figured it out I’d say, might be totally off but at least it works)…

Looking at the assets library we have a “Connection”-object containing our subscription ID. This could be an ID to another subscription, might be useful for IT to deploy services to a developers subscription or something like that.

We also have a “Certificate”-object where we also uploaded the corresponding certificate to our collection of management certificates in Azure, this needs to be done on the right subscription then if you’re managing multiple ones, keep that in mind…

Automation assets

 

 

 

 

 


<# .DESCRIPTION .NOTES Author: Joachim Nässlander, TSP Datacenter, Microsoft #>

workflow Start_Azure_Demo_VM
{
param()

$MyConnection = “Internal Subscription Connection” # <— The name of your Connection object in assets
$MyCert = “InternalSubscriptionCertificate” # <— The name of your Certificate object in assets

# Get the Azure Automation Connection
$Con = Get-AutomationConnection -Name $MyConnection # <— Connect to your subscription
$SubscriptionID = $Con.SubscriptionID
$ManagementCertificate = $Con.AutomationCertificateName
$Cert = Get-AutomationCertificate -Name $Con.AutomationCertificateName # <— Get the certificate from assets

write-output “Subscription ID: $SubscriptionID”

write-output “Certificate Name: $Con.AutomationCertificateName”

}

 

And since this is the internet. How are you using Azure Automation and the assets library, feel free to comment!

Have a nice weekend, and don’t miss The Fratellis to keep you company over a beer!

Getting started with Azure Automation

Azure automation is currently in preview so you might not see it in your portal if you haven’t enrolled already. You can enroll for Azure Automation at http://azure.microsoft.com/en-us/services/preview/and also find all other services currently in preview. It’s a good place to frequently check out, fun things emerge here!

So what is Azure Automation? Well, it’s the ability to run PowerShell workflow scripts from Azure, targeted at your Azure resources.

Once you’re enrolled into the preview program you can create your first automation account. An account can be seen as a container that you can fill with runbooks and assets needed by the runbooks. An asset can be for example a certificate allowing you to connect to your (or another) Azure subscription.

 

Automation dashboardThe overview over your runbooks looks like this. It’ll show you the number of runbooks, number of activites,  number of minutes you’ve ran your scripts and a whole lot more.

 

 

 

 

 

 

 

Now that you’re enrolled you might want to quickly just test it out, personally I love just getting a feel for things before diving into documenation. You can find example scripts and a how-to at http://azure.microsoft.com/en-us/documentation/articles/automation-create-runbook-from-samples/. One thing to note when creating your runbook is that your scripts name in the portal need to correspond to your workflow name. Ie if your workflow is named “Join-Servers-Domain” your runbook must be named the same.

 

Automation runbooks overviewLooking more at the portal, if you click “runbooks” up top, you can see your runbooks listed with their latest run time and status. This gives you a quick overview without having to look at each runbook individually.

 

 

 

Detailed view of runbook

Selecting one specific runbook gives you a chart over how it has ran over the some periods of time. Here you can also drill down into each script run and view script output and any input parameters.

 

 

 

 

 

Published runbookClicking on “author” while in detailed view takes you to the published version of the script. Here you can view your script and start it manually if you want to.

 

 

 

 

 

Runbook draftIf you opt for “draft” instead you’ll be able to edit your script and insert things from your assets library or other runbooks, allowing for runbooks to interact with each other. Here you can also test your runbook before publishing it.

 

 

 

 

 

Automation assetsThe assets library contains building blocks needed for your scripts to function properly. And it’ll make it easier for you to develop scripts for multiple subscriptions too.

In my example we have:

  • Connection to a subscription
  • A certificate which allows us to connect to this subscription (find a guide for that here)
  • PowerShell credentials so we don’t have to enter username/password each time
  • A module containing PowerShell cmdlets

 

 

You can read more about getting started with PowerShell workflows at http://technet.microsoft.com/en-us/library/jj134242.aspx.

PDT user creator in, hold it… PowerShell!

 

 

Well, I’ve read about it. I’ve tried some. I’ve never written one myself. But it finally happened! Using the PDT (PowerShell Deployment Toolkit) I’ve come to realise that creating the users and groups in my lab Environment takes some time. And what’s better to go PowerShell when it’s time to create a new script, don’t wanna be seen doing old vb-scripts 🙂

If you haven’t tested PDT yet, go do it instantly! It’s written by Rob Willis from Microsoft, and he has saved me at least 200 hours already. Check it out at http://blogs.technet.com/b/privatecloud/archive/2013/02/08/deployment-introducing-powershell-deployment-toolkit.aspx

 

Copy / save as PDTUserCreator.ps1


# Script creates users, ou:s and groups for PDT #
# Created by Joachim Nässlander, Microsoft #
# joachim.nasslander@microsoft.com #
# #
# Script provided as-is #
# #

# Import module and check for write permissions
cls
Import-Module ActiveDirectory
try {
New-ADUser -name TemporaryUser -SamAccountName TemporaryUser
Remove-ADUser TemporaryUser -Confirm:$false
}
catch
{
Write-Host “No write permissions in Active Directory”
Exit
}

# Create arrays, passwords, get domains and stuff
$PDTusers=”!installer”,”!vmm”,”!or”,”!ac”,”!om_saa”,”!om_das”,”!om_dra”,”!om_dwa”,”!sm_s”,”!sm_w”,”!sm_r”,”!sm_a”,”!sql”,”!jd”
$PDTUserPassword=”P@ssw0rd”
$SecurePDTUserPassword=$PDTUserPassword | ConvertTo-SecureString -AsPlainText -Force
$PDTOUs=”Services”,”Servers”,”Groups”,”Users”
$PDTGroups=”AC Admins”, “OM Admins”, “CM Admins”, “SM Admins”, “Orchestrator Admins”, “VMM Admins”, “DPM Admins”, “SQL Admins”
$Domain=Get-ADDomain
$DistName=$Domain.DistinguishedName
$DNSRoot=$Domain.DNSRoot
# Check / create ou’s
if (dsquery ou domainroot -name HQ)
{}
else {
New-ADOrganizationalUnit -Name “HQ” -Path $DistName -ErrorAction SilentlyContinue
}
foreach($ou in $PDTOUs){
if (dsquery ou domainroot -name $ou)
{}
else {
New-ADOrganizationalUnit -Name “$ou” -Path “OU=HQ,$DistName” -ErrorAction SilentlyContinue
}
}
# Check / create groups
foreach($group in $PDTGroups){
if (dsquery group -samid $group)
{}
else {

New-ADGroup -Name $group -GroupScope Global -Path “OU=Groups,OU=HQ,$DistName” -ErrorAction SilentlyContinue
}
}
# Check / create users
foreach ($user in $PDTusers){
if (dsquery user -samid $user)
{}
else
{
New-ADUser -Name “$user” -SamAccountName “$user” -ChangePasswordAtLogon 0 -AccountPassword $SecurePDTUserPassword -Description “PDT created user” -Enabled 1 -Path “OU=Users,OU=HQ,$DistName”
}
}
Add-ADGroupMember -Identity “SQL Admins” -Members “!sql” -ErrorAction SilentlyContinue
Write-Host “PDT users, groups and OU’s created”

Creating a cluster with PowerShell

I’m obviously so late with this post that others (Jan Egil Ring) have posted about this already. Considering he’s running a blog about Powershell, his script most likely looks a lot better than mine too…

Below is the script I used at TechEd in Berlin to create my 2-node cluster. You can watch the video from TechEd if you’re interested in knowing more about clustering. It’s Symon Perryman, PM from Microsoft that does most of the talking. I’m the demo guy 🙂

Note that in this script a lot of the information is hardcoded, such as nodenames, iscsi-information and ip-addresses. I’m working on a version which will ask you for the needed information but I haven’t finished it yet, this ought to get you started though.


cls
write-host -foreground red "CREATING A 2-NODE CLUSTER WITH POWERSHELL`n`n"
sleep 3
#Start iSCSI
write-host "STARTING iSCSI SERVICE ON NODE FS01"
sleep 1
icm fs01 -scriptblock {set-service msiscsi -startuptype automatic}
icm fs01 -scriptblock {start-service msiscsi}
write-host "STARTING iSCSI SERVICE ON NODE FS02"
sleep 1
icm fs02 -scriptblock {set-service msiscsi -startuptype automatic}
icm fs02 -scriptblock {start-service msiscsi}

#Connect disks
write-host -foreground red "`nCONFIGURING iSCSI CONNECTIONS`n`n"
icm fs01,fs02 -scriptblock {iscsicli qaddtargetportal 10.10.10.20}
icm fs01,fs02 -scriptblock {iscsicli persistentlogintarget iqn.clu02-quorum T * * * * * * * * * * * * * * * 0}
icm fs01,fs02 -scriptblock {iscsicli persistentlogintarget iqn.clu02-storage T * * * * * * * * * * * * * * * 0}
icm fs01,fs02 -scriptblock {iscsicli qlogintarget iqn.clu02-quorum}
icm fs01,fs02 -scriptblock {iscsicli qlogintarget iqn.clu02-storage}
#Partition and format
write-host -foreground red "`NPARTITIONING AND FORMATTING DISKS`N`N"
icm fs01 -scriptblock {sc $env:tempdiskpart.txt @"
list disk
select disk 1
online disk
att disk clear readonly
clean
create partition primary
format fs=ntfs quick
assign letter=q
select disk 2
online disk
att disk clear readonly
clean
create partition primary
format fs=ntfs quick
assign letter=s
"@ -encoding ascii}
icm fs01 -scriptblock {diskpart /s $env:tempdiskpart.txt}
#Install feature
write-host "`NINSTALLING FAILOVER-CLUSTER FEATURE`n`n"
icm fs01,fs02 -scriptblock {load-mo -install Failover-Clustering}
#Create cluster
write-host -foreground red "`nCREATING CLUSTER`n`n"
New-Cluster FileCluster01 -Node fs01,fs02 -staticaddress 192.168.0.111
#Add disks
write-host -foreground red "`nADDING DISKS TO CLUSTER`n`n"
Get-ClusterAvailableDisk | Add-ClusterDisk

Powershell + WMI = null

Having spent a bunch of hours programming Powershell with WMI for managing Hyper-V hosts I’ve come to realized that running pre-beta software can be a pain when it comes to developing. The usual Powershell / WMI combination didn’t return a single machine. When testing on a standard Windows Server 2008 it did of course work immediately. I spent at least 2 hours troubleshooting my script with the WMI Explorer, WMI CIM Browser and all the other usual WMI tools before going to the serverroom to check on one of our Hyper-V hosts.

The Powershell / WMI combination looks like this:

$Server = ‘localhost’
$VM_Service = get-wmiobject -computername $server -namespace rootvirtualization Msvm_VirtualSystemManagementService
$ListofVMs = get-wmiobject -computername $server -namespace rootvirtualization Msvm_ComputerSystem | where {$_.ElementName -like ‘*’}
foreach ($VM in $ListofVMs)
{
$VM.ElementName
}

Saving this with a .ps1-ending and running it from Powershell on a Windows Server 2008 Hyper-V host will return the virtual machines hosted on that machine.

If you were to run this on a Windows Server 2008 R2 Hyper-V host, nothing gets returned. A bug is filed, now I’m just waiting to see if it ends up with a fix or if I’ll have to recode the scripts that are working now.