PowerShell Summit 2017 thoughts

2017 saw our largest Summit to date. 250 PowerShell fanatics (I use the word advisedly) descended on Bellevue Washington. The conversations had already started when I arrived at the hotel on the Friday night before the Summit!

We had a planning meeting for Summit 2018 on the Saturday. Some good ideas came from the meeting that we’ll share in a little while. If you thought this year was good next year will amaze you.

The Summit opened on Sunday with 3 hour Deep Dives morning and afternoon. During the morning I discovered that the issues Delta airlines were having was causing problems for speakers travelling to the Summit. In the end we only had one speaker unable to reach the Summit. We have a plan to mitigate any future issues with speaker drop outs that we’ll be implementing for Summit 2018 and later (yes we do plan more than a year in advance).

Monday saw the PowerShell team presenting on PowerShell now and in the future with team members showing what they’re working on now!

Tuesday and Wednesday morning saw some amazing standard length sessions and the first outing for the Three Furies of PowerShell – who knows they may appear again sometime.

Wednesday afternoon saw the Community Lightning Demos – everything I’ve heard says it was amazing – I was moderating 2 panel discussions. We also had some longer technical sessions.

I saw a large number of people I recognised from previous Summits. I asked why they came back and consistently received two reasons:

– the high level of the technical content

– the ability to talk to speakers, MVPs, team members and other attendees about their PowerShell problems and get answers to those problems

Be assured we’ve taken those 2 things on board and are committed to preserving those aspects of the Summit.

A huge thanks to the speakers, the PowerShell team and the attendees for making a fantastic Summit. A little while to reflect and catch my breath and then its time to dive into the work for next year

Posted in Powershell, Summit | 1 Comment

Abandoned technologies

Why do some technologies become widely adopted and others are seemingly abandoned – often without any real testing. What do I mean by abandoned technologies? Things like Server Core for instance. And I suspect that nano server and even containers on windows will follow and become abandoned technologies.

Server Core first appeared in Windows Server 2008! In nearly 10 years of existence how many organisations are utilising Server core to its full potential. Very few in my experience. I suspect many, if not most organisations, don’t use it at all.

Nano server was introduced with Server 2016. Its totally headless and very small footprint. You can pack 100s of them onto a 64GB host. Nano server supports a limited number of roles but if you need a small footprint server to host a web site, host VMs or containers or act as a file server for instance its ideal.

The last thing I suspect may join my list of abandoned technologies is Windows Containers. Again, introduced with Server 2016 containers offer a lightweight route to running your applications. With the ability to easily move containers between machines deployments from development to testing and production become much simpler.

So, why do I think these are abandoned technologies or will become abandoned technologies.

The reason is that the majority of windows administrators don’t want to adopt these technologies. They either actively block them or passively ignore them.

Why does this happen? Look at the three technologies again – none of them have a GUI interface! Until Windows administrators fully embrace remote, automated administration techniques they will remain abandoned technologies.

The day of administrators who can’t, or won’t, automate is ending – slowly but surely the pressures to move to a more automated environment are growing. Maybe it’’ happen soon enough that server core, nano server and windows containers will stop being abandoned technologies.    

Posted in Opinion | 1 Comment

PowerShell Direct failure

PowerShell Direct is introduced with Server 2016/Windows 10. it enables you to create a remoting session from the Hyper-V host to a VM using the VM name or ID. I recent discovered a PowerShell Direct failure that I couldn’t explain until now.

Normally you do this:

PS> New-PSSession -VMName w16cn01 -Credential (Get-Credential w16cn01\administrator)

Id  Name          ComputerName    ComputerType    State         ConfigurationName     Availability
 -- ----          ------------    ------------    -----         -----------------     ------------
  1 Session1      W16CN01         VirtualMachine  Opened                                 Available

But on one particular machine I was getting this

PS> New-PSSession -VMName w16as01 -Credential (Get-Credential w16as01\administrator)
New-PSSession : [W16AS01] An error has occurred which Windows PowerShell cannot handle. A remote session might have ended.
At line:1 char:1
+ New-PSSession -VMName w16as01 -Credential (Get-Credential w16as01\adm ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : OpenError: (System.Manageme....RemoteRunspace:RemoteRunspace) [New-PSSession], PSRemotin
   gDataStructureException
    + FullyQualifiedErrorId : PSSessionOpenFailed

I couldn’t find an explanation for this particular PowerShell Direct failure

I’ve been working with PowerShell v6 and OpenSSH the last few days and I noticed that the PowerShell directory had been removed from the system path by the installation of one of these pieces of software.

W16AS01 had been the first machine I experimented with PowerShell v6/OpenSSH and it was the first to experience this PowerShell direct failure.

I checked W16AS01 and sure enough the PowerShell folder was missing from the system path. Adding the Powershell folder back onto the path (and restarting the machine for luck) then retrying PowerShell Direct gives:

PS> New-PSSession -VMName W16AS01 -Credential (Get-Credential W16AS01\Administrator)

Id Name            ComputerName    ComputerType    State         ConfigurationName     Availability
 -- ----            ------------    ------------    -----         -----------------     ------------
  1 Session1        W16AS01         VirtualMachine  Opened                                 Available

Looks like I’ve found a solution for this particular PowerShell direct failure

Posted in Hyper-V, Powershell, Windows 10, Windows Server 2016 | Leave a comment

Maximum number of Acronyms

IT is littered with acronyms – many of which you can’t remember what it means. This has been explained by recent research that shows there is a maximum number of acronyms that it is possible for one person to remember.

Once you’ve reached your maximum number of acronyms as soon as you try to remember a new one then one of the existing ones will be erased from you memory. The acronym to be erased seems to be selected at random though recent computer models lead researchers to suspect that its inversely proportional to your age and directly proportional to the number of different technologies you’re currently working with.

Your maximum number of acronyms seems to be hardwired on an individual basis and no amount of training seems to be able to modify this number.

If you suspect that you’re reaching your maximum number of acronyms all you can do is make sure that you have a cheat sheet available to use to look up their meaning.

Many IT vendors are actively aiding research in this area as their ability to generate incomprehensible documentation littered with acronyms is severely hindered by this new discovery.

Posted in General | Leave a comment

PowerShell v6

Tried PowerShell v6 yet?

Its the open sourced latest version of PowerShell – runs on Windows, Linux (various flavours) and MacOS

Its available from – https://github.com/PowerShell/PowerShell

Before you get too excited there’s a few things you need to remember:

– its ALPHA code. That means its still under development and subject to change

– its not production ready

– it only provides the core parts of PowerShell – for instance all the CDXML modules aren’t available

– it uses .NET Core rather than  full .NET  – it doesn’t have the GUI libraries for instance

– it does install side by side with out of the box PowerShell (on later versions of Windows only)

– it does enable remoting between Windows and Linux using SSH

Its worth testing for 2 reasons. First you can see what’s happening with PowerShell and the changes that are coming. Secondly, you can feed back directly to the project and directly influence the future of PowerShell

Posted in Powershell | Leave a comment

Summit 2017–one week to go

With one week to go before travelling to Seattle for the 2017 PowerShell & DevOps Summit I’m putting the finishing touches to my presentations and the Summit organisation.

The agenda was published last October but we’ve had to make a few changes recently to cover for speakers that have dropped out. Check out the final version before deciding on the sessions you’ll attend.

We’ve already started planning the 2018 Summit – more details will come later – but we think you’ll like what we’re planning. All your favourite Summit features and more.

If you’re going to the Summit don’t forget to sign up for the Lightning Demos – your chance to speak and share your discoveries.

Posted in Powershell, Summit | Leave a comment

Name mismatch

Ever wondered why you can’t do this:

Get-ADComputer -Filter * -SearchBase 'OU=Servers,DC=Manticore,DC=org' |
Get-CimInstance -ClassName Win32_OperatingSystem

The –ComputerName parameter on get-CimInstance accepts pipeline input BUT its by property name.

PS> Get-Help Get-CimInstance -Parameter ComputerName

-ComputerName [<String[]>]
    Specifies computer on which you want to run the CIM operation. You can specify a fully qualified domain name
    (FQDN), a NetBIOS name, or an IP address.

    If you do not specify this parameter, the cmdlet performs the operation on the local computer using Component
    Object Model (COM).

    If you specify this parameter, the cmdlet creates a temporary session to the specified computer using the WsMan
    protocol.

    If multiple operations are being performed on the same computer, using a CIM session gives better performance.

    Required?                    false
    Position?                    named
    Default value                none
    Accept pipeline input?       True (ByPropertyName)
    Accept wildcard characters?  false

If you look at the output of Get-ADComputer it has a Name property.

PS>  Get-ADComputer -Filter * -SearchBase 'OU=Servers,DC=Manticore,DC=org'

DistinguishedName : CN=W16PWA01,OU=Servers,DC=Manticore,DC=org
DNSHostName       : W16PWA01.Manticore.org
Enabled           : True
Name              : W16PWA01
ObjectClass       : computer
ObjectGUID        : 8d137004-1ced-4ff1-bcf4-f0671652fc8c
SamAccountName    : W16PWA01$
SID               : S-1-5-21-759617655-3516038109-1479587680-1322
UserPrincipalName :

So you have a Name mismatch between the property and the parameter.

There are a number of ways to deal with this.

First use foreach

Get-ADComputer -Filter * -SearchBase 'OU=Servers,DC=Manticore,DC=org' |
foreach {
  Get-CimInstance -ClassName Win32_OperatingSystem -ComputerName $psitem.Name | 
  select CSName, Caption
}

Use $psitem.Name (or $_.Name) as the input to –ComputerName. Simple coding and works very nicely.

If you have a lot of computers you may want to use a foreach loop instead

$computers = Get-ADComputer -Filter * -SearchBase 'OU=Servers,DC=Manticore,DC=org' | select -ExpandProperty Name
foreach ($computer in $computers) {
  Get-CimInstance -ClassName Win32_OperatingSystem -ComputerName $computer|
  select CSName, Caption
}

Create an array of computer names and iterate through them.

Second use select

Get-ADComputer -Filter * -SearchBase 'OU=Servers,DC=Manticore,DC=org' |
select @{N='ComputerName';E={$_.Name}} |
Get-CimInstance -ClassName Win32_OperatingSystem |
select CSName, Caption

In this case use select-object to create a property with the name ComputerName (case DOESN’T matter) and pipe that into Get-CimInstance.

This option is a bit more advanced as you have understand how select-object works and how to create extra properties on the object you’re passing down the pipeline. It looks cooler and should get you a few extra “ace powerShell coder” points.

The third option takes advantage of the fact that _Computername accepts an array of computer names

Get-CimInstance -ClassName Win32_OperatingSystem -ComputerName (Get-ADComputer -Filter * -SearchBase 'OU=Servers,DC=Manticore,DC=org' | select -ExpandProperty Name) |
select CSName, Caption

You run Get-ADComputer and use select –Expand to only return the VALUE of the computer name (a string). This gives you an array of computer names. Because its in () its treated as an input object to the parameter.

Very clever and gets you maximum points.

Posted in PowerShell and Active Directory, PowerShell and CIM | Leave a comment