Increase Your Productivity by Using the PowerShell Pipeline

Increase Your Productivity by Using the PowerShell Pipeline

  • Comments 1
  • Likes

Summary: Microsoft Scripting Guy, Ed Wilson, teaches you how to use the Windows PowerShell pipeline to increase your productivity and to avoid writing scripts.

Hey, Scripting Guy! Question Hey, Scripting Guy! I keep hearing about the pipeline in Windows PowerShell, but I do not get it. I see commands, but I am not sure what they are actually doing. In addition, why do I need to use a pipeline in the first place? Surely, I can store stuff in variables, and use the ForEach command to walk through collections. I just don’t get it. I guess you can tell that I am an old VBScripter, and I guess old habits are hard to break. But really, what’s up with the pipeline?

—DG

Hey, Scripting Guy! Answer Hello DG,

Microsoft Scripting Guy, Ed Wilson, is here. Well, this new year is already shaping up to be an exciting one. The Scripting Wife and I are hard at work on a new series of Scripting Wife blogs in preparation for the 2012 Scripting Games. This year’s games will be the biggest and the best ever. I have been talking to various presidents of Windows PowerShell Users Groups from around the world, and I plan to work with them to help them get their groups up to speed for this year’s games. If you are not a member of a Windows PowerShell user group, check the Windows PowerShell Group website to see if there is one near you. If there is not one, and if you would like to start one where you live, contact me via the scripter@microsoft.com email address, and I will help point you in the right direction. 

Anyway, DG, you are not the first person to ask me about pipelines in Windows PowerShell. For people who approach Windows PowerShell from a strictly VBScript background, or even from a strict Windows background, the idea of a pipeline is somewhat foreign. At times, the process seems to work seamlessly, and at other times, it does not work at all. In some cases, the command appears to make sense, but in other cases, it does not make sense. And often there is neither rhyme, nor is there reason to the syntax and the commands themselves.

A basic example of a pipeline

A good way to see a pipeline in action is to create an instance of the Notepad process, retrieve the newly created process, and stop that process. To start a new instance of the Notepad process, I use the Start-Process cmdlet. I then use Get-Process to retrieve the process object, and I pipe the object to the Stop-Process cmdlet.

Start-Process notepad

Get-Process notepad | Stop-Process

What happens under the covers is that the process object that is retrieved by the Get-Process cmdlet, is sent to the InputObject parameter of the Stop-Process cmdlet. The Stop-Process cmdlet then stops each process contained in the process object that comes across the pipeline. If we did not have the pipeline in Windows PowerShell, I would perform this operation, exactly the same as I did in the VBScript days…I would store the objects in a variable, and pass the variable to the InputObject parameter of the Stop-Process cmdlet. This technique is shown here:

Start-Process notepad

$notepad = Get-Process notepad

Stop-Process -InputObject $notepad

Examining the pipeline mechanics

I can confirm that the process object passes to the InputObject parameter by using the Trace-Command cmdlet.

Note: Microsoft MVP and Honorary Scripting Guy, Don Jones, reminded me about using this cmdlet in his great column, PowerShell with a Purpose. (I had not played with it much since I wrote my book, Windows PowerShell 2.0 Best Practices).

To use the Trace-Command cmdlet to display ParameterBinding information, I specify ParameterBinding to the Name property—that is the name of the command that I want to trace. I use the PSHost switched parameter to tell the cmdlet to monitor the Windows PowerShell host, and I then specify the Expression I want to trace. The Expression goes into a script block (delimited by a pair of curly brackets). The complete command is shown here.

Trace-Command -Name parameterbinding -PSHost -Expression {get-process notepad | stop-process}

The command to trace the parameter binding of the Get-Process command as it is piped to the Stop-Process cmdlet, along with the output associated with this command is shown in the image that follows.

 Image of command output

Using different cmdlets in the pipeline

A number of cmdlets typically combine with other cmdlets to assist in working with objects on the pipeline. These cmdlets accept input via the pipeline, and then they perform services such as grouping, sorting, or filtering the information that comes through the pipeline, before they pass the input to other cmdlets. These cmdlets are:

  • Get-Unique
  • Group-Object
  • Select-Object
  • Sort-Object
  • Tee-Object
  • Where-Object

One of the most important of these pipeline cmdlets is the Where-Object cmdlet. The reason the Where-Object cmdlet is so important is that it filters out the input it receives. It is quite common for a cmdlet to produce a lot of information. For example, the Get-Process cmdlet returns an awful lot of data, only a small portion of which appears in the default display. If I am interested in obtaining information about only processes named svchost on my computer, I could use the Where-Object cmdlet to filter out all of the process objects that did not have a name of svchost.

In the command that follows, I first retrieve a collection of process objects by using the Get-Process cmdlet (one for each process that runs on the computer). I then send these process objects one at a time across the pipeline. When the process arrives on the other side of the pipeline, the Where-Object cmdlet examines each process.

While the Where-Object cmdlet is looking at each process that comes across the pipeline, it uses the $_ automatic variable to represent the current object in the pipeline. Therefore, I use the $_ variable to represent the current object in the pipeline, and I choose to examine the Name property of that object. If the name of the object (represented by the $_ automatic variable) is equal to svchost, the object passes the filter. The code that filters the svchost processes is shown here.

Get-Process | Where-Object { $_.name -eq 'svchost'}

There is nothing wrong with using the Where-Object to limit the return of process information to the svchost process. But, in this exact example, the Get-Process cmdlet happens to have a Name parameter that will filter out processes based upon a name. The difference is that when using the Where-Object cmdlet, Get-Process returns information about every process on the system. Then all the process information enters the pipeline, and the Where-Object cmdlet performs the filtering. This is inefficient, and when working against a remote system, it can cause a significant performance hit on the command. The command to return process objects that are named svchost is shown here.

Get-Process -Name svchost

If I am curious about the difference in performance between using the Where-Object or using the Name parameter, I can use the Measure-Command cmdlet to determine which command is fastest. The syntax to measure the two commands is shown here.

measure-command { Get-Process | Where-Object { $_.name -eq 'svchost'} }

measure-command { Get-Process -Name svchost }

The image that follows illustrates using the Measure-command cmdlet and the associated output from those commands.

 Image of command output

On my system, the command using the Where-Object cmdlet takes 24 milliseconds, and the command that does not use the pipeline takes 1 millisecond. The Measure-Command cmdlet is not accurate with subsecond measurements, and in reality, there is no difference between 1 millisecond and 24 milliseconds. If using one command instead of the other command causes you to pause and consider the syntax, your performance gain is lost. However, if you are querying for svchost information from 1000 servers that are distributed across the network, the differences are likely to become pronounced, and there could be a larger time difference between the two commands.

DG, that is all there is to using the pipeline and filtering out data. Pipeline Week will continue tomorrow when I will talk about sorting objects.

I invite you to follow me on Twitter and Facebook. If you have any questions, send email to me at scripter@microsoft.com, or post your questions on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

Ed Wilson, Microsoft Scripting Guy 

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment
  • Hi Ed,

    pipeline week is something very important!

    It should be a kind of periodically returning topic, to remind us of the advabtages of the pipeline.

    Filtering remotely is usually the best way to get the requested informations avoiding the transport of unwanted information from one system to another.

    Klaus (Schulte)