Windows Server 2012, PowerShell 3.0 and DevOps, Part 1…

Windows Server 2012, PowerShell 3.0 and DevOps, Part 1…

  • Comments 11
  • Likes

In the first of a two part series, I provide some background information about PowerShell and DevOps.  In the second post, I’ll provide you a bunch of specifics.  PowerShell 3.0, like Windows Server 2012, has a ton of new features and enhancements so I’ll only scratch the surface. 

--Jeffrey


The first time I heard DevOps was a podcast describing the 2009 Velocity conference.  While most of the industry was struggling to deploy releases a few times a year, John Allspaw and Paul Hammond rocked the house with the talk “10 Deploys Per Day: Dev And Ops Cooperation at Flickr”.  They made the case for delivering business results through changes in culture and tools, and gave birth to a new term: DevOps.  The problem is that developers think they are responsible for delivering features and operators are responsible for keeping the site running.  The gap between developers and operators leads to finger-pointing when things go wrong.  Successful business requires an IT culture of joint accountability and mutual respect: developers thinking about the needs and concerns of operators and operators thinking about the needs and concerns of developers. 

Their talk described how businesses required rapid change but that change is the root cause of most site-down events. Shunning the traditional “avoid change” approach, they advocated minimizing risk by making change safe through automation.  This is the job of DevOps – safe change.  This was the Taguchi quality approach applied to IT operations.  Taguchi observed that the root cause of poor quality was variation.  The solution was to first figure out how to do something repeatably.  Once you could do that, then you can make small modifications in the process to see whether they make things better or worse.  Back out the changes that make things worse. Keep doing the things that make things better.  The key is repeatability.  Repeatability allows experimentation which drives improvement.  We get repeatability in IT operations through automation.

We started PowerShell by publishing the Monad Manifesto which articulated the problems we saw, our approach to solving them and the components we would deliver.  We envisioned a distributed automation engine with a scripting language which would be used by beginner operators and sophisticated developers.   PowerShell’s design was driven by the same thinking and values that drove the birth of DevOps:

  1. Focus on the business
  2. Make change safe through automation
  3. Bridge the gap between developers and operators


Focus on the business
PowerShell has always focused on people using computers in a business context.  PowerShell needed to be consistent, safe, and productive.  Much has been made of the similarities between PowerShell and UNIX but in this regard, our ties are much closer to VMS/DCL and AS400/CL.

Consistent:  Operators and developers don’t have a lot of time to learn new things.  A consistent experience lets them to invest once in a set of skills and then use those skills over and over again.  PowerShell uses a single common parser for all commands and performs common parameter validation delivering absolute consistency in command line syntax.  PowerShell cmdlets are designed in a way that ubiquitous parameters can provide consistent functions to all commands (e.g.  –ErrorAction, –ErrorVariable, –OutputVariable, etc)

Safe:  An Operator once told me that occasionally he was about to do something and realized that if he got it wrong, he would be fired.  In PowerShell, if you ever execute a cmdlet which has a side-effect on the system, you can always type –WhatIf to test what would happen if you go through with the operation.  We also support –Confirm, -Verbose and –Debug.  Despite these safeguards, things can go wrong and when they do, PowerShell spends a lot of effort to speed up the process of diagnosing and resolving the error.

Productive:  Every aspect of PowerShell’s design maximizes the power of users (ergo the name).  PowerShell makes it easy to perform bulk operations across a large number of machines.  PowerShell also makes it easy to have productive engagements between your operators and developers because it allows them to speak a common language and to help each other with their scripts.

Make change safe through automation
There has been a lot of discussion about whether PowerShell is a .Net language, a scripting language, or an interactive shell.  PowerShell is a distributed automation engine with a scripting language and interactive shell(s).   Interactive shells and a scripting language are critical components but the focus has always been on automation through scripting.  Automation is the process of reducing and/or eliminating operations performed by a human.  A script documents what is going to happen.  People can review a script and you can modify it based upon their feedback.  You can test the script, observe the outcome, modify the script and if modification is good, keep it and it if is bad back it out. In other words, scripting provides the repeatability required to apply the Taguichi method to IT operations.  Once you have an automated process, you can safely apply it over and over again.  These processes can now be performed reliabily by lower skilled admins.  These steps aren’t possible when you use traditional GUI admin tools.

Bridge the gap between developers and operators
Our goal has always been to deliver a single tool which could span the needs of operators doing ad hoc operations, simple scripting, formal scripting, advanced scripting and developers doing systems-level programming. 
PowerShell spends a ton of effort trying to project the world in terms of high level task-oriented abstractions with uniform syntax and semantics.  We call these cmdlets. And this is what operators want to efficiently and effectively manage systems.  In order to copy a file using APIs, you would do this:



Have you ever wondered why PowerShell uses curly braces {} (and other C constructs) instead of BEGIN/END as other scripting languages do?  We did that because we wanted to make it easier to adopt by developers of other C-based programming languages: C++, Objective C, Java, JavaScript, Perl, PHP, etc.  We did some testing and determined that operators were able to readily adapt to this syntax.  We also wanted to provide a smooth glide path between PowerShell and C# .  This provides career mobility for operators who might want to transition to being a developer.

Most importantly, we wanted to develop a tool which could be used by BOTH operators and developers to bridge the gap between the groups and allow them to create common scripts, learn from each other and work together.

Windows Server 2012 and PowerShell 3.0 are excellent DevOps tools
DevOps is a new term and there is some disagreement about what it entails but at the heart it is all about making change safe through automation and bridging the gap between operators and developers.  There is a lot to do in this area but Windows Server 2012 and PowerShell 3.0 make excellent progress towards accomplishing those goals.  PowerShell won’t be the only tool in your DevOps toolbox but it should be in every DevOps toolbox.  Download the beta today and find out for yourself.

Cheers!


Jeffrey Snover
Distinguished Engineer and Lead Architect for Windows Server

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment
  • Great work on PowerShell! I've been using it for years to do exactly that: bridge the gap between development and administration! :)

    Cheers,

    Trevor Sullivan

    @pcgeek86

  • Hi Jeffrey - great post.

    Of course Devops is the hot item and everyone is looking to solve this problem - but the one major issue I see with this - is that it only Microsoft based.

    Other Orchestrattion platforms (like Chef, Puppet ) will allow you to work across all platforms - Linux, Windows makes no difference.

    Of course there are great advantages but this will always be a problem with a Windows-Centric platform and tool.

  • It's a very right article, Jeffrey! Simplicity is the key.

    I'm working on several modules for software testers (they are often newcomers to PowerShell and newcomers to the libraries I use under the hood), and I see that they absorb PowerShell fast and with visible pleasure.

    But what's about people who write cmdlets and modules? Will be PowerShell guidelines updated for 3.0? There, on the market, is lack of rich information about creating cmdlets.

    Even naming is the question:

    cmdlets should be in the form ApprovedVerb-CodeNameNoun and its parameters should be from the approved list.

    Here is the sample:

    the Get-UIAWidnow cmdlet gets a window by means of FindFirst/FindAll in MS UI Automation, the standard way

    How should I name a cmdlet that gets a window by its handle or uses Win32 API (FindWindow)?

    Get-UIAWindowFromHandle vs Get-UIAWindow -Handle ?

    Get-UIAWindowByWin32 vs Get-UIAWindow -Win32 ?

    To put it simply, let's take your (MS) Get-ChildItem;

    Get-ChildItemByHardLink vs Get-ChildItem -HardLinkNumber ?

    What is the best, a cmdlet with rich but non-standard name or a new non-standard parameter to the well-known cmdlet?

  • @maish -

    I concur that managing all the elements of a datacenter is important.  That is different than saying that your orchestration platform needs to run on all the elements of the datacenter - I don't think those other platforms run in network load balancers, switches, or BMCs right?

    What is important is managing all the elements of the datacenter.  That is why we made such a large investment in Standards Based Management.  Wojtek Kazaczynski did a good post on that : blogs.technet.com/.../standards-based-management-in-windows-server-8.aspx .  Standards based management is a better strategy then proprietary management mechanisms.  It is better for the customers, the products being managed and the management products.  Having watched this movie a number of times in the last 32 years, I can tell you that standards endure a lot longer than most management companies do.

    Thanks for the comment.

    Cheers

    Jeffrey Snover [MSFT]

  • Men i LOVE PowerShell> I've been its advocate since i found it (PS> v2.0) ... but i have to say that despite the verb-noum uniform semantics it has a somewhat  steep learning curve and the CLI has the same weaknesses than its predecessors (lack of CTRL+C & CTRL+V support, Syntax highlighting, comand-spell-check ....etc), things that one would expect of a MODERN automation engine/scripting language/interactive shell.

    I just read some the blog about rocking the Admin experience, and i must said the additions to PowerShell ISE are great... (Show Command, Intellisense and Snippets).

    Why cant that be the default enviroment for PowerShell (or give the powershell.exe some of this capabilities) ???. And make it the default environment for core servers (instead of start-up with cmd.exe)???

  • @apetrovskiy

    "Simplicity is the key" -  Hey- how did you get access to part 2 of the post?  More on this topic tomorrow!

    Regarding your naming comment/question - you are really getting at the question of cmdlet granularity.  As a general rule we are working towards a THINK, TYPE, GET world so if you imagine what people will want to think first - that is always your best guide.  In your examples, you should use the parameter instead of extending the nounname.  

    Get-ChildItem -HardlinkNumber

    DOES get the childitem.  Specifying the -HardLinkNumber just tells the command how to accomplish that task - it isn't a different task.

    Cheers!

    Jeffrey Snover [MSFT]

  • @alvaro

    We don't make PowerShell be the default shell in WS2012 ServerCore because while it and .NET are installed by default, we made them optional component which means that organizations that want to have the smallest possible footprint can remove them.

    Regarding your other comments - Read tomorrow's post.   :-)

    Cheers!

  • Will there be any integration between Powershell and Visual Studio? In particular, it would be nice if there were a way to easily start a Powershell session "in the context of" a C# project (assembly references, etc.) I remember wanting this in a previous life in a DevOps-ish role.

  • When will the PowerShell ISE support code signing, like Sapien?  Thanks!

  • @Jeffrey

    Hello Jeffrey.

    I'm again with a question about naming. I prepared a document, small enough to be readable, that contains description of what I want to expose to the user in cmdlet names and the problem that seems to be a show-stopper if I want to follow the guidelines. It's here http://sdrv.ms/JNEEkM and should be editable, not to fill the blog with texts that are loosely connected to the topic. I'd be glad if you could some day read and indicate what the right names are for those cmdlets…

    Alexander

  • Hello Jeffrey,

    Thank you for sharing the thoughts. I hope DevOps could be recognized by more and more people. Also I hope Windows PowerShell and Windows Server 2012 will start a new era of DevOps.

    As an ITPro, I know ITPros who master Windows PowerShell will be "the man in the middle" (see WIKI DevOps Illustration ), not for attack, but for communication and collaborations. ;)

    Best Regards

    Huajun Gu