Your Guide to the Latest Windows Server Product Information
In the first of a two part series, I provide some background information about PowerShell and DevOps. In the second post, I’ll provide you a bunch of specifics. PowerShell 3.0, like Windows Server 2012, has a ton of new features and enhancements so I’ll only scratch the surface.
The first time I heard DevOps was a podcast describing the 2009 Velocity conference. While most of the industry was struggling to deploy releases a few times a year, John Allspaw and Paul Hammond rocked the house with the talk “10 Deploys Per Day: Dev And Ops Cooperation at Flickr”. They made the case for delivering business results through changes in culture and tools, and gave birth to a new term: DevOps. The problem is that developers think they are responsible for delivering features and operators are responsible for keeping the site running. The gap between developers and operators leads to finger-pointing when things go wrong. Successful business requires an IT culture of joint accountability and mutual respect: developers thinking about the needs and concerns of operators and operators thinking about the needs and concerns of developers. Their talk described how businesses required rapid change but that change is the root cause of most site-down events. Shunning the traditional “avoid change” approach, they advocated minimizing risk by making change safe through automation. This is the job of DevOps – safe change. This was the Taguchi quality approach applied to IT operations. Taguchi observed that the root cause of poor quality was variation. The solution was to first figure out how to do something repeatably. Once you could do that, then you can make small modifications in the process to see whether they make things better or worse. Back out the changes that make things worse. Keep doing the things that make things better. The key is repeatability. Repeatability allows experimentation which drives improvement. We get repeatability in IT operations through automation.We started PowerShell by publishing the Monad Manifesto which articulated the problems we saw, our approach to solving them and the components we would deliver. We envisioned a distributed automation engine with a scripting language which would be used by beginner operators and sophisticated developers. PowerShell’s design was driven by the same thinking and values that drove the birth of DevOps:
Focus on the businessPowerShell has always focused on people using computers in a business context. PowerShell needed to be consistent, safe, and productive. Much has been made of the similarities between PowerShell and UNIX but in this regard, our ties are much closer to VMS/DCL and AS400/CL.Consistent: Operators and developers don’t have a lot of time to learn new things. A consistent experience lets them to invest once in a set of skills and then use those skills over and over again. PowerShell uses a single common parser for all commands and performs common parameter validation delivering absolute consistency in command line syntax. PowerShell cmdlets are designed in a way that ubiquitous parameters can provide consistent functions to all commands (e.g. –ErrorAction, –ErrorVariable, –OutputVariable, etc)Safe: An Operator once told me that occasionally he was about to do something and realized that if he got it wrong, he would be fired. In PowerShell, if you ever execute a cmdlet which has a side-effect on the system, you can always type –WhatIf to test what would happen if you go through with the operation. We also support –Confirm, -Verbose and –Debug. Despite these safeguards, things can go wrong and when they do, PowerShell spends a lot of effort to speed up the process of diagnosing and resolving the error.Productive: Every aspect of PowerShell’s design maximizes the power of users (ergo the name). PowerShell makes it easy to perform bulk operations across a large number of machines. PowerShell also makes it easy to have productive engagements between your operators and developers because it allows them to speak a common language and to help each other with their scripts.Make change safe through automationThere has been a lot of discussion about whether PowerShell is a .Net language, a scripting language, or an interactive shell. PowerShell is a distributed automation engine with a scripting language and interactive shell(s). Interactive shells and a scripting language are critical components but the focus has always been on automation through scripting. Automation is the process of reducing and/or eliminating operations performed by a human. A script documents what is going to happen. People can review a script and you can modify it based upon their feedback. You can test the script, observe the outcome, modify the script and if modification is good, keep it and it if is bad back it out. In other words, scripting provides the repeatability required to apply the Taguichi method to IT operations. Once you have an automated process, you can safely apply it over and over again. These processes can now be performed reliabily by lower skilled admins. These steps aren’t possible when you use traditional GUI admin tools.Bridge the gap between developers and operatorsOur goal has always been to deliver a single tool which could span the needs of operators doing ad hoc operations, simple scripting, formal scripting, advanced scripting and developers doing systems-level programming. PowerShell spends a ton of effort trying to project the world in terms of high level task-oriented abstractions with uniform syntax and semantics. We call these cmdlets. And this is what operators want to efficiently and effectively manage systems. In order to copy a file using APIs, you would do this:
Jeffrey SnoverDistinguished Engineer and Lead Architect for Windows Server
Comments in this blog are open and monitored for each post for a period of one week after the posting date. If you have a specific question about a blog post that is older than one week, please submit your question via our Twitter handle @WindowsServer