These postings are provided "AS IS" with no warranties, and confers no rights. You assume all risk for your use.
Anthony Bartolo Twitter | LinkedIn
Stephen IbarakiIndustry AnalystFCIPS, I.S.P., ITCP/IP3P, DFNPA, CNP, FGITCA, MVP
This week I’m in Seattle, at an internal Microsoft conference called TechReady. If I hadn’t already felt like I was drinking through a fire hose, I certainly do now! It’s a Mecca of technical presentations, hands-on labs and web casts on everything from architecting IT solutions to developing applications to deploying and monitoring systems, plus everything in between. However, one of the best things about this week has been the opportunity to network with and get to know some of my colleagues.
In fact, I was chatting with one of my colleagues when he mentioned something called DSI and the concept of modeling your IT environment. I had no idea what DSI is – there are way too many acronyms used around here – and I wasn’t sure of the practicality of what he was explaining to me about modeling. But he caught my interest.
The idea sounds good in theory - you create a description of your infrastructure so that when applications are developed they can be matched against the model before deployment. The intended state of the system is defined, the actual state is understood and the difference is measured against acceptable tolerances. Any inconsistencies, such as a dependency on something not installed on the target machines or a requirement that giant holes be punched in the firewall, can be sorted out during development instead of being discovered during the pilot phase.
But realistically, who has time to build and maintain such a thing? Many, if not most IT departments have difficulty keeping current documentation of their network and systems – there just aren’t enough hours in the day and there are too many higher priorities.
After our conversation, I went and did a little research. DSI, or Dynamic Systems Initiative, is an effort led by Microsoft to automate how businesses design, deploy and operate distributed systems. Basically it’s a method of harnessing the knowledge of systems to automate well-defined systems management tasks, reducing support costs (because systems are known and issues discovered before applications are deployed) and lessening human mistakes and omissions that are inherent in a manual modeling process.
I didn’t realize before looking it up that this technology is available today in Visual Studio Team System and is being built into the System Centre family of products through a modeling language called the System Definition Model or SDM. If you’re interested in knowing more about DSI and SDM, there are whitepapers here.