Insufficient data from Andrew Fryer

The place where I page to when my brain is full up of stuff about the Microsoft platform

January, 2012

  • ITIL and System Center 2012

    If ITIL is the Why , the What and the When of IT operations System Center provides the how..


    I have been quite harsh with the tick boxes here, for example you could argue that part of Service improvement could be to redesign a process and this could be done in System Center (Orchestrator), and validation and testing management is in yellow as this requires Visual Studio Team Foundation Suite with its Lab Manager extensions to Virtual Machine Manager.  However I can’t see System Center being used for evaluation management particularly as one of the choices being evaluated could be System Center itself!

    ITIL is vendor agnostic, and so it isn’t dependant on anything from Microsoft and because of this it is more focused on the management structure rather than how things get done.  To turn those broad concepts into practical reality in a Microsoft world, there is the Microsoft Operations Framework (MOF).  This is not a product, it’s a free solutions accelerator designed to help you get the most out of your Microsoft infrastructure using ITIL like best practice, and included in it is a cross reference between MOF 4.0 and ITILv3

    MOF is constantly updated, not just to keep up with the best practice to deploy the latest versions of SQL Server, Exchange etc. but also  to stay abreast of current infrastructure thinking like Private Cloud. 

    No one seems to have time to change how to do IT in a very lean economy and adopt these processes, however we do all seem to have time to rework, patch and fix the the less than efficient methods we have in place today.  Therefore all I am suggesting is to free up a little time planning and getting to grips with ITIL (and MOF if you are a Microsoft based outfit) to save lots of time fixing later, which I believe is more rewarding for you and for your business.


    Data Transformation Services is the bit of SQL Server that helped pay for my house, car and some really nice holidays.  I first got to use it some 12 years ago and after working with BCP (bulk copy program) it was just so easy, I was sucking data out of AS400/RPG, Oracle and Sybase to create data warehouses in weeks rather than months. However to be fair it did have its faults,  but for a free utility bundled with SQL Server (7 & later) there was no comparison.

    However those faults became more important as it was used more widely and it lacked many of the capabilities of the standalone tools that were also around at the time:

    • It didn’t really have many good transformation tools so you either wrote your own in ActiveX or you did the work in views/ procedures on your target database.  This generally resulted in lots of temporary tables in your data warehouse and lots of disk activity.
    • Flow control was pretty basic, for example if you wanted to pick up and load all the excel files in a folder, you would have to do that in code yourself.
    • Parameter passing was pretty basic if memory serves.
    • The moving form development  to test, to production was tricky because data sources were fiddly to manage.  In fact I remember embarrassing myself by issuing a truncate table statement on the source instead of the target at one client site, fortunately they had backups! 

    DTS Packages like this can be really hard to understand (thanks to Neeraj Nagpal for the screenshot)

    There was no easy way to modify DTS to add these capabilities and so SQL Server Integration Services (SSIS) came out with SQL Server 20005 late in 2005.  However You could still run those old DTS packages  inside SSIS and even edit them if you needed to (details are here for running DTS in SQL Server 2008 R2).  This side by side capability continued in SQL Server 2008 & R2 but DTS was  specified as a deprecated feature in SQL Server 2008 and this is advanced warning that running DTS packages won’t be supported in the next version.  That next version SQL Server 2012 is now out in beta  and as stated there is no support for DTS in it.   

    I first wrote a post about this problem nearly four years ago,  and my advice at the time was to do a gradual migration from DTS to SSIS where when a significant change was needed to a DTS package you would reengineer it in SSIS.  Another option is to use DTS xChange from Pragmatic Works which does cost money but makes a very professional job of automating the conversion into a well designed SSIS package with good design and proper logging. Finally you could just get in some data warehousing experts and they’ll do the work for you.

    Whatever you decide DTS is pretty nearly dead, and while I do have a soft spot for it, once I learnt SSIS I realised how much was missing in DTS.

  • How to Plan an ITIL Infrastructure Implementation

    In the last guest blog post from Erin Palmer he looked at how to manage IT chaos with the adoption of ITIL infrastructure.  In this second guest post he takes a more detailed look at five key points from the TWDC case study that can help plan a successful ITIL implementation process of your own.

    1. Generate the Buzz

    ITIL adaptation takes skilful planning, a strategic implementation schedule, and the participation of key players who will be in full support of the transition. Follow the success of TWDC’s strategy and help your constituents see how the adaption of an ITIL infrastructure can address current IT concerns and will help them use IT more efficiently. Implement a top down educational plan and select key players for advanced training, or bring on talented ITIL leaders with the experience to help make the transition as seamless as possible. Once the people in the organization grasp the positive potential of ITIL for the company’s growth, then the process begins to form a life of its own. This process cannot take place without the commitment to funding resources, time, and human resource development necessary to achieve success.

    2. Assemble Powerful Teams

    Taking the time necessary to build talented teams with leaders that have both the technical experience and the strong communication skills to articulate the overall vision and goals for the team is essential for success. The TWDC case study underscores the fact that successful ITIL adaptation takes time. Looking at the current role of IT in your organization and being clear about what you would like to see with regard to data management, reporting, efficiency, delivery, maintenance, etc. is vital to building a plan with clear goals and measurable outcomes. Selecting a team that will help you implement the changes necessary to reach your new profit and efficiency goals is easier if you are clear about what you want to truly achieve with the adoption of an ITIL infrastructure. Putting the time in to train and assemble strong leaders for the project will build overall trust in the process and will help safe guard against breaks in service and diminish other challenges as the project gets underway.

    3. Keep Clear and Regular Communication a Top Priority

    From the moment you start to generate the buzz, until the process is complete and running smoothly, communication is vital to keep all constituents informed and connected to the momentum of the project. Every leader needs to be fully fluent in best practice methods for communicating technical and non-technical aspects of the process to a wide variety of users. The message needs to be adapted to the recipient, not the other way around. Skilled ITIL leaders are aware that the CFO, the help desk worker, and the marketing manager have differing IT related roles and will need to hear about the ITIL infrastructure engagement process in a language that makes sense with examples that are relevant. Throughout the process regular updates and the celebration of milestones builds confidence in the process and builds a more cohesive team.

    4. Strategically Build an ITIL that Serves Your Needs

    The best part about the ITIL infrastructure is that it is flexible. You can integrate what works without “reinventing the wheel.” Taking time to collect insightful research gained from involving all teams who use IT in the beginning of the project will lead to a stronger ITIL implementation with less adjusting later. Strong ITIL leadership can ensure that the ITIL infrastructure will grow with you. A strategic ITIL infrastructure bolsters revenue by streamlining processes like ordering, and delivering products. Redundancy in storage is decreased thus speeding up servers. Everyday processes like scheduling, stocking inventory, maintaining client communication, performing system maintenance, and generating specific data to track progress in key areas, are all made more efficient, timely, and profitable with ITIL adaptation. Data security also increases with ITIL which can bolster client confidence. In order for ITIL to serve you well, you need to be clear about the realistic goals within your budget and then proceed if the resources are there to support the project.

    A strong ITIL infrastructure generates the data that is needed to make maintenance and long term adjustments efficient, leaving more time for growing your business and leading your organization into new markets with confidence. Engaging in industry-wide best ITIL practice methods will help you build the strategy you need to assure you have a successful and integrated ITIL infrastructure with all teams in your organization driving the momentum forward to new goals and profitability. As e-commerce continues to expand and competition in the global market place increases, ITIL data can be a powerful tool to guide your growth. Case studies show that implementing an ITIL infrastructure can increase profitability; these case studies also reveal that the plan takes time, company-wide support, and team cooperation to succeed -three important factors to weigh heavily when considering an ITIL plan for you organization.


    As I mentioned last time Erin works at Villanova University and this article comes from their new ITIL training course. This course is part of the overall IT Service Management training program.

  • How to Manage Chaos with an ITIL Framework

    I have to confess I don’t know too much about the detailed mechanics of ITIL, but I couple of months ago I got chatting to an expert, Erin Palmer from Villanova University in the US, and over the Christmas holidays he’s written up a really good post by way of an  introduction..


    So you think you have IT chaos to manage? Imagine the IT services needed to tame the chaos of a multi-billion dollar conglomerate of 11 large-scale theme parks, two water parks, over 40 resorts, and a pair of cruise ships – and over 118 million customers annually. Did I mention these services also operate in all time zones across many languages and international borders every day all year long? Can you imagine an IT department of 1000?

    These are the statistics from the Walt Disney Company (TWDC) case study as they took on adopting ITIL best practices in the mid-2000’s. Used since the 1980’s in the United Kingdom to manage the IT services of large governmental entities, ITIL has proven its value time and time again in a variety of large, medium, and small business settings worldwide. The IRS which processed over 236 million tax returns and collected more than $2.3 trillion dollars in revenue in the fiscal year 2009 uses ITIL, so does N.A.S.A., and a host of other business entities that aren’t nearly this big or complex. One of the reasons why ITIL structure continues to grow in popularity is that it is adoptable and flexible in nearly any business setting that uses IT to conduct its commerce.

    Looking at recent ITIL case studies, several important points emerge when thinking about how ITIL might help you manage the IT chaos in your business setting, no matter how large or small.

    1. ITIL processes are flexible and help manage the services IT provides as the business grows:

    · Working with existing IT structures, ITIL grows the business efficiently

    · Reducing redundant data storage, ITIL saves room on servers

    · Centralizing data storage means increased security of company data

    · Generating reports for monitoring progress is easier and more efficient with ITIL

    · Increasing consistency and dependability is an ITIL goal; business is streamlined

    · Maintaining the ITIL is efficient, resulting in less down time and “work-arounds”

    2. ITIL best practices generates data to address IT and departmental problems pre-emptively

    · Self-monitoring applications reveal areas that need attention before they become a problem

    · Generating reports to track how various IT factors work together is easier and you can assess for glitches in the system before they cause a problem

    · Measuring data and comparing goals across departments increases productivity and accountability

    · Implementing company- wide IT standards and guidelines means less is overlooked, or repeated and that there is increased communication and cooperation among departments regarding company goals

    3. Client Trust increases

    · Less outages and down-time due to IT issues means increased client satisfaction

    · Increased data security is a strong selling point

    · ITIL provides strong back-office support which positively impacts the user’s experience

    · ITIL means streamlines services that lead to faster response time when a client needs assistance

    4. The ITIL structure helps team building

    · When company goals are ubiquitous across all sectors progress is easier to monitor

    · Each team plays a unique role in reaching company goals and can keep track of contributions toward the goals

    · Teams know that company-wide standards are in place so everyone is on a level playing field when striving for a goal

    · Reports are easy to generate and share regarding company-wide progress

    · Communication between teams and with team leaders is more streamlined so response time can be faster when team members request changes or observe a situation where an IT adjustment would streamline services even more

    · Help desk requests are processed more efficiently and data tracks trends that need more careful attention

    As a business grows, the chaos of IT grows as well. By implementing an ITIL, not only can you harness some of that chaos, but you can make it work for you. By analysing the huge amount of data available to you in the ITIL framework, more specific reports are possible that indicate progress toward goals. Team building within the company is increased through the implementation of standard language, goals, and processes from one department to the next. Increased efficiency can mean increased profits, a more enjoyable IT experience, increased client satisfaction, and more “free” time for managers to promote company growth and increased profitability.

    Even though your company goals may not involve IT that provides a literal thrill ride for your clients, through examining case studies of companies that have successfully adopted ITIL best practices, you may just find that the lucky star you were wishing for is closer than you think. With its internationally recognized standards of best practice, ITIL offers the possibility for unlocking greater potential and momentum for increased growth, efficiency, security, and profitability in your company no matter where you are and what service or product you provide. Chaos in the global marketplace will continue to increase; ITIL offers a proven method for harnessing the chaos and riding it all the way to the bank for those willing to put the time in to prepare, invest in, and utilize an ITIL framework to its fullest potential.

    This article was submitted by Villanova University’s new ITIL training course. This course is part of the overall IT Service Management training program.

  • Some SQL Server 2012 upgrade advice for ISVs

    I have spent a lot if time recently briefing Independent Software Vendors (ISVs) on SQL Server 2012,so I thought a consolidated post on the subject might be useful for those planning to develop solutions on top of SQL Server 2012. 

    New Features that will just work

    By this I mean there are some new things in SQL Server which you can take advantage of without changing your application.

    Always On allows you to make an application highly available by combining the best parts of mirroring and clustering without the need to have a SAN or other shared storage. Not this is in enterprise edition.

    Report Alerting allows end users to setup conditions in simple interface on any report and get an email when those conditions are met.  This needs SQL Server standard & SharePoint Foundation (the free one) or higher.

    New Features that you can take advantage of in your application

    Development  SQL Server now has SQL Server data tools that you can deploy to Visual Studio 2010 to make application lifecycle management easier. For example simple tools to edit and compare schemas and data tier applications to make deployment of your application easier.  There is also Distributed Replay, which allows captured profiler traces to be replayed on another environment which might be a later version of SQL server or simply a test server. the tools can either be installed as part of installing SQL Server or via the web platform installer

    Security.  The key security feature in SQL Server 2012 for ISVs will be contained database security which will allow you to have all the security credentials built into the database you are using.

    File Table.  This builds on filestream to expose a new type of table a file table as a folder that can be used as any normal file folder except that each file and subfolder will now be stored as a row in the File Table.  This might be useful in storing any unstructured data as part of your application.  Note that full text search and the new semantic search work well with File Tables. I have post here on setting that up too

    T-SQL. There are a few new functions in T-SQL, that might be relevant.

    Self Service BI

    In many situations the end user will want to combine data from your application with other sources.  The new self service BI capabilities in SQL Server 2012 can make it easier for users to do this in Excel and for this work to be scaled up and deployed to the rest of the business. To get the best out of this in your application you might consider:

    • Creating a suite of reports designed to expose the dimension type information (product lists, chart of accounts etc.) that can then be consumed by the user as odata feeds in PowerPivot for Excel (odata is built into reporting services). 
    • Creating a BI Semantic model to map how your data is structured and add extra business logic (calculations and aggregations) so that business users can quickly build their own analytics and report in such tools as the new Power View.  note: This requires SQL Server BI edition and SharePoint enterprise

    What won’t work

    There are a only a few things that won’t work in SQL Server 2012 that are in SQL Server 2008 R2. Microsoft has a process for announcing which features will go; in any given release there are a list of deprecated features, those that won’t be supported in a future release. This means there is plenty of advanced warning, both to stop using the feature if you are already and not to use a deprecated feature in any new design work.

    In SQL Server the list of features that are no longer supported is very minor; i.e. if it works in SQL Server 2008 / SQL server 2008 R2 it will also work in SQL Server 2012:

    • System stored procedures:
      • sp_ActiveDirectory_Obj
      • sp_ActiveDirectory_SCP
      • sp_ActiveDirectory_Start
    • the Surface area configuration (SAC) tool
    • and various command line switches to install SQL Server so if you are deploying SQL Server as part of an application you’ll need to change the install script. I mention this because various tools like the upgrade assistant will pick up what code is in your database and profiler and the SQL Server deprecated features object will track your usage of features that are going to be obsolete, there aren’t really any tools to check your installation process.

    Upgrade Advisor and Upgrade Assistant

    Two confusingly named tools exist to put some science into your upgrade planning, the SQL Server Upgrade Advisor is a Microsoft tool, and the Upgrade Assistant is also free and provided by a top gold partner Scalability Experts.  The Upgrade which does a high level check of compatibility issues moving from one version to another, and the Upgrade Assistant is a detailed tool for preparing making trace replays to confirm that the code that is actually executing in an application works in the new version, so this can be used to track an installation code executing in multi tier applications as well as the objects inside any given database.  If you’re an ISV you’ll probably want to use both in your testing.


    DTS won't be supported in SQL Server 2012 for more on this check TechNet and my post on the subject.

     Anyway I hope that’s useful, full details on SQL Server 2012 Editions & Licensing are here, and for more information on those new features visit the main SQL Server 2012 Resource Centre.

  • Anywhere Working actually works

    It’s Work Anywhere week but working anywhere isn’t working everywhere. By that I mean that not all organisations allow their staff to work away from the office even where this is possible and even sensible. So here’s my bluffers guide as I do this quite a lot. At first sight working away form the office might seem all good for the employee with no obvious benefit to the employer, however the reality is that there are downsides for employees and loads of benefits for employers.

    Lets look at the employer first and the downsides of Anywhere working.  I guess the obvious concern is loss of control, not knowing if your staff are doing what they ought to, and vague concerns about productivity. 

    However there is a big price to be paid for having all your staff on site, not least the cost of that site. If a business implements anywhere working then it might only need desks for 60% of the workforce and not 80-90%.  My counter to the productivity argument is threefold:

    1. Your workforce are dependant on personal and public transport to get to work and failures in these also lead to a loss in productivity.  If I cast my mind back to the snow and ash clouds last year, our anywhere working and unified communications meant that very few customer meetings and events were cancelled.

    2. Work Anywhere doesn’t just mean working at home.  If your workforce can get unified communications and some sort of VPN access to get at internal resources form a remote location. They can work on a client site, at public events like trade shows, and in coffee shops, hotels etc.  In my own case I worked at my mum’s house while she got over a cancer op at the start of the year, so I could care for here and get stuff done, and then when I was at a big show at Olympia the following week I could access internal SharePoint sites to get answers I needed to the many questions I was being asked .

    3. The other major loss of productivity is sickness and if you can work at home in a reduced capacity they won’t bring their germs to work and affect the rest of their team.  This happens all the time at Microsoft and its not just colds it’s post trip jetlag, sports injuries and in my case working at home after an emergency appendectomy.  Of course this requires two trust between managers and staff but that should be there anyway. Migraines are  my problem here and I can just work round them with the trust in place between me and my manager.

    The upsides for us employees are all pretty obvious, in my case looking after mum after her cancer surgery,  burning the midnight oil in the week to  make a swift exit on Friday lunchtime, and working in New Zealand for a day  while on holiday as you can’t really have 5 weeks off back to back without doing a check-in and some e-mail triage.  Another great benefit is having stuff delivered at home rather than endless trips to the post office track down your latest Amazon & EBay purchases.  However there are some downsides:

    • It’s all too easy to check e-mail whenever so you need to set aside time to unplug from the office
    • If you are unable to work you need to say so and take time out, no one is indispensable! 
    • You’ll save money and a lot of time by not commuting to work but season tickets on public transport aren’t economical for a once or twice a week trip to the office.
    • Another cost is that your energy bills at home will rise as you’ll need to heat & power home office.
    • Most employers will cover your broadband costs, but the cost of having your own home office will be down to you

    So  I think Anywhere working is a good thing and while I am more than happy to work for Microsoft even if they didn’t support this, I am more effective and productive because they do in principle and in practice.  Check out Anywhere Working for more on this and encouraging your organisation to think about it, for example I punched in my daily commute and assumed I would work at home 3 days a week..


  • Educated desktops

    Simon and I have spent most of last week on stand duty at BETT one of the largest education events in the world. We were there to field questions from teachers and some of the hardest working IT Professionals, those supporting the IT in schools.  Agility is essential to cope with the new influx of students every year as well as is the need to deploy every more applications to keep up with the latest standards for the curriculum and the way each subject is taught.  Some of these questions are relevant to all of us so I thought I would post some of the discussions..

    Teaching the next generation of IT Professionals

    There was a lot of coverage in the press last week about teaching coding and development as part of ICT, however I had two separate requests from ICT teachers about teaching how to maintain and fix problems on PCs, because that’s what their students had asked for.  We discussed setting up virtual machines on Hyper-V and using snapshots to allow a damaged desktop to be fixed and then being reset with the problem for the next lesson.  I also think some of the information on clustering and virtual machines on the Microsoft Virtual Academy could be reused in class rooms. 

    Remote Desktop Services & App-V.

    One way to deal with the problem of matching up students and teachers to the applications they need , irrespective of where they are working is to use App-V (application virtualisation) as this deploys a virtual copy of an application to a desktop based on the groups a user belongs to i.e. it won’t show up in programs in control panel and can run side by side alongside earlier versions of the same application which it would normally conflict with.

    Another approach is to use Remote Desktop Services (RDS) and it was no surprise at BETT to see all the hardware vendors sporting their latest thin client devices, and personally I like the LG and Samsung offerings where the thin client was just part of the LCD panel.  However not every application likes running as a remote desktop and you can end up creating a lot remote desktops for each type of user.  The trick here is to use App-V with RDS so that the applications run virtually inside the remote desktop session and a given user only gets the applications they need even though you only have one or two standard desktops in RDS (the guidance on how to do this is here). 

    Another good thing about RDS is that it reduces heat in the classroom if thin client devices are used and also reduces the background noise, although the noise from pupils will still be the same!  It is possible to implement RDS without also deploying Citrix or Quest technologies on top, however both of these partners’ offerings add ease of use and manageability to what the raw RDS experience delivers.

    Digital Inclusion

    RDS can be setup so that these personalised remote desktops are available to staff & students working at home or other locations and this means they can use their own devices to interact with a school.  Of course laptops are expensive and can be difficult to justify on a limited budget, so to level the playing field there is Get On Line @ Home, which provides affordable reconditioned hardware with Windows 7 + Office 2010 with telephone technical support included.

    ..and Finally

    One of my colleagues was asked for a whitepaper so he naturally wanted to know on what topic as we have loads of them,  the answer came back “no I just want some whitepaper” and the delegate grabbed some blank A4 sheets off the stand!