Insufficient data from Andrew Fryer

The place where I page to when my brain is full up of stuff about the Microsoft platform

July, 2010

  • .VHD, one of my favourite file formats

    Virtual Hard Disks or VHD’s are not just the format used to hold a hard disk for use with they also have at least three other uses I know of:

    • They are the format used for windows backup so making a virtual machine from a physical one (aka P2V) is simply a matter of making a backup, creaing a new hyper-V virtual machine and pointing it at this file.
    • You can also boot directly from a VHD if the operating system on the VHD is Windows 7 or Windows Server 2008 R2.  This may sound like virtualisation but it isn’t because there is no hypervisor technology needs and if there is you can actually setup Hyper-V in this VHD and run virtual machines from it.  (I have more on this here).
    • For me the most useful feature is that they can be mounted into Windows 7 or Windows Server 2008 R2 using the disk manager in either OS. 

    I use this a lot in building demo environments because I can mount a VHD on a running environment (physical or virtual) and copy stuff to and from it.  

    First of all I need to create a VHD which can either be done under the hyper-V role or under disk management in server manager (which I find is faster). I simply go to the actions and select create VHD:

    a create VHD

    in the wizard specify it’s size, location and what kind of disk (dynamically expanding or fixed). In the demo world dynamically expanding is fine and even though I have specified 10Gb the VHD will start at 4Kb until I start to do anything with it.

    b create VHD size

    Click OK to finish and the VHD now shows up in Disk Manager(with a pale blue icon to show it’s a VHD)..

    c create VHD done

    Now you can initialise it , create partition(s), format it and assign it a drive letter as you would any physical disk(in this case I now have disk manager open in server manager)..

    k vhd mounted and ready to use

    So now I can copy whatever I want into it such as installation files databases scripts etc.

    When I have all the stuff I need on the VHD, I just come back to disk manager and detach it, by selecting it, right clicking and choosing detach 

    l detach vhd from physical OS

    as I can’t use the same vhd in more than OS at the same time.

    Now it’s detached I can attach it a virtual machine.  The virtual machine must be off to do this So this is unlike an .iso file which you can mount at anytime but is of course read only once you’ve made it.

    Anyway I can now go to the settings of a non running virtual machine in server manager  in this case my windows 7 client virtual machine. I select the IDE controller 0 and the option to add a hard disk

    m add new hard drive to vm

    I specify the path of the resources VHD I created above…

    n specify vhd  to add to vm

    ..and click OK to attach it to the VM. 

    No I can start the VM and see my resource disk with all the stuff I put on it..

    o VHD in guest VM

    At this point I have usually forgotten to add something to this VHD that I needed. All I need to do now is stop the VM, attach the VHD on the physical OS again, copy the extra stuff, detach and start the VM again.

    It gets better in that the VHDs can be reused in this way on any VM and I could even do all this attach/detach business in PowerShell.  BTW If you want to use PowerShell then refer to this excellent post from Taylor Brown a member of the Windows Server core team.

    You can do all of this in Virtual PC as well and move a VHD’s between the two.

  • Spare a thought for your hard working helpdesk

    It’s certainly no fun being on the helpdesk, and I can’t see this job getting any easier as a result of any kind of cloud implementation. If anything it will initially be a bit harder to support across all the moving parts, as identifying the cause of a problem will be hard and some of the infrastructure will be beyond the control of the help desk.

    All helpdesks are under huge pressure as the economy wobbles again and the only certainty is the increasing pressure on any overheads. Ignoring the obvious question of why anyone would want to do this thankless task, how do you keep on top of the issues? Processes need to be tracked, you need to demonstrate you are doing a great job without  burning extra time filling out extra forms on what you have been doing.

    The answer is integration. The business have this with SharePoint, CRM, Outlook etc. and so they only have to use one experience to get their work done.  In the world of the help desk, there is System Center in all it’s guises for the IT Pro to spot faults and control patching and fixing. However what’s been missing until recently is an integrated help desk solution.   Enter Service Manager 2010(SCSM) ,  a cunning piece of CRM style software that integrates with other bits of System Center Operations Manager(SCOM), Configuration Manager (SCCM)and Virtual Machine Manager (SCVMM).


    As you can see this is a familiar world for anyone who has used windows server or Outlook. There is a lot going of great stuff hidden in here:

    Processes are designed around best practice to support the latest current standards of IT management,

    • SCCM will automatically create incidents where PC are out of compliance
    • A knowledge base where expertise can be shared between the IT Team
    • A self service portal for users to see what’s going on (such as services being down) and help themselves with the simpler problems

    As a BI buff I also noticed the options at the bottom for reporting and Data Warehouse.  This Data Warehouse is going to round up all of those statistics you need to report on. To present this raw data to good effect there is an additional  (& free) solution accelerator here.

    This allows you to quickly create web parts for SharePoint Services (WSS 3) from the data warehouse..

    imageimage 


    As with the rest of the System Center line up there is also a healthy ecosystem of third party add-ons so that specific hardware and other platforms can be supported centrally. For example there’s an add-in for Intel vPro technology here to enable better remote access of desktops and this add-in from Provance Technologies delivers IT asset life cycle management by providing a new process management pack to SCSM.

    There’s a trial edition here and you’ll also find the System Center Product team’s blogs on SCSM useful should you decide that this is something your helpdesk would benefit from.

  • Simon May

    Contrary to popular belief Microsoft staff are humans who have often led varied and interesting lives before being absorbed into what slashdot describes as the collective. A lot of people actually want to work for Microsoft and so we have recently seen a new batch interns and graduates who have had to fight off fierce competition  to be accepted. 

    In a similar manner we have a shiny new technical evangelist on our team, Simon May.  Some would say he married a Microsoftie, to improve his chances of landing the job.  All I know is that he’s going to fit right in anyway and that he’s not alone in having a partner who also works for Microsoft, but enough of that.

    In Simon’s words..

    “I’ve just joined the evangelism team helping Andrew look after IT Pros in the UK. I’ll be focusing on our client technology including deployment, management and use – obviously that can’t be done without exploring our server platforms too. Helping keep UK IT Pros in touch with what’s going on in the cloud, how it helps them and their business and how they can adopt a more cloudy outlook is an exciting challenge too. My other main focus is working out how Microsoft works being the newbie.”

    you can find Simon’s blog here and he tweets as @simonster

     So I’ll continue to widen my technical skills, across all the stuff that happens in the data centre, not just my first love of BI and SQL Server

  • Building a Demo sandbox using Hyper-V part 4

    As part of a random series of posts on my demos, I have now stopped using my shuttle, and I am now using an upgraded  laptop (Lenovo T61p):

    • 8Gb RAM
    • 160Gb SSD in the primary drive bay
    • 500gb 7200 rpm hard disk in the secondary bay in place of the on board DVD drive

    The thing I have lost in the process is that I now only have 2 cores on the laptop compared with 4 on the shuttle. However I just use the new SSD to run virtual machines and this more than compensates for the loss of computing power.  Not only do the virtual machines spin up much more quickly they also seem to actually run faster, especially my BI box running SQL Server 2008 R2 and SharePoint 2010. 

    Using a laptop for hosting virtual machines has 2 advantages which make it ideal for demos (excluding the obvious size and weight that is):

    • It has a battery so I can spin stuff up for a couple of hours before I need to
    • It has on board wireless networking.  A word on this in Windows 2008 R2 you need to turn on a the wireless networking feature to use any wireless adapter, and Hyper-V doesn’t allow you to create a virtual network based on a wireless adapter so what you can do is to simply create an internal network and create a bridged connection from that to the wireless adapter.

    I have also broken a few virtualisation golden rules -  I have PowerPoint installed and the desktop experience feature enabled for video streaming on the host operating system (Windows Server 2008 R2), so I can present from this machine if I need to.  BTW if you want to fool your public at the expense of loss of performance then turn on the themes features and set the themes service to automatic to make Windows Server 2008 R2 into a convincing Windows 7 client experience complete with aero glass (if you use that theme)..

    win2k8r2 bing theme

    Don’t try this in production:

    Windows Server 2008 r2 with aero glass plus the hyper-V role and Office 2010 installed. 

    One final note, I needed to use the Windows 7 driver for the graphics adapter to get show aero glassing working and more importantly I needed to do this anyway to duplicate the display to a projector or connect a second screen.  I mention this because you might see some posts suggesting that you stick with the generic display adapter for performance, but what I am after here is a good demo performance, so I need to show stuff on screen and take a possible performance hit.

  • Woodland Trust and Green IT

    I was lucky enough to meet  Lionel Wilson , head of IS at the Woodland Trust at the Windows 7 launch last year.  He is now starting to implement what he saw on Hyper V and Remote Desktop Services in Windows Server 2008 R2 as the trust prepares to move into it’s (literally) cool new HQ building..

    DEEP4156 

    Lionel Wilson outside the new Woodland Trust HQ

    To understand why this should be of interest to you it is important to understand that charities like the Woodland Trust, are often under the same constraints as many businesses; they are run on sound commercial principles to ensure that as much of the funding they receive goes toward the cause they are supporting. They are run by professionals who may well have come from big business (Lionel being a case in point) who have decided to use their talents for good causes.  All this affects IT in two ways; budgets for infrastructure are very tight and they simply cannot afford to pay the competitive IT salaries you might see in the city. Green charities also face the additional requirement of being seen as leaders in adopting sustainable technologies, to show what is possible.  

    So Lionel thought it would be good to do a series of interviews on how he and his team are implementing all of this as it’s one thing to read a case study but another to understand how sustainable IT and data centre optimisation works in practice, what the problems were and so on.

    The first interview introduces what the Woodland Trust does and why they decided to replace Vmware with Hyper-V  and is on TechNet Edge here or you can stream by clicking on the image below

    image

  • Stop listening to your remote users

    If you have a mobile workforce, your  most likely going to be listening to them complaining about remote access/VPN/RAS or whatever they call it.  My wife is a case in point, she is running XP plus a load of third party software to lock it down to the point where it barely works and often won’t connect into the corporate VPN. All of this results in her calling the helpdesk at least once a week which isn’t good for her or good for them.

    By contrast I also work at home, but the IT department (MSIT) are so proud of the solution they have come up with for remote workers they have a case study on how they do this.  Their answer is a combination of DirectAccess (a built in role in Windows Server 2008 R2), and Forefront Unified Access Gateway (UAG).  You can do this without UAG, but if you want down-level support XP clients / Windows Server 2003 then UAG is needed for this.

    If you are using windows 7 clients, UAG adds extra security (for legacy apps etc.) and load balancing.  For the IT Pro setting this up there is no client software involved and apart from  a group policy update nothing to configure on the client as Windows 7 also has DirectAccess functionality built in.  

    UAG and DirectAccess

    But our our IT department is not stopping there, and I am now on an internal beta (we call it dogfood) which uses the Trusted Protection Module (TPM) built into many modern PCs instead of using a PIN & smartcard.  You may have heard of the TPM as a prerequisite of BitLocker, the disk encryption system built into Vista and Windows 7. BitLocker is mandatory for Microsoft  client machines and this latest incarnation of our DirectAccess deployment relies on that being implemented. 

    What does this look like for me as an end user?

    The only indication I have that I even have DirectAccess is this notification in the system tray..

    image

    which kicks into life as soon as I have any kind of internet access, without any intervention from me, so as I am typing this at home I could hit any of the many corporate web sites to book leave, download some dogfood or remote desktop into one of the machine I have left at the office.

    Another annoying remote worker niggle that DirectAccess removes is that it will work over the standard ssl port where third party solutions are normally blocked, and so it works on virtually any public wi-fi networks e.g. hotels, cafes, client sites etc.

    It’s also  intelligent enough to route traffic efficiently - if I hit an internet site while I’m connected that goes through my ISP as normal and doesn’t get rerouted through the VPN and out onto the internet from the corporate network, so I still have the same internet access performance as I had before and I haven’t slowed down the office network either .

    So if you want to stop listening to all of those users with VPN support calls then you could try this out on a  few VM’s using this deployment guide to get you started. Then you can call your users to have a discussion about the feasibility of using DirectAccess in your organisation.

  • Is the Tablet PC dead?

    I have the luxury of 2 laptops one for infrastructure demos and one for the day to day stuff most of us need to do.  I have added luxury on top of that becuase my day to day machine is a multi-touch tablet (Dell XT2).  Before I get the iPad comparison comments at the end of this post about multi touch screens, I would agree the iPad wins hands down  on feel and precision.

    However I can use the optional stylus for real precision, plus I need to run random stuff on my laptop like Adobe Photoshop to edit my sketches for print & online use..

     

    SDC11397

    and Microsoft Expression encoder / Live Movie Maker for videos on top of all the usual office applications any IT professional needs to use.

      I also need some of the other hardware that this tablet has:

    • A keyboard for entering lots of text (like this post).
    • An SD Card reader to pull in pictures (like the one above)
    • USB ports to connect my MP3 player, memory sticks, and a webcam (which should have been built in IMO) 
    • a SmartCard reader until we fully transition away from using them for authentication
    • a trusted protection module (TPM) to encrypt the data that’s on here.

    Sure it is heavier, more expensive and has less battery life than the cool gadget of the moment, but I only need one device and one mains adapter to be able to do most of my work and keep up to date with the world. 

  • Woodland Trust part 2

    In my second interview with Lionel Wilson Head of IS  at the Woodland Trust,  I wanted explore what the challenges are of moving from this ..

    .DEEP4154 

    into a new sustainable HQ that wouldn’t look out of place in  Grand Designs albeit on a larger scale. It’s packed full of passive and active cooling tech, to make it as sustainable as possible without costing any more than traditional new builds.  You can’t get too much sense of it at the moment but here’s the new data centre with a special strengthened base in the middle for the servers and UPS

    DEEP4159

    and there are few stills in the video..

    image

    Lionel is joined by his ops manager Richard Otter as they explain that their approach to IT follows the same approach as the rest of the Trust, by being innovative but not expensive.  I was also keen to find out how you go about doing a project like this and what are the risks

    My plan is to continue to chat with Lionel and his team as they prepare to move in September to see how it all goes and share some of their experiences to help you with you migration projects.

  • Integration, Integration, Integration

    Microsoft is all about choice, so instead of having one way of shifting data between two different platforms we have three.  I get asked about two of these a lot but I also wanted to discuss the latest tool in this space to make sense of all them and understand when to use what:

    • Integration Services included in standard edition and up of SQL Server since SQL Server 2005
    • BizTalk Server, a product in its own right
    • StreamInsight in Enterprise (single threaded) and DataCenter editions of SQL Server 2008R2

    So what’s different about these three?

    Biztalk is a workflow service that enables integration of disparate systems, but also allows human intervention and business rules to be included.  It works at the transaction level rather than being used to bulk move lots of data.  If your a developer reading this you’ll be familiar with Windows Workflow foundation and Biztalk is where you can scale and run workflows you’ve written.  So Biztalk is for thing like Enterprise application integration (EAI), fore example synchronizing or exchanging data between a company's warehouse and enterprise resource planning (ERP) systems.

    Integration Services is about lifting and shifting bulk data between two systems. Although it is part of SQL Server neither the begin or end points needs to be a SQL Server database. It also has complex transformation rules and IMO can pretty much do everything Informatica can do as fast as it can.  I know this because I used Informatica in production before joining Microsoft. Typically Integration Services is used to populate data warehouses and run associated tasks (building cubes, running reports, sending alerts etc.).

    StreamInsight is the new shiny thing and is about working with data as it appears in a feed in near real time.  I have already done a few posts on that and the StreamInsight blog has some good samples but to précis those, it’s a services that you write rules in using LINQ (language integrated query) in visual studio.  It is being used for telemetry monitoring in manufacturing, and trading & fraud analysis in the financial sector.

    So three different integration tools for three very different use cases. 

  • Consolodation 101

    I spent yesterday at the  Hampshire ICT conference presenting on virtualisation on behalf of one of our partners, Medhurst IT. What made my session here unusual and interesting was the audience; there were quite a few teachers who wanted make sense of virtualisation, both for themselves and to explain it to their pupils studying ICT.  I thought it would  be good to share this as there are still many people who don’t know too much about the topic.

    The top priority for IT departments in this uncertain world is consolidation i.e. getting more done with less. Virtualisation is seen as synonymous with this but it isn’t necessarily so unless it’s done right.  Before I get into that what is virtualisation anyway?  It’s the business of detaching what we are doing on a physical computer (be it client or server) so that we can move that work to other computers at will.  We can virtualise a number of things:

    • We can virtualise what our servers are doing
    • We can virtualise what the client does and allied but separate form this ..
    • We can virtualsie the desktop our users are using and at a lower level of detail ..
    • We can virtualise an individual application be something simple like Adobe Reader to a full on application like Visual Studio, 

    Server Virtualisation enables lots of virtual servers to exist on one real or physical server and has been around for year with mainframes, and is no more mainstream for windows and linux servers. 

    Terminal Services  or Remote Desktop Services as it now called has long been the traditional method to virtualise a desktop for a client to use. This can also be used with thin client devices in place of full fat PCs. 

    That leaves the two newer forms of virtualisation Virtualised Desktop Infrastructure (VDI), and Application virtualisation.

    VDI is simply the business of using server virtualisation technology to provision a virtual machine running a client operating system for each of users.  I can see a few niche use case where this might offer more flexibility and capabilities to certain types of users but in reality remote desktop services can cover a lot of the same use cases and needs less computing power to provide the same services as VDI. 

    Application virtualisation allows an IT Pro to identify a bunch of users or computers that need a piece of software and then streams this down forma server when required.  The clever bit is that it doesn’t actually install and allows multiple versions of things like excel to happily co-exist.  Central control also allows best use and control of limited licenses of expensive software e.g. Visual Studio Team System  or Adobe CS5.  However this technology means that the software is running on the client machine. The other way provide an application to a client without installing it is to use remote desktop technology to just provide the application. However you need to be permanently connected to the network to use any remote desktop technology, whereas with application virtualisation you only need to be connected to download the application.   

    However none of these approaches to virtualisation will achieve any real savings without really good management, any more than an eco friendly car will save carbon unless driven carefully.  While running six virtual servers on one physical server will save some power and rack space, each of these virtual servers still has to be managed (patched updated, audited etc.)  so you can’t downsize the IT team at all just because you have done this. 

    To really get the benefit of using virtualisation you need solid management tools to manage both the physical and virtual servers from one screen.  These tools aren’t just about patching and updates (though that’s essential) it’s about performance monitoring to ensure you are getting the most from the data centre, and to have the information to prevent problems rather than having unplanned downtime.

    Microsoft’s approach to virtualisation is odd in that you don’t actually license any of it , rather it’s built in to operating system or in the case of application virtualisation you get as free extra when you have software assurance.  However Microsoft does have paid for products for managing your virtual infrastructure - System Center.  One other free thing you will need to get you started is the Microsoft Assessment and Planning Toolkit which will identify what Microsoft products (versions service packs etc.) you have so you have a baseline to plan your virtualisation strategy

    Further Reading.

    A quick scan of the web using your favourite search provider will no doubt give you a bunch of resources from Microsoft’s partners and competitors on virtualisation  but if you want to know more about Microsoft’s approach to virtualisation then follow these links:

  • Things you would never hear a DBA say

    Twitter is great fun and switched on DBAs use it for two reasons, they have their servers under control so they have time for a bit of fun during the day and occasionally need to phone a friend to get tech tip or share some insight.

    I caught this hashtag, #thingsYouShouldNeverHearADbaSay, in a quite moment 10 minutes ago and it’s like a SQL version of a mock the week tragedy I wanted to post a few of them hear to have a bit of fun and possibly enlighten a few of you !:

     

    • @SQLCraftsman: I don't have to worry about I/O because I use a SAN. 
    • @SQLChicken: RAID 5 is our backup strategy!
    • @jayape: I just made you a sysadmin, now you don't have to keep asking me for permissions
    • @aaronbertrand: I've changed all of the databases in production to use both AutoShrink and AutoClose
    • @paulrandal: Nope - the backups were on that drive too.
    • @simon_sabin: Just use sa, the password is blank
    • @chrissie1: Excel can do this and much more
    • @mike_walsh: Wow. It really IS the database! Sorry, guys... I've been blaming the developers and storage guys so long!
    • @onpnt: We have more than one sql server installed here?!

     

    No doubt there’ll be more to follow, but you might want to follow a few of the guys mentioned here, they do actually know what they’re talking about. BTW you’ll find me on there as @DeepFat.

  • Managing SQL Azure

    Effective management of  the cloud is essential, and SQL Azure is no exception to this.  Until a couple of days ago the only way to do this was to fire up SQL Server Management Studio in SQL Server 2008 R2. However the Azure team have now released a public beta of Project Houston a lightweight web based management console.

    I simply connect..

    image

    to get a screen full of Silverlight loveliness…

    image

    I can then do all the basic stuff I need with my database, such as view and modify table designs, run and save queries and create new table, view and stored procedures. Here I am looking at the data in dimDate..

    image

    from where I can also add and delete rows.

    The more astute among you will realise that despite the gloss, the tool is quite basic at the moment.  I agree entirely but I would emphasise the ‘at the moment’ in the above statement. One of the game changing things about cloud based solutions is that they are regularly refreshed  and because they are cloud based I can use the new version this straight away as there’s nothing to deploy.  In the case of SQL Azure This rate of change is unique for any database platform. For example since it’s launch in January, SQL Azure has added spatial data type support and is now available as a 50Gb offering, in addition to this new management tool.

    Apart from that SQL Azure is really really boring as you just use it like any other SQL Server Database, and when it comes to databases boring is good.

  • Social Office

    A lot of people who use facebook also use Office so Microsoft Fuse labs have used Azure to provide Office document sharing between facebook users in much the same way as facebook users can now share pictures and videos.

    I Simply went to http://docs.com and signed in with my facebook account to get started.

    I then went back to facebook and clicked on the + on the tabs and select docs to get a docs tab..

    image

    and now I can create  a document in facebook, in this case PowerPoint..

    image

    Notice I am now using Office 2010 web apps, although I could elect to open full fat office and edit the deck that way if I wanted to. Notice I also have all the usual facebook social tools to the right including privacy.  I can also upload one I made earlier such as my SQL Server consolidation deck featuring some of my cartoons ..

    image

    I find this interesting for 3 reasons:

    • It’s useful e.g.  for homework, for organising community work to simply as another means of getting a message out to customers. 
    • It breaks the link between the data and the application and changes the focus on to what a user wants to do.  This shift is also apparent in office itself for example when I want to prepare a deck for an event I just work in PowerPoint as it has pretty good tools for embedding video and enhancing images without me needing to switch to using a separate tool for each of these.
    • It shows what can be done by 4 developers in a couple of months using Azure as the platform
    • It shows how Office Web Apps has come on since 2003 

    You can try all this yourself now  which might be a welcome change from virtual  farming on facebook!