Insufficient data from Andrew Fryer

The place where I page to when my brain is full up of stuff about the Microsoft platform

February, 2010

  • Windows 7 is the Business for the User

    My post  Windows 7 x64 is fun might seem to echo quite a few comments on various forms and posts, but I do have a different view on how useful it is for business, based on about 18 months usage.

    rather than regurgitate an endless feature list let me tell you what is really useful too me. To do that here is a typical day:

    7am Arrive at the Microsoft campus because I hate traffic.  I fire up my machine in seconds after slamming the lid shut the night before and instantly I am confronted with all the meetings I have come into the office for. hmm… the batteries looking iffy I wonder if George will let me order a new one?

    My machine has now seen the wireless network (you have to have a domain joined laptop to get on the wireless network) and the corporate spam floods in!

    8am Check the session I am running today. I right click on PowerPoint select the top file and I have my deck. This is because I have PowerPoint pinned to the taskbar and the deck I will present is pinned as well so it’s always in my recent list.

    10am presenting in Chicago (our main lecture theatre) plug in the VGA cable here logo key P and I am projecting ready for my session, let’s hope the demo gods are smiling!

    12 noon Start writing my blog post around what I have just been speaking about. I use the side by side feature (logo key right or left arrow) to snap live writer to the right of my screen leaving Internet Explorer 8 on the other side so I can get some links to resources on TechNet. I also use that snipping tool to grab a few screens.

    2pm push off home.  When I get in and fire up my laptop I get a set of keys in the system tray telling me I need to enter my smart card credentials.  This is DirectAccess and when I have signed in I have VPN functionality that is effortless and makes very good use of the limited network bandwidth I have at home.  With my laptop in this state our IT team can check/enforce policies even while I am at home such as ensuring I have BitLocker encryption enabled. Actually I can’t get in if my machine isn’t up to date thanks to Network Access Protection.

    3 pm I need to print my pass for TechEd Dubai and forgot to do this at the office. No problemo my home machines are running windows 7 and I have a HomeGroup which I can see by simply setting my network location to home and entering my secure passphrase to access it.

    Now I can print to my PC connected printer, pick up some music, but my wife on her PC can’t see my laptop as Home Groups don’t expose domain joined machines.

    5pm Oh ***, I need to paste some links on my post  that I’m working on. They’re in something I read a month or so  ago, but exactly what I can’t remember. That’s simples - I just type in what I’m looking for from search in the start menu to get e-mails , documents in one list from my drive and designated sites such as the corp intranet and Bing.

    I am not sure this could all be done on a heavily modified XP laptop but it would be slower for me and I would have to invest in third party tools for the encryption VPN etc.  I absolutely know this because my wife has a shiny new laptop

    which she uses to connect to her corporate network and it’s painfully slow.

    I am also not sure if anyone who’s posting on windows 7 not being good for business is actually relying on it rather than just playing with it.

    However I am very sure that I wouldn’t go back to using XP if you paid me whether that’s for the serious business of playing games or my hobby of Evangelist at Microsoft.

  • BI Masters Degree at Dundee University

    Although I am a huge fan of Microsoft BI (which is  why I work for Microsoft not because I work for Microsoft) I realise that many of the factors behind a successful BI project are nothing to do with the technology. Of course the technology decisions can make a BI project more effective, more affordable and more agile, but that’s another story.

    So when Mark Whitehorn asked me to do a session to his MSc BI course at Dundee University, we agreed on a title “Business Intelligence – it’s not about the technology”.  My talk (whihc I gave today) was based on my own experiences using solutions from Business Objects, Cognos, Informatica etc. and of course Microsoft, but what I wanted to draw out was what to watch out for in the real world irrespective of the stack used.

    After thinking about this for a while I eventually realised that the core of my talk would essentially prove the validity of Ralph Kimball’s readiness test for business intelligence. So for each of these tests I simply put in evidence of where there was a positive and negative example:

    image

    Strong Business Sponsorship from the senior partner in a major UK law firm got buy in from all parts of the business and meant that the project was completed on time (two days before I joined Microsoft) and on cost.

    image

    ..however at one company I worked at the IT guys could see the need to update a reporting solution, but didn’t have the senior sponsorship needed, so a very tactical solution was all that was achieved, while some decisions seemed to have been made by the board during a round of golf.  I wonder if that still happens there now?

    image

    Urgent Business Need. One successful project I remember was nothing more than a mainframe replacement. This did add extra ad hoc functionality, but a straight goal of like for like replacement and a very real need for a fixed go live date when the mainframe kept the scope and deadline fixed.

    image

    I have worked on a couple of white elephants especially when I first started and was too inexperienced and too junior to push back, the most obvious examples were often a board members pet project , so resources were made available but the project didn’t align with the business strategy (one one occasion there wasn’t one). Perhaps white elephants are rarer in these uncertain times.

    image

    Good collaboration between IT and the business has meant that I was able to train and lead a new BI team in a utility company. As a consultant this might be like eating your own children, but I wouldn’t be able to scale and do all their projects or handle the day changes and support that a good BI system needs if it is to keep up with the business. However they could always come back to me to assist in the bigger projects and to help them pitch ideas to their management.

    image

    On the other hand I was brought in  by the business users in an attempt are made to go around what is seen by the (not me) as institutional inertia in s the IT departments.  All went well until it was time to go into production where regular access was needed to data and production servers so the project hit the skids (to use a technical term).

    image

    Culture of Analysis. I have written several post before on flying on instruments and I worked in women’s fashion for a while where the merchandisers would make decisions to discount a line to ensure a balance between discounting too early and aggressively which reduces profit and too late where you have a warehouse full of unsold stock.  Complex spreadsheets were used for this which were then replaced with a suite of exception reports.  image

    Despite my best efforts to produce actionable insight at a brewery I worked at (tough work but somebody had to do it), the salesmen were all for using their experience to make decisions rather than believing what the reports were telling them. later the brewery were acquired and I guess the sales guys are know working elsewhere.

    image

    Feasibility Issues. I have come across which can block a project have included not being able to get at the data, and especially poor data quality around customer demographics.  One other particular was in an insurance system where it impossible to resolve the data on screen to what was in the database because of the complexity of the application.  This can happen in systems where the customer can design and alter the application to the extent that the schema changes.

    This all sounds like doom and gloom, but I thoroughly enjoyed most of the projects I worked on. Of course the issues I have highlighted were frustrating but being aware of them before they became an issue meat this was kept to a minimum. 

    Having given my talk, Mark thanked me for pretty much summarising what he has been telling the students , and that’s interesting when there often seems to be so little consensus in IT these days, hence the post.

    I just wish this course was around when I was starting out, so I didn’t have to learn it the hard way!!

  • Using Hyper-V to make a demo sandbox

    I like to make my demos myself and to try and make them as near real world as possible however I have two problems:

    • There is always something new to show not just in SQL Server, but in Windows Server, SharePoint, Office etc. so I need a modular approach to fire up bits of infrastructure as needed.
    • I can’t rely on getting a connection back to the office so I have to take my demos with me.  This can still be problem, for example when I had a disk crash at SQL Bits which I later realised also had the system partition on it!  

    I use a shuttle PC (called BINKY) with a quad core CPU, 8gb RAM and 3 x physical disks in it. This has Windows Server 2008 R2 installed with just the Hyper-V role on it.  From this I run all my virtual machines (VMs)…

    image 

    I typically run three VM’s at the same time; the Domain Controller (DC), a Windows 7 client and a SQL/SharePoint server. This setup means that all my service accounts are domain accounts as are any users who access reports, sites etc.

    I could put the Domain Controller (DC) role on the physical machine but this can be restrictive, and anyway is not ‘real world’.  The key thing about this is the way I have setup networking and the virtual machine (VM) that is my DC ..

    • The physical server BINKY has a static ip of 10.10.10.10 which is easy to remember and connect to if i have to use a fixed ip address to get to it. 
    • I have five network connections setup..

    image

    Two of these are controlled by the Hyper-V Virtual Network Manager, the Internal Virtual Network, and the External Virtual Network (both highlighted). 

    Hyper-V can’t use wireless networks but a lot of my demos need internet access so I have bridged (the bottom entry above) the wireless network to the Internal Virtual Network to achieve this.

    The Local Area Connection is only set to have the Microsoft Virtual Network Switch Protocol enabled. Technically this is controlled by Hyper-V as well when you elect to use an external adapter and make it visible to the physical server..

    image 

    • The last piece in the puzzle is that the VM running as a DC only connects to the External Virtual Network, and so the DNS and DHCP roles I have also setup on it are only available to a computer connected to BINKY via a hub and a cable.  This is how I connect my laptop to BINKY which enables me to remote desktop to any virtual machine (VM) running on BINKY, and thus means I can run my demos form one screen and use the power of my laptop to run any screen recording software (for which I use TechSmith Camtasia).  RemoteDesktop in windows 7/ Server 2008 R2 is very rich as you can see with a windows 7 remote desktop inside windows 7 …

    image

     

    Remote desktop also enables me to share local drives to get data and apps in and out the VMs 

    With this setup all my VM’s have two adapters both of which have their ipv4 setting set to automatically detect, but can be resolved to the machine names as would happen in production.

    My other tips for building demos with Hyper-V are:

    • To backup the VMs to another disk by using the export VM feature in Hyper-V, this picks up any snapshots etc. and allow that VM to be quickly deployed to another Hyper-V server, providing the names of the virtual networks (i.e. internal virtual network etc.)are the same.
    • Use snapshots  to roll back to the start of demos or if you are trying something like an install that you might want to rollback from and label them accordingly..

    image 

    • Run the VMs on different disks to the physical OS if possible, or better yet get some solid state storage!
    • Some network adapters can run slowly when shared for virtual networking, if this happens disable the checksum offset features in the adapter properties (there are several of them (thanks to James for this).

    image

    • Use virtual floppy disks. these can be mounted and un-mounted on virtual machines to enable you to quickly move scripts between virtual machines which can also be edited.
    • Don’t put other roles or applications on a Hyper-V server, unless you have to.

    Hopefully this is useful for anyone wanting to make a demo or sandbox environment but just be clear there is quite different best practice if you’re using Hyper-V in production and there are numerous posts on that including my favourite  - Best practices for running SQL Server on Hyper-V

  • Multi Server Management in SQL Server 2008 R2

    There have been tools to manage multiple instances of SQL server pretty much since it came out. If you don’t like the Microsoft tools or you have multiple database platforms in your business then there are quality tools around form the likes of Idera, Quest and RedGate amongst others.

    However I want to concentrate on what is in the box specifically what’s new in SQL Server 2008 R2. The new feature is called Multi-Server Management on some of the slide decks you might have seen, but in the current CTP (November) of R2 it’s called the Utility Control Point (UCP) and that’s how I will refer to it.

    The UCP performs two functions, it gives you very simple graphical representation of how your servers are performing and can also monitor the data tier applications I mentioned in my last post.

    Here it is setup..

    image

    From this screen you can enroll new instances, and you can see the node for data tier applications which are automatically identified on a managed instance as they are deployed to it. Here’s the one I setup in my last post..

    image 

    which is just idling on my demo rig so not too much to see but you get the idea.

    Rather than clutter up this post with endless screen shots I have a separate article on getting started with multi server management, so if you want to try it out you can refer to that or watch the videos that are available when you first navigate to the utility control point (menu –> view –> utility explorer in SQL Server Management Studio)..

    image 

    I do think this is nice and simple and will be useful for the casual dba who hasn’t invested in any third party tools, but this usefulness is limited to managing other instances of SQL Server 2008 R2.

    If you want to try this, it’s all in the November CTP of SQL Server 2008 R2 which is the last public beta before it is released in May.

  • Getting Started with Multi-Server Management

    In SQL Server 2008 R2 multi server management is expanded by the use of a Utility Control Point.  This article shows you how to set it up from a clean install of SQL Server.

    Open SQL Server Management Studio (SSMS) and go to the menu -> view –> utility explorer… 

    image

    From here you can sit back and watch what it does.  For those that hate watching videos I am going to create a UCP & show you the screen grabs starting with the wizard flash screen..

    image

    First I need to connect to the instance I want to create the UCP

     

    image

    Then I need to specify an account that is going to be used to go and get the telemetry for this instance which  is gathered using SQL Agent running periodically.  I am going to use the SQL Agent because it’s running with a domain account that I use for all my SQL related services on my demo rig, contoso\sqlservice as you can see in SQL Server configuration manager..

    image

    Now the UCP wizard will runs some tests..

    image

    Mine has failed because SQL agent needs to be running and set to start automatically so back to the SQL Server Configuration Manager (above) to do that. Now I can rerun the validation and proceed to the confirmation or summary step..

    image 

    I hit next & the install begins..

    image

     

    and finishes successfully.  If I close this dialog I now get my management dashboard..

    image

    which is empty. So what’s behind this tow key things some agent jobs and a management data warehouse based on what there was in SQL Server 2008..

    image

    Now that it is installed you can do several things…

    Enrol other instances so can manage see the status of all your SQL Server instances in one place. However be aware that it is currently only going to work against other instances of SQL Server 2008 R2.

    Select Utility Administration to

    Setup the threshold levels to indicate when servers are over uilised which can be down ata global level or specially set for individual instance..

    image

    Assign which accounts can access the dashboard from the security tab above

    Set how long the the database should keep information from the data warehouse tab..

    image

    Once I have configured the global and specific policies for each managed instance I can just keep an eye on the overall dashboard or drill into the details for each instance by selecting it..

    image

     

    note the tabs in the middle of the right pane above whee I can choose to look at CPU or storage utilisation, the polices specific to this instance and its overall properties.

  • Data Tier Applications

    I have a confession to make; I have been doing a little bit of development! I wanted to check out one of the new features in Visual Studio 2010 / SQL Server 2008 R2, data tier application projects (confusingly shortened to DAC).

    Essentially this is a simple lift and shift capability that allows a developer to deploy a database to production after which the DBA can manage it.

    In VS2010 I can create a SQL Server Data Tier application ..

    image

    into which I can add database objects such as tables, views, stored procedures etc. from scratch.  If I have a database already that I want to start developing then I can simply create a new project and then right click on it to import a data tier application (only from SQL Server 2008 R2 I’m afraid).

    image 

    Now it’s in VS2010 I can compare it another version of the schema add objects, deploy it to another server e.g. the local copy on your machine complete with all the security logins etc.

    You can also create a DAC from SQL Server Management Studio (SSMS) by selecting the database you want then Tasks –> Extract Data Tier Application to bring up this wizard. This creates a file of all of your settings into a file with a DACPAC extension which is actually a zip file full of xml files describing what you have extracted (which you can prove by changing the extension to zip & opening it).

    BTW when I did this against a sample copy of AdventureWorks I got an error that ddl triggers aren��t supported in DAC, so watch for that.

    The other interesting thing about DAC is that when it is deployed as such to a server it can also be monitored using the new Control Point feature in SQL Server 2008 R2, which I’ll leave for next time.

    If you want to try any of this then there is the updated Developers Training Kit for SQL Server 2008 R2 which has examples and videos  of this and a lot of the other new stuff for R2 (and the older one for SQL Server 2008 here).

    OK enough of development back to the exciting world of infrastructure!

  • It’s official, 64 bit Windows is fun

    Although I have an Xbox, I still find flying and adventures are often better on the PC and I am not alone.  I am also not alone in that I use 64 bit windows 7 at home for gaming according to this steam hardware survey

    image

    What I am seeing here is that windows 7 is seen as a high performance platform in it’s 64 bit variant.  This shouldn’t be a surprise because it’s big brother windows server 2008 R2 is only  available as 64 bit. So where 64 bit has perhaps been less of a focus in previous windows versions (certainly for drivers, and third party apps) it is now seen as the future.

    A good example of this is the  NVidia drivers. My own experience on Vista running Flight Simulator X was that although everything worked the x86 version was noticeably slower than x64 and there were issues in other games (which might not be down to NVidia).  However in Windows 7 everything works and in games like Crysis (which also comes in 64 bit form), I can crank up all the effects and details and enjoy all the eye candy it has to offer.

    mozzie

    Flight simulator X running at 1920 x 1080p

    The other thing that this highlights is that steam are able to run a successful business on the back of windows games (of whichever version). This is a bit at odds with the stuffy image of a PC user in a suit that some ad campaigns would have you believe. I am also pretty sure that the guys at Steam, Eidos, Electronic Arts etc. are also suit averse.

    So if you’ve got 3Gb of RAM or more in your machine you might want to have a look at 64bit to make use of that with Office 2010 x64 or just to give your games a bit of a boost.

    Technorati Tags: ,,,
  • Cloudy with a chance of redundancies

    Whether you want your cloud solution from a search provider, a bookstore or a software house there is no denying that the next wave in the IT revolution is the cloud.  Does that mean that IT professionals will go the way of the thatcher, millwright and blacksmith to be consigned to OU courses on social history. Is it some developer/management led conspiracy to remove what is seen as an obstacle to change in some organisations, or simply a way of cutting costs?

    My assertion is this is not the case for several reasons.

    • Our roles will be different.  When I started in tech support we used excel to track ip addresses because dhcp wasn’t a service we had, indeed we had to install tcp-ip software on windows because it wasn’t included in 3.11! However despite all the advances in networking we still have network guys, why because that increased functionality means that more is possible, and there is more to go wrong!
    • Another point is that we still have users, and even in a totally connected world they will make mistakes, or have trouble getting their work done, so I don’t see an end to the helpdesk although we could argue about where this will be done in future.
    • Like me the UK needs more fibre to make it healthy, only in the UK it’s not bran that’s needed it’s fast broadband; fast enough for unified comms, video streaming and distributed transactions.  Until that day we will be running a mixed economy of on premise and off premise services, and that means good IT Pro’s to seamlessly integrate this and keep it all running.
    • Even if we do live in a permanently connected world I suspect some systems will always be kept in house because of concerns around performance, control and security, or just general inertia

    So am I thinking about retraining to learn to be an electrician, or ramping up my guitar skills? No I am staying with SQL Server (including SQL Azure in the cloud), as I believe the demand for dba skills will continue long after I am in the ground.

    Technorati Tags: ,,
  • Zip It Block It Flag It

    Although today is European safer internet day, I did my training yesterday to 650 secondary school pupils, as I have an event today in TVP as part of the day job.

    There was stunned silence after they watched the excellent videos provided by CEOP (Child Online Protection Centre) , which are also available to teachers and parents on the  ThinkUKnow website.  CEOP are also providing both a Toolbar and Search add-ins for IE8 to make it easier to report abuse and raise awareness of the issues:

    image

    This is useful as not all sites commonly accessed by children have the report abuse icon that CEOP use, Facebook being an obvious example.

    I heard on the way in that the Today team on radio 4, were stating that there were only 169 cases of grooming bought before the CPS, and were therefore downplaying the initiative a bit.  So why do I bother with this?

    1. 169 is 169 too many
    2. The reporting rates for this kind of crime are very low, i.e. only  a few people come forward.
    3. Grooming is only one kind of problem on the internet which can adversely affect children, and so I also emphasise the damage that cyber bullying can do and that they need to be really careful with the photos they take of themselves and their friends.  And as you can see there’s quite a few other topics as well.

    OK rant over, back to preparation for today’s event on the Microsoft Application Platform in 2 hours and counting!

  • PowerPivot Installation

    I hope my PowerPivot posts and videos are proving useful, and continuing this them I want to cover off installation.

    PowerPivot for Excel is a free download for Excel 2010 and simply doesn’t need any edition of SQL Server behind it.  It comes in 32 bit and 64 bit depending on which variant of Excel you have installed and you can get it the beta of this from the http://www.powerpivot.com site.

    However to deploy that PowerPivot to SharePoint 2010, you’ll need to do quite a bit of work.  Rather than rewrite these I will simply direct you to the set of draft setup guides on the PowerPivotGeek blog.

    The basic setup is to..

    • Install but not to configure SharePoint 2010
    • Install SQL Server 2008 R2 analysis services using the analysis services integrated option…

    image

    • and then complete the post configuration guides to setup the PowerPivot gallery in SharePoint.  The oddest one of these that I found I had to do was to make sure that the profile for the SharePoint service account had the fact that SharePoint is in the intranet zone!

    I have used the first option, a single server setup to produce my PowerPivot videos and posts.  However my demo rig is a bit more complicated in that I have three virtual machines (vms) on my shuttle (BINKY) which has 4 cores 8Gb RAM and 2 x NICs which are setup as follows:

    • a Windows Server 2008 R2 domain controller so I can use domain accounts for all the services. This has 768Mb RAM  and 1 x logical processor. I do this because I use the DC for all my demos and because it’s closer to the real world.
    • a client virtual machine that is domain joined.  This has 2gb RAM and 1 x logical processor.  It is running Excel 2010 with the PowerPivot add-in on Windows 7 x86 as well as the SharePoint designer, and the SQL Server client tools.
    • The SharePoint vm is running Windows server 2008R2 x64 with 4.5gb RAM  and 4 x logical processors. which is way below the recommended production specification of 64Gb, but performs OK for single client access against a couple of PowerPivots.  This VM also has Reporting Services installed in SharePoint integrated mode so I can show the interoperability of Reporting Services and PowerPivot. BTW There is also an install guide for this on the PowerPivotGeek blog.

    It’s a miserable Friday out there so why not persuade the boss for a bit of R&D time and set this up this afternoon.

  • SQL Server edition sprawl

    One of the confusing things about SQL Server is all the editions it comes in and one of the great things about SQL Server is er… all the editions it comes in.

    Recent additions to this edition sprawl are the SQL Server DataCenter edition, the SQL Server Parallel data warehouse and SQL Azure. Why is this annoying and why is this good for you?

    The annoying bit first…

    Which edition should I use is a common question, and here I want to use the analogy of pizza.  Go into any restaurant and they will have a list of the different pizzas on the menu based on their experience and feedback from their customers.  But surely at the end of the day we are talking about cheese on toast here so how hard can it be? True but each customer is a little bit different and may have allergies and other dietary constraints and preferences. 

    SQL Server works like this the editions are named for what they are used for web edition for the web, workgroup edition for workgroup , enterprise edition and so on.  These have the mix of features in them that make them fit for purpose like compression, resource governor and all the BI stuff in enterprise edition.

    Of course you could adopt the “make your own pizza” approach that some other some vendors go for which is to give the basics at a given price and then charge for every single add on that you might want like compression clustering, support etc. but this can get quite difficult to manage across a suite of servers to say nothing of the higher costs this usually entails.

    The good stuff..

    is the ability to run SQL server on anything from an embedded application on a device, to a small PC, all the way up to data centres and the cloud.  This is a good thing for developers because the same application will just work on any of these without a rewrite, and dbas can use the same set of tools to manage any of these and replicate data across them if need be.

    The point is that new editions have recently appeared in R2 to reflect new hardware and the cloud, but you don’t need to do anything radical to use them except install or subscribe, the rest you’ll be familiar with.

  • What am I doing here?

    I often get asked about what I do in Microsoft, especially if my title of evangelist comes up in conversation. The latest addition to our team Sara Allison was also curious about what her colleagues get up to, so she interviewed me …

    BTW Sara is our producer, so she’ll be doing all the cool creative web based stuff and has just started to blog on MSDN.

  • Windows RC sell by date

    I am not sure if Mrs Fryer is trying to cash in on the life insurance or hasn’t quite got the hang of supply chain management.  The thing is I keep seeing stuff in the fridge with “sell by October 2009”, and the odd bit of furry cheese. 

    Of course software doesn’t normally expire in the way that food does, but there are exceptions in the form of trial versions and beta products.

    Windows 7 RC is probably the most downloaded beta to date and it is about to start to expire.  This shouldn’t matter too much as it was only meant for evaluating the new operating system and helping Microsoft to test its reliability.  However the reality is that many users still have it installed and this may well be on their only machine.

    The way the RC will behave is as follows:

    • 15th Feb it will remind you to upgrade after every reboot
    • 1st March it will reboot every 2 hours, giving some access to do an upgrade and get data of their machines
    • 1st June it will go into ‘notification state’ to tell you copy of Windows is not genuine e.g. your wallpaper will be set to black and there will be popups appearing at random intervals.

    I mentioned upgrade above because if these people were unhappy with Windows 7 they would have reverted to what they had before. I suspect the continuing use of RC is partly a testament to how reliable it was (for a beta) and how good Windows 7 is.  However the RC does have a few quirks in it, it is not supported nor will updates be released in future to secure it and should be consigned to the recycle bin as soon as possible, even if it was possible to go on using it forever.

    So can I ask you to do two things:

    • If your organisation is now looking to upgrade to Windows 7 then you can now use an evaluation copy of rtm here
    • Check around friends family and colleagues to ensure that they have plans in place to move of the RC before it starts making their lives difficult.

    Full details of this are on the KB article on Windows RC expiry