Insufficient data from Andrew Fryer

The place where I page to when my brain is full up of stuff about the Microsoft platform

August, 2010

  • SQL Nugget - Dates are illogical

    Whether I am reading Chocky by John Wyndham or the 12 Caesers by Suetonius the illogical nature of a calendars can be annoying . In Chocky the alien wonders why we don’t have 16 months instead of 12 and why weeks aren’t made up of 8 days, and in the 12 Caesers successive emperors try to make sure that a year really is a year by adding extra months(July and August).  Months can also catch you out in SQL Server:

    If I run this

    SELECT DATEADD(MONTH,-1,’20100930’)

    I will get back 

    2010-08-30

    because SQL Server has simply subtracted one month of the month number .  However what I was probably hoping for was the last day of the previous month as the 30th September is the last day of the month.  if you want to be fire proof and make no assumption about the day of the month that’s passed in then you could:

    • ignore the day of the month that’s passed in by replacing it with the first of the month i.e. ‘01’
    • take a day of this to give you the last day of the previous month

    which in T-SQL looks like this

    SELECT DATEADD(dd,-1,LEFT(CONVERT(VARCHAR(10),’20100930’,112),6) + ‘01’)

    For extra credit can anyone  tell me what this should return and why , assuming you are in the UK like I am?

    SELECT DATEADD(MONTH,-1,’17521010’)

  • Windows 7 the Presenter’s friend

    I think it takes a lot of effort and courage to stand up in front of an audience  and give a presentation.  Adding a demo into this mix only makes this worse, and so the SQLBIts organisers setup a speaker training day at our offices yesterday ran by Guy Smith-Ferrier.  Part of this was getting your laptop ready to present and I wanted to add a couple of tips to the excellent resources Guy as made available everyone on his site.

    Hopefully most of you know that the windows logo key(logo) P will bring up the projector settings..

    image

    but logo X (initially seen in Vista) is even more useful and not as well known..

    image

    Note Dell  have added some goodness to my screen here as other vendors do, but the top section is part of Windows

    You might want to check some of these setting well before you start to present:

    • Are you running on mains power at full performance?
    • Turn on presenter mode this can be further customised by clicking on it:

      image

      To set an optional background for your presentation as not everyone needs to see your loved one(s) , Ferrari, football team etc. and clean backgrounds during demos make it easier to see what’s going on. You can also turn off your screensaver and set a different volume level.

      • Connectivity may be important to you but is a risk in any demo so have a backup plan.

      Guy also emphasised the need to set a nice big font in all the tools used in a demo be that Visual Studio, PowerShell, Command line. That can take time to do and makes it awkward to use your machine for normal work while it’s set like that. One of the audience in the training session was to create a demo user account and have all those settings in that profile leaving your normal profile for normal work. You’ll want to be clever with permissions here but i think that’s a great idea.

      Another approach is to use the built in Windows 7 zoom keys to focus on an area of the screen: logo  + zooms in, logo – zooms out and logo Esc exits.  Don’t do it too often though as it can make some people feel ill. 

    • My additional Windows 7 demo top tips relate to display and remote desktop:

      • While Windows 7 does a pretty good job at picking up the projector capabilities, I have seen it get thrown by some of the newer graphics outputs, such as VDI out and the new display port. Some of these don’t work properly unless the port is connected so your only option is to boot with the projector connected if this is the case.  I did have a machine like this and ditched in favour of one with the trusty old vga out on the back (although it does have VDI too). 
      • Also be aware that remote desktop or any connection to a virtual machine will get thrown when you resize your actual desktop for the projection size, so I try and find out what size the projector will work at and manually set this up first, both in the physical and in Remote Desktop as per my Remote Desktop post a couple of weeks ago.
      • Be prepared for 1024 x 768 as this is the lowest common denominator for projectors out there.

      I have learnt most of this the hard way, so read this and Guy’s stuff so you don’t have to!

  • SQL Server Redundancy

    When I saw SQL Server redundancy on a recent internal e-mail I wondered if it was about a replacement for SQL Server , DBA job losses, or the rise of the NoSQL movement.  It turned out to be nothing more than a general ask about the pros and cons of clustering and mirroring. If there are internal Microsoft staff who don’t know when to use what, then I imagine there must still be confusion in the real world about these two approaches. 

    First some introductions:

    SQL Server clustering is built  on Windows Server clustering, which use a number of servers (nodes) and shared storage to automatically recover a service from a server failure.  The service doesn’t have to be SQL Server it could be IIS, or Hyper-V for example.  It’s also important to understand that the recovery form failure (failover) is not instantaneous and can take several minutes. 

    As far a s SQL Server is concerned there are good things about clustering:

    • Clustering operates at the instance level so all of the databases in the instance that are clustered are protected in the same way including the system databases.  This means protecting an application that uses more than one database is straightforward and also all the metadata, logins etc, associated with the application that are in the system database are also protected.
    • The whole cluster looks like a  SQL Server instance which doesn’t change no matter which server is is the active one.  So any application connection can be used and this doesn’t need to be changed when a server failover occurs

    And some things be aware of:

    • There is only one copy of any database in a cluster it’s just that this located on shared storage.  This means there is no protection against logical error occurs like you or a user accidentally deleting  a row , table etc. so clustering is not a replacement for backups.
    • Recovery and switchover to a new server in the cluster is not instantaneous and can take several minutes.  During this time an application might time out.  You could therefore lose a transaction that wasn’t committed.

    SQL Server Mirroring  is a process by which a copy of a database is kept up to date on another (mirror) server, and this mirror server can take over as the lead (principal) server at anytime should the first server fail.

    The good things about mirroring are:

    • It’s really simple to setup
    • Failover to the mirror in the event the principal fails is very quick.
    • Mirroring comes in several flavours like synchronous and asynchronous depending on whether you want to wait for the transaction to be applied to the mirror beofr it commits on the principal.

    and the things to be aware of are:

    • Mirroring only protects one database at a time, so if you create logins and other system objects you’ll need to have a separate process (e.g. a script in sql agent) to keep these in synch on the mirror server.
    • Although there is a second copy of the database on the mirror server, if you have mirroring setup in synchronous mode any logical errors on the principal will be copied to the mirror so mirroring is not a replacement for a good backup strategy.
    • You can only mirror a database to one other server.
    • Each of the servers in a principal mirror pair has it’s own server name ip address etc. so for the application to know which one to connect to you do need to use the ado.net native connection (SNAC) as when the application starts it doesn’t ‘know’ which server is the principal.

    In summary here are the key comparisons..

    Area

    Database Mirroring

    Failover Clustering

    Data Loss

    None

    None

    Failover

    Automatic

    in HA Mode

    Automatic

    Transparent to Client

    Yes,

    (SNAC)

    Yes,

    (same IP)

    Downtime

    <3 Seconds

    >20 Seconds

    + DB Recovery

    Applies to

    Database

    Instance

    Masking of Disk Failure

    Yes

    No

    Shared Disk

    Hopefully these strengths and weaknesses will guide you in when to use what but I also want to add in 2 other factors:

    • You might also consider using both technologies together e.g. you could have a cluster on site with mirroring to remote location.
    • Virtualisation technologies like Hyper-V allows virtual machines to run in a cluster, but there is only the one copy of the virtual machine shared between the cluster nodes.

    I will defer to the SQL Server Customer Advisory Team (SQLCAT) for a detailed discussion on this and their whitepapers on getting the most out of mirroring and clustering as this is just a gentle introduction to the world of SQL redundancy, or high availability as it more commonly referred to.

    Finally just to be clear there is no substitute for database backups whatever else you decide to use.

  • Private Cloud?

    Private Cloud is probably the most over used and least understood term in IT at the moment. Gartner’s definition of private cloud is “A style of computing where scalable and elastic IT-enabled capabilities are delivered as a service to customers using Internet technologies.” What does that look like for the IT Professional in the Microsoft world?

    “A style of computing..” does that mean we dress up for work? No it means that the IT Professional behaves like a cloud provider by offering services to the business not dissimilar to the service level agreements that operate between larger businesses and the IT provider they outsource to.  The scalable and elastic capabilities is sort of their now depending on your precise definition of what this means.  The underpinning of this is virtualisation, which for most people translates to server virtualisation and using this technology we can spin up more of each type of infrastructure as demand dictates.  There are other types of virtualisation like desktop virtualisation that also provide this capability; think of a company taking on 10 new staff , they could be setup with their own environment in a matter of minutes if the business had  a PC or thin client equipped hot desk area.  Of course even the magic of software isn’t going to conjure up  hardware for these people, what I am saying is that the process to get them going can be fast and largely automatic.

    So isn’t  that just the same as an optimised infrastructure? Essentially yes but what I also see is the breaking up of the traditional IT department and placing IT professionals closer to the business.  This new kind of IT professional (or private cloud specialist) will have a broader infrastructure knowledge and will also have a deeper understanding of the business she/he will be providing services for.  This can only happen if the infrastructure is easy to manage and self healing where possible. 

    The key software in this shift will be management tools, rather than the virtualisation platforms themselves.  For the Microsoft stack this means System Center and depending on what size your business is will dictate which parts of that product line you use:

    • For small business you might run small business server and manage your clients with the new InTune product (our new evangelist Simon May has more on this).
    • For businesses running up to 50 server and 500 clients there is System Center Essentials (SCE) 2010.  This combines the key features in the separate System Center products used by larger businesses into one simple interface including virtual machine management, system health, and update management.
    • For larger businesses there are individual System Center products to manage the infrastructure.

    So how does this relate to the public cloud?  I would argue that some components of most business will still need to run on premise for a number of reasons:

    • The UK is ranked 33rd in the world for broadband speed for download and 66th for upload.
    • Compliance procedures in many businesses (both the customers and some of the providers) will need to be tightened before all data can be stored in the cloud
    • some applications like voice in unified communications and business intelligence aren’t really cloud ready yet.
    • federation of identity isn’t there for all applications, making it harder to integrate systems
    • It’s seen as risky as it’s so new
    • the move from capex to opex might seem attractive but what stops the cloud providers from putting up the costs at will (much as a landlord might)
    • Cloud SLA’s need to recognise the true cost of business downtime

    and having a consistent approach to IT whether on premise or not will provide a business with the flexibility it needs. 

    Public clouds are a lot like tornados, and if you see one that is not moving to the left or right it means it is coming towards you! So my plan over the coming year is ensure you are prepared as possible for this disruptive technology as it is very likely to be hitting your business in some shape or form if not this year then next.

    ..added 17 Aug 2011

    Since I wrote this post last year The Microsoft Virtual Academy has been
    launched which provides solid introductory training  on both public and private
    cloud.  I have done all of the private courses myself:

    and I can vouch for them being a good use of your time in getting you up to
    speed, and giving you a solid foundation for getting certified for the two
    Microsoft certifications relevant to the private cloud:

    70-659:
    TS: Windows Server 2008 R2, Server Virtualization (Hyper-V+VMM)
    70-400:
    TS: Microsoft System Center Operations Manager 2007, Configuring

  • SQL Server Reporting Services interop

    SQL Server Reporting Services is nearly ubiquitous as SQL Server itself, it is in nearly all versions of SQL Server from Express (with Advanced services) to DataCenter edition and will soon be available in SQL Azure.   It’s been around for 8 years and if Microsoft used traditional version numbers it would now be on V5.  This wide spread availability and long history does cause some confusion when trying to work out what works with what.

    So here are the FAQs I commonly see and get asked about :

    FAQ 1.  Can Reporting Services report on data stored in [ insert relevant database platform here]?

    Although Reporting Services is part of SQL Server it can consume data from pretty well any structured data source e.g Excel, xml, and anything you can get an ADO.Net,OLEDB or ODBC connector for. To design your reports you’ll need these database connectors (drivers) on the machine you’re designing the reports on and on the server they’ll be run from. 

    Also don’t forget that if your source data is in a different version of SQL Server that doesn’t matter either, providing you again use the right connector/drivers.

    FAQ 2. Can I design a report in the SQL Server 2XXX and then run it on Server 2YYY?

    A report can generally be run on a later version of SQL server than it was designed for (unless you have custom code or rare authentication or security setup). Once you save it to the newer version of reporting services it will be automatically updated. However you can never do this the other way around i.e. run a report on an older version an the one it was designed in.  This is because the report definition language (RDL) gracefully changes in each release, so that older reports can run.

    Other resources on upgrading to SQL Server 2008 R2 can be found here

    FAQ 3 Can I use and older version of SQL server to host the reporting server databases

    Reporting services uses 2 SQL Server databases, one to hold the metadata about the reports and the other as a temporary workspace.  You can use a different version of SQL Server to store these databases and the interop matrix looks like this

    SQL Server database Reporting Services version Compatibility
    2000 2005 OK
    2005 2005 OK
    2005 2008 OK
    2005 2008 R2 OK
    2008 2008 OK
    2008 2008 R2 OK 

     

    FAQ 4. Which versions of Reporting Services work with which versions of SharePoint?

    SharePoint version Reporting Services version Compatibility
    2007 2005 OK
    2007 2008 OK
    2007 2008 R2 OK
    2010 2005 Not possible
    2010 2008 OK, but you actually use the 2008 R2 reporting services add-in for SharePoint (actually this part of the SharePoint installation) and 2008 needs to be at Service pack1 cumulative update 8
    2010 2008 R2 OK

     

    FAQ 5.  Which versions of Visual Studio (VS) work with which version of Reporting Services?

    When SQL Server 2005 came out it introduced the BI development studio (BIDS) which is essentially a cut down version of Visual Studio in that case 2005.  When SQL Server 20087 came out BIDS was built on VS2008.  However (BIDS) in SQL Server 2008 R2 is also still built on VS 2008 , but it’s simple to have another version alongside e.g. VS 2010.

    FAQ 5.  Which versions of Visual Studio do I use to embed my reports into my applications? 

    An ancillary question to this relates to the report viewer control that developers can use to embed reports in their projects. 

    • For SQL Server 2005 there is a report viewer tool, described  here, that works in Visual Studio 2005
    • If you followed that link you’ll notice there is an other version option that tales you to similar details about the report viewer control that works in VS 2008 but still against SQL Server 2005.
    • The report viewer in VS2010 the only works against SQL Server 2008 and SQL Server 2008 R2 as detailed here.

    in summary the allowed permutations are:

    Visual Studio version Reporting Services version Compatibility
    2005 2005 OK
    2008 2005 OK
    2010 2008 OK
    2010 2008 R2 OK

    FAQ 6 How is Reporting Services licensed?

    Probably a post in it’s own right, but to summarise: If you install a server component of SQL Server (so reporting services, analysis services the database engine etc.) on a server then you must license that server for SQL Server, either by CPU or user CAL just as you would the DB engine itself.

    This covered in the SQL Server licensing quick reference guide on the SQL Server 2008 R2 licensing page   

    This is a classic example of why I  blog,  I  will be asked about what this post relates to again and again and I’ll forget where the answer is unless I page it to a post!

  • The curious case of the Trusted Protection Module

    I am keen on insurance and I am more interested in the claims process when things do go wrong, than in the cost of the premium.  In IT the same thing should apply , the level of protection should be driving force in any decision on how to look after your your data rather than the cost of the solution.  

    What is odd about IT is how we seem to change modes when we get home to our home computers.  My home machine has a lot of personal data from contacts, to sign-on details, to various websites and personal and financial information and while I would be annoyed if someone swiped my desktop I would be in a complete mess if they accessed all that data.  I could individually encrypt the relevant files and use 3rd party tools to do this but it seems to me that BitLocker just takes the hassle out of the process.  The only other precaution then is to make sure the machine is shut down or hibernated when I am out so that cold boot attacks won’t work.

    My old PC couldn’t easily run BitLocker as it didn’t have a trusted protection modules (TPM) on it, so when buying a new one recently I wanted to check that it had one as it’s part of the motherboard and not something you can add your self.  The call centre guy I talked to about this replied that they only put the TPM in their business machines.     I do know there has been consumer resistance to TPM’s as it has been suggested it would be used to for software protection and digital rights management that should only be a problem if you are part of the crew of the Black Pearl.  So #fail to that well known brand, and so I nipped round the back of our office to the local NovaTech store, and theirs do, and so I bought a barebones box (based on an intel i5 with 6gb RAM )and cannibalised  my old gaming rig and now I am all encrypted.

    Before you comment on this post that only the top end editions of Windows have this capability, yes I know that so you need Ultimate edition at home for this. However if you add up the third party bits you need to get the same functionality then it actually isn’t anymore expensive.  You might also counter what I have written here by mentioning the various personal cloud offerings that around some to keep your data safe, some  of which like SkyDrive are free.  However my broadband is pretty appalling so I need to keep my stuff local until the fibre revolution happens.

  • A quick word on Remote Desktop for demos

    I use Remote Desktop a lot for my demos and whether this is in person or when making videos this makes a lot of sense:

    • When I am making screen casts  running something like Expression Encoder or Camtasia in a virtual machine(VM)  or alongside a hypervisor is never going to be a good thing. You’ll probably get some screen clipping or loss of voice & video synchronisation going on.  Remote Desktop allows me to quickly setup the right screen resolution I need  - usually 1280 x 768 which gets squashed to 720p as I need the 768 depth to see some of the standard dialogs used in Microsoft’s server products, and 720p is a good video compromise for HD given broad band bandwidth in the UK.
    • If I am presenting I can control my decks and my demos from one screen with Remote Desktop and I can lay off the extra work the GPU needs to do when projecting onto multiple screens to my Windows 7 laptop, allowing my virtualisation machine to just run Hyper-V and keep the speed up.

    One tip I did notice was that if you have Remote Desktop on your Windows 7 taskbar you can quickly open your favourite or recent connections with right click, just as you can with documents in Office..

    image

    and if you hover over it you can of course see what remote desktop session you have open (another top tip is to change the backgrounds on them so you know which one you’re using)

    image

     

    If you’re still stuck in the past on XP then you can save connections to the desktop for each of the ones you need to connect to by clicking on the down arrow next to the options..

     

    image


    Other top tips are the display tab where you set the screen resolution you need, and local resources where you can pick up the local drives on your machine. Oh and to fire up Remote Desktop either type mstsc or remote desktop in the Windows search bar to quickly start it up

    Of course Remote Desktop isn’t just there for connecting to  other machines (physical or virtual) for demo's, but it’s other two uses, providing remote support to your users and thin client computing to whole organisations I have posted on before, and will do so again.  For now just remind yourself of how good it is for your own research and evaluation work..

    image

    A remote desktop session to a windows 7 vm running the same them as as my physical desktop (the dynamic Bing theme) 

  • ROLAP dead but not forgotten

    With a catchy name like Relational On Line Analytical Processing, is it any wonder that this technology hasn’t really entered mainstream use. Firstly what is it?

    OLAP (On Line Analytical Processing) is  a data store organised for ad hoc query and analysis, a good way to think about this is as a very big Excel PivotTable which can be shared by many users.  The Relational refers to the fact that the data resides in a conventional relational database as opposed to the other kind – MOLAP (Multi-Dimensional) OLAP. In MOLAP the data store is structured differently and as the name suggest based on the dimensions the  data relates to using a coordinate system.  There is also HOLAP (Hybrid) which mixes the first two. 

    So why have two ways of doing this?  Traditionally ROLAP has been able to handle much larger volumes of data while MOLAP is faster because MOLAP makes extensive use of RAM to store the dimensions and aggregations, as will as sophisticated indexes. In the case of SQL Server Analysis Services there is also some very clever compression which uses a bit more cpu usage but less slow moving IO. 

    HOLAP tries to provide the best of both worlds by storing the most frequently referenced data in MOLAP and the rest in ROLAP. For example you might have your current year’s sales in MOLAP and prior years in ROLAP.

    So in very large BI projects ROLAP has it’s place and some vendors only use this technology, while Microsoft offers all three flavours of OLAP in Analysis Services, with a common interface to use whichever one you need.  

    When it comes to database and BI scale, Microsoft also has the SQL Customer Advisory Team (SQLCAT) and they have just published a paper on ROLAP best practice .  This whitepaper is pretty deep, but with a bit of work the SQLCAT team managed to get ROLAP to out perform MOLAP on the same data using the same hardware in certain cases.  However the work involved was considerable so I am not suggesting for a minute that you should stop using MOLAP as I find MOLAP solutions far easier to build and maintain. 

    If you just want to know what to do for your ROLAP solution, rather then the details of the why, then the recommendations are summarised at the end of the paper.

  • Thanks for the (dynamic) memory – Windows Server 2008 R2 sp1 beta

    Although 16Gb laptops are out there I am making do with 8Gb, so any chance to get a bit more oomph out of it is always welcome and so I wanted to try the new dynamic memory capabilities in Windows Server 2008 R2. I am going out on a limb here because you shouldn’t use any beta code in production and essentially my demo rig is my production rig. However I caught up with the Hyper-V tsar Jeff Woolsey on our internal TechReady conference last week - he says it’s rock steady and that’s good enough for me!

    I went for the the windows update install (from here) which just puts a flag in the windows update process to fool it into thinking the service pack is released code, and then forces the sp1 beta to install as an update.

    Now when I look at the setting of my BI demo VM I can start to use dynamic memory.

    dynamic memory bi2010

    so what can you set and what difference does it make?

    • Startup RAM is essentially what we have now, so it;s the minimum RAM the virtual machine (VM) gets and the some of the strup RAM for the running VMs can’t exceed what’s on the physical server.  So this is nothing like the the over committing you will have seen in Vmware, but that’s a post for another day.
    • Maximum RAM is hopefully obvious as is
    • Memory priority.
    • Buffer reserve percentage controls how much free memory there should be on top of the amount of currently committed memory.  So the target memory the hypervisor is trying to assign to a given VM is

    Currently committed memory + Free buffer memory

    where

    Free buffer memory=(1/(1-Free Buffer %)-1) X Currently committed memory

    This is all fine when there is enough memory to go around all of the running VMs. When the there isn’t the priority setting is used to decide which VMs suffer.  You’ll need to be careful setting all of this as it is possible to set a up a scenario where a lower priority VM appears to be better served than a high priority one, by setting a high free buffer % on the lower priority VM.  This is to do with the way the memory balancer calculates the pressure each VM is under, the calculation for this being..

    Current Committed memory/ Available memory

    The other thing to be aware of is setting the page file appropriately.  If you have a small free buffer and a small page file and you suddenly launch a massive program or process in a VM it might fail because there won’t be sufficient free space.  However using the same VM with the same settings the memory could be gradually be increased beyond the limit that caused the big program to fail to start as on each occasion a little but more memory will be dynamically assigned to preserve the size of the free buffer.

    How does this work?  Basically there are changes to the integration components put into the guest VM which can dynamically add memory and using ballooning to get it back again.  This ballooning process reserves chunks of memory in the guest VM that have been handed back to the host as once memory has been added it can’t be removed in the same way, or it would crash the guest VM or the apps running on it.

    You might also be wondering how you control how much memory is left over for the physical OS and where you set this.  In the current beta this is dynamically set as part of the dynamic memory control in what’s called the memory balancer.  You can override this in the registry but my advice would be to understand what performance you are getting before you start this deeper level of tuning.

    So for my little laptop I normally run three VMs and in the table below is how I have set them up now I have dynamic memory to play with:

    VM Static Memory prior to sp1 min Memory max Memory Free Buffer Priority
    Domain Controller 500Mb 500Mb 1500Mb 20% Lowest
    Win 7 Client 1500Mb 1500Mb 65536Mb 20% medium
    BI2010 4500Mb 4000Mb 65536Mb 20% high

    Over the next few weeks I’ll see how this behaves compared to before and keep you posted.  In the meantime if you have some spare tin around you might want to conduct you’re own evaluation in which case you can get sp1 and a trial copy of Windows Server 2008 R2 if you haven’t got one all from here

    and finally the small print ..

    When the service pack is actually released you’ll have to uninstall and re-install the actual service pack because as usual with any Microsoft beta, CTP or RC code this won’t be a supported upgrade path.

  • Work Life Balance in Office

    For me the office is Microsoft Office as my litmus test for whether I am working or not is what I am doing not where I am.    The final bit of Office 2010 is still in beta – Office Communicator 2010 and I have been pestering the development team  to get on the internal dogfood to see what the fuss is about.  I have to be honest and say that I was very happy with the old version as it just worked and was really useful like the presence information showing up in SharePoint and Outlook to make collaboration so easy.

    In my hurry to upgrade I didn’t snapshot the old one so I can’t show you a comparison but it now has pictures of your colleagues..

    image

    Top of the shop is Juliet (mrs DeepFat) and she is available (well to me she is!), she is outside of Microsoft (the messenger logo beside the pushpin which shows this) and has just gone away..

    image

    This is good as I can use the Communicator to keep in touch with external contacts or for personal use such as when I am away from home. On the flip side I can hide details of my presence as I wish with the possible exception of my manager!

    You can turn off the picture of yourself and what you see of your colleagues but I think this makes it a better experience. Of course I can go the full monty and have a video call with anyone with Communicator, and despite what BT is pushing with its TV ads with landlines I would say that a video call is the next best thing to meeting someone, even someone with a face for radio like me!

    As before Outlook integration is impressive and You can respond to any mail with a communicator meeting or IM with one click. Also if you are already in a meeting or communicator call you can now literally drag someone into that meeting from communicator, assuming they are available.

    The client is also available for the mobile and you can run it in a browser (Communicator Web Access) , and this is exactly what Microsoft means with its three screens and a cloud messaging.

    If you want to try it in your organisation all the resources are here, but I would only show this to your users if you are serious about deploying it as they will probably bite your arm off to get their hands on it. 

  • Spatial Referencing Systems in the UK

    I have had a couple of queries about spatial data in SQL Server regarding SRIDs.  I have posted already on some of this here, and while I mentioned SRIDs I didn’t really cover what that means so hopefully this explains it...

    First of all the term Spatial Reference Identifier (SRID) is a European Petroleum Survey Group (EPSG) standard and refers to how the spatial data is plotted on the earth, the most common way of doing this is SRID 4326 which is simply lattitude & longitude, plotted over a flattened sphere (or oblate spheroid to give its technical term) that mimics the surface of the earth.  This standard also equates to OpenGIS standard WGS84.  This is important for several reasons:

    • It’s supported in a number of databases so SQL Server (2008 & later) , Oracle, db2 , MySQL and PostGres.
    • It’s used to plot data onto Bing Maps and Google Earth, you might want to use Bing Maps if you need to plot onto Ordnance Survey maps for example.
    • It’s the standard used by all GPS systems

    So why wouldn’t you use it if it’s so ubiquitous?  the answer might well be that you:

    a. live in the UK

    b. you want to work with the government data on Data.Gov.UK

    The spatial data at Data Gov UK is based on a completely different system, the National Grid,not the one you plug your laptop into but the one any Scout or Guide knows which uses eastings and northings.  On this Ordnance survey view you can see the national grid numbers over the sea..

    tHAMES mAP

    If this map was plotted on the same projection as used for National Grid then the blue squares would be straight up and down and would be exact squares (it’s not apparent they aren’t so trust me on this)  The Ordnance Survey site explains this in more detail.  The National Grid equates to SRID 2770 and converting data in that from to the SRID 4326 is not trivial;

    For example on the national grid Microsoft UK is located at SU747742, the SU representing a 100x 100km square and the numbers would be 74.7km east of the bottom left corner of that square and 74.2km north of it. That equates to Longitude 0.9258985519409051 West  Latitude 51.46122678211959 North.      

    You can actually import this data into SQL Server 2008, but using the geometry type.  If you run a query in SQL you will see the shapes properly represented but rotated.  However you can’t then play with Bing Maps either with your own code or using reporting services as the Bing Maps layer won’t understand the references.

    So you are probably going to need some conversion tools to play to convert from SRID 2770 into SRID 4326 and there is also a company called Safe Software who  have a tool,  FME to do exactly that in SQL Server Integration Services. 

    One more thing you should be aware of is that will Bing Maps is only free when you are in consumer mode, or you are working for a not for profit or an educational establishment, and it does need to be licensed if you are using it commercially and details of this are here.  However this could well be a cheaper option than using dedicated GIS tools particularly if don’t need the richer toolset these traditional tools offer.

  • SQL Bits 7, York registration is now open

    SQL Bits 7 – the wonders of SQL is now open for registration.  It’s in York University from 30th September to 2nd October. Why York? Because the organisers try and run some of these outside of the London area (so far Manchester, Birmingham and Cardiff) so this time they are off to spread the SQL love in the North East.

    The event runs like this:

    • 30th September is the paid for training day featuring the likes of Brent Ozar from SQL Skills in the US, Buck Woody the DBA’s DBA from Microsoft and UK MVPs such as analytics guru Chris Webb and Reporting Rock Star Chris Testa O’Neill
    • 1st October is the paid for conference day,  featuring some of the speakers from the training day but across multiple tracks in a conference style format.
    • 2nd October is the free community day where the content is voted and put together by the community.

    Note: There are various discounts on the sight and bundle prices for attending all three days.

    I’ll be at there Friday and Saturday and I want to hear your worst SQL squealers, i.e. the sad, bad and mad problems you have seen or possibly created and how you got around them. I’ll be trading some cool swag for these and I am happy to change names to protect the guilty and innocent, and you’ll get top approve any draft before it’s published.

  • Managing a small private cloud

    In my last post I mentioned that the key to a private cloud was effective management.  If your organisation has less than 50 server and 500 clients, you’re new best friend could well be System Center Essentials (SCE) 2010.  It’s a subtle blend of the important bits from the individual bigger brother Systems Center products so it can:

    • Manage your virtual machines like you can in System Center Virtual Machine Manager
    • Do basic patching like you can in System Center Configuration Manager
    • Keep an up to date inventory of your software and hardware assets as well as take control of desktops like you can in System Center Operations Manager

    ..all from one simple to use console.  If you have used windows server over the last few years this three pane interface should look pretty familiar..

    sce monitoring

    It’s built on SQL Server (2008 or later), so as an ex DBA I am liking that and it will install express (the free one ) at install time. However for larger deployments you will need a licensed (standard) edition, which can either be on the local server or one that you have on another server already. SCE 2010 also uses SQL Server’s reporting services engine for it’s reports and again this can be local or remote. I tested it with SQL Server 2008 R2 and that worked just fine.

    One thing that isn’t totally clear in the install is the check for disk size if you elect to install the virtual machine management option.  This extra disk is to store the virtual machine templates, however it might well be that you will run SCE on a virtual machine like I am in which case you’ll probably attach in additional Virtual Hard Disks for this later.  Even though I am running SCE in a virtual machine I can still monitor and manage the physical machine (which is called Hotblack on my demo rig) running the hypervisor..

    sce computers

    It is this ability to monitor and manage virtual machines be they servers or clients, that differentiates SCE from the new InTune product you might have seen Simon blogging about.  i.e. SCE is specifically designed to manage a private cloud , where InTune is managing the clients that could be connecting to cloud services be they off the shelf applications like BPOS, or your own applications running on Azure.

    There’s a trial copy here, or you can get it from TechNet if you have a subscription, or you can wait til later in the week when I will have a couple of videos showing how it works up on TechNet Edge.  

  • Sharepoint Saturday

    SharePoint Saturday UK 2-10-2010

    User groups should be run and managed by Microsoft users, so I am really pleased to see SharePoint experts Tony Pounder, Brett Lonsdale and Mark Macrae , taking a lot of time and effort to run the first SharePoint Saturday in the UK. I have seen how successful SQLBIts has become and how it compliments the traditional beer and pizza after work that the SQL community also runs, hopefully this will be the same success.  Of course that isn’t so much up to me as it is to you because this is only going to work if you are also willing to give up your Saturday and rock up to this. Like SQLBits there will be multiple track in this case ITPro, Dev, and Information Worker.  Going the extra mile you might still be able to submit a session to them or help them with sponsorship, if you are also working for a partner.

    I for one am going to share my knowledge on PowerPivot and Reporting Services(hopefully presenting on this),as well as how to protect SharePoint with ForeFront and Data Protection Manager, and to learn more about the rest of it.

    However because of the vagaries of scheduling the SQL Bits 7 community day and SharePoint Saturday are both on 2nd October, so I have a bit of a dilemma about which one to go to.  What I’ll probably do is attend SQL Bits 7 on the Friday (in York) and nip down to this on the Satruday. 

    Hopefully there won’t be too many of you in a similar situation, but if you are and you are free on both days (1-2 October) and want to go to both then please email me (afryer@microsoft.com) with Dilemma in the subject line (I use a lot of outlook rules) and I’ll see what I can do to get you a free pass to SQL Bits on the Friday.