TechNet UK

Useful tools, tips & resource for IT professionals including daily news, downloads, how-to info and practical advice from the Microsoft UK TechNet team, partners and MVP's

TechNet UK

UK  TechNet Flash Newsletter
Featured
  • Convert all your VMWare VMs to Hyper-V – FREE

    You could be forgiven for thinking that the 8th April 2014 is only really going to be remembered for the End of Support for Windows XP after almost 15 years of service. Time will tell, but I would venture that a large number of Virtualisation IT Pros are still rubbing their hands with glee at the release of the Microsoft Virtual Machine Converter 2.0 (MVMC) and since the future is PowerShell, they will also be salivating with anticipation at using the Migration Automation Toolkit (MAT) released to support the MVMC 2.0. What is it? Why is it such good news? And who will use it?

    MVMC 2.0 and MAT

    MAT1

    First and fairly importantly MVMC 2.0 is a completely free toolkit to assist an IT Pro in converting VMWare virtual machines into Hyper-V Virtual Machines and yes also into Microsoft Azure Virtual Machines. The MAT is a PowerShell-based set of scripts and utilities to automate this process over a number of hosts and platforms.

    If you are not a VMWare customer or do not use VMWare Virtual Machines and don’t need to know how to convert them and don’t think you will ever need to, then you can stop reading,

    ‘this is not the blog post you are looking for.’

    Otherwise – read on….

    Over recent years Hyper-V has, for many good reasons been eating into the installed base of VMWare virtualisation customers. With the advent of this tool, the process of conversion to Hyper-V is dramatically simplified. If you are unsure of the value of such toolkits, it is possible to download the Windows Server 2012 R2 operating system (evaluation) install the Hyper-V role (free) and the MVMC 2.0 and MAT (free) and prepare a test migration to prove it will be fast, efficient and trouble-free to migrate all of your Virtual Machines from VMWare to the Azure or Hyper-V platforms.

     Should you be using Linux as the basis for your VM guest estate, then your final solution will require absolutely no licence fees for Microsoft products. Especially if you choose to use the Microsoft Hyper-V server as your final host (free)

    All sound too good to be true, well it’s not, below I explain what MVMC 2.0 and MAT can do for you.  

    The installation pre-requisites are

    For MVMC

    Windows Server 2008 R2 SP1 or above (I just installed it on Windows 8.1 (update 1) the full list is below

    Supported Operating Systems

    Windows Server 2008 R2 SP1, Windows Server 2012, Windows Server 2012 R2

    Before you install Microsoft Virtual Machine Converter (MVMC), you must install the following software on the computer on which you want to run MVMC:

    • Windows Server 2012 R2, Windows Server 2012, or Windows Server 2008 R2 SP1 operating systems
    • Microsoft .NET Framework 3.5 and .NET Framework 4 if you install MVMC on Windows Server 2008 R2 SP1
    • Microsoft .NET Framework 4.5 if you install MVMC on Windows Server 2012 [A1] or Windows 8.
      Note Although MVMC installs on all of these versions, using the Windows PowerShell cmdlets that are released as part of MVMC requires Windows PowerShell Runtime 3.0, as the cmdlets function only on Windows Server 2012 [A2] and above or Windows 8.
    • Visual C++® Redistributable for Visual Studio® 2012 Update 1

    For MAT

    1. The Microsoft Virtual Machine Converter (MVMC)
    3. SQL Express or any other SQL Server Editions
    4. A Windows account with rights to execute MVMC locally
    5. A Windows account with rights to schedule tasks on remote systems and run MVMC (if using remotes)

    MVMC is a wizard driven conversion tool but also works with the System Center automation engine provided in Orchestrator 2012 R2. And can be invoked from the PowerShell command line.

    The major new features in MVMC 2.0 are listed below:

    • Converts virtual disks that are attached to a VMware virtual machine to virtual hard disks (VHDs) that can be uploaded to Windows Azure.
    • Provides native Windows PowerShell capability that enables scripting and integration into IT automation workflows. (Completely new, previously MVMC had its own command line interface).
    • Supports conversion of Linux-based guest operating systems.
    • Supports conversion of offline virtual machines.
    • Supports the new virtual hard disk format (VHDX).
    • Supports conversion of virtual machines from VMware vSphere 5.5, VMware vSphere 5.1, and VMware vSphere 4.1 hosts Hyper-V virtual machines.
    • Supports Windows Server® 2012 R2, Windows Server® 2012, and Windows® 8 as guest operating systems that you can select for conversion.

    MVMC 2.0 standard features include:

    • Convert and deploy virtual machines from VMware hosts to Hyper-V hosts on Windows Server® 2012 and 2012 R2 or Windows Server 2008 R2 SP1
    • Convert VMware virtual machines, virtual disks, and configurations for memory, virtual processor, and other virtual computing resources from the source to Hyper-V.
    • Add virtual network interface cards (NICs) to the converted virtual machine on Hyper-V.
    • Support conversion of virtual machines from VMware vSphere 5.5, VMware vSphere 5.0, and VMware vSphere 4.1 hosts to Hyper-V.
    • Wizard-driven GUI, which simplifies performing virtual machine conversions.
    • Uninstalls VMware Tools before online conversion only, provides a clean way to migrate VMware-based virtual machines to Hyper-V.
    • Support Windows Server and Linux guest operating system conversion.
    • PowerShell capability for offline conversions of VMDK Disks to Hyper-V .vhd disks

    Finally to install MVMC 2.0 – the account in use must be a local administrator on the machine.

    The MVMC installation files can be obtained here and consist of an msi setup file that installs the wizard, an admin guide document and a cmdlets document.

    MAT is simply collection of PowerShell scripts that will automate conversions using MVMC 2.0 and it is back ended by a SQL instance (SQL Express will work). You can use it to convert several machines at once, on a single server – or scale it out and execute conversions on many servers at the same time.

     

    Although MVMC 2.0 can convert VMWare VM’s to Microsoft Azure, this has not currently been implemented in MAT, so this product is scoped to on-premises conversion only, for the time being.

    Most of the MAT changes are minor revisions but it does ship with an example script which demonstrates how a migration can be controlled using a single PowerShell script and PowerShell workflows. In short this demonstration or example script can move all running VM’s from a VMWare host to a Hyper-V host.

    Since I aim to keep all my posts close to 1000 words – I will cover more detail in another post – especially the use of PowerShell Workflows and the architecture of MAT.

     

    The post Convert all your VMWare VMs to Hyper-V – FREE appeared first on Blogg(Ed).

  • Making BigData Valuable Data


    Ryan Simpson
      By Ryan Simpson, Parallel Data Warehouse Technical Solution Professional at Microsoft





    Data without analytics is simply that, it's just data. In this new 'BigData' world, the value of data lies beyond what the individual predetermined structure and content tells you, and in what you are able to infer by bringing different sources of data together with someone who understands the meaning of the data in order to bring new insights to the business: Cue the Data Analyst.

    How are businesses getting new insights out of their data beyond the classically structured or categorised information they've had access to before now? In a recent computer weekly article Alan Grogan of RBS explains how he is using payment information to build a map of his customers own supply chain. This helps them identify potential risks to their business from worldwide events that either directly, or indirectly impact their supply chain. Imagine if RBS had highlighted to you, your dependency on a supplier who procured hardware from Thailand ahead of the catastrophic flooding in Thailand in 2011, which resulted in a shortage of hard-drives worldwide.

    Modelling the supply chain is a useful analogy in understanding the participants involved in ultimately delivering insights from the data. We could consider the delivery of insights as the output of analytics and model that as a supply chain between the data platform and the business, we can map the various participants in this delivery of analytics as:

    image

    The delivery required of each role, as we move from left to right change from getting the best out of the platform to accurately exploring the business. The risk to the best possible insight is anything that impacts the final delivery of any analytics. We can assume that with fewer impacts we are getting the most analytics possible.

    To explore this mapping of deliveries from left to right, and what potential impacts are on the 'supply chain' that delivers these insights, let’s take a fairly simple but typical scenario: loading a large flat file to run ad-hoc queries against, and look at that from the perspectives of the Dev\DBA and the Analyst and how they might spend their time or ‘cycles’. This is a fresh data load, we don't yet know the value of the data, so we're just loading the data to develop some analytic queries over it. I've laid out the steps as you might have to consider them, rather than the steps you might execute them. We'll assume the target platform is SQL 2014, so clustered column store is available to us from a platform choice, and there's some reference or CRM data available to join on.

    Task

    Dev\DBA

    Analyst

    Load Data

    File appears large enough to require some optimised loading process

    • Create partitioned Table
    • Create file groups per partition
      - Allocate enough storage for each file group
      - Ensure log is big enough
    • Choose partition key
      - Check database is in simple recovery (or bulk logged at a maximum)
      - Create a package to load the single flat file in parallel to the target stage tables
    • We'll bulk load loading SSIS and the Balanced Data Distributor
    • Map outputs to target partition
      - Run data load
    • Congratulate self on blistering speed achieved of data load
    • Construct simple SSIS Package to load data.
    • Kick off data load - go grab coffee, it’s done when it's done.

    Develop analytic queries

    • Develop required query working with Business Analyst.
    • Tune required query adding appropriate indexes to CRM tables and building intermediary sets by batching to improve overall performance and reduce impact on other u34esers.
    • Develop initial exploratory query.
    • Repeatedly iterate over all data examining other paths (both those that exist, and those that should exist but don't) until conclusion is reached.

    I've deliberately made the example activities here to highlight where each role is spending their cycles. The Dev/DBA is spending their cycles developing using the features of the platform getting ready for the analytics. They're taking into account the physical configuration of the platform and how best to utilise it, and, as this is a shared platform, they're also considering the impact on other users. The Analyst focus their cycles on the analytics, and, with limited knowledge about the physical platform are using the fewest cycles to get to the point they can start analysing. They don't have the same responsibility to the platform, they just have a responsibility to get answers. Now obviously, there are analysts and dev/dba's who cover both sides of this, the point is; cycles spent on either side of this relationship are not spent on the other, which will reduce the total output, or increase the time to get to the same output.

    This highlights potential friction in our 'supply chain' between data and business insights: getting hold of, and processing lots of data requires physical storage and compute power. In many organisations that responsibility lies with the IT Organisation, who have in turn optimised their own supply chain to efficiently purchase and manage and present these resources to the Dev\DBA, who then configures the software as we have above. However, this supply chain must fit in with purchasing and deployment cycles and be shared across the whole organisation. Ultimately it is this friction combined with tailoring to the specifics of the platform that has led to the statistic that '50% of an Analysts time is spent loading and preparing data'. In addition to this the IT Organisation also has to ensure the business is meeting the right levels of governance, by ensuring data is appropriately protected and secure. Alan Grogan articulates his experience of this friction on zdnet, explaining how he has enabled his analytics team using Parallel Data Warehouse 2012.

    On April 15th Microsoft announced the 'Analytics Platform System' (APS) as the new name for SQL Server 2012 Parallel Data Warehouse. The new name is no coincidence and is more than just a name change from its former 'Parallel Data Warehouse'.

    APS accelerates time to insights by automating the delivery key area, removing the friction between the platform and the analytics. It is purchased as an appliance, with hardware built and configured by HP or Dell to an exact architecture developed jointly between Microsoft and each of these vendors. The Software is developed, installed, and configured by Microsoft, with a single line of support for both hardware and software also led by Microsoft. The Massively Parallel Processing (MPP) engine which runs over SQL Server automatically takes care of the platform optimisation cycles detailed in our task table above. There are no hardware specific details surfaced to the Analyst or Developer so the following DDL is all that is required to create a column store table that is partitioned by date and correctly laid out on its own file groups and aligned to the optimised storage arrays in the appliance:

    clip_image002

    When your DDL is this concise, any changes to the data model become significantly easier as there is no dependency on the physical configuration. If we need to scale the solution out, APS has a 'add node' function - a scale unit, which consists of additional SQL Servers and Storage is added to the appliance, and the 'add node' command run to redistribute the data across the new hardware to allow us to take advantage of the additional power. That's simply it, no other changes to the solution. As an MPP appliance, doubling the size of the appliance will double the performance, and as hardware and its configuration is managed the requirements on the IT organisation can be clearly defined and consistent even though the analytic demands will be constantly changing.

    APS has a parallel loading architecture, so our analyst can construct a simple SSIS load package to get the data in, and APS will automatically load the data in an optimised fashion ready for ad-hoc analytic style queries, and it will automatically manage the resources across the appliance to ensure that other queries and data loads can be satisfied whilst this data load is running.

    There is recognition across many vendors that data in Hadoop can be hugely valuable, however for many organisations the skills required to access the data and manage a cluster presents a barrier to adoption. APS makes interaction with data stored in Hadoop seamless to analysts in two ways: The first is via Polybase. Polybase allows us to express external tables that map to data in a HDFS cluster so that it can be queried directly from, and be joined to data in APS using T-SQL, so no combination of HiveQL and Jscript functions, just T-SQL. APS collects statistics on this external data, and will generate a Map reduce job if it decides this provides more efficient access to the data. The second way is the options available to an organisation for hosting a Hadoop cluster. APS can have a dedicated, in-appliance HDInsight region, again, installed and support by Microsoft. APS can also be configured to talk to an external Hadoop cluster. That external cluster could also be a HDFS store in Windows Azure. As with the DDL, to enable connectivity it's simply a case of running two commands: sp_configure 'hadoop connectivity', and 'CREATE EXTERNAL DATA SOURCE'. Hadoop connectivity is also by-directional, analysts can query from and push data out to Hadoop with just T-SQL and without having to learn new languages.

    APS makes BigData valuable data by allowing a business to focus their time on developing insights from that data, without introducing limits to scale or new skills to learn, on a platform that fits into many existing organisational structures. RBS have detailed their use of Parallel Data Warehouse in their Microsoft case study.

    Additional links:


    If you’re interested in the Business Intelligence aspects of SQL Server (and let’s face it, who isn’t), we’re holding an event in Reading on 1st May where you can hear about the latest innovations on Microsoft's high performance platform for real-time insights and apps covering SQL Server 2014, SQL Server Parallel Data Warehouse, Excel 2013, Power BI for Office 365 and Azure. You can
    register here – hope to see you there.

  • Windows XP vs Windows 8.1 - Demonstrating Tasks

    April 8th 2014 has been and gone, a notable day in the history of Microsoft and indeed the IT industry. Re-cap on reactions from twitter and find out the next steps on XP End of Support: The World Reacts by Alex Guy.

    The second in a series of videos from the WindowsUG, this time from Microsoft MVP and WindowsUG member Chris Rhodes who highlights some of the benefits associated with moving to a secure and modern operating system, w/ a focus on demonstrating tasks!

    If you're in the market for a new system or are working on your business' migration, we have a whole range of resources, guides and offers to help you make the move. Here's a handy selection to get you started:

    Additional help for your users? See here for additional resources on how to update to Windows 8.1 for business users w/ step by step videos.

    The Windows User Group runs nationally in the UK, both online and at F2F events, to provide help and information to business and individuals using Microsoft technologies. 

    Run by three Microsoft MVP (Most Valuable Professional) awardees, Mike Halsey, Chris Rhodes and Andrew Bettany, they encourage everybody to get involved be that asking questions, providing advice and support or attending WindowsUG events and tours.  You can find out more at www.WindowsUG.com

    Event: Windows User Group w/ Virtual Machine User Group
    The WUG will be sharing their expertise and delivering talks at Virtual Machine User Group (VMUG) events in April and May focusing on Windows 8.1 – Virtualisation, Security, XP EOS and Q&A.
    Come along if you’re free, registration links are below as well as an idea on what to expect from the user group.

    Agenda

    • Windows 8.1 Virtualisation
      The benefits that virtualising your operating systems can bring to your business span productivity, deployment and ecology.  In this session, Microsoft MVPs Andrew Bettany, Chris Rhodes and Mike Halsey will show you the new inclusion of Hyper-V (Microsoft’s virtualisation service) in Windows 8 and how this and other virtualisation features can be used to create secure environments for security and data sensitive applications, sandbox Windows XP for continued use, make the most of your server hardware, create bootable virtual machines for rapid repurposing of business PCs and much more besides.

    • Windows 8.1 Security for business and enterprise
      Increasing numbers of companies migrating their older PC systems are recognising the security benefits that Windows 8.1 brings such as Bitlocker Pre-Provisioning, Bitlocker Automatic unlock, Bitlocker To Go, Windows To Go, VPN Reconnect Credential Manger and Secure Boot. In this session MVP and Windows certification author Andrew Bettany focuses on the features and enhancements that can make your company or organisation more secure, and the ones you should already be considering for the high impact they can have in the Enterprise space

    • XP End of Life; What it means for you and your business
      With all support for Windows XP ending on April 8th this year, many businesses and organisations are facing challenges in maintaining and supporting their existing systems. The fact is, Microsoft are not going to extend this deadline leaving many unprepared for the security and stability problems that lay ahead, and not yet ready for deployment of a more modern and secure operating system. Now 12 years old Windows XP has had a great run, and we all loved it; but now it’s time to move on. In this session we look at what this mean in terms of opportunities and threats to your company or organisation, and how to make the leap to an operating system that will be supported to 2020 and beyond. The end of XP support is a watershed moment that could see many companies ending up in the press… or the dock, but it’s also an opportunity to become our finest IT Hour.

    • Windows 8.1: Your Questions Answered
      The Windows User Group invite you to join MVP’s Chris Rhodes, Mike Halsey and Andrew Bettany as they talk about new features in Windows 8.1 , OS deployment, MDOP technologies such as DaRT and App-V. Also find out more about how Office 2013 can aid anywhere working and increase productivity when used with Windows 8.1 and cloud integration. These veterans of the IT industry have published books, articles and talked to 100s of people at user group meetings, TechEd and international events. Learn about Windows 8.1 and Office from these industry leaders who will be on hand to answer your questions live and in person. Be sure to come along with questions, as well as a desire to meet and network with fellow IT Pros.

       Tues 29th April, Leeds:
      Doubletree Hilton
      Register here

       Tues 27th May, London: 
      Doubletree Hilton
      Register here


    Did you find this article helpful, is there something it was missing? Let us know in the comments below. Or reach out via @TechNetUK.

  • Lab Ops Redux setting up Hyper-V in a hurry

    I am at Manchester United as I write this and one of the delegates wanted to quickly try out Hyper-V in Windows Server 2012R2.  I thought I had an all up post on that but it turns out I don’t, so Nicholas Agbaji this is just for you!

    You’ll need a latptop/desktop running Windows 7/Windows 2008R2 or later to work on that’s capable of running Hyper-V, and that the BIOS is setup for virtualization

    • You’ll need to download a copy of Windows Server 2012R2 and a special PowerShell script Convert-WindowsImage.ps1 from the TechNet Gallery.
    • Run the PowerShell script as follows. 

    .\Convert-WindowsImage.ps1 –SourcePath <Path to your iso> -Size 50GB -VHDFormat VHD –VHD “C:\WS2012R2.VHD” -Edition "ServerStandardEval"

    Note: If you are running on Windows 8 or Windows Server you can use the newer VHDX format for virtual hard disks

    • We now have a VHD with a sysprepped clean copy of Windows Server 2012R2 and Windows 7/2008R2 & later allows us to boot from a VHD just like the one we just made.
    • To boot from VHD we need to mount the VHD. In Windows 8/2012 you can simply click on a VHD to mount it, however in Windows 7/2008R2 then we’ll need to open an elevated command prompt and do this manually:
      • diskpart
      • select vdisk file =”<path to your VHD>”
      • attach vdisk
    • We’ll now get an additional drive say drive H: and now we’ll need to edit the boot database from an elevated command prompt and add and edit a new entry to register the VHD:
      • bcdboot g:
    • We also need to edit the BCD to get Hyper-V to be enable in our VHD with
      • bcdedit /set “{default}” hypervisorlaunchtype auto
    • Optionally you could describe your new boot entry with
      • bcdedit /set “{default}” description “Windows Server 2012R2 Lab Ops”
    • Reboot you server/laptop and you’ll have an extra boot option to boot to windows server.
    • The final step is to add in  the Hyper-V role either from Server Manager or with Powershell..
      • add-WindowsFeature hyper-v –IncludeManagementTools 

    Once you have this VHD setup you can boot into your OS and backup the VHD you made and possibly reuse it on another machine.  So good luck Nicholas and thanks for spending the day with us!

  • Licensing Logic: Licensing SQL Server, Everything You Need to Know!

    Part of the Microsoft Licensing Logic series from the Microsoft Licensing team.

    Just when you think Microsoft licensing is straightforward and you’ve got a pretty good grasp on it, along comes SQL Server which has historically been the exception to the licensing rules. However with SQL Server 2012 we did a great deal of simplification so it’s easy to understand the basics. You’re going to approach licensing differently depending on whether you’re deploying SQL Server in a physical or virtual environment.

    SQL Server Licensing in a physical environment.

    SQL Server is available is three main editions; Standard, Business Intelligence and Enterprise. The Enterprise edition is licensed per core (no CALs required), Business Intelligence is licensed per server and client access licence (CALs) and the Standard edition can be licensed using either method. This is summarised below and hasn’t changed with the April 1st release of SQL Server 2014.

    clip_image002

    If you’re interested in the Business Intelligence aspects of SQL Server (and let’s face it, who isn’t), we’re holding an event in Reading on 1st May where you can hear about the latest innovations on Microsoft's high performance platform for real-time insights and apps covering SQL Server 2014, SQL Server Parallel Data Warehouse, Excel 2013, Power BI for Office 365 and Azure. You can register here – hope to see you there.

    Before I present a little flowchart which might make your decision easier, let me clarify a few things about per-core licensing. We are talking per-core here and not per-physical processor, unlike Windows Server 2012. Currently SQL Server 2012, SQL Server 2014 and BizTalk Server 2013 are the only products licensed per-core.

    To find out the appropriate number of cores you need to licence, simply count the number of cores in each physical processor in the physical server. Software partitioning doesn’t reduce the number of cores you need to licence. Once you have that you need to remember three things:

    1. You need a minimum of four core licences per processor. So if you have a dual-core, dual-processor machine you would need to count that server as a dual, four-core processor and purchase licences for eight cores, despite only having four cores in total.
    2. SQL Serve core licences are sold in packs of two: each SKU covers two processors. So in our example above we would purchase four SQL Core licence SKUs to cover eight cores.
    3. Certain AMD processors need to have a multiplication factor of 0.75 applied to the core count. See this link for the processors in question and what to do.

    For server and CAL, SQL Server works in the same way as any other Microsoft server + CAL product. Licence the server(s), determine the number of unique users and/or devices accessing the SQL Server and purchase the appropriate number and type of CALs. SQL 2014 CALs will allow access to all previous versions of SQL Server. Also you don’t require a separate CAL for every SQL Server; a SQL Server 2014 CAL allows access to all the SQL Servers within the organisation.

    A simple way of determining the edition and licensing of SQL Server 2012 and SQL Server 2014 is below.

    clip_image004

    SQL Server Licensing in a virtual environment.

    Regular readers of the licensing blog will be saying “I bet this has something to do with Software Assurance (SA)”. Well, you’re partly correct. I’m going to assume you're running Windows Server 2012 Datacenter edition on these boxes just for simplicity and I haven’t included details of the OS running in the Virtual Operating System Environment (VOSE). Licensing Windows Server has been covered in a previous blog.

    For SQL Server Standard and Business Intelligence editions you can licence individual virtual machines (VMs) using the server + CAL model. Simply purchase one server license for each VM running SQL Server software, regardless of the number of virtual processors allocated to the VM. Then purchase the appropriate number of CALs.

    For example, a customer who wants to deploy the Business Intelligence edition running in six VOSEs, each allocated with four virtual cores, would need to assign six SQL Server 2014 Business Intelligence server licences to that server, plus the CALs to allow access.

    For SQL Server Standard and Enterprise editions you can licence individual VMs using the per-core model. Similar to physical OSEs, all virtual cores supporting virtual OSEs that are running instances of SQL Server 2014 must be licensed. Customers must purchase a core license for each virtual core (aka virtual processor, virtual CPU, virtual thread) allocated to the VOSE. Again, you are subject to the four core minimum, this time per VOSE. For licencing purposes, a virtual core maps to a hardware thread. When licensing individual VMs, core factors (i.e. the AMD processor 0.75 factor) do not apply.

    Two examples are shown below (figure 1 and figure 2) for clarification.

    clip_image007Figure 1: (above) SQL Server core licences required for a single VOSE on a dual, four-core processor server.

    clip_image010Figure 2: (above) SQL Server core licences required for two VOSEs on a dual, four-core processor server.

    With the SQL Server 2014 Enterprise edition (note: not Standard edition), if you licence all the physical cores on the server, you can run an unlimited number of instances of SQL Server, physically or virtually as long as the number of OSEs with SQL doesn’t exceed the number of licensed cores. For example, a four processor server with four cores per processor provides sixteen physical cores. If you licence all sixteen cores, you can run SQL Server in up to sixteen VOSEs (or the physical OS and 15 VOSEs), regardless of the number of virtual cores allocated to each VM. What if you want to run more than 16 VOSEs in this case? Well, you are permitted to assign additional core licenses to the server; this is known as licence stacking.

    Here’s where Software Assurance comes into play. Licence all the physical cores with SQL Server 2014 Enterprise Edition and software assurance and your licence rights are expanded to allow any number of instances of the software to run in any number of OSEs (physical or virtual). This SA benefit enables customers to deploy an unlimited number of VMs to handle dynamic workloads and fully utilize hardware computing capacity. As with most SA benefits, this licence right ends if SA coverage on the SQL core licences expires.

    Licensing for maximum virtualization can be an ideal solution if you’re looking to deploy SQL Server private-cloud scenarios with high VM density, Hyper-threading is being used so you’re looking at a lot of virtual cores to licence, or you’re using dynamic provisioning and de-provisioning of VM resources and you don’t want the headache of worrying about adjusting the licence count. As you can see in figure 3 (below) this can be very cost-effective.

    clip_image013Figure 3: (above) Options to licence SQL Server Enterprise in a virtual environment. In the top example you would need 8 core licences + SA for unlimited virtualisation whereas in the bottom example you would need 10 core licences and still be limited in the number of SQL VMs you could run.

    What’s new in Licensing for SQL Server 2014?

    Just two subtle changes: one for high availability scenarios and the other for multiplexing with SQL Server Business Intelligence edition.

    The rights to install and run a passive fail-over SQL Server have now moved to be a Software Assurance Benefit. This is a licence right for SQL 2012 and earlier with the license terms listed as an exception under each SQL edition to which it applies. With SQL 2014 the fail-over servers terms will move to the Software Assurance Benefits section and thus only apply to SQL covered with SA.

    The second update is for Business Intelligence Edition. We’re relaxing the multiplexing policy so it no longer requires a CAL for users or devices that access the BI server

    Mobile First, Cloud First, Data DrivenRegister for the SQL Server 2014 and Power BI for Office 365 Launch event on May 1st at our offices in Reading.

    We say this in every blog but as you can imagine, there’s a lot more detail to SQL licensing so please listen into our monthly licensing spotlight calls where we cover this and other topics (you can view archived calls here).

    Did you find this article helpful? Let us know in the comments bar below, or reach out via twitter @TechNetUK.

  • SharePoint and the Emergence of the Data Scientist

    image



      By Geoff Evelyn, SharePoint MVP owner of www.sharepointgeoff.com



    As the use of content management systems evolve with users adding more, ahem, "content", the organizations accountable for those content systems will need to ensure that they build in people resources who can manage that content, and particularly people who can find insights in that content for the benefit of the organization.

    Business intelligence requirements and implementations are growing faster than ever before, particularly due to the rise of cloud computing and more cloud services. There is now much more pressure on ensuring that customer interactions are tracked as a key aspect of business intelligence data gathering. This is one of the most critically important ways of working out the value that cloud services provide.

    Examples of this is everywhere. Azure Media Services and partners provided cloud-based components for the London Olympics delivering VOD content to more than 30 countries spanning across three continents. This was provided again for the 2014 Winter Olympics using a combination of Microsoft Dynamics and Windows Azure. Platform as a Service (PaaS) as being driven by Mobile application developers, so that means more push notification and geo-location services, for example. IBM announced plans to expand its global cloud footprint committing 1.2BN.

    Due to this and other factors, there is already a requirement for those who can help manage the customer data, usage data, behavioral analysis, etc. The big problem is, who knows how to gather the data, what is the actual skill-set required, and what is going to be the impact on IT services, particularly the roles? 

    Data Scientists are needed. On this front, demand has raced ahead of supply. Indeed, the shortage of data scientists is becoming a serious constraint in some sectors. Greylock Partners, an early-stage venture firm that has backed companies such as Facebook, LinkedIn, Palo Alto Networks, and Workday, is worried enough about the tight labor pool that it has built its own specialized recruiting team to channel talent to businesses in its portfolio.

    Data is becoming so important!
    Much of the current enthusiasm for big data focuses on technologies that make taming it possible, including Hadoop (the most widely used framework for distributed file system processing) and related open-source tools, cloud computing, and data visualization. While those are important breakthroughs, It is worth noting that getting the correct people with the skill set (and the mind-set) to put them to good use is just as important. The emergence of integrated app development will also ensure that data scientists will be required. There is already upon us the emergence of devices with sensors, and therefore the reality of the term 'Internet of Things' where more devices will have Internet connectivity woven into them. This means that adoption and app development will rise at consumer level and that app driving sensors will be defined by the data it provides.

    Why is data so important? Three reasons:
    1. Data is the center, not the application.
    Just ten years ago, there was no such thing as 'customer support' or 'customer analysis' concerning analytics surrounding 'metadata' or site usage, or extracting value from data. The application (in other words, the product used by the customer to create the content) was deemed way more important in the eyes of those who provided or provisioned the product to the customer. Analysis of the data was secondary. Back then, a person in I.T. was known as an individual who did not need to get close to the customer, did not need to have business acumen, and did not need to get 'close' to the data created by the application. Instead, all they needed to worry about was the actual software and hardware.  Nowadays, organizations need to understand whether the products they produce, and therefore the data they provide, is deemed as valuable so they can understand whether their services are useful (and continue to be useful) to a broad range of consumers, and thus help them answer key usage and business questions to help them run the business, and thus make the platform (which surfaces that data) better. Nowadays there are requirements that means we need to present findings and recommendations to key decision makers at various management levels, to ensure that data provides clarity on ambiguous projects with multiple stakeholders and unclear requirements. By doing that a 'machine resource' is there to aid quality control.

    2. Job roles are evolving.
    An example of a job role evolving is a business analyst, who is typically responsible for gathering business requirements, defining metrics, aiding dashboard designs and producing product executive reports. As well as ensuring ROI assessment is mapped to an ever evolving SharePoint platform. I suspect that Data Scientists are there to help the business analyst, by identifying the correct data to help fulfil the requirements. So why doesn’t a company simply stick with the Business Analyst? The key reason again is simple. Organizations today are starting to solve the issues concerning data silos, by storing large volumes of decision support data in warehouses. This data is becoming more and more complex. All organizations wish to centralize their data, and in many cases need to carry out real-time analysis, and will require people with analytic skills. Some organizations take on the business Analyst and then the business analyst role starts to occupy that of a data scientist, because the value and experience of the business analyst is built up first, then they are able to analyse the data further, if the data does not get too complex!

    3. Data Convergence is a strategic reality.
    The SharePoint objective is to manage and centralize data - that is clear. So this means the bringing together of data to a central place, instead of relevant users having to visit multiple places which would impact productivity. So, if you have data say in IBM Cognos, then it is probable that you would want to see that data presented as say dashboards in a SharePoint site. If you have data in SQL, then again, you may want to see that presented through SharePoint. Without talking about the technical requirements to do this, surely the business requirement is to identify what data has value which should be exposed first? So, you would say 'easy, get someone to do that'. So who? Get a longtime Business Intelligence power users to mine the data? You could do that, except then they are in fact very close to data scientists... The point here is that the more there is a requirement to understand the various data silos, the harder it will be to say that you do not need a resource to understand.

    Let us now put into context why data is so important from a personal perspective. Acronis, the backup / restore company, carried out a survey earlier in February 2014, which polled 818 respondents. This revealed two interesting statistics:

    · 53 percent say their personal pictures are the most important things to them versus their videos, music, etc.

    · 74 percent say the value of their personal files is more important than the devices themselves.

    What is a Data Scientist?
    Before you say 'Data Scientist is not an IT Role' - think again. Data analysts generally have a strong foundation in applications and computer sciences. They are able to communicate analysis results to IT and the business. They can identify the through analysing organizational data identify problems and solutions, by selecting insights. They look at the data from multiple angles and recommends ways to apply the data. Data Scientists is a strategic position, and helps by defining the data model, examine for consistency and reliability. They have a technical position because they understand and use various business intelligence systems, tools, reports and data sets to generate business insights.

    A Data Scientist is effectively an amalgamation of a number of older roles such as Statistical Analysis, Data Miner, Predictive Modeller or even Analytical Analyst. Add to those some of the newer and now very important areas of customer analysis on sites such as behavioural analysis, sentiment analysis. And to put this into context, Data Scientists are in fact used by Microsoft Office365 teams to analyse the terabytes of data being collected per day. That data will not simply be limited to usage, it will be based on support up/down, performance, helpdesk tickets, time to resolution and more.

    A client, working with SharePoint is extremely concerned about data analysis. They have systems which are connected to sensors. "The amount of data we have coming off our sensors produces data every ten minutes. This we are counting as Big Data. It is very important to us that we are able to analyse that data from a business perspective, and not simply rely on technical information".

    The SharePoint impact
    In the early days of SharePoint, the Engineer, Administrator have altered and grown over time to include all the roles required to deploy SharePoint to an organization. Generally, they are technical roles. However, they are still deemed as non-data centric in the eyes of the business, which focuses on the value of the data to help meet its business imperative - cash flow. The business facing critical roles, particularly that of Business Analyst, Content Strategist are now heading into data analysis land to extract value from that data, again to focus on the business imperative.

    What will and has been emerging is the requirement to analyse the data. This has been going on for some time - just looking at the requirements for usage statistics on SharePoint solutions is a sign. Take that, and add on the requirement to analyse usage from an app provided say in Office 365, and that then begs questions of the other data that can be extracted from the app. Not just the technical log information either, the behavioral aspect of the users working with the app is also data.

    Another aspect of data management is the focus of business intelligence and the connectivity of disparate data sources to provide information. In a small organization, it will be prudent to analyse the data to identify the best fit, however, this is simply an aspect of business intelligence gathering which 'ends' when the dashboard is defined. However, as the data gathering becomes more complex, and the analysing requirements of that data become more focused on customer usage and behaviour. An impact on SharePoint is therefore ensuring there is a critical difference in analysing the data which is surfaced, to the data which is not. Another impact could be that the focus on SharePoint data analysis still seems to be the tools so that one can focus on how to present and visualize that data, however, there still needs to be work done to identify the connections between multiple sets of data in the enterprise.

    Conclusion
    The emergence of the Data Scientist is not altogether new, but is definitely gathering pace. This is particularly when there is data trend spotting to be done, Data Scientists will be needed to help bring change to the organization. Irrespective of the platform, and I have mentioned SharePoint only because it provides the means to surface that data, organizations will need the skill-set and the mind-set to be able to make sense of that data as the organization evolves.

    As we working humans harvest more data, data scientists able to determine that the data will be of critical value to the organization using machines - it is that machine intelligence which will become vitally important as we steer into the wealth of customer centric cloud services.

    As I researched the items for this article, I was amazed at the mass of information concerning what it means to be a Data Scientist. Most notably ‘What is a data Scientist?’ by Forbes and also an interesting PowerPoint on the roles available out there for a Data Scientist, according to Network World.

    Geoff


    Microsoft Accelerate Your Insights

    Interested in finding out more on Microsoft Business Intelligence? Join us o
    n May 1st for an event that will explore the opportunities for organisations wishing to accelerate their use of data insights.  To find out more about this free online event check out Anthony Saxby's’ (Microsoft Data Platform lead in the UK) ‘Can big data be used to help your customers?’ article.

    Find out more about Microsoft SharePoint and how it could rapidly respond to your business needs.

    Did you find this article helpful? Let us know by commenting below, or reaching out to us via @TechNetUK.

  • Become an MVA Hero

    Sign up to MVA Hero today and get your free hero costume (t-shirt) and earn some great prizes!

    I am still surprised at how many IT Professionals and developers haven’t made use of the Microsoft Virtual Academy to learn about Microsoft technologies and stay up do date as new versions are released. Each module has a basic test and decks to accompany the videos focused around a particular subject and they are graded to indicate how advanced they are.

    One of the great things about MVA is how wide and deep the content is, but there isn’t really any sort of path through the courses as they might apply to your role e.g. which are the right Microsoft Azure courses to do, how do I get up to speed on Hyper-V with all the improvements in Windows Server 2012R2? 

    So the TechNet UK team have a plan called MVA Hero to fix that and have some fun at the same time!  We wondered what sort of super heroes worked in a datacentre, what skills they would need, and mapped this onto the latest MVA courses.

    image

    Cloud Ninja is all about Azure and even if you aren’t planning to use Azure just yet it’s actually a great way to spin up VM’s and learn some of the Microsoft core technologies if you haven’t got a free server to use for evaluations.

    Captain Code is getting to grips with PowerShell which will should give you time back in your day to do more MVA courses rather than clicking on endless dialog boxes to do repetitive tasks.

    Dr Desktop will be all over Configuration Manager and In Tune as it’s no fun to fiddle with profiles applications on all the devices accessing corporate resources one by one.

    RackMan understands how to turn Just a Bunch of |VMs (JBOV) into services that can scale, be costed and provisioned using self service just like a public cloud.

    Hypervisor needs to efficiently allocate compute network and storage and provide quality of service to all of those to prevent noisy neighbours hogging resources.

    Solid Server understand that while we have all seen Windows Server we haven’t seen all of Windows Server , and things like IP Address Management, storage spaces and VDI are all hidden away waiting to be turned on.

    Taking the courses these heroes have done is a good starter and will help you get on the road to expertise and ultimately certification. Even if your are familiar with older versions it they’ll show you what’s new and may uncover some useful tips that you didn’t know was already in the product. For example, I did a talk in Leeds last week where no one knew about the Active Directory recycle bin in Windows Server 2008 R2, and at another event no one had seen IP address Management although everyone was running Windows Server 2012!

    We are having a bit of fun with the MVA - when you sign up to MVA Hero you will receive a free t-shirt and then once you complete a badge, you will be rewarded with a stress ball hero figurine (collect all six!) to show you have earned the skills of the MVA SuperSix!

    So to quote one of my colleagues on why she loves working at Microsoft “it’s all about learning and laughing”

  • Hybrid Cloud – take the ExpressRoute to extending your VPN into the Microsoft Azure Cloud



      Straight from the blog of Ed Baker, Microsoft Technical Evangelist.



    One of the traditional impediments to businesses adopting public cloud computing is the concern over putting all your eggs in one basket. The Hybrid cloud is the solution to this.

    The Hybrid cloud is a description of utilising a pre-existing on-premises datacentre and a cloud solution such as Microsoft Azure to balance the overall solution.

    The last two days at Enstone with the Lotus F1 team have been an excellent introduction for a packed audience into the way to use System Center to manage your on-premises datacentre (or private cloud) and to start using Microsoft Azure to develop your Hybrid Cloud.

    Embedded image permalinkMichael Taylor, Chief Information Officer of Lotus F1, takes the stage at  our #UKITCamp - "We keep business critical in house but push everything else out to the cloud."

    Michael Taylor, CIO of Lotus F1 (above) gave an excellent introduction to both days. He explained that Lotus keep all business critical and confidential data in their own data centres. Lotus then use Office 365 and Microsoft Azure to host their email and other less critical services in the Cloud – Forming a true Hybrid Cloud solution.

    Often networking and connectivity is seen as another impediment to connecting these two discrete elements and yet maintaining security. The recent Heartbleed OpenSSL issue although not directly affecting the Azure platform does highlight the need for vigilance and the security of the connectivity in a modern, robust Cloud solution. Essentially businesses want Azure in their network.

     Yesterday Microsoft announced the ExpressRoute partnership programme and introduced BT and Equinix as the first partners to provide the solution in the UK and EMEA. The BT announcement is here and the Equinix one is here and the Microsoft Azure blog covers it well here.

    So ExpressRoute – what is that?
    hybrid1First and foremost ExpressRoute provides a private dedicated connection between Azure and the customer datacentre, no reliance on a shared internet infrastructure to reach your apps, services and data.

    Within this you can now choose the network performance you want or need (or can afford), this will allow you to design your Apps better and meet QoS and SLA requirements.

    How fast do you say?  – Well up to 10Gbps – is that fast enough! If you have large amounts of data to move between your datacentre and Azure or vice versa then this is a great, fast and economically sound option.

    So it is fast but what can I use it for?
    ExpressRoute is designed to cater for mission critical workloads such as

    • Storage (Migration, DR, retention archives)
    • Dev/Test (large VM movements from Dev / Test / Production environments)
    • BI and Big Data (Efficient transfer of large data sets to increase ‘Big Data’ performance)
    • Media (solid and predictable performance for streaming data to or from Azure)
    • Hybrid Apps (the mix of High Bandwidth and Low Latency links create a great environment for Azure to be used as a datacentre extension for multi tier apps  – improved I/O and API response times.)
    • Productivity Apps (Sharepoint as an example requires high bandwidth and low latency to work at scale)

    There are three routes into Azure as shown below.

    Express route provides a dedicated private route in one of two flavours an Exchange provider or Network Service provider route. The former provides a simple point to point solution while the latter exposes Azure as an additional site in the corporate network.

    2roots

    Any regular reader of this blog will know that the I openly declare that the future is PowerShell and ExpressRoute is no different there are specific ExpressRoute Commandlets.

    pshell

    Microsoft provide ExpressRoute pricing for the access and bandwidth as well as throughput and your network provider will add their charges on top of this.

    To keep up to date with ExpressRoute and all the changes and developments within Microsoft Azure bookmark the Azure Page and the Azure Blog

    The only question left for me is why wouldn’t you adopt this economically viable, flexible and fast solution to the Hybrid Cloud solution – Bring Azure into your network.

    Try it free. Connect to the cloud for free with Microsoft Azure.
    Sign up now or download your free training kit.

  • XP End of Support: The World Reacts

    This Tuesday April 8th 2014 was a notable day in the history of Microsoft, and indeed the IT industry. It was the day that Microsoft officially ceased supporting Windows XP, first announced several years ago.

    As a final farewell to the trusty old OS, we take a look at its last day in tweets, including advice, analysis and a ballad. Yes, really.

    If you're in the market for a new system or are working on your business' migration, we have a whole range of resources, guides and offers to help you make the move. Here's a handy selection to get you started:

     So without further ado, over to you, world.

  • In-Memory Computing, Boom or Bane for Storage?

    image
      By Steve Willson, VP Technology Services EMEA at Violin Memory.





    SQL 2014’s new in-memory computing takes several leaps forward in performance and control.
    Application designers can now control what data is permanently staged in memory (thus reducing IO waits) and how machines compile the SQL code (for amazingly faster execution).  SQL 2014 also includes a new latchless locking mechanism (for the in-memory tables) that reduces inter-user transaction contention, allowing for increased concurrency. This is especially helpful as the number of users per system continues to grow.
    So, with the data now in-memory, what does this mean for storage?

    If we could hold all SQL Server data in-memory using server DRAM, performance would be amazing, servers would be well utilized, applications would fly – but the reality is businesses have requirements other than just performance.

    • Business data needs to be stored safely on a mature persistent media.
    • Business data needs to be accessed via multiple servers for high availability.
    • Systems need to scale over time.
    • Virtual machines need to migrate between servers.
    • New data is received and created constantly.
    • DRAM is a finite (and a relatively expensive) resource.
    • High DRAM memory models, require high processor counts, and can reduce server utilization.
    • Reporting systems can have a very large footprint (well beyond DRAM scaling limits)
    • Loading data into the system needs to be timely (evaporating maintenance windows)

    These types of challenges demand a persistent storage solution that compliments the high performance of in-memory computing. In-memory database processing is enabled when the data can get into the system, be changed by the system and remain intact between events.

    If an application can write 20x faster, then it will need a persistent storage system that can ingest sustained writes 20x faster. In-memory processing allows for the transaction to start immediately (the data is already in DRAM) but it still has to write the change to storage before it can complete (there’s still a transaction log).

    To ensure that the new high powered performance engine in SQL 2014 is properly enabled, Microsoft has been working on the next generation of transport (SMB 3.0 & SMB Direct). By adding multi-channel, transparent failover and RDMA type features to SMB they have created a protocol that runs faster than block (fibre channel) for cheaper (IP networking like 10Gb, Infiniband, etc). It also, via UNC naming and SOFS (Scale-out File System), allows for the next generation of architecture where everything is referenced by shares (no more static NTFS LUNs or drive mapping), secured and controlled by Active Directory, scales and migrates live, monitored and provisioned by System Center and PowerShell and also includes favourite features like mirroring, compression, encryption, deduplication, etc.

    IT departments are anything but one dimensional so Microsoft has been working with many partners to ensure that all classes of storage will work in the same ecosystem. For branch office kits, archive targets or moderate workloads there is the Cluster In a Box class of products and for high end performance systems (SQL, SharePoint, Hyper-V farms, custom applications, etc) there is all-flash arrays like the new Violin Memory Windows Flash Array (WFA). These all work off the same principle of Microsoft service to storage, operated by Windows Failover Clusters embedded inside of storage appliances. In the case of the WFA, Microsoft spent over a year tuning the kernel and I/O stack optimizing for the world’s fastest storage experience.

    In-memory processing can now be enabled by a natively Windows-embedded storage appliance that allows for extreme write loads, high availability, scalability, reduced administration time and delivers the high speed data access to large reporting systems sitting next to the high end transactional systems. Storage is no longer the bottleneck keeping applications back and instead will empower the speeds of the new high performance engines like in SQL 2014.

    Also check out our recent announcements on SQL Server 2014.

    Did you find this article helpful? Let us know by commenting below, or reaching out to us via @TechNetUK.