Kevin Remde's IT Pro Weblog

  • Announcing: Taming Android and iOS with Enterprise Mobility Suite Jumpstart on MVA


    On December 8, 2014 my friend Simon May and I recorded (while presenting live) a Microsoft Virtual Academy “Jumpstart” session all about how to manage iOS and Android devices with Microsoft solutions.  We divided the topic up into 4 modules, to introduce the topic and the fundamentals, as well as to cover iOS and Android device management in greater depth.  We covered how Microsoft products and solutions such as Windows Server, Azure Active Directory, Microsoft Intune and System Center Configuration Manager can be used to grant secured, managed access to corporate resources for those users who “BYOD”; their iPhones, iPads, or Android devices. 

    Here is the landing page:

    “Kevin, did you have a cold?”

    Ugh.. I was in the 2nd week of a pretty annoying chest cold.  I hope the editors were able to edit out the random hacking and coughing.  I felt fine otherwise.  And <knock on Surface Type Cover> I and healthy now.  Smile

  • TechNet Radio: Success with Enterprise Mobility - Live Webcast Series Preview

    In this episode I am honored to welcome back Microsoft Vice President Brad Anderson to the show.  We discuss his monthly upcoming live webcast series, “Success with Enterprise Mobility” , that kicks off on Tuesday, December 9th and concludes on March 3rd. Tune in as he gives us a preview of what he and his guests will be discussing and learn how you can successfully support business productivity for your users through secured and controlled mobility




    • [1:02] It’s been about a year since our last conversation on TechNet Radio. So what are you up to these days?
    • [2:31] Tell us about the “Success with Enterprise Mobility” blog series you’ve been writing. Why is this an important topic?
    • [5:29]  Let’s chat about your upcoming live webcast series. What’s the series title, and what will you be discussing?
    • [13:11] Where can people go to see the full schedule and register for these webcasts?

    Success with Enterprise Mobility Series

  • Newest Azure (IaaS) Cost Estimator Tool is Available

    It isn’t a brand-new tool, but it was updated to version 1.1 the other day, and definitely worth sharing.  The Microsoft Azure (IaaS) Cost Estimator Tool is now available.  It’s an installable tool that allows you to “profile [your] existing on-premises infrastructure and estimate cost of running it on Azure.”

    “Sweet!  So, it installs agents on servers and then..”

    Whoa!  Lemme stop you right there!  No agents.  It’s agentless.  It does require you to supply administrative credentials that will apply to the machines you’re profiling, which make sense.

    The first time you run it, you’ll see this screen.

    Cost Estimator Tool

    As you can see, the description of what it does and how it can be used are clearly spelled out.

    In my example, for example (?), I’m running the tool on a PC in my test network.  I’ve selected to profile physical machines, such as my domain controller named, surprisingly enough, “DC”.  I’ve supplied my credentials…

    Machine Selection

    And clicking Add, plus adding a couple of other machines (whose names might give away their purpose) results in this:

    Machine Details

    Clicking Next brings me to the page where I can choose a profiling duration, scanning frequency, and a name for the generated report.

    Profiling Choices

    I’m going to scan only one time, so my results won’t be based on more accurate, actual traffic or performance of my machines.  But it’s good enough for a start.

    Profiling Choices

    I click Begin Profiling, and (in my case) after about 10-15 seconds my one-time scan is complete.  I click View Report, and after an informational pop-up describing what was done and what options I have to change values, I see this screen:

    Scan Result

    Notice that I can tweak values and select just some or all of the machine before clicking Get Cost.  I’ll just leave the values as determined, select all, and get my cost.  Here is the result:

    End Result

    Notice that I can tweak the pricing model, and change the size of the Compute Instance (the type/size of VM) to play with various values.  And when I’m done, I can export the results to a .CSV file (for use in Excel), or go back and try it all over again.  Pretty nice?

    “Very nice!  But, what does this tool cost?”

    Nothing.  Nada.  Zilch.  Zero-dollar$.

    “Sure.  And I suppose once I run this tool I’m going to be bombarded with e-mails from Microsoft.”

    Nope.  Not even a requirement for a Microsoft Account to download, and no information is ever sent from this app back to Microsoft.

    Seriously, Microsoft hopes that this will be a good way to get an idea of what your existing machines, whether physical or already virtualized, will cost to run over time as VMs hosted in our Azure Infrastructure Services.  It’s all a part of helping you plan for an eventual migration of some of your local resources into Azure, to take advantage of the scale, capacity, security, and cost-benefits of the cloud.

    In case you missed the link earlier, here it is again: Microsoft Azure (IaaS) Cost Estimator Tool

  • Desired State Configuration (DSC)–Modernizing Your Infrastructure With Hybrid Cloud (Part 25)

    DSC OverviewWelcome to another in our series entitled “Modernizing Your Infrastructure with Hybrid Cloud”.  As you may be aware, this week the theme is “Management and Automation”.  As a part of that theme I’m sharing with you an introduction to Desired State Configuration (DSC); more completely called Windows PowerShell Desired State Configuration

    DSC is a relatively new (less-than-a-year-old) technology, introduced with PowerShell v4.0, that lets IT define what the configuration of a server will be, apply that configuration, and then verify (and remediate) so that the configuration is still in place and as-desired.

    “So, it’s like System Center Configuration Manager?”

    No.  It’s built-in as a part of Windows, and is configured and implemented using PowerShell.  Sound interesting?

    “I’m listening.”

    Good.  In the context of one blog article naturally I won’t be able to go into every detail, but I hope that this article, some simple examples, and some additional resources at the end will get you excited for trying this out.  And ultimately that you’ll see the immense value that this will give your IT and, of course, you’re business.

    A Simple Example

    For our quick example let’s assume a couple of things.  I’ve enabled the Windows PowerShell DSC feature on a server named “Server1”.  Server1 is a member server in my domain.  I’ll be using an administrative account from another server (called Admin) to apply configuration to Server1. 

    I open up the PowerShell ISE and enter the following text.  Can you tell what it’s doing from what the text says?


    “It looks like it’s defining something that’s a ‘Configuration’ and calling it ‘IISWebsite’.  And for your server named Server1, it’s laying out what Windows Features should be installed!”

    Exactly!  And in this PowerShell session, when I execute the configuration, I end up with a .MOF file, which is a definition on behalf of how Server1 should have the Web Server and ASP.NET 4.5 installed and running.  All I need to do is run the Start-DSCConfiguration PowerShell cmdlet with the proper parameters referring to the .MOF file and pointing to Server1, and DSC configures the features and enforces that they always be there as I desired  In fact, even if I or another administrator were to manually remove the ASP.NET 4.5 feature from the server, after a period of time the state would be re-evaluated and the configuration would be fixed! 

    What if, like those “WindowsFeature” sections, I were to add a “File” section like this:


    Basically what I’m saying is, “Here’s the source folder of content that I want you to make sure is always found under this destination.”  Ah.. and doesn’t the path look like it might be a web site folder?  Yes!  This configuration not only enforces that IIS be installed and running, but that the contents of a web application be always there and that the destination code always matches what is coming from the source!  Someone could go in there and, say, delete some of the web content, but DSC would fix it automatically!

    “Hey Kevin… What’s a .MOF file?”

    Yeah.. this was a very quick, very simple example.  Let me go through and briefly describe the parts that make up DSC…

    The Parts – Configuration

    The configuration is what we built in my earlier example.  It’s a PowerShell definition that, using “Resources” (defined next) specify how things should be configured; our “desired state” for the configuration of a target server. 

    The Parts - Resources

    In our example above, you notice that I’m defining what Windows Features are to be installed.  I can do this because there is a built-in DSC “Resource” called “WindowsFeature”.  From the TechNet Documentation, “Resources are building blocks that you can use to write a Windows PowerShell Desired State Configuration (DSC) script.”  Windows comes with a number of these built-in resources that know how to specifically work with, configure, and enforce various aspects of the operating system.  Resources for working with the registry, the file system, Windows Features, services… and many more, are included in the list of built-in DSC resources.

    But it gets even better.  These resources are just PowerShell modules.  And just as you have the ability to create your own modules to extend PowerShell, you also have the ability to create your own custom resources!  

    The Parts - The .MOF file

    This is the file that contains the configuration to be applied.  It’s the result of executing the configuration definition in PowerShell, and is in a standard format as defined by the DTMF.

    “Hey Kevin - Why do we even really need a .MOF file?  Can’t Microsoft just do what it needs to do directly from PowerShell?”

    I’m sure they could.  But the beauty of using the .MOF is that because it’s a DTMF standard, it is formatted in a way can be applied to different machine types and for various purposes.  In fact, at TechEd in Houston earlier this year I saw Jeffrey Snover actually use DSC to create a .MOF that then configured a Linux server running an Apache web server.  (Yeah.. we’re “open” like that these days!)   

    The Parts - How It’s Deployed

    The full name, “Windows PowerShell Desired State Configuration” is a hint about how you enable the DSC capability.  It is a feature of Windows Server 2012 R2, found here in the Add Roles and Features Wizard:

    Add Roles and Features Wizard

    When you check the box, you’ll notice that it will also install some Web components to your server…

    Additional features being installed

    This is because one of the ways DSC configurations are securely pulled is to use IIS.

    Dr. Doolittle's Push-me-Pull-YouThe Parts - Push-me-Pull-You?

    One important aspect of DSC is that it becomes even more powerful when you can distribute configurations, or maintain consistent configurations among many machines, all from a smaller number of source locations.  DSC allows either a simple “push” distribution, which is simple and more manual, and a “pull” distribution where not only do you apply a configuration to a machine but you also tell it where it should be looking for its configuration and any changes going forward.  Pulling can take place over HTTP (not recommended), HTTPS (recommended), or SMB Share permissions (okay because it’s authenticated access). 

    “Why isn’t HTTP recommended?”

    Think about the damage someone could do if they hijacked DNS and then pointed to and automatically applied someone else’s version of a server configuration to your servers.  Scary prospect, indeed.

    The Parts – The Local Configuration Manager

    The Local Configuration Manager is “the Windows PowerShell Desired State Configuration (DSC) engine. It runs on all target nodes, and it is responsible for calling the configuration resources that are included in a DSC configuration script.”  So basically when you’ve enabled the DSC feature on a server, this is the service that either takes the pushed configuration, or pulls the configuration, and then applies it as defined in the most recent .MOF file.

    For More Information…

    Like many of you, I find that I learn best by looking at other people’s examples.  And thankfully in the case of PowerShell and DSC there is a really big community already formed and willing to share what they have done with the rest of us.  Here are some of the places I recommend you check out and save to your favorites if you’re really going to get serious about using Desired State Configuration:

    If you want to try it out in a virtualized lab environment:

    And finally, don’t forget to check in frequently at our “Modernizing Your Infrastructure” series landing page, to see all the great articles our team has created and resources we’ve shared.

  • Breaking News: System Center 2012 R2 DPM is now supported in an Azure Virtual Machine

    Get the System Center 2012 R2 evaluation and try out DPM!This is pretty cool.

    As the title says: System Center 2012 R2 Data Protection Manager is now an application that Microsoft will support when running inside a virtual machine in Microsoft Azure.

    I’m sure they won’t mind me sharing this.. but here is the text from an e-mail I received on the subject that spells it out nicely:

    We are pleased to announce that System Center Data Protection Manager (DPM) is now supported to run in Azure as an IaaS virtual machine. This announcement allows customers to deploy DPM for protection of supported workloads running in a Azure IaaS virtual machines. Customers with a System Center license can now protect workloads in Azure. Read more about it on the DPM blog.

    Support for multiple virtual machine sizes

    Choose the size of the virtual machine instance that will run DPM, based on number of workloads and the total data size to be protected. Start with just an A2 size virtual machine, and upgrade to a larger size to scale up and protect more workloads.

    Support for Microsoft Azure Backup

    Protect your data to Microsoft Azure Backup and get longer retention with the flexibility of scaling storage and compute separately. The Microsoft Azure Backup agent works seamlessly with DPM running in an Azure IaaS virtual machine.

    Familiar management using the DPM console

    With DPM running in an Azure IaaS virtual machine, you get the same experiences and capabilities that you are familiar with.

    So, here’s what you should do:

    1. Read the announcement on the DPM Blog
    2. Set up a free 1-month Trial Azure Subscription (if you don’t already have one),
    3. Get the System Center 2012 R2 trial, and
    4. Try out DPM in an Azure Virtual Machine.

  • Cross-Premises Connectivity with Site-to-Site VPN - Modernizing Your Infrastructure With Hybrid Cloud (Part 15)


    Yesterday in our “Modernizing Your Infrastructure with Hybrid Cloud” series, Matt Hester described how to create a virtual network “in the cloud” in Microsoft Azure in order to support cloud-based Virtual Machines and their ability to communicate with each other and with the outside world.  We of course have the ability to connect to our VMs individually using Remote Desktop connections, but if we’re going to treat the location of these cloud based machines as just an extension of our own datacenter, we’re going to want to have a secured connection to them.

    That’s what the VPN Gateway is all about.

    In this article I’m going to show you step-by-step how to connect your Azure virtual network to your on-premises network.  Here are the steps we’ll go through:

    • Collect Some Information
    • Define the “Local Network”
    • Enable Site-to-Site
    • Create the Gateway
    • Configure the Local VPN Device

    And as an added bonus, I might throw in a little something extra.


    I hope you’ll think so.  But for starters, let’s begin where Matt left off.  I have an Azure subscription with a virtual network named AzureNet1, located in the South Central US datacenter region.  In my scenario, I want to connect this Azure network to my Fabrikam office (Fabrikam was recently purchased by Contoso).  Once the connection is established, I will want to join servers in that office to the domain.

    Here’s what the AzureNet1 network dashboard tab currently looks like.  Note the two virtual machines currently in this network; a domain controller and an application server.

    AzureNet1 Dashboard

    As you can see on the configure tab, I’ve set up two subnet ranges (their purposes are obvious based on the names I’ve given them) as part of an 8-bit-masked 10.x.x.x subnet.:


    Notice that I’ve also defined my DNS server as  My domain controller has that address.


    Collect Some Information

    Before we start adding the site-to-site connection, I need to collect some information so that I can carefully use it to make the correct configurations.  As you probably know first-hand, when doing networking configuration it’s easy to make simple little mistakes that cause everything to NOT work, so let’s make quick note of a couple of important items:

    • Local Network Address Range
    • Gateway Address

    Your local network address range refers to the addressing of your local network.  By that name, though, it’s a little misleading.  “Local” assumes you’re connecting your Azure network to some “Local” office.  But in reality it could be some other branch office or even another virtual network somewhere else in the Azure world.  So, just think of “local network” as being “the network I’m connecting to my Azure network”.  And I’ll keep using “local network” in “quotes” throughout the rest of this article for just that reason.

    In our example, my Fabrikam network is with a 16-bit subnet mask.  (

    The Gateway Address is the externally accessible IP address of the gateway.  In the Fabrikam network, let’s say that I have a VPN device connected to the Internet with an external Internet-exposed address of  I will also have a gateway address on the AzureNet1 gateway, but that address will be assigned when I create the gateway for my virtual network.  So, in simple terms, the gateway address is the connection point on either end of the VPN connection.


    Define the “Local Network”

    Before enabling the site-to-site connectivity and creating the gateway, we need to define the “local network” Fabrikam, so that our network knows what addresses it will be routing to over the VPN through the gateways.

    To define my “local network” (which I’ll name “Fabrikam”), I clicked on +New in the bottom-left corner of the Azure portal, and selected Network Services –> Virtual Network –> Add Local Network.

    Add Local Network

    I give my local network a name, an optionally the gateway address (I can add it later if I don’t know it right now.)


    Then on the next screen I add the address spaces that exist at my “local network” at Fabrikam.


    Once created, you’ll see it in the list on the local networks tab.

    Local Networks


    Enable Site-to-Site

    Back on my AzureNet1 network and on the Configure tab, now I can check the box to enable Site-to-Site Connectivity.  Notice that a couple of things change.  I now will choose which “local network” I’m going to connect to (Fabrikam), and it also requires (and defines) a “Gateway Subnet” for me.


    “Hey Kevin.. What’s that ‘ExpressRoute’ option?”

    That’s actually what my friend Keith Mayer is going to cover in tomorrow’s article in the series. I’ll include the link to his article after it’s published.

    UPDATE: Here is Keith’s article - Modernizing Your Infrastructure with Hybrid Cloud - Step-by-Step: Cross-Premises Connectivity with Azure ExpressRoute (Part 16)

    Anyway, after checking Connect to the local network, clicking Save starts the process of updating the network configuration.  After a couple of minutes it completes, and now back on the dashboard tab we see this:


    This means the gateway is in defined, but not actually created.  That’s our next step.


    Create the Gateway

    At the bottom of the dashboard screen, click on Create Gateway.


    Notice when you click it that you are given a choice between a static and dynamic routing VPN gateway.

    “What’s the difference?”

    Your choice will be based on a number of factors.  Often the VPN hardware you are using will limit you to one or the other.  A static routing VPN gateway is one that routes traffic based on policy definitions (which is why it’s often referred to as a Policy-based VPN).  Packets are routed through the gateway based on a defined policy; an “access list”.  A dynamic routing VPN gateway, also known as a “route-based VPN”, is a simple forwarding of packets between two networks.  If the network doesn’t locally contain the destination for this packet, I’ll assume the gateway knows where to send it.  And if it’s known by the gateway as existing on the other network, it sends it securely through the tunnel.  For more information about these choices, and about various devices and gateway types that support either static or dynamic VPN gateways, check out this excellent documentation.  Even if your device is not on that list, it may still work if your hardware supports

    In my scenario I’m creating a simple tunnel to a device that supports the other end of dynamically routed VPN, so I’ll choose Dynamic Routing.  Creating the gateway does take a good amount of time (as much as 15 minutes), so be patient.  Eventually our display will go from this:

    Creating Gateway

    …to this:


    …and eventually, this:


    Notice that we’ve been assigned an official actual external gateway IP address.  We’re still not actually connected. (Connected would be GREEN in color.)  We haven’t addressed the configuration of the “local network” side of our connection yet.  At the bottom of the page you see a Connect button:


    But let’s not click that just yet.  We still need to…

    Configure the Local VPN Device

    Other than collecting some information about our Fabrikam network, we’ve only focused on the AzureNet1 side of our VPN tunnel.  We still need to create the gateway on our Fabrikam network.

    On the AzureNet1 dashboard, notice this hyper-link towards the right-side of the page:

    VPN Device Script

    Clicking on Download VPN Device Script this brings up a very interesting page that allows us to specify what kind of hardware (or software) we have on the “local network” side of our connection.  The beauty of this is that, based on our selection of hardware (or even Windows Server 2012  and 2012 R2 Routing-and-Remote-Access (RRAS) working as your gateway), you are generating a script that can then be used to automatically configure your gateway on the “local network” side.

    VPN Device Configuration Script

    Once we’ve selected our Vendor, Platform, and Version, and clicked the check mark, we’re immediately sent a text file containing the configuration script for our selected device.

    VPN Device Script

    Use this script to configure your device, establish the connection from the local network, and then come back to the Azure network dashboard and click connect.  And if you’ve done everything correctly, you should see something happy (and GREEN) like this:


    “What kind of hardware do you have on Fabrikam’s network, Kevin?”

    I don’t know.  I’m not actually using a local network.  For this demonstration, I’ve actually connected my AzureNet1 network, which is located in the South Central US datacenter region to a Fabrikam virtual network that host in the Central US datacenter region and manage through an entirely different Azure subscription.  So.. I’m doing Site-to-Site between two Azure virtual networks. That’s my “something extra” that I promised earlier.  Now I’m going to show you what I needed to do to make that connection work.

    Connecting two Azure networks via a Site-to-Site VPN requires two things:

    1. Dynamic Routing Gateway, and
    2. A shared private key. 

    I’ve already showed you where you choose Dynamic Routing when you create the gateway.  And other than the shared key, everything else I did for configuring the Fabrikam network was identical to what I configured in AzureNet1, except that my network in Fabrikam is – identical to what I defined the “Fabrikam” “local network” to be on this side. IMPORTANT: These have to match.  The range and mask have to be correct and consistent on both ends both the “local network’' definition and the actual network (or Azure virtual network as in my case) for this to work.

    Also in the definition of the “local network” on either side was the specification of the Gateway IP Address.  Again, ordinarily, your configuration script is populated with the Azure virtual gateway’s IP address.  But in this instance, I need to create the gateway first, and let it fail connecting, just so I can see what the actual assigned gateway IP address on that side of the connection is going to be.  Then I can take that address and configure it into the “local network” definition on the other side.

    As for the shared private key.. Notice at the bottom of the AzureNet1 dashboard that there is a Manage Key button:


    If I click this, I can see (and copy) the generated long key.  I’ll copy it to the clipboard.

    Manage Shared Key

    This key was created when we created the gateway, and is included for you in the configuration script on behalf of your “local network” device.  But…

    “We don’t have a local network device!”

    Bingo.  And we also don’t (as of this writing) have a way to use the Azure portal to set the shared key in and the configuration of the virtual network!  But we will need to do that to at least one end of the tunnel to make sure they match.  (Or both if we want to just use our own text as the shared key.)

    This is where PowerShell comes in.

    I’ve installed the Azure PowerShell cmdlets onto my local system, and then in PowerShell I connected to my Azure subscription where the Fabrikam virtual network resides.  And now I use the following PowerShell command to set the shared key for the gateway connected the Azure network Fabrikam to the (from this point of view) “local network” named AzureNet1.

    Set-AzureVNetGatewayKey -VNetName fabrikam -LocalNetworkSiteName AzureNet1 -SharedKey 2kDsdqXnxeXrGjI4r4rLltKKT1g9E9gY

    (For the Windows PowerShell command-line tools, go to the Azure downloads page, and scroll down to “Windows PowerShell” section.  Instructions for setting this up are found there as well.)

    That’s how I was able to get the common shared key into the other side of my connection.  After this command completes, and soon after clicking Connect in the dashboard, I was happily sending data back and forth.


    “But Kevin… Prove to us that you have the connection established!  Finish your domain-joining scenario!”


    In the AzureNet1 network I have two servers.  One is a domain controller, and the other is a member server.  All machines here are assigned their DNS server as They reside in the South Central US datacenter region.


    On my Fabrikam network (which, you’ll recall remember, resides in the Central US datacenter region, so not in the same location as the AzureNet1 network and machines) I have one server that I’ve just created:

    Fabrikam Network Machines

    Importantly, I’ve also created a “DNS Server” designation here, and assigned to the Fabrikam network, with the address.  Note the configure tab of the Fabrikam network.


    In this way my machines in this Fabrikam network will be assigned as their DNS server, and so will know how to find the DC in the AzureNet1 network.  To verify this I can establish a remote desktop connection to my new karContosoDC2 server and look at the status of the network adapter:

    Network Status

    Trusting that my VPN is happily and dynamically routing traffic between Fabrikam and AzureNet1, and knowing that my new server in Fabrikam is going to look for DNS at the domain controller in AzureNet1, I attempt to join the domain:

    Joining the Domain

    I am asked for domain credentials (a very good sign!)…

    Prompted for Creds


    I'm in!

    I’m in!  That’s proof that I have successfully connected these two virtual networks!


    For more information on configuring secure cross-premises connectivity, check out the official documentation here:

    Here are some more specific configurations and their documents:

    And be sure to keep watching for the full series of articles on modernizing your infrastructure.

  • TechNet Radio: (Part 3) Modernizing Your Infrastructure with Hybrid Cloud - Planning Hybrid Cloud Networking

    Keith Mayer and I continue our series on “Modernizing Your Infrastructure with Hybrid Cloud”.  In today’s episode we discuss various options for networking. Tune in as we go in depth on what options are available for hybrid cloud networking as we explore network connectivity and address concerns about speed, reliability and security.

    Planning Hybrid Cloud Networking


    • [2:46] What components are involved in Hybrid Cloud Networking?
    • [5:30] What are some of the technical capabilities of Hybrid Cloud networking?
    • [9:25]  Which VPN gateways are supported with Microsoft Azure?
    • [11:28]  What are some of the common scenarios that customers are implementing for Hybrid Cloud networking?
    • [15:40]  Besides Site-to-Site IPSec VPNs, are there any other connectivity options for Hybrid Cloud networking?
    • [20:10] DEMO: Can you walk us through the basic steps for setting up a Hybrid Cloud network?

    Shortened URL if you would like to share on Twitter or Facebook, etc.

  • TechNet Radio: (Part 2) Modernizing Your Infrastructure with Hybrid Cloud - Planning a Hybrid Cloud Storage Architecture

    My friend Dan Stolts and yours truly continue our series on “Modernizing Your Infrastructure with Hybrid Cloudwith an overview on how to plan for a hybrid cloud storage solution using Windows Server 2012 R2 and Microsoft Azure. Tune in for our lively discussion on the many storage options available to you as well as discussions around performance, reliability and security.  

    Watch the entire video


    • [1:18] Let’s start with a quick summary of existing storage capabilities using modern infrastructure on-premises as supported by Windows Server 2012 R2
    • [10:16] What is Azure Storage?
    • [11:17]  Can you give us a quick overview of Azure Storage Architecture?
    • [12:30]  In order to connect local systems to Azure Storage accounts, I have to think there is some kind of authentication required to make that happen securely. How is that done?
    • [16:00]  What is Blob Storage?
    • [17:30] What are some common uses of Azure File Storage?
    • [18:18] Is Azure data reliable?
    • [21:10] Since we can access storage from Azure services or from our on-premises services, what kind of performance can we expect?
    • [23:17] I understand we can take snapshots of data in Azure. Can you tell us a bit snapshots
    • [24:37] Other than through the azure portal, how can businesses access Azure data?
    • [28:45] What can you tell us about StorSimple?
    • [32:21] Can I use Azure to host my SQL Server database?
    • [35:19] Are there other storage components that we have not talked about?

    Follow the entire series!

    Shortened URL if you would like to share on Twitter or Facebook, etc.

  • Modernizing your Infrastructure: The FREE Microsoft Azure Virtual Machine Readiness Assessment

    Extend your datacenterWelcome to another in our new series of “Modernize your Infrastructure” articles.  Today I’m pleased to share with you the details of yet another free and easy-to-use assessment tool from Microsoft.  The purpose of this tool is to help you answer the following important question:

    “Are my servers and services able to be migrated to Microsoft Azure?”

    And that is a fair question; particularly if we see the value, but don’t really know where to begin.  If in the process of modernizing my infrastructure I consider perhaps moving some (or all) of my servers – whether they’re physical or virtual machines - off of my local hardware and into “the Cloud” as Microsoft Azure hosted Virtual Machines as an extension of my datacenter, then it would be good to have a starting-point assessment to help me learn about and consider what might be required; and even better if it was based on my current environment and some initial goals and desires.

    And that’s what the Microsoft Azure Virtual Machine Readiness Assessment is all about.

    It’s a free and easy-to-install tool that, when run on supporting OS and with the proper credentials, will ask you a number of questions about your environment and about your needs and desires (the end goal), and result in a lengthy report based on your answers and, importantly, based on what it was able to detect in your infrastructure.

    “Can you show it to me?”

    Showing you the whole process would be overkill here.  But how about I show you some of the highlights.


    Requirements and Installation

    The download page is where you’ll find a good description of how and where the tool can be run.  In basic terms, it will run on any OS newer than Windows Vista and Windows Server 2008.  It does have some .NET framework requirements as well.  The instructions are pretty simple:

    1. Download and run WAVMRA.EXE on the computer you want to run the assessment from
    2. Complete the installation steps
    3. Launch the tool
    4. Select the technology you want to assess and proceed through the wizard experience

    Run it

    On the workstation you’ve installed the tool on, make sure you run it as an administrative account that has rights to administer Active Directory, SharePoint, or your SQL Servers (whatever it is you’re interested in assessing). 

    Naturally, the first question you are asked is “What would you like to assess?”

    What would you like to assess?

    Your answer here will determine some of the remaining questions concerning what kind of connectivity, applications, availability, and performance you’re going to require.

    Let’s say that In my example I’m going to want to extend my Active Directory domain into the cloud.  Using my single corporate domain I want to extend authentication to other applications that I want to host on virtual machines in my Microsoft Azure network.

    Prior to the remainder of the questionnaire, you are reminded of the requirements for this tool to be able to run successfully:


    Answer the Questions

    The rest of the process prior to scanning your environment and generating the final report, is to ask you additional questions.  In my scenario, I’m asked 13 more questions.  “All questions must be answered as part of completing this assessment.”  Here are a few samples:

    Network requirements

    LOB Apps

    Deploy Workloads

    Cloud DR Site

    AD Forest

    Note that each question provides additional detail about what’s being asked, and you are often giving the option to basically say “I don’t know yet”.  Trust me – The report will give you excellent detail on and pointers to additional information about all of the options available.


    And eventually…

    View your report


    The tool generates a Microsoft Word .docx file that you can save, print, share.. whatever you want to do with it.  Inside you’ll find a detailed report on what you’ve chosen, what’s required of you, and links to additional information and further learning around your next steps.  The report is organized into three parts: “Ready”, “Set”, and “Move”.


    And then shows you “What we checked”, with a quick visual indication of which items are fine, and which ones should probably be looked into further.


    And that’s it! 

    Hopefully you’ll find this a useful first-step into extending your infrastructure into the Microsoft Azure cloud.

    Go Forward “To the Cloud!”

  • Modernize your Infrastructure: The Series

    Modernize your InfrastructureThe team of US DX IT Pro Technology Evangelists is doing another series of articles and TechNet Radio interviews.  The topic: Modernizing your Infrastructure.  The goal: To give you as many resources as you’ll need to get your infrastructure up to speed, whether you’re simply looking for ways to migrate off of Windows Server 2003, or even move workloads and applications up into Microsoft Azure.

    “Sounds great, Kevin!  Where is the series landing page?”

    Go here:

    Bookmark it and check back often.  There should be new content there every day.

    Speaking of new content, today’s first-article-in-the-series is by Dan Stolts.  He writes about and documents using the Microsoft Assessment and Planning Tools.  CHECK OUT HIS ARTICLE HERE

    “Hey Kevin, are you writing any articles for the series?”

    Absolutely.  I have one going live tomorrow, others scheduled for later.  And you’ll also get to see my pretty (?!) face in a couple of TechNet Radio interviews over the next couple of weeks.

    Stay tuned…

  • A Useful List of Windows 8 or 8.1 Resources for IT Pros

    Thanks to Mary Jo Foley for tweeting about this.  Mary Hutson is maintaining a very useful list of “top Microsoft Support solutions for the most common issues IT Pros experience when using or deploying Windows 8 or 8.1.”  She updates the list every quarter; the most recent being just two days ago (Aug 11, 2014). 

    * HERE IS THE LIST * <—Click that

    Kudos, Mary!  This is a great page to bookmark!

    Mary Hutson

  • Upcoming Microsoft Events: Do you want to stay informed?

    Whether or not you choose to believe me, here are a two things that I know to be true:image

    1. Microsoft takes your privacy and preferences very seriously, and
    2. Microsoft wants you to stay “in-the-loop” about what’s coming.

    Often our marketing and operations people send out notices about up-coming events, such as our IT Camps or live Microsoft Virtual Academy trainings.  But many of you aren’t getting those e-mails, because either you didn’t know how to set up your information and preferences about you, your interests and how you’d like to be contacted, or at some point you said “no” to getting more e-mail from Microsoft.  And if you opt out for one thing, you’re opting out for everything.  (See point #1 above)

    “So.. what if I do want to be contacted about up-coming events?”

    You need to edit your information and preferences associated with your Microsoft Account (formerly LiveID).  And that’s what this blog post is all about.

    The Profile Center:

    At the profile center, you can sign in and set up or edit the information about you and your preferences.  It’s broken down into these 5 areas:

    Once you go to any of those pages and sign-in, you can add to or edit the information that Microsoft has about you.  Importantly, in order to get notifications from Microsoft about events or resources that are important to you, you need to fill in the correct business and interest areas, and then allow Microsoft to contact you via e-mail.  Here is a quick summary of each of the areas, and I’ll highlight the areas that will help you get our e-mails in future.

    Personal Information

    Share as much as you’re comfortable with, but know that if you don’t give us accurate information, it’s not going to help us to keep you up-to-date. 
    IMPORTANT: Even if you see the proper e-mail address, make sure you click edit next to it to see whether or not you “would like to hear from Microsoft…”.  If it’s un-checked and you’d like to hear from us via e-mail, make sure you check the box and click save.


    Business Information

    Being accurate with the kind of business you work for (or want to work for) will help us better determine what makes the most sense for how we address you or what we send you.  It also helps us figure out in a more general sense the populations and communities out there, and helps us shape our priorities and where we put our attention (and dollar$).


    Technology Interests

    This is the area that you should revisit on a regular basis.  Seriously.  It had been awhile since I personally looked here, and I found that I was still interested in Windows Vista but not at all interested in Server 2012 or Windows 8.  As technologies change, you need to make sure we know that you’re interested in learning more about it. 


    Contact Preferences

    It won’t help you if we can’t contact you.  Sure, it’s entirely up to you, but many people are surprised that they’ve opted out of contact at some point and never knew it.  Make sure that if you would like the occasional e-mail about items relating your business and technology interests, that you at least check the “E-mail Address” box.



    E-mail newsletters on various topics are also available on a regular basis.  Feel free to subscribe to the ones you’re interested in.


    Your Privacy

    Of course, you can go back to the Profile Center anytime and change or update or verify your preferences.  And if you need more details on Microsoft’s Privacy policies, you can find it all here:


    Any questions?  Concerns?  Rants?  Insults?  I can take it.  Put them in the comments.  Smile

  • Kevin’s TechEd Diary – Day 4 (and beyond)

    Dear TechEd Diary,

    Front of the convention centerThis entry is several days over-due, but I hope you’ll understand.  Towards the end of the conference the schedule is tight, and the exhaustion is real.  So I thought rather than taking time out of my day on Thursday or Friday I’d just take in and enjoy the rest of the week and finish up my diary writings after I’ve had a little rest.

    On Thursday (Day 4) I skipped the first session of the day. 


    Oh.. give me a break.  I needed a little extra sleep after the IT Pro Community Party at the Hughes Hangar on Wednesday night.  (see photos below)  I did get to the conference in time (or so I thought) for “TWC: Malware Hunting with Mark Russinovich and the Sysinternals Tools” (DCIM-B368), but even though I was there 20 minutes before the start, the room was “full” and people were waiting outside to get in.  So instead, I decided to head down to the Hands On Labs, where I re-visited my past life as a Developer and walked through a lab showing me what’s new in the latest version of Visual Studio. 

    After lunch I attended “Case of the Unexplained: Troubleshooting with Mark Russinovich” (WIN-B354).  This time I got there 40 minutes early and managed to get a good seat.  The room was full soon after.)  This was the 2014 version of Mark’s popular session that he’s done for many years now, where he shows off the System Internals tools and how they were used to solve strange workstation and server issues.

    And.. I confess… I skipped the last session of the day and went back to my hotel.  The weather, which through most of the week was very unseasonably cool, was finally warm enough in Houston to enjoy a little time poolside.

    Following a lovely Chinese dinner (we highly recommend the China Garden) with my coworker Jennelle Crothers and MVP and author Ed Horley, the evening party at Minute Maid Park was fun.  Lots of food and drink, and live music stationed around the perimeter.  The Dueling Piano Bar took tips to allow our friend Don Donais to play drums on a few songs.  And the group “The Spazmatics” were a fun 80’s-tribute band.  After the main party was over, several of us went to Pete’s Dueling Piano Bar for more entertainment. 

    Friday was a sleep-in day, followed by leisurely packing, heading to the airport, and sharing a plane home with several of my Minneapolis friends. 

    “So what’s next, Kevin?”

    I’m glad you asked.  I’m going to be planning, facilitating, and participating in a series of blog articles entitled “TechEd 2014 Favorites”, in which I and my coworkers who attended the event will document some of the important announcements and our favorite moments from TechEd 2014.  Watch this blog for more details coming very soon.


    I’ll leave you, dear TechEd diary, with some photos taken at the events described above. 

    IT Pro Community Party

    DJ Joey Snow cranking out the tunes at the IT Pro Community Party

    DJ Joey Snow

    My west coast coworker (and newbie on the team) Jessica DeVita, Simon May, and Ed Lieberman.

    Good Friends

    Mark Russinovich explaining the unexplainable

    Mark Russinovich

    Mark Russinovich

    The drawing at the Adaptiva booth for a Harley Davidson motorcycle.  I didn’t win.  Sad smile 

    Adaptiva Booth

    The little 3’ deep pool shared by the Courtyard and Residence Inn.  It was empty when I got there, but he chairs filled up soon after.  I guess I’m a trend-setter.


    Minute Maid Park on the walk over.

    Minute Maid Park

    Entertainment at the party.

    Minute Maid ParkMinute Maid Park

    Hmm.. Odd that nobody is in the salad bar line.

    Minute Maid Park

    View of the field and main stage


    Minute Maid Park

    Don Donais rocking out with the pianos at the dueling piano bar.

    Don Donais playing with the players

    The Spazmatics

    The Spazmatics

    The Spazmatics

    Pete’s Dueling Piano Bar

    Pete's Dueling Piano Bar

    See you next year!!

  • Kevin’s TechEd Diary – Day 3

    Dear TechEd Diary,

    Well diary, if you ever had any doubts about “the cloud”, they should have been eased a bit if you attended or watched the live stream of Mark Russinovich and Mark Minasi’s “Mark and Mark” cloud discussion this morning. Besides being entertaining, it was very informative.  It’s always great to hear the opinions of someone as important as Mark Russinovich is to Microsoft, and to the industry we serve.  And it’s a good look under the hood of not only what we’re doing now, but there are hints of what’s coming.  And to me, it’s very exciting.

    Mark and MarkThe session, if your interested, is DCIM-B386 – “Mark Russinovich and Mark Minasi on Cloud Computing”.  At the time of this writing, the recording isn’t yet available.  But it will be soon.

    The talk took the form of an informal interview, with Mark asking Mark the questions. 


    Mark Minasi asked Mark Russinovich the questions concerning Microsoft and cloud services and often drilling specifically into Microsoft Azure.  On Azure, they discussed new capabilities, the amazing scale that Microsoft has (Did you know that last year Microsoft purchased 17% of the world’s servers for their datacenters?) as well as topics of security and data privacy.  It’s well worth a (re)view.

    “What else did you learn about today?”

    I learned more about the fundamentals of some of the new networking capabilities in Microsoft Azure.  Session: DCIM-B305, “What’s New in Microsoft Azure Networking”.  They did a good job of summarizing the many new capabilities, such as site-to-site VPN, multi-site VPN, reserved static public IP addresses, and more.  The demos showed that some of the configuration isn’t quite as straightforward as they’d like it to be; but it’s a great start, and definitely something I’m looking forward to playing with (and blogging about) some more.

    The other highlight of the day was another Mark Russinovich presentation: DCIM-B306, “Public Cloud Security: Surviving in a Hostile Multi-Tenant Environment”.  Mark does a “Top 10” based on the Cloud Security Alliance “Notorious Nine: Cloud Computing Threats”, and gives his impressions on the realities of the list’s items from Microsoft’s perspective.  It’s an eye-opener, for sure. 

    Those two last sessions’ recordings aren’t available yet.  I’ll update this post with links when they’re available.

    “What did you do for fun last night?”

    Ah.. fun?  Who has time for fun?

    “You do.”

    Yep. Smile  Last night’s meal was courtesy of the “Ask the Experts” event.  Later, Veeam threw a very fun party (I am a sucker for good live music.)  And the TechEd Jam Session was also in full-swing at the House of Blues, though I didn’t spend much time there.   ($11 for a tall can of Miller Lite?!  I don’t think so.)

    Tonight I’ll be attending a party thrown at Lucky Strike by Nimble Storage, and then I’m heading over to the Hughes Hangar for the “Windows IT Pro Community Party” (Sponsored by Springboard Series on TechNet).


    Here are some photos from last night and earlier today.

    Food at Ask the Experts

    Veeam Party at the HOB

    Veeam Party at the HOB

    Veeam Party at the HOB

    Veeam Party at the HOB

    Lots of time on escalators

    Device Bar

    Channel 9 area

    Alumni Lounge

    Alumni Lounge


    Now.. off to the parties!

  • Kevin’s TechEd Diary – Day 2

    Dear TechEd Diary,

    My day 2 started much as my day 1 did.  I'm an IT Hero

    “Too early?”


    Sure, if there is such a thing as “too much fun”, I came close to it.  But even though last night’s Exhibit Hall Reception and the later “MCP Celebration” party at Howl at the Moon was a lot of fun, I managed to get home before midnight.  Plus, there’s not much to do in the area around my hotel.. so that’s another reason to call it a night.

    Today I kicked off the day with a very good session on migration strategies for moving from VMware to Microsoft Hyper-V virtualization.  (Session DCIM-B412, in case you’d like to view the recording).  I have to give these guys credit for doing an excellent job delivering topics that are sometimes confusing in a very easy-to-understand way. 

    Right now, as I type this, I’m waiting for the next session to start.  (“Best Practices Integrating On-Premises Datacenters with Azure IaaS” – DCIM-B330)

    I haven’t decided upon what I want to learn this afternoon.  I’ve got 3 potential good topics at 1:30pm, 5 at 3:15pm, and 2 at 5pm to choose from.  And no matter what I pick, I bet the words “Mobile” and/or Cloud” will be used frequently.

    “So, what’s the vibe been this year?”

    A lot of the people I’ve talked to are very impressed and very excited about the transition to a “Mobile First / Cloud First” focus.  Some are still apprehensive, but I think they all understand how important it is for their businesses.  In particular the support we’re adding for management of all of the most popular devices is well-received.  And the improvements to Windows Azure are causing many to consider using it as an extension of their datacenters.

    To close out this blog post, here is some photo-evidence from last night…

    Lots of people gathering lots of swag

    Lots of people gathering lots of swag

    I really want to win this!

    I really want to win this!

    I don’t care so much about winning this.

    I don’t care so much about winning this.

    Howl at the Moon for the MCP party

    Howl at the Moon for the MCP party

    Howl at the Moon for the MCP party

    Other more incriminating photos have been withheld to protect the not-so-innocent. Smile

  • Kevin’s TechEd Diary – Day 1

    Here’s a quick post with some of my photos so far.  Trying out a neat photo album feature by just dragging multiple photos into Windows LiveWriter.

    Included are a couple of shots from the Krewe Meet-n-Greet last night, and random shots of people and food.

    Now I’m off to another session…

    View albumView albumView albumView albumView album
    View album    
  • Kevin’s TechEd Diary – Day 0

    Dear TechEd Diary,

    Smiling happy conference-goers.It’s the day before TechEd, which leads to my 3rd favorite week of the year (just behind Christmas/New Year and Easter/Holy week).  My flight appears to be on-time. The day here in Minneapolis will be a good one, weather-wise.  It will be nice in Houston, too.  I haven’t packed much yet, but a quick hour of playing today’s game #1: “Which old conference or funny t-shirts best represent who I am?” is going to be playing out very soon.

    Last night and early this morning I suffered through all the tweets and Facebook messages from friends in #theKrewe.  These are the folks who arrived a day or two earlier than I could and have already had dinners and parties and (I’m assuming) far too many drinks.  Oh, how I wish I were there already!  But not to worry. I’ll catch up with (and to) them this evening.

    Later today I’ll be playing another favorite TechEd travel game: Spot the Geek.  At the gate and on the plane, I’ll notice other funny-or-geeky t-shirts or old conference bags or other tech-logo’d apparel.  And I’ll settle in comfortably, noting, “yes.. these are my people!”  I’d say hello and talk to one of them, but I’m way too introverted for that.  (True!)

    I’ll land in Houston later (defying death again), wait for my t-shirt-carrying luggage, and check social media (or just look around) to see if someone else is around that might want to share a taxi downtown.   I’ll check in to my hotel, drop my bags, and if there’s time I’ll head out to the convention center to register and take a few photos.  (Yes, dear TechEd diary, I’ll share them with you.)  This year I’m not taking my Canon or my Sony; relying instead on my new kick-a** Lumia 1020 for all photos and videos.  I even got the camera-grip / spare battery power thing for extra comfort and extra usage.  Rock-n-Roll.

    “What’s on the agenda for the evening?”

    Tonight is the “Krewe Meet-n-Greet” at the House of Blues.  This leads to playing today’s game #3: “What was your name again?”  I’m horrible with names.  I really do need to work on that.

    But more importantly, which t-shirt should I wear to the party?


    Oh.. PS – Happy Mother’s Day.  Disappointed smile

  • VMware Pro? Gear up on Hyper-V through Microsoft Early Experts

    Are you a VMware Pro?  In an increasingly multi-hypervisor world, more and more IT organizations are using Microsoft Hyper-V and Microsoft Azure to achieve superior performance and workload flexibility at the best possible price. With nearly two-thirds of businesses on more than one virtualization platform, adding Microsoft virtualization and cloud skills to your technical repertoire can improve your career options and prepare you to face new IT demands.

    Join Early Experts! We'll help you get certified!

    What is Early Experts?

    Microsoft Early Experts is a free, online study group for virtualization professionals who want to extend their Microsoft Hyper-V, System Center and Microsoft Azure knowledge with official Microsoft certifications.  We've organized our high-impact learning resources into online Knowledge Quests that include concise videos, prescriptive study materials and hands-on practice with real products.  Complete the weekly quests at your own pace and enjoy the flexibility to stop and review certain topics when you need more time.

    Rewards and Prizes (U.S. Only)

    Complete the online Quests and receive a completion badge suitable for printing or sharing online to showcase your new skills with Microsoft virtualization!  In addition, IT Pros located in the U.S. are eligible to win one of several cool prizes during monthly prize drawings in April, May and June 2014. **

    Rewards, Prizes and Certification: Join Early Experts today!

    If you're located outside of the U.S., you're certainly still welcome to join us and take advantage of the Early Experts study materials to help you prepare for certification.

    The clock is ticking … Join us today!

    Early Experts study groups are forming now for existing VMware / Virtualization professionals targeting the following Microsoft certification:

    Join Us: Become our next Early Expert!

    **NO PURCHASE NECESSARY. Open only to IT Professionals who are legal residents of the 50 U.S. states or D.C., 18+. Sweepstakes ends on June 30, 2014. See Official Rules.

  • Windows 8.1 for Business: Series Wrap-up and Resources

    Windows BlogWell.. Now that you’ve had a chance to take in all the great news (//Build) and new updates from Microsoft over the past couple of weeks, we thought it was finally time to wrap up our series for IT Pros entitled “Windows 8.1 for Business”.  For this final article of the series, I’m going to leave you with some useful resources that you should be watching and following regularly (other our blogs, of course).

    The Build 2014 Site
    Keynote and session recordings are available.  Whether or not you’re a developer, you’ll find some good information there.

    The Windows Client page on TechNet
    This is the starting point for all Windows client information that IT Pros will need to know.  “Presented by the Springboard Series for IT professionals, this site offers technical resources, free tools, and expert guidance to ease the deployment and management of your Windows client infrastructure.”

    The Windows Blog
    ”Scaling Windows Phone, evolving Windows 8.”

    The Windows 8.1 Enterprise Evaluation
    Get the 90-day free trial

    The Springboard Series Blog
    Well worth following, this blog is a great source of information on the latest and greatest.  For example, every IT Pro should read Ben Hunter’s excellent entry – “Windows 8.1 Update: The IT Pro Perspective

    The Springboard Series Insider Newsletter
    “Get the latest news, resources, tools, and guidance to help you deploy Windows 8.1, migrate from Windows XP or Windows Vista, and manage your existing Windows client infrastructures successfully—and with less effort.“

    And of course, follow @msspringboard on Twitter.


    “Anything else coming up, Kevin?”

    I’ll see you in Houston May 12-15 for at TechEd!

  • Doesn’t Windows 8.1 require more hardware than Windows 7? (Series: Windows 8.1 for Business)

    Click here to find the Windows 8.1 Enterprise EvaluationLogically, it makes sense.  Forever it seems whenever any software company comes out with a new version of their product, the added features and functionality require extra hardware.  In the case of operating systems, that seems especially true.  Software vendors are always trying to build their solutions not to fit today’s hardware, but what hardware will be available tomorrow – trying to forecast what the hardware specs will look like for the mass market in order to take advantage of hardware “state of the art” when their software finally goes on sale.

    So it should follow that Windows 8.1 requires more hardware (memory/cpu/disk) to run than Windows 8.  And Windows 8 required more than Windows 7.  And Windows 7 required more than Windows Vista.  And… you get the idea.

    “And what if that’s what I believe?”

    You’d be wrong.  Sorry.

    For the purpose of this article and our comparison, we’ll just take the basic hardware aspects of a typical PC: Processor, Memory, Available Hard Disk Space, and Graphics Processor,

    Windows Vista*

    Windows 7

    Windows 8 & 8.1






    1 GB System Memory

    1 GB System Memory

    1 GB System Memory

    Disk Space

    40GB + 15GB for install

    16GB (32bit) or 20GB (64bit)

    16GB (32bit) or 20GB (64bit)


    DirectX-9 & WDDM

    DirectX-9 & WDDM

    DirectX-9 & WDDM

    * Windows Vista Home Premium or better.

    There are of course other considerations that may require additional specialized hardware to support specific new functionality or features such as enterprise-class BitLocker or Client Hyper-V.  Click the OS Names in the table above to go to their respective hardware requirements pages.

    But.. do you notice something interesting in that table?  Other than the disk space requirements (which actually went DOWN when going from Windows Vista to Windows 7), the requirements are all exactly the same!  And considering the fact that Windows Vista was released to the world on January 30, 2007, we’re letting you run the latest and greatest PC operating system on hardware that could be 7 years old!

    So as you can see, the hardware that you are running Windows Vista or Windows 7 on very likely will run Windows 8 and Windows 8.1.  Myth = BUSTED

    “But wait a second… Doesn’t Windows 8.1 require a touch screen?”

    Nope.  In fact, my friend and coworker Keith Mayer addressed that very question just yesterday in his post for this series.

    “What about screen resolution?  What’s required?”

    That’s a valid point.  Minimal screen resolutions do differ a little.  That’s one of those “if you want to take full advantage of the capabilities of the OS” sorts of issues.  If you have a really old monitor that can’t do anything better than 800x600, then you’re probably already suffering with Windows Vista or Windows 7, and the Windows 8 won’t work with anything less than 1024x768.  Something to consider, for sure.

    “What about performance?  Can I expect my PC to be faster or slower with the newer operating system?”

    Ah.. that’s a very subjective topic.  “Your mileage may vary”, mainly because everyone has certain things that they wish would be faster, as well as various things that many of us unwittingly do to our systems that actually slow them down over time. 

    The good news is that there has actually been a lot to improve the performance of Windows and the things that we typically find we’re waiting for.  A perfect example is system startup or shutdown; one of the most obvious time-wasters in older operating systems.  Even with technologies such as sleep and hibernate available, the majority of people still prefer to shut their computers down completely at night or when they’re not using them.  So a lot of focus was given to improving this aspect of performance in Windows 8 and beyond. 
    For an exhaustive walk-thru of how this was done in Windows 8, check out this article on the Building Windows 8 blog: Delivering Faster Boot Times in Windows 8

    Bing it!For even more well-documented examples of performance improvements beyond Windows 7, check out all the great content via this bing search: “windows 8 performance improvements

    And for good tips and tricks on how to optimize your performance in general, check out this article: Optimize Windows for Better Performance


    So in conclusion – If you were thinking that you’ll need to buy new hardware to replace what you are already happily running Windows Vista or Windows 7 on in order to get most of the benefits of Windows 8.1, now you can hopefully see that a simple OS upgrade is another option.


    If you’re interested in evaluating Windows 8.1:

    If you’d like to purchase an upgrade to Windows 8 (which includes the free upgrade to Windows 8.1), check out the Windows Store.

    And if you’d like to donate $1000 towards sending your favorite IT Pro Technology Evangelist to Microsoft TechEd , contact me through this blog and I’ll send you my mailing address. 
    (My Mom always said, “It never hurts to ask.”) Smile


    This article is part of our March 2014 series of blog articles entitled “Windows 8.1 for Business” by your Microsoft Technology Evangelists and guests. 
    For the full list of articles in this series please visit the series landing page:

  • Can I use Windows 8.1 without a touchscreen?

    Get the Windows 8.1 Enterprise Evaluation and others by clicking here.Even if you’ve never tried Windows 8 or Windows 8.1, you may have seen the advertisements showing off the lovely colorful “touch-first” interface (previously known as the “Metro” UI).  Even some people using a traditional old non-touch laptop will reach out and touch the screen when they first encounter the start screen.  (And by “some people”, I mean my wife while using my old Lenovo laptop on vacation a couple of years ago.)  So, as lovely and colorful as the start screen is, it’s easy to assume that you might need – or at least be most happy with – a device with a screen you can touch in order to get the most out of the latest version of Windows.

    “Isn’t that the case?”

    I’m going to argue that it’s not.  Right this very minute, for example, I’m using my Surface Pro as my work device of choice…

    “Ha! Touchscreen!”

    Yes, a lovely touchscreen which is currently sitting flat and closed on top of my desk, while my Surface is plugged into my KVM switcher connecting me to my big Acer monitor and ultra-durable Microsoft MultiMedia Keyboard 1.0A and Microsoft Wireless IntelliMouse 2.0.  (Gotta love that name.)  Can I touch my screen?  No.. I can’t even reach it from where I’m sitting.

    “And you’re just as happy using just your keyboard and mouse?”

    I’ll be honest.  When I’m using new Windows 8 apps, sometimes I’d rather touch them.  And some of those apps are really better suited for hands-on-navigation or gameplay anyway, which are great when I’m sitting in my TV chair or on an airplane.  And when simply browsing the Internet using IE 11 I do prefer to be able to swipe and pinch and zoom and touch with my fingers.  But for most of my work, I’m still on the desktop apps where typing and precision selections with my mouse are still the best way to get stuff done.

    “Okay, Kevin.. I’m almost convinced.  Show me how to work Windows 8.1 without a touchscreen.”

    That’s where my friend Keith comes in!  Keith Mayer is the author of today’s article in our “Windows 8.1 for Business” (or “Why you’re wrong about Windows 8.1”) series.  In his write-up, he goes further to prove, and gives real examples, tips, and tricks around how Windows 8.1 is the better choice for ALL devices, whether or not you put fingerprints on your screen.



    This article is part of our March 2014 series of blog articles entitled “Windows 8.1 for Business” by your Microsoft Technology Evangelists and guests. 
    For the full list of articles in this series please visit the series landing page:

  • Why can’t I just use the desktop in Windows 8.1?

    Well, you can.  In fact, my friend and coworker Jennelle Crothers explains it all for you in today’s article in our “Windows 8.1 for Business” blog series.  She writes…

    Click here to find valuable Microsoft Evaluation Software“Ask anyone who uses a computer for every day work tasks, they might say that they LIVE on the desktop and can’t be bothered with the new modern start menu and interface of Windows 8.1. I’ll tell you that I also live on my desktop.”

    “I use Outlook, Word, OneNote and Excel, Lync, LiveWriter and IE 11 for a crazy number of line of business applications for work.  For native apps, I tend to find myself in the PDF Reader or the native mail app to checking personal email. Most of the social media I consume I use apps for Twitter, Facebook and Yammer. I think the default full screens used by native apps are great for viewing and interacting with my friends, watching video and reading news.”

    Make sure you check out her entire article, which includes tips on how to make it easier for you and the users you support to go directly to the ol’ familiar desktop.



    This post is part of our March 2014 series of articles entitled “Windows 8.1 for Business” by you’re your Microsoft US IT Pro Technology Evangelists and guests. 
    For the full list of articles in this series please visit the series landing page:

  • Do you really need a start menu?

    Try the Windows 8.1 Enterprise Evaluation for FreeOne of the reasons some people weren’t comfortable with Windows 8 from the beginning was for the lack of their familiar start menu.  In today’s article in our “Windows 8.1 for Business” series, my friend Matt Hester delivers the run-down on the benefits of the Start Screen vs. the old Start Menu, and provides some useful tips and workarounds to help you adjust to this big change. 

    And as an added bonus, Matt even gives us a “Fun Super Tip” – a built-in way to add something very similar!


  • Blog Series: Windows 8.1 for Business

    Windows 8.1 Powers BusinessWelcome to March!  And not that I mean to alarm you, but welcome to the final month before support ends on Windows XP.  I know that many of you supporting IT and devices for your businesses have known this for a while, and are either already done or continuing to work on migrating to Windows 7 or Windows 8But which one, and why?

    What’s interesting to me is that there is a lot of fear, uncertainty, and doubt (FUD) surrounding Windows 8.1 and whether or not there is any real benefit to providing and supporting it as the default, best-choice for business devices.  And while I know that most of you have indeed done proper due-diligence in order to come to the conclusion that Windows 7 is a better choice for your businesses, it just may be that not all of your information was based on fact, or was missing some very important beneficial tidbits which, if you had known, might very well have changed the equation.

    That’s the purpose of this March blog series: “Windows 8.1 for Business”We, the 9 Microsoft Technology (IT Pro) Evangelists in the US, plus a few special guest authors, want to take this month to help dispel some myths and provide some useful resources for you as you evaluate (and hopefully choose) Windows 8.1 as your business desktop/laptop/tablet/phablet platform of choice.

    Below is our schedule, which will be continually kept up-to-date with links to completed articles as they become available.  Stop back often, because we sincerely want you to benefit from this information.  And if you have any questions or comments, please please please post them in the comments either here, or at the articles themselves.

    UPDATE: Thank you for your patience!  Due to the importance of the topics we are going to cover, we’ve had to delay posting to this series.  We will continue soon (this week of March 17), and I’ll add items to  the schedule as soon as we’re sure of their availability.  Keep watching…

    All the best!
    Kevin Remde




    March 3

    Series Introduction (this article)

    Kevin Remde / @KevinRemde

    March 4

    Oh Start menu, how do I miss thee…or do I?

    Matt Hester / @MatthewHester

    March 5

    Beloved Desktop, Where Art Thou?

    Jennelle Crothers / @jkc137

    March 6

    Windows 8 works great without a touch screen

    Keith Mayer / @KeithMayer

    March 7

    Does Windows 8.1 require more hardware than Windows 7?

    Kevin Remde / @KevinRemde

    March 19 Getting started with Client Hyper-V Matt Hester / @MatthewHester
    March 20 Is the “Cloud” a really big deal? Blain Barton / @Blainbar 
    March 21 Remember Our Good Friend Group Policy Matt Hester / @MatthewHester
    March 28 Build No-code Business Apps with Windows 8.1, Project Siena and Microsoft Azure

    Keith Mayer / @KeithMayer

    March 31 Build No-Code Business Apps with Windows 8.1, Project Siena and Microsoft Azure (Part 2) Keith Mayer / @KeithMayer
    April 1 Build No-Code Business Apps with Windows 8.1, Project Siena and Microsoft Azure (Part 3) Keith Mayer / @KeithMayer
    April 8 Top 5 Key Security Improvements Anthony Bartolo / @WirelessLife
    April 10 XP EOS – Guidance for Small/Medium Businesses and Individual Consumers Pierre Roman / @PierreRoman
    April 14 Series Wrap-up and Resources Kevin Remde / @KevinRemde
  • The Case for the Offline Backup

    To evaluate some of the software discussed in this article, click here.Over the past several weeks, my teammates have all contributed to a very valuable series of blog articles entitled “Disaster Recovery Planning for IT Pros”.  They’ve covered topics such as how to get started in planning, and Server Virtualization and how it applies to Disaster Recovery, and testing your recovery plans.  And they’ve discussed technologies that can help – the tools in your DR tool belt - such as Hyper-V Replica, Windows Azure, and the newer Windows Azure Hyper-V Recovery Manager

    For the full list of excellent articles, CLICK HERE.

    “What is an Offline Backup?”

    Before I dive in here, I want to be clear about what I’m covering.  For the purpose of this discussion, I’m not talking about tapes or off-site storage of backed up data.  That’s something more commonly called an archive.  Regular storage and archival for recovery from past history is an important (and big) topic in-and-of-itself; perhaps the topic of another blog series for another day.  For this article, however, I’m talking about having a copy of some important digital asset that was saved in a way that can safely and fully be recovered as a complete unit, in case the original location is unable to house that asset.  (Yeah.. a disaster.)  That digital asset could be a server OS installation, a directory, a database, a virtual machine, a desktop image, file storage, an application; really whatever you consider valuable and worth the effort (and cost) to have protected in a way that can be quickly restored if the worst should happen. 

    “Do I really need an offline copy these days?”

    That’s a fair question.  With all of the excellent (and many now built-in and included) technologies in modern operating systems such as Windows Server 2012 R2 , it could be argued that you don’t really need to create backups of some items.  A virtual machine will start running on another cluster node if the hosting node fails, and the storage supporting that machine could be on always-available file server cluster (Scale-Out File Server), with redundant paths to the storage, and supporting arrays of disks that, if they or the controllers that support them fail, are redundant and easily replaced even while the data continues to be available. (And I haven’t even touched on application availability within a virtual machine or the benefits of virtualization guest-clustering.) 

    But even with all of this great technology, not all data or files or applications are equally important, and not all are worth the same amount of investment to ensure their availability and – important to our DR topic – their recovery in case of a really bad thing (disaster) happening.

    The case for the offline backup will be determined by these factors:

    • The importance of that data (The RPO and RTO)
    • The technologies you’re willing to invest in to support continuous availability

    “How important is your data?”

    VirtualServerWomanAs part of the planning process (which Jennelle introduced to you early in our series), you’ll take an inventory of all of your digital assets, and then make a priority list of all of those items.  The priority should be in order of MOST critical (i.e. My business can’t function or survive without) , to LEAST critical (no big deal / can rebuild / etc) assets.  Now, going on the assumption that at some point your datacenter is turned into a “steaming pile”,  you’ll draw a line through your list.  Items above the line are critical to your business overcoming the disaster.  Items below the line.. not worth the investment in time or effort.  (Note: that line will shift up or down as you work through this, as you get into actually figuring out the costs associated with your plan, and importantly as you re-evaluate on a regular schedule your disaster preparedness.)

    For each of your inventoried and prioritized digital assets you’re also going to be defining a couple of objectives – the Recovery Point Objective (RPO) and the Recovery Time Objective (RTO).

    The RPO is the answer to the question: “How much data can I afford to lose if I need to recovery from an offline copy of that data?”  In its simplest terms, it decides how often or how frequently I make a new offline copy of that asset.  An example in Hyper-V Replication would be the setting that determines how frequently a new set of changes are replicated to the virtual machine’s replica copy.  If I’m replicating changes every 5 minutes, then at most I could lose up to 5 minutes of changes should the worst happen, so my RPO is 5 minutes.  Is that good enough?  Maybe.  It depends on what that virtual machine is actually doing for you.

    The RTO describes how long you’re willing to wait to bring that digital asset back online.  If I’m still doing all of my backups to tape, and then shipping those tapes offsite, it’s going to take a lot longer to recover at a new location than it would, say, to take advantage of another site and/or Windows Azure to host your stored backups.  Can I afford to wait a day or two?  An hour?  A few minutes?  How critical that asset is to your business continuity will help you set a desired RTO for that item.'s a hit.“How much are you willing to spend?”

    Again, there are expensive and there are cheaper options for addressing the RPO and RTO, and you’ll ultimately base how much you’re willing to invest by the relative importance of the digital assets in bringing your company back quickly (or as reasonably quickly) from disaster. 

    “And once I’ve implemented everything, I’m done?”

    Of course not.  You’ll regularly test your recovery.  And less frequently – but still critically – you’ll occasionally re-evaluate your priority list and the methods for meeting your objectives.  Of all people, IT Pros know how quickly technology evolves.  What seemed like a good, solid plan and a decent implementation of tools last year may not fit as well today now that newer/better/faster/cheaper options are available.  And that’s not even considering the shifting nature of your own environment, the servers, the applications, the growth of data.. all need to be re-considered on a regular basis.

    In conclusion…

    There is definitely a case for offline backup.  What and how you do that backup will be defined by you, based on priority, and adjusted by cost.  And making those decisions and implementing your plan isn’t the end of the process.  You must revisit, re-inventory, re-prioritize, adjust and test your plans on a regular schedule. 


    This is post part of a 15 part series on Disaster Recovery and Business Continuity planning by the US based Microsoft IT Evangelists.  For the full list of articles in this series see the intro post located here: