Kevin Remde's IT Pro Weblog
IT Pro Resources
TechNet EventsMicrosoft Security Response CenterMicrosoft Virtual AcademyKevin’s Evaluation Download Center
IT Pro Evangelist Blogs
Blain Barton Blain Barton's Blog@BlainBar
Brian LewisMy Thoughts on IT...@BrianLewis_
Dan Stolts IT Pro Guru Blog@ITProGuru
Jennelle Crothers TechBunny@jkc137
Kevin RemdeFull of I.T.@KevinRemde
Tommy PattersonVirtually Cloud 9@Tommy_Patterson
Yung Chou Yung Chou on Hybrid Cloud@YungChou
The following article is part 4 of our many-part series, “Hybrid Cloud for IT Pros”. Click HERE often for the ever-growing full list of articles in this series.
At my IT Camp events, when discussing this topic, I’ll often ask my IT Pro friends in attendance the following questions; usually with the following results:
“How many of you have played with Windows Server Backup, the built-in file backup and recovery utility?”About 50%-75% of the hands go up.
“How many of you are using Windows Server Backup as your main server file-system backup tool?”Maybe one or two hands go up. And we all laugh.I’m not surprised! It’s a very simple tool, and maybe didn’t do all we need for things such as long-term archiving and off-site storage.. so we went with other value-add providers. But still, if you want to simply create a backup schedule and save multiple recovery points, Windows Server backup is still a nice solution.
Wouldn’t it be great if there were a way to take that simple capability and, rather than storing backups to another local storage device, we stored directly into the cheap and always-available cloud storage that is Microsoft Azure?
“Yeah! I’d love that!”
Your wish: GRANTED.
Azure Backup is Microsoft’s Windows Server Backup – cloud-ified. At the heart of it it involves an Azure subscription, a storage account, a credential to allow the service to trust the server (or client), and an agent installed on your local server (or client).
“Kevin.. you keep saying ‘(or client)’. Are you saying that this Azure Backup can also backup files from Windows Client operating systems?”
That’s right! A recent update to this capability was to allow the backup not only of all supported server operating systems (Server 2008 R2 and newer), but also of client operating systems from Windows 7 through Windows 10.
“Very cool! But how do I set it up?”
Here is my very own Step-by-Step guide, just for you: Step-by-Step: Windows Azure Backup
This post contains Lab 6 of the 5 labs created for our current set of US DX IT Camps. Yeah.. this is one I just added for good measure.
The complete set of labs are listed here:
1. In the Azure management portal, click NEW.
2. Click COMPUTE, click VIRTUAL MACHINE, and then click FROM GALLERY.
3. In Choose an Image, click UBUNTU, click Ubuntu Server 14.04 LTS, and then click the Next arrow.
4. Create a new virtual machine using the values in the following table, and then click the Next arrow.
5. On the Virtual machine configuration page, in CLOUD SERVICE, select itcservice<ID>.
6. In STORAGE ACCOUNT select itsstore<ID>.
7. In ENDPOINTS, in ENTER OR SELECT A VALUE, select REMOTE DESKTOP, and then click the Next arrow.
8. Click the Complete icon.
a. The virtual machine will take a few minutes to create. Depending on the load this may take between 5 and 25 minutes.
b. Wait for the new virtual machine to finish before proceeding.
1. On your local workstation you’ll need some files from our AzureManagement.zip file.
a. Using Internet Explorer, download and extract https://itcmaster.blob.core.windows.net/fy15q3/AzureManagement.zip to your create an \AzureManagement folder (either at the root of C:\, or on your desktop).
b. NOTE: The above URL is Case Sensitive!
2. In \AzureManagement, double-click PuTTY.exe.
3. In Host Name (or IP address), type ITCService<ID>.cloudapp.net, and then click Open.
a. <ID> is your unique id.
4. In the PuTTY Security Alert dialog box, click Yes.
5. Log on as AzureUser using Passw0rd! as the password.
a. You are low logged on to your new Linux VM using SSH.
6. Type the following commands, pressing ENTER after each one. This set of commands will add a desktop and enable RDP. Confirm each command as needed.
sudo apt-get update sudo apt-get install Ubuntu-desktop
a. Enter Y when prompted.
b. This process will take up to 30 minutes or longer. You can allow this to run in the background and come back later. This VM will not be used again.
sudo apt-get install xrdp sudo /etc/init.d/xrdp start
c. This last command ensures the xRPD server is started, as it does not always automatically start.
7. Now you should be able to go back to the Azure portal, select your Linux01 virtual machine, and connect to it using RDP.
Azure Backup “is a simple and reliable data protection solution which enables customers to back up their on-premises data to Microsoft Azure.”
Remember Windows Server Backup? Well, like that perennial utility, this one backs up from or restores items to your Windows File System.
“But this one backs up to or restores from Azure?”
Bingo. And as of December, 2014 it also supports backup and restore of files on Windows Client (7, 8, 8.1 and on up) operating systems.
Here’s what we’ll do in this Step-by-Step guide:
Remember: If you just want to try this out without purchasing or using an existing Azure subscription, you can easily set up a free trial.
Note that, to start, you might want to be doing these steps from the server or workstation that you want to configure for backing up files. You’ll be downloading credentials and agent, installing the agent, and registering the machine against your Azure subscription, all from that server or workstation, so you may as well start these steps from that machine.
Open an browser, and to to your Azure Subscription (http://manage.windowsazure.com). On the left-hand column you can scroll down to find Recovery Services.
Select Recovery Services, and click New (the “+” mark) at the bottom left of your browser. This will contextually place you into the New / Data Services / Recovery Services area.
Select Backup Vault, and then Quick Create.
Your only two things to configure here are to give your vault a useful name, and to choose where in the world you want to have it stored. Note: You may notice that not all of our data center regions support hosting backup vaults. The list of regions may change over time.
Click Create Vault, and after about 10-15 seconds you’ll have your new backup vault ready to use.
By clicking on the name, you’ll enter into your vault’s quick start page.
In the first versions of Azure Backup, establishing the trust between the vault and the server to be backed up required generating and installing a certificate, exporting it, and uploading it to the vault. More recently, however, we’ve made it very easy for you. You’ll simply download the vault credentials from within your account, and that file will be used to establish the trusted connection; either for initially registering the server or workstation, or to recover items to a new machine.
On the quick start page, click Download vault credentials. You will be prompted to open or save a file of type .VaultCredentials.
Save it somewhere you’ll remember, on your machine to be backed up. Note: Treat this file with care. It’s a file that you don’t want to let get into the wrong hands.
On the quick start page, click the link to download the backup agent that you require.
Notice that the same agent you can use to natively backup a workstation or server’s files is also the one used by System Center Data Protection Manager (SCDPM), which also can backup to or restore from Azure.
Save the MARAgentInstaller.exe, and then run it later. Or simply run it at this point if you’re already on the machine you want to backup files from.
I choose to run it from right here. You will have some very basic configuration choices when you install the agent.
For my needs, I’m going to just take the defaults, and don’t have any special proxy to get me to/from the Internet. Notice that the agent installation will also install any required components (.NET Framework 4.5) or software (Windows PowerShell) that might be missing.
When the installation is done, you can either close the installer or “proceed” to the “registration”. I’m going to proceed.
When you do that (or when you open the agent for the first time), you’ll be asked to provide the vault credentials file.
Browse to and select the file you saved earlier, then click Next.
On the Encryption Setting page, you’ll either generate, or enter your own Passphrase.
I choose to have the tool generate a long passphrase for me, and I’ll just save it to my desktop folder for now.
When I click Finish, the tool registers this “server” (I’m actually running this on my Surface Pro 3 running Windows 10 preview) with my backup vault.
And when it successfully completes, I see this:
Leaving the “Launch Microsoft Recovery Services Agent” checkmark checked and clicking Close will launch me into the recovery agent.
Note the options in the Actions pane on the right. We’ve already registered the server, but as the little alert in the main pane reminds us, we haven’t yet scheduled anything to backup. So let’s do that now.
Schedule your Backups
I have a folder full of very important stuff. For this demonstration, it’s right on my C:\ drive in the “Very Important Stuff” folder, with an important file called An Important File.txt
In the Microsoft Azure Backup tool, in the Actions pane, I will click on Schedule Backup.
Clicking Next on the Getting Started page brings me to the Select Items to Backup page.
This is what I use to add, browse to, and select folders or items to backup. Notice that I could also use this to exclude certain files by file types.
I’ve selected the C:\Very Important Stuff folder, which is all I need to backup for now. Click Next.
On the Specify Backup Schedule page, notice that I can choose to do my backups daily at a certain time, or weekly, being able to select the days and times to perform the backup.
I’ll just do my backup daily at 4:30am. Click Next.
On the Select Retention Policy page, we have some pretty flexible options for retaining our backed-up data for longer periods of time. In my case, not every daily backup needs to be saved for several years, but maybe just the backup that I take the Saturday of the last week of March, which I want to save for 10 years.
On the Choose Initial Backup Type page, I can choose to do my first backup of my files automatically over the Internet, or in an “offline” way, automating the pull of the first set of data from an existing Azure storage location. For our simple sample, we’ll just do our first backup over the Internet. Click Next.
And on the Confirmation page, I verify that all is as it should be. Clicking Finish creates my backup schedule.
Note: You haven’t yet launched any backup! If I left it all now, the next backup would happen based on my schedule. But I’m going to click the “Back Up Now” option in the Actions pane.
On the resulting Confirmation page, I click Back Up. And then I can click Close at any time, because the job has been launched and will run for you in the background – even if you close the Microsoft Azure Backup console.
But I’ll leave the console open and watch the status of my job change…
…and in fairly short order (because this was a pretty small backup), I see this…
And I’m quite relieved that my very important stuff is now safely tucked away in my cloud recovery services backup vault.
Back in your Azure subscription and in the Recovery Services section, let’s look into my SampleBackupVault and see what we can see there…
On the DASHBOARD tab, I can see that I have one “server” registered, and currently 0 GB currently protected. (It was a pretty small file, so I’m not surprised that it didn’t register here.)
On the REGISTERED ITEMS tab, I can see my one machine of type “Windows server” currently listed and registered.
Notice that this is also where I could delete any old or no-longer-needed server registrations.
On the PROTECTED ITEMS tab I can see some basic information about what I’ve protected; a file folder currently with only one recovery point available.
It also shows what the most recent recovery point and time are.
Oh no! Someone deleted my file! (Hint: It was me.)
No worries! Go back into the Microsoft Azure Backup console and click Recover Data in the Actions pane.
On the Getting Started page, notice that I can specify if I’m currently on the machine where the backup was originally taken (and therefore is already registered with the Backup Vault), or if I’m on a new machine that doesn’t yet have the vault credentials – in which case I’d be given the opportunity to point to a downloaded .VaultCredentials file.
Since I’m on the machine where the backup was taken, I’ll just click Next.
On the Select Recovery Mode page, I can choose to either Browse for my files, or search for them.
I would pick search if I knew there were a very large list to go through. But in my case, it’s just one file, so I’ll just browse. Click Next.
On the Select Volume and Date page, you use the drop-down to pick the volume from which your backup was taken, and then you’re presented with a calendar with some dates in bold representing points in time when you’ve completed past backups.
In our sample, I’ve only done the one backup, so that’s the only point I can recover to. I’ll select it, and click Next.
On the Select Items to Recover page, I can browse to my Very Important Stuff folder and see the files that were backed up from it.
I’m happy to see that my important file is there, so I’ll select it and click Next.
On the Specify Recovery Options page, I have some choices about whether or not I want to restore to the original location, or how to handle duplicates. I can even choose to restore (or not) the Access Control List (ACL – the security permissions) that were associated with the original file.
I’ll leave these defaults and click Next.
I verify that all looks good on the Confirmation page, and click Recover. The recovery starts, and I can close this window because the recovery job is now running for me.
Back in the Microsoft Azure Backup console I can see that my recovery job has completed successfully…
…and.. Hooray! My file is back!
So.. that’s about it!
What do you think? Go ahead and share your comments / questions / concerns / rants in the blog comments.
The following article is part 3 of our many-part series, “Hybrid Cloud for IT Pros”. Click HERE often for the ever-growing full list of articles in this series.
“Hey Kevin, I’d like to take advantage of putting application servers up in a virtual network in Azure. But I need a domain controller for my application to work. Can I put one in my virtual network?”
Absolutely! There’s no reason you can’t build a server, install AD Domain Services, and have it either as the new domain controller in a new forest, or as another domain controller in an existing forest – provided you can get to the other domain controllers through Site-to-Site VPN Gateway or ExpressRoute.
As a matter of fact, in our current set of content for our US DX IT Camps happening across the country, our Hands-on-Labs have our guests using their own (or a trial) Azure subscription to create a network and then populate it with a Domain Controller (among other machines). If you want to try out just building a Domain Controller on a virtual network in Azure, I suggest you run through at least the first two of our labs:
If you don’t already have an Azure subscription, sign up for a FREE TRIAL HERE and give the labs a try.
Finding our series useful? I hope so! Feel free to share or ask anything you’d like in the comments.
This post contains the appendix information for the hands-on-labs created for our current set of US DX IT Camps.
In this task, you will use Windows PowerShell to install and configure Active Directory on DC01. To perform this task, you will use Windows PowerShell ISE as an Administrator.
To connect an RDP session to your DC01virtual machine:
From within your RDP session to DC01:
Before you can manage virtual machines from PowerShell on your local administration station you need to download the tools.
This post contains Lab 5 of the 5 labs created for our current set of US DX IT Camps.
In this task, you will configure the required public endpoint mappings for WEBFE01.
Perform the following tasks in the Azure management portal.
Next, you must enable WEBFE01 to communicate internally within the service. While general IP connectivity is provided by DHCP, both servers are workgroup members and have the public firewall profile enabled. In this task you will open firewall ports and enable PING traffic on WEBFE01.
Perform the following tasks in an RDP connection to WEBFE01.
In this task, you will use Windows PowerShell remoting to install Internet Information Services on WEBFE01. To perform this task, you will use standard Windows PowerShell remoting and administration commands; however, you must first install the Windows PowerShell remoting self-signed certificate installed in your WEBFE01VM. This is because Windows PowerShell remoting relies on HTTPS connections by default.
Establish an RDP session to your SQL01Server:
From within your RDP session on SQL01:
Install the Azure PowerShell Extensions on SQL01:
Open Windows PowerShell ISE as Administrator.
We now need to enable Azure PowerShell commands by clicking the run pane (bottom) type the “Import-Module Azure” command then press <ENTER>
In this task, you will deploy a sample site. The sample web site simulates the types of tasks the Contoso production application performs, and will prove that the Azure infrastructure meets the base technical requirements of the production system.
Perform the following tasks in RDP sessions to WEBFE01.
<connectionStrings> <add name="AdventureWorksConnection" connectionString="data source=SQL01;initial catalog=test;user id=DataManagementApp;password=Passw0rd!;multipleactiveresultsets=True;application name=EntityFramework" providerName="System.Data.SqlClient" /> </connectionStrings>
<add name="AdventureWorksConnection" connectionString="data source=SQL01;initial catalog=test;user id=DataManagementApp;password=Passw0rd!;multipleactiveresultsets=True;application name=EntityFramework" providerName="System.Data.SqlClient" />
Congratulations! Play around with the various portions of the web site, and verify that you have full SQL Server connectivity.
When you’re done with the labs, don’t forget to shut down your virtual machines from within the Azure Portal, so that you’re not using up compute/hour $$’s.
This post contains Lab 3 of the 5 labs created for our current set of US DX IT Camps.
In this section you will create a new virtual machine to host the web application. You can create this VM using quick create; however, that will not enable you to specify the service or storage, and will create separate storage and services for this VM. You will use the gallery option to ensure you can specify the storage and services for the machine.
Perform the following tasks in the Azure management portal:
VIRTUAL MACHINE NAME
NEW USER NAME
NEW PASSWORD and CONFIRM
While the web server is being created, let’s go setup some defaults for SQL Server. You would never want to store SQL Data on the system drive, so the first thing we will do is add an additional disk that will be used for holding the SQL Server Data. We will create a single simple drive but you could create multiple drives and use storage spaces as an alternative. See the Lab Appendixfor details.
Click the Check Mark button to createand attach the new virtual hard disk to virtual machine.
Now let’s connect a remote desktop session to SQL01
Now from the Remote Desktop console of SQL01 we’ll create a new partition on the additional data disk attached above and format this partition as a new F: NTFS volume. After formatting this new volume, you’ll create following folders:
Once inside Server Manager, go to Tools (upper right corner menu) then select Computer Management.
Now, you will update the database’s default locations for DATA, LOGS and BACKUP folders.
1. To do this, right click on your SQL Server instance Name SQL01 (upper left corner) and select Properties.
2. Select Database Settingsfrom the left side pane.
3. Locate the Database default locations section and update the default values for each path to point to the new folder paths created above. Click OK
4. Right-Click SQL01 and select Restart; In the popup that asks Are you Sure, select Yes. if you go back into properties; you should see the change took place.
Close SQL Server Management Studio.
In this task, you will import the testing database provided by your development team. You will then create a user account that will be used by the web front end to access the data in the database.
Perform the following tasks from within an RDP connection to SQL01.
In SQL01, open SQL Management Studio.
Next, you must enable WEBFE01and SQL01 to communicate internally within the service. While general IP connectivity is provided by DHCP, both servers are workgroup members and have the public firewall profile enabled. You will enable SQL Server traffic and PING traffic inbound on SQL01.
Perform the following tasks in an RDP connection to SQL01.
In your RDP session to SQL01, open Server Manager:
Next, let’s make sure we can successfully connect to SQL01from our Web Server.
Perform the following tasks from within an RDP connection on WEBFE01
The output of the script is a small set of system data which indicates you can communicate with the SQL Server instance on SQL01.
Azure Active Directory is a service that provides identity and access management capabilities in the cloud. In much the same way that Active Directory is a service made available to customers through the Windows Server operating system for on-premises identity management, Azure Active Directory (Azure AD) is a service that is made available through Azure for cloud-based identity management. Azure AD can be used as a standalone cloud directory for your organization, but you can also integrate existing on-premises Active Directory with Azure AD. Some of the features of integration include directory sync, password sync and single sign-on, which further extend the reach of your existing on-premises identities into the cloud for an improved admin and end user experience.
In this task, you will create a new Azure Active Directory tenant.
In this task, you will create a user account to serve as the administrator of your Azure Active Directory service.
In this task, you will perform an initial logon to set the password for the admin account.
Perform the following tasks on your localworkstation:
In this task, you will configure Windows Server 2012 R2 and create a new user to test your synchronization when you enable DirSync, and then perform an initial sync to populate your Azure Active Directory service with copies of your local user accounts.
To connect to DC01using RDP:
You are now logged on to your virtual machine.
Multi-factor or two-factor authentication is a method of authentication that requires the use of more than one verification method and adds a critical second layer of security to user sign-ins and transactions. It works by requiring any two or more of the following verification methods:
The security of multi-factor authentication lies in its layered approach. Compromising multiple authentication factors presents a significant challenge for attackers. Even if an attacker manages to learn the user's password, it is useless without also having possession of the trusted device. Conversely, if the user happens to lose the device, the finder of that device won't be able to use it unless he or she also knows the user's password. Azure Multi-Factor Authentication is the multi-factor authentication service that requires users to also verify sign-ins using a mobile app, phone call or text message. It is available to use with Azure Active Directory, to secure on-premise resources with the Azure Multi-Factor Authentication Server, and with custom applications and directories using the SDK.
In this task, you will configure Multi-Factor Authentication (MFA) with Microsoft Azure. To complete this module fully, you need to have a phone which can send and receive text messages or calls. You will configure this lab to use your phone as a second authentication factor this is done via replying to a system-generated text or voice message.
We will start by enabling the MFA service:
In this task, you will test multi-factor authentication. Ensure you have the phone readily available as you will have a limited time to receive and reply to the text message generated by Microsoft Azure.
Perform this task on your local machine.
This post contains Lab 2 of the 5 labs created for our current set of US DX IT Camps.
Azure virtual machines give you the flexibility of virtualization without spending the time and money to buy and maintain the hardware that hosts the virtual machine. However, you do need to maintain the virtual machine -- configuring, patching, and maintaining the operating system and any other software that runs on the virtual machine. In this lab you are going to deploy 2 virtual machines into Azure for the two workloads of identity and database. You will create these two virtual machines:
In this task, you will deploy a new virtual machine(VM) to function as a domain controller in your newly created virtual network created in Lab01. As you provision the virtual machine you will leverage a custom script extension which contains PowerShell code to install Active Directory as a part of the provisioning process. Custom Script Extensions can automatically download scripts and files from Azure Storage and launch a PowerShell script on the virtual machine. These scripts can be used to install additional software components, and in this lab it will install Active Directory Domain Services and create the ContosoAzure.com forest. Like the any other VM extensions, Custom Script Extensions can be added during VM creation or after the VM has been running. During the last portion of the lab you will also configure the AD service as the DNS server for the virtual network you created in Lab 1, and you’ll assign it a static IP Address (Technically speaking this is a DHCP reservation in the subnet but it will be referred to as a static IP pretty much everywhere in Azure documentation.)
(FYI - This is just assigning the Safe Mode Admin password)
Now that the virtual machine is created, you want to log on and verify that it looks, feels, and behaves just like any server on your network.
Your DC01 is currently assigned to the AD-Production-Static subnet. But this doesn’t actually assign it a static address that might not someday change. In this task, you will configure a static IP address using the new Azure Preview Portal.
You could accomplish what we’re about to do in two separate ways – the new Azure Preview Portal, or through PowerShell. For our Lab, we’re going to use the new portal, and then show you how it could be done using PowerShell.
While the new portal offers some great enhancements to managing Azure. It is still in preview, and this task will give you a glimpse into the new portal.
You may now close the new preview portal tab.
NOTE: This is just informational! If you’ve used the new portal to assign the static IP address, you don’t need to do these PowerShell steps!
If you were to do this using PowerShell, you will need to make sure you have installed the Microsoft Azure PowerShell cmdlets and connect it (or authenticate) to your subscription. You can read the Install PowerShell Toolsappendix section for more information.
Before proceeding to the next step you may need to wait for the last operation to complete. Assigning a new IP address forces the VM to restart.
In this task, you will create the database server to run the database portion of our application. This will be a SQL Server Enterprise 2014 virtual machine. You will leverage one of the many virtual machine images that are located in the virtual machine gallery. Images are used in Azure to provide a new virtual machine with an operating system. An image might also have one or more data disks. Images are available from several sources:
Perform the following tasks in the non-preview Azure management portal.
In this task you will create a new DNS server entry. This entry will be assigned to all computers using DHCP on their next restart, since all VMs use DHCP in Azure, even the ones with “static IPs” as these are technically just DHCP reservations on the virtual network. Azure provides automatic routing between subnets on the same virtual network, but automatic name resolution only when machines are in the same Cloud Service. Though we won’t be doing so in these labs, if we were to add new VMs to the domain, they would have entries in DNS, so that it wouldn’t matter what cloud service they were in. They’d have name resolution through DNS on the Domain Controller.
URGENT NOTE: Please confirm that the creation of the domain is complete on DC01 BEFORE changing DNS. You can do this by looking in Server Manger on DC01. AD DS and DNS should both be listed in the left NAV. If you do not, name resolution will fail
This post contains Lab 1 of the 5 labs created for our current set of US DX IT Camps.
In this first lab you will create the core building blocks for your Azure services:
The services mentioned above are the core tenants that provide a foundation for your applications, virtual machines and hybrid connectivity in Azure. Having this well thought out provides a great architecture for all of your cloud services.
Perform the following tasks:
First, you will create a Microsoft Azure network object and corresponding subnets. Azure Virtual Network lets you provision and manage networks in Azure and, optionally, link them via secured VPN tunnels with your on-premises IT infrastructure to create hybrid and cross-premises solutions. With virtual networks, IT administrators can control network topology, including configuration of DNS and IP address ranges. You can use a virtual network to:
With the virtual network you are creating will provide IP addresses assigned to objects and virtual machines you create in other labs that will be associated with this virtual network. You will also leverage subnets to help organize your IP addresses as well.
Microsoft Azure Storage is a massively scalable, highly available, and elastic cloud storage solution that empowers developers and IT professionals to build large-scale modern applications. Azure Storage is accessible from anywhere in the world, from any type of application, whether it’s running in the cloud, on the desktop, on an on-premises server, or on a mobile or tablet device. In this lab, you will create a storage account to contain all objects for your Azure services. Your VHDs, which you will create in lab 2 for your Azure virtual machines, will be stored in this storage account.
By creating a cloud service, you can deploy a multi-tier application in Azure, defining multiple roles to distribute processing and allow flexible scaling of your application. A cloud service consists of one or more web roles and/or worker roles, each with its own application files and configuration. Azure Websites and Virtual Machines also enable web applications on Azure. The main advantage of cloud services is the ability to support more complex multi-tier architectures. In this section you will create a new service to contain your virtual machines. By assigning your new VMs to this service, they will be able to communicate internally.
End of Lab 1: Building the Foundation
On December 8, 2014 my friend Simon May and I recorded (while presenting live) a Microsoft Virtual Academy “Jumpstart” session all about how to manage iOS and Android devices with Microsoft solutions. We divided the topic up into 4 modules, to introduce the topic and the fundamentals, as well as to cover iOS and Android device management in greater depth. We covered how Microsoft products and solutions such as Windows Server, Azure Active Directory, Microsoft Intune and System Center Configuration Manager can be used to grant secured, managed access to corporate resources for those users who “BYOD”; their iPhones, iPads, or Android devices.
Here is the landing page:
“Kevin, did you have a cold?”
Ugh.. I was in the 2nd week of a pretty annoying chest cold. I hope the editors were able to edit out the random hacking and coughing. I felt fine otherwise. And <knock on Surface Type Cover> I and healthy now.
In this episode I am honored to welcome back Microsoft Vice President Brad Anderson to the show. We discuss his monthly upcoming live webcast series, “Success with Enterprise Mobility” , that kicks off on Tuesday, December 9th and concludes on March 3rd. Tune in as he gives us a preview of what he and his guests will be discussing and learn how you can successfully support business productivity for your users through secured and controlled mobility
It isn’t a brand-new tool, but it was updated to version 1.1 the other day, and definitely worth sharing. The Microsoft Azure (IaaS) Cost Estimator Tool is now available. It’s an installable tool that allows you to “profile [your] existing on-premises infrastructure and estimate cost of running it on Azure.”
“Sweet! So, it installs agents on servers and then..”
Whoa! Lemme stop you right there! No agents. It’s agentless. It does require you to supply administrative credentials that will apply to the machines you’re profiling, which make sense.
The first time you run it, you’ll see this screen.
As you can see, the description of what it does and how it can be used are clearly spelled out.
In my example, for example (?), I’m running the tool on a PC in my test network. I’ve selected to profile physical machines, such as my domain controller named, surprisingly enough, “DC”. I’ve supplied my credentials…
And clicking Add, plus adding a couple of other machines (whose names might give away their purpose) results in this:
Clicking Next brings me to the page where I can choose a profiling duration, scanning frequency, and a name for the generated report.
I’m going to scan only one time, so my results won’t be based on more accurate, actual traffic or performance of my machines. But it’s good enough for a start.
I click Begin Profiling, and (in my case) after about 10-15 seconds my one-time scan is complete. I click View Report, and after an informational pop-up describing what was done and what options I have to change values, I see this screen:
Notice that I can tweak values and select just some or all of the machine before clicking Get Cost. I’ll just leave the values as determined, select all, and get my cost. Here is the result:
Notice that I can tweak the pricing model, and change the size of the Compute Instance (the type/size of VM) to play with various values. And when I’m done, I can export the results to a .CSV file (for use in Excel), or go back and try it all over again. Pretty nice?
“Very nice! But, what does this tool cost?”
Nothing. Nada. Zilch. Zero-dollar$.
“Sure. And I suppose once I run this tool I’m going to be bombarded with e-mails from Microsoft.”
Nope. Not even a requirement for a Microsoft Account to download, and no information is ever sent from this app back to Microsoft.
Seriously, Microsoft hopes that this will be a good way to get an idea of what your existing machines, whether physical or already virtualized, will cost to run over time as VMs hosted in our Azure Infrastructure Services. It’s all a part of helping you plan for an eventual migration of some of your local resources into Azure, to take advantage of the scale, capacity, security, and cost-benefits of the cloud.
In case you missed the link earlier, here it is again: Microsoft Azure (IaaS) Cost Estimator Tool
Welcome to another in our series entitled “Modernizing Your Infrastructure with Hybrid Cloud”. As you may be aware, this week the theme is “Management and Automation”. As a part of that theme I’m sharing with you an introduction to Desired State Configuration (DSC); more completely called Windows PowerShell Desired State Configuration.
DSC is a relatively new (less-than-a-year-old) technology, introduced with PowerShell v4.0, that lets IT define what the configuration of a server will be, apply that configuration, and then verify (and remediate) so that the configuration is still in place and as-desired.
“So, it’s like System Center Configuration Manager?”
No. It’s built-in as a part of Windows, and is configured and implemented using PowerShell. Sound interesting?
Good. In the context of one blog article naturally I won’t be able to go into every detail, but I hope that this article, some simple examples, and some additional resources at the end will get you excited for trying this out. And ultimately that you’ll see the immense value that this will give your IT and, of course, you’re business.
A Simple Example
For our quick example let’s assume a couple of things. I’ve enabled the Windows PowerShell DSC feature on a server named “Server1”. Server1 is a member server in my domain. I’ll be using an administrative account from another server (called Admin) to apply configuration to Server1.
I open up the PowerShell ISE and enter the following text. Can you tell what it’s doing from what the text says?
“It looks like it’s defining something that’s a ‘Configuration’ and calling it ‘IISWebsite’. And for your server named Server1, it’s laying out what Windows Features should be installed!”
Exactly! And in this PowerShell session, when I execute the configuration, I end up with a .MOF file, which is a definition on behalf of how Server1 should have the Web Server and ASP.NET 4.5 installed and running. All I need to do is run the Start-DSCConfiguration PowerShell cmdlet with the proper parameters referring to the .MOF file and pointing to Server1, and DSC configures the features and enforces that they always be there as I desired In fact, even if I or another administrator were to manually remove the ASP.NET 4.5 feature from the server, after a period of time the state would be re-evaluated and the configuration would be fixed!
What if, like those “WindowsFeature” sections, I were to add a “File” section like this:
Basically what I’m saying is, “Here’s the source folder of content that I want you to make sure is always found under this destination.” Ah.. and doesn’t the path look like it might be a web site folder? Yes! This configuration not only enforces that IIS be installed and running, but that the contents of a web application be always there and that the destination code always matches what is coming from the source! Someone could go in there and, say, delete some of the web content, but DSC would fix it automatically!
“Hey Kevin… What’s a .MOF file?”
Yeah.. this was a very quick, very simple example. Let me go through and briefly describe the parts that make up DSC…
The Parts – Configuration
The configuration is what we built in my earlier example. It’s a PowerShell definition that, using “Resources” (defined next) specify how things should be configured; our “desired state” for the configuration of a target server.
The Parts - Resources
In our example above, you notice that I’m defining what Windows Features are to be installed. I can do this because there is a built-in DSC “Resource” called “WindowsFeature”. From the TechNet Documentation, “Resources are building blocks that you can use to write a Windows PowerShell Desired State Configuration (DSC) script.” Windows comes with a number of these built-in resources that know how to specifically work with, configure, and enforce various aspects of the operating system. Resources for working with the registry, the file system, Windows Features, services… and many more, are included in the list of built-in DSC resources.
But it gets even better. These resources are just PowerShell modules. And just as you have the ability to create your own modules to extend PowerShell, you also have the ability to create your own custom resources!
The Parts - The .MOF file
This is the file that contains the configuration to be applied. It’s the result of executing the configuration definition in PowerShell, and is in a standard format as defined by the DTMF.
“Hey Kevin - Why do we even really need a .MOF file? Can’t Microsoft just do what it needs to do directly from PowerShell?”
I’m sure they could. But the beauty of using the .MOF is that because it’s a DTMF standard, it is formatted in a way can be applied to different machine types and for various purposes. In fact, at TechEd in Houston earlier this year I saw Jeffrey Snover actually use DSC to create a .MOF that then configured a Linux server running an Apache web server. (Yeah.. we’re “open” like that these days!)
The Parts - How It’s Deployed
The full name, “Windows PowerShell Desired State Configuration” is a hint about how you enable the DSC capability. It is a feature of Windows Server 2012 R2, found here in the Add Roles and Features Wizard:
When you check the box, you’ll notice that it will also install some Web components to your server…
This is because one of the ways DSC configurations are securely pulled is to use IIS.
The Parts - Push-me-Pull-You?
One important aspect of DSC is that it becomes even more powerful when you can distribute configurations, or maintain consistent configurations among many machines, all from a smaller number of source locations. DSC allows either a simple “push” distribution, which is simple and more manual, and a “pull” distribution where not only do you apply a configuration to a machine but you also tell it where it should be looking for its configuration and any changes going forward. Pulling can take place over HTTP (not recommended), HTTPS (recommended), or SMB Share permissions (okay because it’s authenticated access).
“Why isn’t HTTP recommended?”
Think about the damage someone could do if they hijacked DNS and then pointed to and automatically applied someone else’s version of a server configuration to your servers. Scary prospect, indeed.
The Parts – The Local Configuration Manager
The Local Configuration Manager is “the Windows PowerShell Desired State Configuration (DSC) engine. It runs on all target nodes, and it is responsible for calling the configuration resources that are included in a DSC configuration script.” So basically when you’ve enabled the DSC feature on a server, this is the service that either takes the pushed configuration, or pulls the configuration, and then applies it as defined in the most recent .MOF file.
For More Information…
Like many of you, I find that I learn best by looking at other people’s examples. And thankfully in the case of PowerShell and DSC there is a really big community already formed and willing to share what they have done with the rest of us. Here are some of the places I recommend you check out and save to your favorites if you’re really going to get serious about using Desired State Configuration:
If you want to try it out in a virtualized lab environment:
And finally, don’t forget to check in frequently at our “Modernizing Your Infrastructure” series landing page, to see all the great articles our team has created and resources we’ve shared.
This is pretty cool.
As the title says: System Center 2012 R2 Data Protection Manager is now an application that Microsoft will support when running inside a virtual machine in Microsoft Azure.
I’m sure they won’t mind me sharing this.. but here is the text from an e-mail I received on the subject that spells it out nicely:
We are pleased to announce that System Center Data Protection Manager (DPM) is now supported to run in Azure as an IaaS virtual machine. This announcement allows customers to deploy DPM for protection of supported workloads running in a Azure IaaS virtual machines. Customers with a System Center license can now protect workloads in Azure. Read more about it on the DPM blog. Support for multiple virtual machine sizes Choose the size of the virtual machine instance that will run DPM, based on number of workloads and the total data size to be protected. Start with just an A2 size virtual machine, and upgrade to a larger size to scale up and protect more workloads. Support for Microsoft Azure Backup Protect your data to Microsoft Azure Backup and get longer retention with the flexibility of scaling storage and compute separately. The Microsoft Azure Backup agent works seamlessly with DPM running in an Azure IaaS virtual machine. Familiar management using the DPM console With DPM running in an Azure IaaS virtual machine, you get the same experiences and capabilities that you are familiar with.
We are pleased to announce that System Center Data Protection Manager (DPM) is now supported to run in Azure as an IaaS virtual machine. This announcement allows customers to deploy DPM for protection of supported workloads running in a Azure IaaS virtual machines. Customers with a System Center license can now protect workloads in Azure. Read more about it on the DPM blog.
Support for multiple virtual machine sizes
Choose the size of the virtual machine instance that will run DPM, based on number of workloads and the total data size to be protected. Start with just an A2 size virtual machine, and upgrade to a larger size to scale up and protect more workloads.
Support for Microsoft Azure Backup
Protect your data to Microsoft Azure Backup and get longer retention with the flexibility of scaling storage and compute separately. The Microsoft Azure Backup agent works seamlessly with DPM running in an Azure IaaS virtual machine.
Familiar management using the DPM console
With DPM running in an Azure IaaS virtual machine, you get the same experiences and capabilities that you are familiar with.
So, here’s what you should do:
Yesterday in our “Modernizing Your Infrastructure with Hybrid Cloud” series, Matt Hester described how to create a virtual network “in the cloud” in Microsoft Azure in order to support cloud-based Virtual Machines and their ability to communicate with each other and with the outside world. We of course have the ability to connect to our VMs individually using Remote Desktop connections, but if we’re going to treat the location of these cloud based machines as just an extension of our own datacenter, we’re going to want to have a secured connection to them.
That’s what the VPN Gateway is all about.
In this article I’m going to show you step-by-step how to connect your Azure virtual network to your on-premises network. Here are the steps we’ll go through:
And as an added bonus, I might throw in a little something extra.
I hope you’ll think so. But for starters, let’s begin where Matt left off. I have an Azure subscription with a virtual network named AzureNet1, located in the South Central US datacenter region. In my scenario, I want to connect this Azure network to my Fabrikam office (Fabrikam was recently purchased by Contoso). Once the connection is established, I will want to join servers in that office to the contoso.com domain.
Here’s what the AzureNet1 network dashboard tab currently looks like. Note the two virtual machines currently in this network; a domain controller and an application server.
As you can see on the configure tab, I’ve set up two subnet ranges (their purposes are obvious based on the names I’ve given them) as part of an 8-bit-masked 10.x.x.x subnet.:
Notice that I’ve also defined my DNS server as 10.0.0.4. My domain controller has that address.
Collect Some Information
Before we start adding the site-to-site connection, I need to collect some information so that I can carefully use it to make the correct configurations. As you probably know first-hand, when doing networking configuration it’s easy to make simple little mistakes that cause everything to NOT work, so let’s make quick note of a couple of important items:
Your local network address range refers to the addressing of your local network. By that name, though, it’s a little misleading. “Local” assumes you’re connecting your Azure network to some “Local” office. But in reality it could be some other branch office or even another virtual network somewhere else in the Azure world. So, just think of “local network” as being “the network I’m connecting to my Azure network”. And I’ll keep using “local network” in “quotes” throughout the rest of this article for just that reason.
In our example, my Fabrikam network is 192.168.0.0 with a 16-bit subnet mask. (192.168.0.0/16)
The Gateway Address is the externally accessible IP address of the gateway. In the Fabrikam network, let’s say that I have a VPN device connected to the Internet with an external Internet-exposed address of 220.127.116.11. I will also have a gateway address on the AzureNet1 gateway, but that address will be assigned when I create the gateway for my virtual network. So, in simple terms, the gateway address is the connection point on either end of the VPN connection.
Define the “Local Network”
Before enabling the site-to-site connectivity and creating the gateway, we need to define the “local network” Fabrikam, so that our network knows what addresses it will be routing to over the VPN through the gateways.
To define my “local network” (which I’ll name “Fabrikam”), I clicked on +New in the bottom-left corner of the Azure portal, and selected Network Services –> Virtual Network –> Add Local Network.
I give my local network a name, an optionally the gateway address (I can add it later if I don’t know it right now.)
Then on the next screen I add the address spaces that exist at my “local network” at Fabrikam.
Once created, you’ll see it in the list on the local networks tab.
Back on my AzureNet1 network and on the Configure tab, now I can check the box to enable Site-to-Site Connectivity. Notice that a couple of things change. I now will choose which “local network” I’m going to connect to (Fabrikam), and it also requires (and defines) a “Gateway Subnet” for me.
“Hey Kevin.. What’s that ‘ExpressRoute’ option?”
That’s actually what my friend Keith Mayer is going to cover in tomorrow’s article in the series. I’ll include the link to his article after it’s published.
UPDATE: Here is Keith’s article - Modernizing Your Infrastructure with Hybrid Cloud - Step-by-Step: Cross-Premises Connectivity with Azure ExpressRoute (Part 16)
Anyway, after checking Connect to the local network, clicking Save starts the process of updating the network configuration. After a couple of minutes it completes, and now back on the dashboard tab we see this:
This means the gateway is in defined, but not actually created. That’s our next step.
Create the Gateway
At the bottom of the dashboard screen, click on Create Gateway.
Notice when you click it that you are given a choice between a static and dynamic routing VPN gateway.
“What’s the difference?”
Your choice will be based on a number of factors. Often the VPN hardware you are using will limit you to one or the other. A static routing VPN gateway is one that routes traffic based on policy definitions (which is why it’s often referred to as a Policy-based VPN). Packets are routed through the gateway based on a defined policy; an “access list”. A dynamic routing VPN gateway, also known as a “route-based VPN”, is a simple forwarding of packets between two networks. If the network doesn’t locally contain the destination for this packet, I’ll assume the gateway knows where to send it. And if it’s known by the gateway as existing on the other network, it sends it securely through the tunnel. For more information about these choices, and about various devices and gateway types that support either static or dynamic VPN gateways, check out this excellent documentation. Even if your device is not on that list, it may still work if your hardware supports
In my scenario I’m creating a simple tunnel to a device that supports the other end of dynamically routed VPN, so I’ll choose Dynamic Routing. Creating the gateway does take a good amount of time (as much as 15 minutes), so be patient. Eventually our display will go from this:
…and eventually, this:
Notice that we’ve been assigned an official actual external gateway IP address. We’re still not actually connected. (Connected would be GREEN in color.) We haven’t addressed the configuration of the “local network” side of our connection yet. At the bottom of the page you see a Connect button:
But let’s not click that just yet. We still need to…
Configure the Local VPN Device
Other than collecting some information about our Fabrikam network, we’ve only focused on the AzureNet1 side of our VPN tunnel. We still need to create the gateway on our Fabrikam network.
On the AzureNet1 dashboard, notice this hyper-link towards the right-side of the page:
Clicking on Download VPN Device Script this brings up a very interesting page that allows us to specify what kind of hardware (or software) we have on the “local network” side of our connection. The beauty of this is that, based on our selection of hardware (or even Windows Server 2012 and 2012 R2 Routing-and-Remote-Access (RRAS) working as your gateway), you are generating a script that can then be used to automatically configure your gateway on the “local network” side.
Once we’ve selected our Vendor, Platform, and Version, and clicked the check mark, we’re immediately sent a text file containing the configuration script for our selected device.
Use this script to configure your device, establish the connection from the local network, and then come back to the Azure network dashboard and click connect. And if you’ve done everything correctly, you should see something happy (and GREEN) like this:
“What kind of hardware do you have on Fabrikam’s network, Kevin?”
I don’t know. I’m not actually using a local network. For this demonstration, I’ve actually connected my AzureNet1 network, which is located in the South Central US datacenter region to a Fabrikam virtual network that host in the Central US datacenter region and manage through an entirely different Azure subscription. So.. I’m doing Site-to-Site between two Azure virtual networks. That’s my “something extra” that I promised earlier. Now I’m going to show you what I needed to do to make that connection work.
Connecting two Azure networks via a Site-to-Site VPN requires two things:
I’ve already showed you where you choose Dynamic Routing when you create the gateway. And other than the shared key, everything else I did for configuring the Fabrikam network was identical to what I configured in AzureNet1, except that my network in Fabrikam is 192.168.0.0/16 – identical to what I defined the “Fabrikam” “local network” to be on this side. IMPORTANT: These have to match. The range and mask have to be correct and consistent on both ends both the “local network’' definition and the actual network (or Azure virtual network as in my case) for this to work.
Also in the definition of the “local network” on either side was the specification of the Gateway IP Address. Again, ordinarily, your configuration script is populated with the Azure virtual gateway’s IP address. But in this instance, I need to create the gateway first, and let it fail connecting, just so I can see what the actual assigned gateway IP address on that side of the connection is going to be. Then I can take that address and configure it into the “local network” definition on the other side.
As for the shared private key.. Notice at the bottom of the AzureNet1 dashboard that there is a Manage Key button:
If I click this, I can see (and copy) the generated long key. I’ll copy it to the clipboard.
This key was created when we created the gateway, and is included for you in the configuration script on behalf of your “local network” device. But…
“We don’t have a local network device!”
Bingo. And we also don’t (as of this writing) have a way to use the Azure portal to set the shared key in and the configuration of the virtual network! But we will need to do that to at least one end of the tunnel to make sure they match. (Or both if we want to just use our own text as the shared key.)
This is where PowerShell comes in.
I’ve installed the Azure PowerShell cmdlets onto my local system, and then in PowerShell I connected to my Azure subscription where the Fabrikam virtual network resides. And now I use the following PowerShell command to set the shared key for the gateway connected the Azure network Fabrikam to the (from this point of view) “local network” named AzureNet1.
Set-AzureVNetGatewayKey -VNetName fabrikam -LocalNetworkSiteName AzureNet1 -SharedKey 2kDsdqXnxeXrGjI4r4rLltKKT1g9E9gY
Set-AzureVNetGatewayKey -VNetName fabrikam -LocalNetworkSiteName AzureNet1 -SharedKey 2kDsdqXnxeXrGjI4r4rLltKKT1g9E9gY
(For the Windows PowerShell command-line tools, go to the Azure downloads page, and scroll down to “Windows PowerShell” section. Instructions for setting this up are found there as well.)
That’s how I was able to get the common shared key into the other side of my connection. After this command completes, and soon after clicking Connect in the dashboard, I was happily sending data back and forth.
“But Kevin… Prove to us that you have the connection established! Finish your domain-joining scenario!”
In the AzureNet1 network I have two servers. One is a domain controller, and the other is a member server. All machines here are assigned their DNS server as 10.0.0.4. They reside in the South Central US datacenter region.
On my Fabrikam network (which, you’ll recall remember, resides in the Central US datacenter region, so not in the same location as the AzureNet1 network and machines) I have one server that I’ve just created:
Importantly, I’ve also created a “DNS Server” designation here, and assigned to the Fabrikam network, with the 10.0.0.4 address. Note the configure tab of the Fabrikam network.
In this way my machines in this Fabrikam network will be assigned 10.0.0.4 as their DNS server, and so will know how to find the DC in the AzureNet1 network. To verify this I can establish a remote desktop connection to my new karContosoDC2 server and look at the status of the network adapter:
Trusting that my VPN is happily and dynamically routing traffic between Fabrikam and AzureNet1, and knowing that my new server in Fabrikam is going to look for DNS at the domain controller in AzureNet1, I attempt to join the domain:
I am asked for domain credentials (a very good sign!)…
I’m in! That’s proof that I have successfully connected these two virtual networks!
For more information on configuring secure cross-premises connectivity, check out the official documentation here:http://msdn.microsoft.com/en-us/library/azure/dn133798.aspx
Here are some more specific configurations and their documents:
And be sure to keep watching http://aka.ms/ModernCloud for the full series of articles on modernizing your infrastructure.
Keith Mayer and I continue our series on “Modernizing Your Infrastructure with Hybrid Cloud”. In today’s episode we discuss various options for networking. Tune in as we go in depth on what options are available for hybrid cloud networking as we explore network connectivity and address concerns about speed, reliability and security.
Shortened URL if you would like to share on Twitter or Facebook, etc.
My friend Dan Stolts and yours truly continue our series on “Modernizing Your Infrastructure with Hybrid Cloud” with an overview on how to plan for a hybrid cloud storage solution using Windows Server 2012 R2 and Microsoft Azure. Tune in for our lively discussion on the many storage options available to you as well as discussions around performance, reliability and security.
Follow the entire series! http://aka.ms/ModernCloud
Shortened URL if you would like to share on Twitter or Facebook, etc. http://aka.ms/TR140822
Welcome to another in our new series of “Modernize your Infrastructure” articles. Today I’m pleased to share with you the details of yet another free and easy-to-use assessment tool from Microsoft. The purpose of this tool is to help you answer the following important question:
“Are my servers and services able to be migrated to Microsoft Azure?”
And that is a fair question; particularly if we see the value, but don’t really know where to begin. If in the process of modernizing my infrastructure I consider perhaps moving some (or all) of my servers – whether they’re physical or virtual machines - off of my local hardware and into “the Cloud” as Microsoft Azure hosted Virtual Machines as an extension of my datacenter, then it would be good to have a starting-point assessment to help me learn about and consider what might be required; and even better if it was based on my current environment and some initial goals and desires.
And that’s what the Microsoft Azure Virtual Machine Readiness Assessment is all about.
It’s a free and easy-to-install tool that, when run on supporting OS and with the proper credentials, will ask you a number of questions about your environment and about your needs and desires (the end goal), and result in a lengthy report based on your answers and, importantly, based on what it was able to detect in your infrastructure.
“Can you show it to me?”
Showing you the whole process would be overkill here. But how about I show you some of the highlights.
Requirements and Installation
The download page is where you’ll find a good description of how and where the tool can be run. In basic terms, it will run on any OS newer than Windows Vista and Windows Server 2008. It does have some .NET framework requirements as well. The instructions are pretty simple:
1. Download and run WAVMRA.EXE on the computer you want to run the assessment from2. Complete the installation steps3. Launch the tool4. Select the technology you want to assess and proceed through the wizard experience
On the workstation you’ve installed the tool on, make sure you run it as an administrative account that has rights to administer Active Directory, SharePoint, or your SQL Servers (whatever it is you’re interested in assessing).
Naturally, the first question you are asked is “What would you like to assess?”
Your answer here will determine some of the remaining questions concerning what kind of connectivity, applications, availability, and performance you’re going to require.
Let’s say that In my example I’m going to want to extend my Active Directory domain into the cloud. Using my single corporate domain I want to extend authentication to other applications that I want to host on virtual machines in my Microsoft Azure network.
Prior to the remainder of the questionnaire, you are reminded of the requirements for this tool to be able to run successfully:
Answer the Questions
The rest of the process prior to scanning your environment and generating the final report, is to ask you additional questions. In my scenario, I’m asked 13 more questions. “All questions must be answered as part of completing this assessment.” Here are a few samples:
Note that each question provides additional detail about what’s being asked, and you are often giving the option to basically say “I don’t know yet”. Trust me – The report will give you excellent detail on and pointers to additional information about all of the options available.
The tool generates a Microsoft Word .docx file that you can save, print, share.. whatever you want to do with it. Inside you’ll find a detailed report on what you’ve chosen, what’s required of you, and links to additional information and further learning around your next steps. The report is organized into three parts: “Ready”, “Set”, and “Move”.
And then shows you “What we checked”, with a quick visual indication of which items are fine, and which ones should probably be looked into further.
And that’s it!
Hopefully you’ll find this a useful first-step into extending your infrastructure into the Microsoft Azure cloud.
Go Forward “To the Cloud!”
The team of US DX IT Pro Technology Evangelists is doing another series of articles and TechNet Radio interviews. The topic: Modernizing your Infrastructure. The goal: To give you as many resources as you’ll need to get your infrastructure up to speed, whether you’re simply looking for ways to migrate off of Windows Server 2003, or even move workloads and applications up into Microsoft Azure.
“Sounds great, Kevin! Where is the series landing page?”
Go here: http://aka.ms/ModernCloud
Bookmark it and check back often. There should be new content there every day.
Speaking of new content, today’s first-article-in-the-series is by Dan Stolts. He writes about and documents using the Microsoft Assessment and Planning Tools. CHECK OUT HIS ARTICLE HERE
“Hey Kevin, are you writing any articles for the series?”
Absolutely. I have one going live tomorrow, others scheduled for later. And you’ll also get to see my pretty (?!) face in a couple of TechNet Radio interviews over the next couple of weeks.
Thanks to Mary Jo Foley for tweeting about this. Mary Hutson is maintaining a very useful list of “top Microsoft Support solutions for the most common issues IT Pros experience when using or deploying Windows 8 or 8.1.” She updates the list every quarter; the most recent being just two days ago (Aug 11, 2014).
* HERE IS THE LIST * <—Click that
Kudos, Mary! This is a great page to bookmark!
Whether or not you choose to believe me, here are a two things that I know to be true:
Often our marketing and operations people send out notices about up-coming events, such as our IT Camps or live Microsoft Virtual Academy trainings. But many of you aren’t getting those e-mails, because either you didn’t know how to set up your information and preferences about you, your interests and how you’d like to be contacted, or at some point you said “no” to getting more e-mail from Microsoft. And if you opt out for one thing, you’re opting out for everything. (See point #1 above)
“So.. what if I do want to be contacted about up-coming events?”
You need to edit your information and preferences associated with your Microsoft Account (formerly LiveID). And that’s what this blog post is all about.
The Profile Center: https://profile.microsoft.com/RegSysProfileCenter/
At the profile center, you can sign in and set up or edit the information about you and your preferences. It’s broken down into these 5 areas:
Once you go to any of those pages and sign-in, you can add to or edit the information that Microsoft has about you. Importantly, in order to get notifications from Microsoft about events or resources that are important to you, you need to fill in the correct business and interest areas, and then allow Microsoft to contact you via e-mail. Here is a quick summary of each of the areas, and I’ll highlight the areas that will help you get our e-mails in future.
Share as much as you’re comfortable with, but know that if you don’t give us accurate information, it’s not going to help us to keep you up-to-date. IMPORTANT: Even if you see the proper e-mail address, make sure you click edit next to it to see whether or not you “would like to hear from Microsoft…”. If it’s un-checked and you’d like to hear from us via e-mail, make sure you check the box and click save.
Being accurate with the kind of business you work for (or want to work for) will help us better determine what makes the most sense for how we address you or what we send you. It also helps us figure out in a more general sense the populations and communities out there, and helps us shape our priorities and where we put our attention (and dollar$).
This is the area that you should revisit on a regular basis. Seriously. It had been awhile since I personally looked here, and I found that I was still interested in Windows Vista but not at all interested in Server 2012 or Windows 8. As technologies change, you need to make sure we know that you’re interested in learning more about it.
It won’t help you if we can’t contact you. Sure, it’s entirely up to you, but many people are surprised that they’ve opted out of contact at some point and never knew it. Make sure that if you would like the occasional e-mail about items relating your business and technology interests, that you at least check the “E-mail Address” box.
E-mail newsletters on various topics are also available on a regular basis. Feel free to subscribe to the ones you’re interested in.
Of course, you can go back to the Profile Center anytime and change or update or verify your preferences. And if you need more details on Microsoft’s Privacy policies, you can find it all here: http://www.microsoft.com/privacystatement
Any questions? Concerns? Rants? Insults? I can take it. Put them in the comments.
Dear TechEd Diary,
This entry is several days over-due, but I hope you’ll understand. Towards the end of the conference the schedule is tight, and the exhaustion is real. So I thought rather than taking time out of my day on Thursday or Friday I’d just take in and enjoy the rest of the week and finish up my diary writings after I’ve had a little rest.
On Thursday (Day 4) I skipped the first session of the day.
Oh.. give me a break. I needed a little extra sleep after the IT Pro Community Party at the Hughes Hangar on Wednesday night. (see photos below) I did get to the conference in time (or so I thought) for “TWC: Malware Hunting with Mark Russinovich and the Sysinternals Tools” (DCIM-B368), but even though I was there 20 minutes before the start, the room was “full” and people were waiting outside to get in. So instead, I decided to head down to the Hands On Labs, where I re-visited my past life as a Developer and walked through a lab showing me what’s new in the latest version of Visual Studio.
After lunch I attended “Case of the Unexplained: Troubleshooting with Mark Russinovich” (WIN-B354). This time I got there 40 minutes early and managed to get a good seat. The room was full soon after.) This was the 2014 version of Mark’s popular session that he’s done for many years now, where he shows off the System Internals tools and how they were used to solve strange workstation and server issues.
And.. I confess… I skipped the last session of the day and went back to my hotel. The weather, which through most of the week was very unseasonably cool, was finally warm enough in Houston to enjoy a little time poolside.
Following a lovely Chinese dinner (we highly recommend the China Garden) with my coworker Jennelle Crothers and MVP and author Ed Horley, the evening party at Minute Maid Park was fun. Lots of food and drink, and live music stationed around the perimeter. The Dueling Piano Bar took tips to allow our friend Don Donais to play drums on a few songs. And the group “The Spazmatics” were a fun 80’s-tribute band. After the main party was over, several of us went to Pete’s Dueling Piano Bar for more entertainment.
Friday was a sleep-in day, followed by leisurely packing, heading to the airport, and sharing a plane home with several of my Minneapolis friends.
“So what’s next, Kevin?”
I’m glad you asked. I’m going to be planning, facilitating, and participating in a series of blog articles entitled “TechEd 2014 Favorites”, in which I and my coworkers who attended the event will document some of the important announcements and our favorite moments from TechEd 2014. Watch this blog for more details coming very soon.
I’ll leave you, dear TechEd diary, with some photos taken at the events described above.
DJ Joey Snow cranking out the tunes at the IT Pro Community Party
My west coast coworker (and newbie on the team) Jessica DeVita, Simon May, and Ed Lieberman.
Mark Russinovich explaining the unexplainable
The drawing at the Adaptiva booth for a Harley Davidson motorcycle. I didn’t win.
The little 3’ deep pool shared by the Courtyard and Residence Inn. It was empty when I got there, but he chairs filled up soon after. I guess I’m a trend-setter.
Minute Maid Park on the walk over.
Entertainment at the party.
Hmm.. Odd that nobody is in the salad bar line.
View of the field and main stage
Don Donais rocking out with the pianos at the dueling piano bar.
Pete’s Dueling Piano Bar
See you next year!!
Well diary, if you ever had any doubts about “the cloud”, they should have been eased a bit if you attended or watched the live stream of Mark Russinovich and Mark Minasi’s “Mark and Mark” cloud discussion this morning. Besides being entertaining, it was very informative. It’s always great to hear the opinions of someone as important as Mark Russinovich is to Microsoft, and to the industry we serve. And it’s a good look under the hood of not only what we’re doing now, but there are hints of what’s coming. And to me, it’s very exciting.
The session, if your interested, is DCIM-B386 – “Mark Russinovich and Mark Minasi on Cloud Computing”. At the time of this writing, the recording isn’t yet available. But it will be soon.
The talk took the form of an informal interview, with Mark asking Mark the questions.
Mark Minasi asked Mark Russinovich the questions concerning Microsoft and cloud services and often drilling specifically into Microsoft Azure. On Azure, they discussed new capabilities, the amazing scale that Microsoft has (Did you know that last year Microsoft purchased 17% of the world’s servers for their datacenters?) as well as topics of security and data privacy. It’s well worth a (re)view.
“What else did you learn about today?”
I learned more about the fundamentals of some of the new networking capabilities in Microsoft Azure. Session: DCIM-B305, “What’s New in Microsoft Azure Networking”. They did a good job of summarizing the many new capabilities, such as site-to-site VPN, multi-site VPN, reserved static public IP addresses, and more. The demos showed that some of the configuration isn’t quite as straightforward as they’d like it to be; but it’s a great start, and definitely something I’m looking forward to playing with (and blogging about) some more.
The other highlight of the day was another Mark Russinovich presentation: DCIM-B306, “Public Cloud Security: Surviving in a Hostile Multi-Tenant Environment”. Mark does a “Top 10” based on the Cloud Security Alliance “Notorious Nine: Cloud Computing Threats”, and gives his impressions on the realities of the list’s items from Microsoft’s perspective. It’s an eye-opener, for sure.
Those two last sessions’ recordings aren’t available yet. I’ll update this post with links when they’re available.
“What did you do for fun last night?”
Ah.. fun? Who has time for fun?
Yep. Last night’s meal was courtesy of the “Ask the Experts” event. Later, Veeam threw a very fun party (I am a sucker for good live music.) And the TechEd Jam Session was also in full-swing at the House of Blues, though I didn’t spend much time there. ($11 for a tall can of Miller Lite?! I don’t think so.)
Tonight I’ll be attending a party thrown at Lucky Strike by Nimble Storage, and then I’m heading over to the Hughes Hangar for the “Windows IT Pro Community Party” (Sponsored by Springboard Series on TechNet).
Here are some photos from last night and earlier today.
Now.. off to the parties!
My day 2 started much as my day 1 did.
Sure, if there is such a thing as “too much fun”, I came close to it. But even though last night’s Exhibit Hall Reception and the later “MCP Celebration” party at Howl at the Moon was a lot of fun, I managed to get home before midnight. Plus, there’s not much to do in the area around my hotel.. so that’s another reason to call it a night.
Today I kicked off the day with a very good session on migration strategies for moving from VMware to Microsoft Hyper-V virtualization. (Session DCIM-B412, in case you’d like to view the recording). I have to give these guys credit for doing an excellent job delivering topics that are sometimes confusing in a very easy-to-understand way.
Right now, as I type this, I’m waiting for the next session to start. (“Best Practices Integrating On-Premises Datacenters with Azure IaaS” – DCIM-B330)
I haven’t decided upon what I want to learn this afternoon. I’ve got 3 potential good topics at 1:30pm, 5 at 3:15pm, and 2 at 5pm to choose from. And no matter what I pick, I bet the words “Mobile” and/or Cloud” will be used frequently.
“So, what’s the vibe been this year?”
A lot of the people I’ve talked to are very impressed and very excited about the transition to a “Mobile First / Cloud First” focus. Some are still apprehensive, but I think they all understand how important it is for their businesses. In particular the support we’re adding for management of all of the most popular devices is well-received. And the improvements to Windows Azure are causing many to consider using it as an extension of their datacenters.
To close out this blog post, here is some photo-evidence from last night…
Lots of people gathering lots of swag
I really want to win this!
I don’t care so much about winning this.
Howl at the Moon for the MCP party
Other more incriminating photos have been withheld to protect the not-so-innocent.