November, 2014

  • What are the best IT Pro Tools for automation – and why?

    Each month here in the @TechnetUk #ITPro office there is a mad scramble (bunfight / race / polite debate) to bid for the best blog topics for the month. Each quarter has a theme, this quarter it is ‘the right tools for the job’. I was lucky enough to see the email early and our editor Charlotte accepted my bids for my post last week on Windows Server Technical Preview and for this one on automation tools.

    As you might imagine this was a pretty popular one to bid for especially as all IT Pros will always try and find the quickest most efficient way to carry out their allotted tasks while it is yet day, all so that they can carry on with important things like Wolfenstein (the original of course) and even a quick toe-dipping into the world of Xbox One.

    Anyone who has ever read any of my posts would be forgiven for thinking automation, well that’s going to be another exposition on the glories of PowerShell and why we need to learn it or learn golf… and in part you may be correct but I decided to read the title in full and this gave me the opportunity to go further than a single tool. PowerShell could be described as the framework for a whole bunch of excellent tools (or modules) but in this post I will be treating it as one tool and will be including it as it is simply the number one productivity and automation tool available in the world of Microsoft Server operating systems and business platforms such as Exchange, SharePoint, Office 365 etc.

    What else is available to an IT Pro as an automation tool, I had to think long and hard about this as I haven’t really used much else on a day to day basis for quite some time, to automate routine tasks.

    What does the landscape look like in the world of automation?

    What does an IT Pro want to automate?

    Well the roles of an IT Pro, even though they are changing and being a little blurred by the new DevOps school of thinking, are many and varied, from a network specialist who really doesn’t want to do all the IP Planning, management and administration manually (or by spreadsheet) to the deployment specialist who absolutely doesn’t want to wander round a building and install images, agents and other software on client and or server machines when everyone else has gone home.

    But let us start with the traditional view of an IT pro – the server administrator and yes PowerShell. I am not going to offer up the whole of PowerShell as that is something I do on a regular basis. I am going to talk about DSC, or more formally titled PowerShell Desired State Configuration.

    As I usually do I am going to quote the TechNet description of the feature and then dive a little deeper into it.

    “DSC is a new management platform in Windows PowerShell that enables deploying and managing configuration data for software services and managing the environment in which these services run.

    DSC provides a set of Windows PowerShell language extensions, new Windows PowerShell CmdLets, and resources that you can use to declaratively specify how you want your software environment to be configured. It also provides a means to maintain and manage existing configurations.”

     Now that sounds all very well but it doesn’t tell me exactly what the feature does in layman’s terms, nor does it describe how to do it or really sell to me the impact that can have on my infrastructure and thereby the amount of time I will have released to carry out other important tasks.

    So PowerShell DSC gives us the ability to define exactly what we want our server to look like in terms of roles and features installed, configuration and even right down to detailed single registry settings or environment variables.

    The seasoned IT Pro may well say at this point, ‘So What?’ I can do that with Group Policy if I am in a Domain environment (and most It Pros work in such an environment). The answer would be yes, of course you can. But Group Policy has default refresh rate of 90 minutes with a randomized offset of 0 to 30 minutes to prevent all machines hogging the network at the same time. The seasoned IT pro will also tell you that up to 120 minutes is a very long time indeed in the world of Server configuration.

    DSC uses a set of built in resources (which are growing all the time) to control a range of features, functions and roles in an entirely automated manner. DSC also allows the IT Pro to create custom resources.  At this point I should add that as with all things PowerShell, the community tends to share and a large number of custom resources are already available for free.

    The descriptions of the original built in resources can be found here.

    In a default install, the built in resources are those as shown below.

    dsc1

     

     

     

     

     

     

     

    I should also add here that this is not basic level scripting or PowerShell and this post is not aimed at teaching you the skills required to script or to understand complex PowerShell commands. I will list out several script blocks to show what is involved. TechNet again provides a great tutorial on custom DSC resources here. In that tutorial the reader is shown how to create a custom resource that will either create, configure or delete a website on a particular server.

    All this can be run on a schedule to ensure that the Desired State is maintained across your entire server estate. It can also be pushed or pulled whichever you prefer. There is also a great deal more DSC goodness coming with PowerShell 5.0 in Windows Server vNext.

    There are also some good TechNet Virtual Labs for DSC and other PowerShell features. Check them out here. (33 of them covering PowerShell, DSC, Azure PowerShell, Automation etc.)

    dsc2

     

     

    I shall save some deeper DSC diving for other posts as this was NOT meant to be a PowerShell love in.

    So what other tools can I use to automate IT pro tasks.

    I have already alluded to the IP planning and deployment / management tasks that need automating and easing. Well I have posted many times about the super effective IP Address Management feature in Windows Server 2012 and 2012 R2. Suffice it to say that if you read this blog regularly you are already sufficiently acquainted with its principles to realize its value. Of course Windows PowerShell 4.0 also added these PowerShell CmdLets to enable automating your IPAM deployment and management.

    ipam2

    TechNet Virtual Labs also do a rather good job of highlighting this feature in this LAB. I ought also to mention here that the Microsoft Virtual Academy has a number of courses covering IPAM and PowerShell for Active Directory that includes DSC.

    The Final set of automation tools (I wanted to pick Azure Automation using PowerShell but I promised I wasn’t going to use all PowerShell today) that I am going to select today are those that enable the deployment of Operating System images. I was spoilt for choice, since I could have picked System Center 2012 R2, Hyper-V, or many other useful tools.

    I have chosen some tools that are cost free once you have licensed a server operating system (Server 2012 or 2012 R2).

    The mix of tools are Windows Deployment Services (WDS) and the Microsoft Deployment Toolkit (MDT) 2013. But before I discuss those, I would like to mention the Microsoft Assessment and Planning (MAP) Toolkit.

    “The Microsoft Assessment and Planning (MAP) Toolkit is an agentless inventory, assessment, and reporting tool that can securely assess IT environments for various platform migrations—including Windows 8.1, Windows 7, Office 2013, Office 2010, Office 365, Windows Server 2012 and Windows 2012 R2, SQL Server 2014, Hyper-V, Microsoft Private Cloud Fast Track, and Windows Azure.”

    This is a must have tool for anyone planning to do anything to their network or clients / servers. Another free tool. Available here. All the above mentioned tools are part of the Microsoft Solution Accelerator Programme which seems to expand every time I look at this page. The MDT Team blog also has masses of useful information.

    So why have I chosen this set of tools? WDS allows me to deploy operating systems across the network to all my clients in a Light Touch manner (LTI) this means that I would have to have some interaction with the client. Currently the preferred Zero Touch solution uses System Center 2012 R2, but this can be a costly option.

    To assist you in using this free service Microsoft have provided the MDT and also the Windows Assessment and Deployment Kit (ADK). This kit is a hefty installation and provides a raft of useful tools. See the screenshot below, if you select all as I have here, the result is over 6GB of installation.

    adk1

    There are a number of TechNet Virtual Labs for the MDT, although most are focused on the integration with System Center Configuration Manager, for larger enterprises. There is one for creating images using the MDT though.

    mdt2

    In short the tools allow you to create images or capture them from reference PC’s then store them until required for deployment to new / refreshed PC’s in your network. Why am I considering this automation?  Well the use of an image in the new(ish) Windows Image Format (WIM) allows you to update, service and add / remove features, drivers and programs from the image at any time. It can also be used to deploy VHD and VHDX files to allow client PC’s to boot from VHD too. All this would take a long time configuring at each machine that you want to deploy.

    As with most tools that save you time in the long run the deployment and configuration of this suite of tools is not a small task and it will involve a degree of learning the principles and processes, which can be confusing, there are capture images, install images boot images, reference images as well as thin thick and hybrid types of images. Enough images for you?

    Oh and I am sure it won’t surprise you to find out that MDT uses PowerShell to carry out all its tasks, as I have said ‘ad nauseum’ PowerShell is the future.

    I don’t have enough space this time to do a run through of MDT / ADK for developing and deploying images with WDS, but they are freely available on the internet and I will do a YouTube one when I get time. It may flow better that way.

    But all new tools take time, whether they be PowerShell, Azure Automation or any other new feature. That is why learning and certification is still such a good thing to be involved with. All of the products and features I have talked about today appear in Microsoft Official Curriculum Courses and in Microsoft Certification Exams too.

    With the landscape changing so often, it is wise to invest in your career by learning and certifying so that your employer or your prospective employer can have some benchmark to judge you by.

    Use the MVA and the other training avenues wisely. For all things training and Certification you can use the many resources available to you at

    Microsoft learning website

    lex1

    Born to learn website

    b2l1

    Microsoft Virtual Academy

    va1

    Watch this space for more on PowerShell DSC, Windows Server Technical Preview top five features and more.

    The post What are the best IT Pro Tools for automation – and why? appeared first on Blogg(Ed).

  • The Data Analysts Toolkit: Why are Excel and R useful together, and how do we connect them?

    This article was commissioned by Jen Stirrup a SQL Server fan and Microsoft MVP which a passion for Business Intelligence and Data Virtualisation.

    Why is analytics interesting? Well, companies are starting to view it as profitable. For example, McKinsey showed analytics was worth 100Bn today, and estimated to be over 320Bn by 2020.

    When I speak to customers, this is the ‘end goal’ – they want to use their data in order to analyse and predict what their customers are saying to them. However, it seems that folks can be a bit vague on what predictive modelling actually is.

    I think that this is why Power BI and Excel are a good mix together. It makes concepts like Predictive Modelling accessible, after a bit of a learning curve. Excel is accessible and user-friendly, and we can enhance our stats delivery using R as well as Excel.

    One area of interest is Predictive Modelling. This is the process of using a statistical or model to predict the value of a target variable. What does this actually mean? Predictive modelling is where we work to the predict values in new data, rather than trying to explain an existing data set. To do this, we work with variables. By their nature, these vary; if they didn’t, they would be called a constant.

    One pioneer was Francis Galton, who was a bit of an Indiana Jones in his day.  Although he wrote in the 19th century, his work is considered good and clear enough to read today. Therefore, this research has a long lineage, although it seems to be a new thing. We will start with the simplest: linear regression.

    Linear regression compares two variables x and y to answer the question, “How does y change with x?” For predictive modelling, we start out with what are known as ‘predictor variables’; in terms of this question, this would be x. The result is called the target variable. In this question, this would be y. Why would we do this?

    • Machine Learning
    • Statistics
    • Programming with Software
    • Programming with Data 
    • Fun!

    Why would businesses work with it at all?

    • to discover new knowledge and patterns in the data
    • to improve business results 
    • to deliver better customised services

    If we have only one predictor variable and the response and the predictor variable have a linear relationship, the data can be analyzed with a simple linear model. When there is more than one predictor variable, we would use multiple regression. In this case, our question would be: , “How does y change with multiple x?”

    In fitting statistical models in which some variables are used to predict others, we want to find is that the x and y variables do not vary independently of each other, but that they tend to vary together. We hope to find that y is varying as a straight-line function of x.
    If we were to visualise the data, we would hope to find a pleasing line chart which shows y and x relating to each other in a straight line, with a minimal amount of ‘noise’ in the chart. Visualising the data means that the relationship is very clear; analysing the data means that the data itself is robust and it has been checked.
    I think that’s why, in practice, Power BI, Excel and R work well together. R has got some great visualisations, but people are very comfortable with Excel for visualisations. All that loading packages stuff you have to do in R… it doesn’t work for everyone. So we use R and Excel, at a high level, as follows:

    • We cleanse and prepare data with Excel or Power Query
    • We use RODBC to load data into R
    • We analyse and verify the data in R
    • We build models in R
    • We load the data back into Excel using RODBC
    • We visualise the data for results

    Excel is, after all, one of the world’s most successful software applications ever, with reputedly over one billion users. Using them both together means that you get the best of both words: R for analysis and model building: Excel is the ‘default’ for munging data around, and visualising it. I’m sure that one of the most popular buttons on software such as Tableau, QlikView et al is the ‘Export to Excel’ or ‘Export to CSV’ functionality. I’d be interested to know in what people think about that!
    Building linear regression models in R is very simple; in our next session, we will look at how to do that, and then how to visualise it in Excel. Doing all this is easier than you think, and I will show you how.

    Article originally written for Jen's Blog.

  • Five Ways Cloud Could Benefit Your Business

     Vadym Fedorov is a Solutions Architect at SoftServe Inc., a leading global software application development and consulting company, and has 12 years experience in enterprise application development, as well as 2 years’ experience in Cloud and operations optimization.

    As today`s businesses strive for the better cost optimization and faster time-to-market, cloud computing seems to be an ideal solution delivering just that. But its potential is actually much larger. In this brief article, I`ll discuss five key immediate benefits that cloud computing could bring to businesses that haven’t migrated yet.

    1. Cost Optimisation

    The primary benefit of cloud migration lies with the financial gains. It comes from the move from capital expenditure (CapEx) to operational expenditure (OpEx) within the "pay as you go" model meaning you pay for the utilized resources only. Before the cloud-computing epoch, a company had to buy the dedicated hardware and software and depreciate them over time, making it hard to respond to the arising business demands such as significant increase in the number of customers. The businesses had to plan their infrastructure ahead and utilize it during the depreciation period. Now you can save time, money and tedious planning efforts, paying for what you receive and use, instead of paying ahead of time.

    To optimize operational costs, cloud providers like Microsoft Azure additionally offer self-service or “use on-demand” IT services delivery.

    2. Business Agility

    Business agility is the second key benefit of cloud computing. Clouds bring adaptability and simplicity to the implemented solutions by introducing the concept of an elastic IT environment: cloud providers offer access to highly scalable elastic compute environments with the capacity adjustable based on the actual demand.

    3. Easy and Quick Access to Additional Resources

    The third benefit cloud provides is an easy access to computing, storage, and network resources, which eliminates additional dependencies and accelerates time-to-market. A customer can request computing and storage resources or services within the optimal time to accommodate the demand. For example, at Microsoft Azure, customers can request up to 50 virtual machines. This covers the needs of the majority of customers, however if you need more, it`s possible to contact the support team and increase the limits.

    4. Risk Mitigation

    Every organization should have a Crisis Management plan ready in case a disaster strikes or an unexpected outage occurs. Often organisations are required to keep running software operations even in case such a force majeure happens. While the cost of providing high-availability systems and recovery mechanisms in-house can be very high, cloud is a great solution for ensuring business continuity and timely and effective disaster recovery. Cloud provides the following tactics for disaster recovery:

    • Backup and Restore – storing data backup outside your datacenter in the cloud
    • Pilot Light – keeping a copy of the critical core components of your infrastructure in the cloud and performing regular data synchronisation. In case of a disaster, it`s possible to quickly restore a complete system in the cloud
    • Multi-Site Solution deployed in the cloud and on-site – running your infrastructure both on-site as well as in the cloud in an active-active configuration.

    As this pain point is especially important for small businesses with limited budgets, it helps that implementing such a crisis precaution is quite affordable. For example, the cost of the backup on Microsoft Azure is: first 5 GB per month, free of charge, greater than 5 GB is $0.20 per Gb per month. For such small amounts of money you receive a secure and reliable geo-replicated storage which maintains 6 copies of your data across two Azure datacenters and ensures 99.9% service availability.

    5. Geo-Distributed Data Centers

    The fifth business benefit that cannot be overlooked is cloud`s potential geographical reach. We live in the global world where IT services are delivered to users located worldwide, both within global organizations and as public services like video hosting or web conferencing. The primary challenge here is how to deliver content or a service with stable lower latency to the geo-distributed user locations on different continents. Large cloud providers have their data centers geo-distributed and interconnected with the high throughput network pipes to bring data as close to the end-users as possible.

    The key takeaway is, if your company hasn`t migrated to cloud yet, it`s high time to do it right now, as the benefits largely outweigh any migration challenges that might arise (read my most recent article on the possible cloud migration challenges and how to address them).

  • Cloud, VPS or Shared Hosting – What`s Best for Your Business?

     Vadym Fedorov is a Solutions Architect at SoftServe Inc., a leading global software application development and consulting company, and has 12 years experience in enterprise application development, as well as 2 years’ experience in Cloud and operations optimization.

    Cloud computing and the ways Cloud technology can address business needs and provide the leading edge in competition remains a hot topic in the industry.

    The first question many decision makers ask is “Why do we need to go Cloud if there are well-established and reliable VPS and Shared Hosting offerings in abundance?” They might even be less expensive. The question is quite practicable, as Cloud, VPS, and Shared Hosting offer very similar approaches to applications deployment and hosting. However, not all of the three options are equally good for different businesses:

    • Shared Hosting is the cheapest solution and the most popular for web site deployment. The deployed web sites share server CPU, RAM, bandwidth, and other resources. The customer has no control over server performance and resource utilization. The hosting provider manages the servers.
    • Virtual Private Server (VPS) uses virtualization technology. The customer controls the virtual server and the applications, but not the server hardware. The customer cannot easily provision more virtual servers (scale out).
    • With Cloud Hosting, the customer gets virtual server instances with full control on server software configuration, capability to start and terminate the virtual instances, performing scale-up and scale-outs to accommodate the needs in performance, resource allocation, and application availability.

    In comparison with shared hosting and VPS hosting (in some cases), Cloud hosting can be more expensive. However, Cloud provides the ability to manage resources based on customers` needs and optimize their total costs in the longer run.

    Here are a couple of typical business scenarios where Cloud is the right tool to use:

    1. An unpredictable load on your servers is expected. This often is the case for internet advertising companies implementing marketing campaign web sites, social network companies, and so on. The scalability provided by Cloud and “pay as you go” billing model can help scale the site at the peak load time with the minimal cost.
    2. A new startup business needs to set up an IT infrastructure to operate. The upfront investment in hardware, software and data center can be a burden for a company working almost without funding. In this case, going with the Cloud technology can be a better option, as capital expenditure (CapEx) can be significantly reduced and expenses can be moved to operational expenditure (OpEx). Software licenses are often included in the service price. From the technical point of view, Cloud technologies offer a quick access to unlimited compute and storage resources. Cloud solutions in this case play the role of business accelerators to speed up your service delivery with predictable costing.
    3. A company wants to optimize the existing IT infrastructure costs. The cost of the IT infrastructure in a traditional data center includes server hardware, network hardware, hardware maintenance, power and cooling, data center space and personnel. Upon switching to the Cloud provider, a company pays for the utilized resources only. The virtualization technology that Cloud providers use provides the ability to build elastic environment as well as manage environment capacity and costs based on the business demand.

    For businesses that strive for higher agility in operations, want to enjoy full control over their infrastructure capacity, and manage infrastructure and operations costs predictably, based on the current needs Cloud is the platform of choice.

  • Upcoming Events

    Microsoft official and Community tech events coming your way this November and beyond.
    Which event are you going to? Let us know via @TechNetUK. 

    Featured Event

    1. Azure IaaS for IT Pros

    Join Mark Russinovich, Microsoft Chief Technology Officer, Azure, as he kicks off a week of Azure training for IT Professionals on the 1st - 4th of December. Over the course of the four days, Senior Technical Evangelist Rick Claus and members of Azure Engineering will dive deep into technologies critical for IT Pro Implementers, like you, to help you better understand and build your foundational cloud skills. Experts share their deep technical insights on these topics and help prepare you to take Exam 70-533: Implementing Microsoft Azure Infrastructure Solutions for Microsoft Azure Specialist Certification.

    Register here

    Don't miss out!

    Register through Microsoft Virtual Academy to receive reminder mails for this event and to obtain details for receiving a voucher for 50 percent off the exam price if taken by January 31st. Join the conversation on Twitter using #LevelUpAzure.

    2. TechDays Online

    Microsoft TechDays Online is back. This is the 4th implementation of our three-day online technical conference for IT Pros and Developers who are keen to know the latest information on developments in Microsoft products and development platforms for cloud, mobility, apps, IT infrastructure and much more. Our virtual technology event provides a unique opportunity for IT Pros and software developers to hear about and experience the latest developments across the wide range of Microsoft products and platforms from our latest devices to our hyper-scale cloud.

    The conference programme will be delivered by Microsoft specialists, technical professionals from the IT Pro and developer communities, customers and partners. All sessions will be fully interactive providing delegates with the opportunity to engage directly with the presenters and the other participants in the conference.

    In addition, the conference programme will include highlights from the Microsoft Future Decoded conference (November 2014) with fresh updates and insights into the latest developments in key Microsoft technologies.

    Day 1 (Tuesday February 3rd) – Client, Devices & Mobility
    Day 2 (Wednesday February 4th) – Server & Cloud
    Day 3 (Thursday February 5th) – Developer & Tools

    Register here

     Upcoming Events 
     1st - 4th December, Online: Azure IaaS for IT Pros - Join Mark Russinovich, Microsoft Chief Technology Officer, Azure, as he kicks off a week of Azure training for IT Professionals. Over the course of four days, Senior Technical Evangelist Rick Claus and members of Azure Engineering dive deep into technologies critical for IT Pro Implementers, like you, to help you better understand and build your foundational cloud skills. Register here
     1st 5th December, London: NDC London - We are now ready to repeat our success with NDC London 2014. This year we have decided to host the Pre-Conference Workshops at the Crowne Plaza Hotel in Docklands, while the conference will be at the ICC Suites at ExCel. We hope to see you there! Register here
     3rd December, Online: LIDNUG & Scott Guthrie – Open Q&A - Scott Guthrie is back! Yes, that's right - we got a 60 minute open Q&A scheduled with Scott Guthrie, Corporate VP at Microsoft. Register here
     4th December, Online: LIDNUG: ASP.NET MVC for Webform Developers with Walt Ritscher - This seminar examines ASP.NET MVC from a WebForm developer perspective, focusing on the key differences and new concepts inherent in the MVC platform. Topics in include routing, controllers, action methods, Razor views, model binding, HTML helpers, input validation and view templates. Register here
     9th – 11th December, Paris: LeWeb - Founded in 2004 by French entrepreneurs Loïc and Geraldine Le Meur, LeWeb is an internationally-renowned conference for digital innovation where visionaries, startups, tech companies, brands and leading media converge to explore today’s hottest trends and define the future of internet-driven business. Register here
     10th December, Leeds: Architecture Forum in the North (7) - Our Architecture Forum returns, now in its 7th year, and Black Marble and Microsoft once again invite you to join us for a unique opportunity to learn about the latest technologies and best practices from luminaries in the field of computing, with speakers including: Andrew Fryer (Microsoft), Phil Winstanley (Microsoft), Simon Carter (BAE Systems), Andrew Barrett (Coalfire), Jonathan Woodward (Microsoft), and Gary Short (Data Scientist). Register here
     10th December, London: LEGup Christmas Meetup - Come along for talks, competitions and prizes (and then drinks) at our Christmas LEGup event. We have three great speakers confirmed including Drew Buddie, who is a Learning Technologist. He will be talking about how subject specific games are and can be used in the classroom. Register here
     10th December, Webinar: Windows Server 2003 End of Support Webinar - This free event will give you an understanding of the technology available today to migrate your 2003 Applications, how to, discovery of your datacenter infrastructure and a look at what they can mean for your business. You will also be able to create a 2003 EOS project plan.  Register here
     17th December, London: Xamarin Re-Evolved and porting from Windows Phone - We're excited to be joined again by Dominique Louis and Michael James from Xamarin. They'll be recapping the important announcements from the Evolve conference and show us how to make the most of taking an app that is currently on Windows Phone and using Xamarin's tools reach other platforms too. Register here
     15th January, Birmingham: SQL Server User Group - We'll be at the Midlands Art Centre. Sessions to be announced closer to the time! Register here
     3rd – 5th February, Online (GMT time): TechDays Online 2015 - Our virtual technology event provides a unique opportunity for IT Pros and software developers to hear about and experience the latest developments across the wide range of Microsoft products and platforms from our latest devices to our hyper-scale cloud. The conference programme will be delivered by Microsoft specialists, technical professionals from the IT Pro and developer communities, customers and partners. Register here

     

    Be sure to keep up to date on TechNet social for more regular event updates. Why not tweet us and let us know which event you’re going to!

  • Lab Ops – Working with Azure VMs

    A couple of months ago I had a very interesting chat with Andy a Data Centre admin at CP Ltd about using PowerShell to manage Azure as Andy is partially sighted, and his team mate Lewis is blind (For more on this please read Andy’s post on the TechNet UK blog) .  I wanted to go into some of the PowerShell in a little more detail so that you can be a good as administrator on Azure as these guys are.  For this post I am going to assume you know how to work with Azure and are familiar with concepts like storage, cloud services and networking, though you will get an idea if you follow this post!

    Firstly to get working with PowerShell on Azure we need to get hold of PowerShell for Azure and remember to check back regularly as they change as Azure changes.

    Before we can run any PowerShell against our subscriptions we need to setup some sort of trust otherwise anyone can create services against our subscription.  The simplest way to do this is with

    Get-AzurePublishSettingsFile

    this will launch the Azure management portal and ask us to sign in. This command will then save a file to our local machine which we can then consume the file like this..

    Import-AzurePublishSettingsFile -PublishSettingsFile "C:\AzureManagement\some filename.publishsettings"

    However the problem with this approach is that you have access to the whole subscription which is fine for demos and labs. In production you’ll have some sort of Active Directory in place and you'll connect to that with:

    $userName = "your account  name" $securePassword = ConvertTo-SecureString -String "your account password" -AsPlainText –Force

    $Azurecred = New-Object System.Management.Automation.PSCredential($userName, $securePassword)

    Add-AzureAccount -Credential $Azurecred

    Now we can run any of the PowerShell for Azure commands against our subscription but before we can do too much with Azure VMs we will need a storage account to store them..

    $StorageAccountName = “lowercase with no spaces storage account name”

    $AzureLocation = “West Europe”

    New-AzureStorageAccount –StorageAccountName $StorageAccountName –Location $AzureLocation 

    where –Location specifies the data centre you want the storage account to reside in e.g. West Europe and get-AzureLocation will give you all the data centres you can choose. Now we have a storage account we need to declare that as the default location for our VMs ..

    $SubscriptionName = (Get-AzureSubscription).SubscriptionName
    Set-AzureSubscription -SubscriptionName $SubscriptionName -CurrentStorageAccountName $AzureStorageAccountName

    If you are familiar with VMs in Azure you’ll know that by default each VM get’s its own wrapper or cloud service but in this demo I want to put three of these VMs into the same cloud service which we can create with..

    $AzureServiceName = “This needs to be unique on Azure.cloudapp.net”

    New-AzureService -ServiceName $AzureServiceName –Location $AzureLocation -Description "Lab Ops cloud service"

    Before we can create any VMs we need to also have a network in place and it turns out the PowerShell port for this in Azure is pretty weak all we can do is setup a network using an xml file in the form of

    <NetworkConfiguration xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.microsoft.com/ServiceHosting/2011/07/NetworkConfiguration">
      <VirtualNetworkConfiguration>
        <Dns>
          <DnsServers>
            <DnsServer name="AzureDNS" IPAddress="10.0.0.4" />
          </DnsServers>
        </Dns>
        <VirtualNetworkSites>
          <VirtualNetworkSite name="My-VNet" Location="West Europe">
            <AddressSpace>
              <AddressPrefix>192.168.0.0/16</AddressPrefix>
            </AddressSpace>
            <Subnets>
              <Subnet name="My-Subnet">
                <AddressPrefix>192.168.10.0/24</AddressPrefix>
              </Subnet>
            </Subnets>
          </VirtualNetworkSite>
        </VirtualNetworkSites>
      </VirtualNetworkConfiguration>
    </NetworkConfiguration>

    you can hack this around and then save it as something like VNnet.xml and then apply it to your subscription with.

    $AzureVNet = Set-AzureVNetConfig -ConfigurationPath " path to VNet xml file"

    for more on how to hack this file with PowerShell rather than editing it then have a look at Rik Hepworth’s (azure MVP)  blog  -

    Now to create those VMs we have more choices -  we could use a template VHD of our own but for now we will  just use the gallery images just as we can in the Azure Management portal.  To do this we need to interrogate the gallery to find the right image with something like this..

    $AzureImage = Get-AzureVMimage | where imagefamily -eq "Windows Server 2012 R2 datacenter" | sort-object PublishedDate -Descending  | select-object -first 1

    which will get the most recent gallery image for Windows Server 2012R2 datacenter edition. I can then consume this in a script to create a VM

    $AdminUser = "deepfat"
    $adminPassword = "Passw0rd!"

    New-AzureVMConfig -Name $VMName -InstanceSize Medium -ImageName $AzureImage.ImageName | `
            Add-AzureProvisioningConfig –Windows -AdminUsername $AdminUser –Password $AdminPassword | `
            Set-AzureSubnet 'Deepfat-Prod' |`
            New-AzureVM –ServiceName $AzureServiceName –Location $AzureLocation -VNetName $AzureVNet

    Note that if you want to embed these snippets  in a script you’ll need to get clever and introduce some wait loops to allow the VMs to spin up.

    By default when you create a VM a couple of endpoints will be created one for RDP and one for PowerShell.  In reality you wouldn’t necessarily want to do this as you may have site to site VPN in which case this is redundant or you might just do this on one VM to manage the rest or use Azure Automation.  We need to query for these ports as in a cloud service each VM will have the same DNS entry but with different random ports:

    $VM = Get-AzureVM -ServiceName $AzureServiceName -Name $VMName

    $VMPort = (Get-AzureEndpoint -Name PowerShell -VM $VM).port

    In Andy’s post he published a self signed certificate to his cloud service which is needed to enable a secure remote PowerShell session to the VM.  However if we are just trying this in a lab then we can use the certificate that Azure automatically creates when a cloud service is created as this is also trusted by the VMs in that cloud service by default.  We can then pull this down and trust it on our local machine with

    (Get-AzureCertificate -ServiceName $AzureServiceName -ThumbprintAlgorithm SHA1).Data | Out-File "${env:PUBLIC}\CloudService.cer"
    Import-Certificate -FilePath  "${env:PUBLIC}\CloudService.cer" -CertStoreLocation Cert:\LocalMachine\AuthRoot

    Now we have all the setting and permissions we need to setup a remote PowerShell session to our VM..

    $VMCred = new-object -typename System.Management.Automation.PSCredential -argumentlist $AdminUser, (ConvertTo-SecureString $adminPassword -AsPlainText -Force  )

    $VMSession = New-PSSession -ComputerName ($AzureServiceName + ".cloudapp.net") -Port $WFE1Port -Credential $VMCred -UseSSL -Name WFE1Session

    with this session we can now add in roles and features, turn on firewall rules and so on like this

    Invoke-Command -Session $WFE1Session -ScriptBlock {
        Get-WindowsFeature Web-server | Add-WindowsFeature –IncludeAllSubFeature}

    If we want to work on a SQL server VM (there’s a bunch of gallery items on Azure with different editions of SQL Server on) then it might be useful to enable SQL Server mixed mode authentication in which case we need to pass parameters into the session and the simplest way to do this is by using the param() setting inside the script block with and –ArgumentList switch at the end (remembering to keep the parameters in the same order..

    Invoke-Command -Session $VMSession -ScriptBlock { param($VMCred, $VMName)
        #set SQL Server to mixed mode and restart the service in the process
        Get-SqlInstance  -machinename $VMName -credential $VMCred -AutomaticallyAcceptUntrustedCertificates |  Set-SqlAuthenticationMode  -Mode Mixed -Credential $VMCred -ForceServiceRestart -SqlCredential $VMCred
          }
    -ArgumentList $VMCred, $VMName

    as this allows us to reuse the parameters we are already working with in the remote session and enhances readability.

    So that’s a quick introduction to some of the stuff that the more enlightened IT Professionals like Andy are using to make their lives easier and actually a lot of the stuff in this post works in your own data centre (like the stuff at the end to setup SQL Server) so using Azure really is just an extension of what you are used to.

    Be warned -  stuff keeps changing on Azure.
    For example a lot of older examples use Affinity Groups in azure to co-locate VMs but this is on the way out so I deliberately didn’t reference that here.  My advice is to be wary of older posts and follow the Azure blog particularly if what you are trying to do is still in preview

  • Windows Server Technical Preview – My Favourite Features – Part 1

    Microsoft released the first Technical Preview of Windows 10 to much acclaim back in October. There have been three releases so far and we currently sit on the ‘last release of the Calendar year’ – Build 9879.

    The Technical Preview is intended primarily for the enterprise to evaluate the changes and inform the development of new and evolved features of the client operating system. This is a brave and intelligent step. Most followers of Windows in an enterprise will know that Microsoft traditionally release their Client and Server platforms in pairs. XP/ 2003, Vista/2008, Win 7/ 2008R2, Win 8/2012 and most recently Win8.1/2012R2.

    The dramatic changes inside Microsoft have not led to a change in this pattern and there is a new server platform being developed alongside Windows 10, this Server is as yet un-named but is also in Technical Preview.

    If you have an MSDN subscription you can find it there in both ISO and VHD formats (the new Hyper V server is there too). If you do not subscribe then you can find it here. The new Remote Server Administration Tools for Windows 10 Technical Preview have also been released to allow you to remotely manage your new server from your new client. The RSAT can be found here. They are available in 32bit and 654bit flavours.

    For anyone interested in the Server Technical Preview, just about everything you could want to know can be accessed from this blog site. This is Jose Barreto’s blog, Jose is a member of the File Server team within Microsoft and has put together this invaluable survival guide. As you might imagine, it is Storage focussed but does cover most other areas too.

    There is one final way you can have a look at and run the Server Technical Preview and that is as a Virtual machine in Microsoft Azure. If you do not have an Azure subscription, again this is part of your MSDN benefit. (MSDN is sounding more and more like good value). Otherwise you can sign up for a cost free trial, here

    azpreview1

    Windows Server 2012 was a huge leap in performance and function for the Windows Server family and despite the familiar look and feel to the new server and most of its tools, there have been significant new features and improvements to old one. BUT please remember when looking at and playing with this new server operating system.

    THIS IS A TECHNICAL PREVIEW – do not use it in production, do not rely on it for any tasks you cannot afford to lose. Having said that I have found it stable and reliable (as with the Windows 10 client. The difference being I use the Windows 10 client on my main work machine and just about all other machines I use – a couple of exceptions) Whereas the server version is very definitely a test rig setup for me at present.

    So, what is new and of those new things, what are my favourite features and why. This is the first post in a series examining major new functionality in the Technical Preview.

    In Server 2012 one of the big five features for me was Hyper-V Replica. The first new feature of the Technical Preview I want to describe is called Storage Replica.

    To quote the TechNet site, Storage Replica (SR) is a new feature that enables storage-agnostic, block-level, synchronous replication between servers for disaster recovery, as well as stretching of a failover cluster for high availability. Synchronous replication enables mirroring of data in physical sites with crash-consistent volumes ensuring zero data loss at the file system level. Asynchronous replication allows site extension beyond metropolitan ranges with the possibility of data loss.

    Ok that sounds a.) A lot of technical stuff and b.) Pretty exciting and revolutionary for an out of the box no cost inclusion in a server operating system. So what exactly does it do and how does it do it.

    Well, Server 2012 introduced the next version of SMB (SMB 3.0) this allowed a vast number of performance and reliability improvements with file servers and storage as well as normal communications using the SMB protocol.

    In short the feature allows an All-Microsoft DR solution for both planned and unplanned outages of your mission-critical tasks. It also allows you to stretch your clusters to a Metropolitan scale.

    What is it NOT?

    • Hyper-V Replica
    • DFSR
    • SQLAlwaysOn
    • Backup

    Many people use DFSR as a Disaster Recovery Solution, it is not suited to this but can be used. Storage Replica is true DR replication in either synchronous or asynchronous fashion

    Microsoft have implemented synchronous replication in a different fashion to most others providers, it does not rely on snapshot technology but continuously replicates instead. This does lead to a lower RPO (Recovery point objective – meaning less data could be lost) but it also means that SR relies on the applications to provide consistency guarantees rather than snapshots. SR does guarantee consistency in all of its replication modes.

    There is a step-by-step guide available here, but I have included some other notes below for those who don’t want to read it all now (all 38 pages of it). (Images are taken from that guide and live screenshots too)

    The Technical Preview does not currently allow cluster to cluster replication.

    ss

     

     

     

     

    sc1

     

     

     

     

     

    Storage replica is capable of BOTH synchronous and asynchronous replication as shown below. And anyone who knows anything about replication knows that to do this there must be some significant hardware and networking requirements.

    synch1
    asynch1

    So what are the pre-requisites to be able to use Storage Replica in a stretch cluster?

    The diagram below represents such a stretch cluster.

    sc2

     

     

     

     

     

    There must be a Windows Active Directory (not necessary to host this on Technical preview)

    Four servers running Technical Preview all must be able to run Hyper-V have a minimum of 4 cores and 8GB RAM. (Note Physical servers are needed for this scenario, you can use VM’s to test Server to Server but not a stretch cluster with Hyper-V).

    There needs to be two sets of shared storage each one available to one pair of servers.

    Each server MUST have at least one 10GB Ethernet connection.

    Ports open for ICMP, SMB (445) and WS-Man (5985) in both directions between all 4 Servers

    The test network MUST have at LEAST 8Gbps throughput and importantly round trip latency of less than or equal to 5ms. (This is done using 1472 byte ICMP packets for at least 5 minutes, you can measure that with the simple ping command below)

    ping1

    Finally Membership in the built-in Administrators group on all server nodes is required.

    This is no small list of needs.

    The step by step guide uses two ways of demonstrating the set up. And is a total of 38 pages long.

    All scenarios are achievable using PowerShell 5.0 as available in the Technical Preview. Once the cluster is built it requires just a single command to build the stretch cluster.

    pshell1

    You could of course choose to do it in stages using the New-SRGroup and New-SRPartnership CmdLets.

    If, like me you do not have the hardware resources lying around to build such a test rig you may want to try and test the server to server replica instead.

    This requires,

    Windows Server Active Directory domain (does not need to run Windows Server Technical Preview).

    Two servers with Windows Server Technical Preview installed. Each server should be capable of running Hyper-V, have at least 4 cores, and have at least 4GB of RAM. (Physical or VM is ok for this scenario)

    Two sets of storage. The storage should contain a mix of HDD and SSD media.

    (Note USB and System Drives are not eligible for SR and no disk that contains a Windows page file can be used either)

    At least one 10GbE connection on each file server.

    The test network MUST have at LEAST 8Gbps throughput and importantly round trip latency of less than or equal to 5ms. (This is done using 1472 byte ICMP packets for at least 5 minutes, you can measure that with the simple ping command below)

    ping1

    Ports open for ICMP, SMB (445) and WS-Man (5985) in both directions between both Servers.

    Membership in the built-in Administrators group on all server nodes.

    NOTE – the PowerShell CmdLets for the Server to Server scenario work remotely and locally, but only for the creation of the Replica, not to remove or amend using Remove or Set CmdLets (make sure you run these CmdLets locally ON the server that you are targeting for the Group and Partnership tasks).

    I do urge you to go off and read more about this solution and test it if you can but remember things are not yet fully baked and will change with each release AND do not use them in Production yet. Read the guide for known issues as well, there are a few.

    Finally – why do I love this feature – NO one likes to think of a disaster but if you don’t plan for it, when it does happen it truly will be a disaster in every respect. This allows a much cheaper but effective way of maintaining a current accurate replica of data either on a separate server or on a separate site within a stretch cluster.

    Still pricey on hardware and networking, BUT much cheaper than a full hot site DR centre with old style full synchronous replication.

    Watch this space for more Server Technical Preview hot features.

    The post Windows Server Technical Preview – My Favourite Features – Part 1? appeared first on Blogg(Ed).

  • Licensing Logic: Visual Studio 2013 Licensing

    Part of the  Microsoft Licensing Logic series from the Microsoft Licensing team. 

    Visual Studio 2013 was released in October 2013 and remains Microsoft’s flagship application development suite. As there is so much more to VS 2013 than just a set of development tools, the licensing can be a little daunting. A good guide can be downloaded here and I’ll try to summarise the main points and products in this post.

    What’s the Difference between Visual Studio and MSDN?

    Visual Studio (VS) came into play in the mid-nineties when it brought together development tools such as Visual Basic, Visual C++ and Visual FoxPro (whom I had the pleasure of working for many years ago). Visual Studio has since grown in capability from its core development capabilities, much like the Office suite, and now covers roles right across the Software Development Lifecycle. The Microsoft Developer Network (MSDN) is a subscription programme that builds on VS and extends it to include access to Microsoft products (current and previous), technical support, training, forums, developer access to Azure and Office 365 and much more.

    How do you buy Visual Studio?

    There are free offerings of Visual Studio components (Express Editions) which generally offer a subset of the higher editions and also the recently announced Visual Studio Community  edition. There are also two MSDN subscriptions that don’t include Visual Studio but are a good fit for individuals involved in the development and test process without needing access to any of the Visual Studio tooling.

    The recommended way for organisations to licence Visual Studio is through an MSDN subscription. The options range from Professional to Premium to Ultimate, each increasing in capability. The Test Professional edition is a specialist IDE designed for software testers and it will not support development. A comparison chart can be found here.

    Each of the four MSDN Subscriptions above are available through all the VL programmes (Open, EES, EA, etc.) as well as full packaged product (FPP) and online through Microsoft. Buying Visual Studio Professional as a standalone is available through Select Plus, Open, FPP or online and is an option if the user doesn’t need access to development platforms (Windows, SQL Server, etc.) or the other benefits of MSDN.

    We covered Server and Cloud Enrolment (SCE) in a previous blog but it’s worth having a re-read because it offers discounted licences of Visual Studio for organisations that can commit to a minimum 20 Licences of any combination of VS Ultimate with MSDN and VS Premium with MSDN.

    Renewing MSDN subscriptions is far cheaper than the initial purchase because you are only paying for the software assurance (SA) component and it is possible to step-up or step-down between MSDN subscription levels. Typically, MSDN subscriptions through volume licensing will be coterminous with the existing VL agreement.

    What do I need to licence?

    Both Visual Studio and MSDN are licenced per user. The user can then install, run, design, develop and test their programs on any number of devices. These can be at work, home, clients’ sites or dedicated hosted hardware. The big restriction with the software obtained through MSDN is that it cannot be used for production use. In other words you can’t get Windows Server 2012 through MSDN and use it for the company infrastructure server; only for developing and testing. There are always exceptions and one is that users licenced for MSDN with Visual Studio Premium or Ultimate get a licence of Office Professional Plus 2013 for production use on one device (see below from the PUR).

    There are a couple of gotchas: even if you have a technician who simply installs the MSDN software for the development team, they will require an appropriate MSDN subscription. Despite the fact they’re not doing any development or testing, they are installing the MSDN software (this is counted as using it) and must be licensed. If they are installing production licences as opposed to the MSDN licences, they wouldn’t need a subscription.

    And the second gotcha, just to reinforce the non-production use limitation: with the MSDN subscription you can download the Windows client OS. But if you use it on a machine for anything other than development and testing (e.g. playing games or using Office) then you’re breaking the licence and you should be using a production licence of Windows client instead.

    There are circumstances including demonstration and user-acceptance testing, where the software can be used by non-MSDN subscribers.

    Other components that require licensing are the Team Foundation Server if you have a team of developers collaborating; the CALs for Team Collaboration Server; and Visual Studio 2013 has a new release management tool which can automate deployments to other servers – these servers need to be licenced with a Visual Studio Deployment licence (either standard or datacentre) and are licenced in the same way as System Center server management licences.

    Visual Studio Team Foundation Server

    Team Foundation Server supports the whole lifecycle of the software development process including version control, reporting and project management and is licenced in a server + CAL model. The good news is one server licence and one user CAL are included with Ultimate, Premium, Test Professional and Professional level MSDN subscriptions. Team Foundation Server and CALs can also be purchased standalone through volume licencing or retail channels.

    Visual Studio Online

    The cloud version of Team Foundation Server was named Team Foundation Service and that has now become Visual Studio Online; a complete application lifecycle management tool integrated with Windows Azure and available from any web browser. Sounds good but how is it licenced?

     The clue was ‘integrated with Windows Azure’ because Azure is the billing platform for VS Online. There is a limited amount of free VS Online use with the basic plan (5 users and up to 60 build minutes per month which is actual computing time required to run your build) but after that an Azure subscription is required. You can think of VS Online more akin to Office 365 however because it’s software as a service (SaaS); you don’t need to be concerned with Azure storage, databases, VMs, etc. since VS Online is a finished service.

    The plan options are shown on the right. For the MSDN subscribers we’ve been talking about, there’s no charge; they get free access to Visual Studio Online.

    As you can imagine, there’s a lot more detail to licensing Visual Studio 2013 and there are regular updates and changes as new features emerge, so please listen into our monthly licensing spotlight calls where we cover this and other topics (you can view archived calls here).

  • Top tips on how to survive as a contractor

    Asavin WattanajantraBy Asavin Wattanajantra, Editor, Microsoft UK Developers

    According to experts, 2014 is the year of the IT contractor. Increasing numbers of people are exploring a career path which doesn't follow the traditional full-time or permanent route.

    In the past, the use of contractors was simply put down to companies not wanting to commit to long-term employees. But increasingly, it's actually more to do with what technology professionals themselves are looking for. 

    Many don't want to be tied down to a single company – they like flexibility and the ability to move around. Working as a contractor can also be significantly more lucrative than a permanent position, while it also provides the opportunity to work on fresh new projects and keep skills current.

    But it's not for everybody. The flexibility it offers must also be counterbalanced with the lack of security in these types of roles, and the need to constantly look for new contracts to prevent yourself being stuck in between jobs with no position and no income.

    Finding the opportunities and improving your skills

    According to Computerworld's 2014 Salary Survey, almost half of IT managers hiring are looking for developers, with application developers most in demand. But skills are important. In a recent eWeek study, languages like Java, JavaScript, C/C++, PHP and Python showed up highly. 

    As well as the lack of security, having the right skills is something contractors really need to keep in mind when looking for new roles (which they'll have to spend much more time doing than a person in a secure, full-time and permanent position). 

    They'll also need to find ways to keep skills up to date, as well as potentially certified. There are various ways to get 'skilled up' in your spare time if you don’t have the time to train – there are free resources with which a tech professional can use to teach themselves, from the likes of Microsoft and others.

    However arguably, the tougher prospect is finding roles which match the experience and the remuneration candidates feel they deserve. This can be easier said than done. To do this you'll need to present the best version of yourself and advertise on all the available channels.

    CVs and branding

    Whatever skills you have aren't any use unless you are put in front of the right people, and this is why branding and advertising is so important. 

    Most obviously, you need to get your CV right. This is a 'living' document which will constantly undergo change due to your constant changing of roles. In the end you'll need it to sell your capabilities and achievements in the best way you can, so you can get the interview.

    It's really important, as a well-presented and effective CV can be a way of presenting yourself in a way which is better than somebody with more experience. And don’t forget the value of a good cover letter. Here are a few tips on getting things right

    Social media is also a great tool for finding and applying for new roles, in particular LinkedIn. As well as being a great place to find hard to find positions, you can turn your profile into an advertisement of your skills and experience. As you network and connect with more people, the wider the pool of hirers who you have contact with.

    If you’ve got more time and are willing to put the effort in, you can even make a push with content marketing, showing off expertise through blogs and social media. For example, LinkedIn offers a way to blog with its LinkedIn Influencers programme, which can increase your exposure significantly. You can also put the effort in creating a personal website and blog.

    Done right, managers and recruiters will be coming to you rather than you needing to come to them. And talking of recruiters... 

    Recruiters

    It'll be very surprising if you don't make use of recruiters in your contracting odyssey. They can be extremely useful – they might have knowledge and access to roles which aren't advertised, and will spend time looking for roles which they think are suitable, rather than you - especially valuable if you're looking for a new contract while you are actively working.

    But it's likely that you'll already know some of the issues in working with recruiters. They are working for employers, not you. This means that they won't necessarily understand your CV and immediately fit you into the job roles you're looking for. And they won't necessarily have your best interests at heart.

    However, if you can build a good relationship with a recruiter, then you have a person who will actively do a lot of the hard yards when it comes to searching for the right roles - and it's in their best interests as they'll get rewarded when you sign on the dotted line.

    The Interview

    It is all well and good having the skills and getting them down on paper, but understandably, getting in front of people for the dreaded interview is a completely different matter. And a contractor will have to do this fairly regularly.

    Now there are masses of literature detailing the best way to do and approach interviews. But at its most basic things to think about are doing the preparation and research, as well as trying to look the part and show a bit of confidence. Hopefully if you have the right skills for the role, this won’t be too much of a problem.

    Dealing with money

    If you're at all serious about embarking on a career as a contractor - get an accountant. Although they'll cost you, it'll be money well spent considering a good one will get you through the complicated maze that is the British tax system. 

    There will be various ways in which you can set yourself up, such as becoming a limited company, joining an umbrella company, or working as a virtual 'employee' for the company. More information on this can be found here.

    Whether you decide to pursue the flexibility and rewards that contracting offers, or the security and stability of a permanent role, a career in IT and development should be one that sets you up for years to come – the pace of technological change seems to be continually accelerating. Good luck!

  • Events: Sneak Peak at Windows 10 Enhancements and Azure Training

    Not one to miss! 

    ...Mark Russinovich and other Azure Experts will be running an online event through the MVA in December. Do you want a sneak peek at enhancements in Windows 10? And/or do you want to get some Azure training to help you prepare for the Exam 70-533: Implementing Microsoft Azure Infrastructure Solutions  exam? Well take a look at the below events: 

    What’s new in Windows 10?

    We can’t wait to show you! Join us for “Windows 10 Technical Preview Fundamentals for IT Pros,” on November 20th, and get a sneak peek! There are tons of enhancements designed to make it easier for you to do your job, and your end users will love the familiar UI.

    In this Jump Start training with live Q&A, join leading experts Simon May and Michael Niehaus, plus lead Product Managers, as they roll back the covers on the Windows 10 Technical Preview. Find out how management and deployment is evolving, and hear how new security enhancements in Windows 10 can help your organization respond to the modern threat landscape. Be sure to bring your questions!

    Course outline

    • Windows 10 Technical Overview
    • Windows 10 Management and Deployment
    • Windows 10 Security

    Register now! 

    Azure IaaS for IT Pros Online Event

    Join Mark Russinovich, Microsoft Chief Technology Officer, Azure, as he kicks off a week of Azure training for IT Professionals on the 1st - 4th of December. Over the course of four days, Senior Technical Evangelist Rick Claus and members of Azure Engineering will deep dive into the technologies critical for IT Pro Implementers, like you, to help better understand and build your foundational cloud skills. Experts will share their deep technical insights on these topics which will also prepare you to take  Exam 70-533: Implementing Microsoft Azure Infrastructure Solutions for Microsoft Azure Specialist Certification.   

    Topics include: 

    • Day 1:  Establish the Foundation: Core IaaS Infrastructure Technical Fundamentals

      • View from the CTO: Mark Russinovich, Chief Technology Officer - Azure
      • Azure IaaS Virtual Machines Inside Out
      • Optimize Your Windows Server Workloads on Azure
      • Inside IaaS Architecture Best Practices and Management
    • Day 2:  Dive Deep into Networking, Storage and Disaster Recovery Scenarios

      • Designing Networking and Hybrid Connectivity Infrastructure
      • Deep Dive Into Storage Using Azure Backup, Data Protection Manager, StorSimple, and InMage
      • Planning Disaster Recovery, Migration and More
      • Learn the Ins and Outs of Azure Automation, PowerShell and Desired State Configurator
    • Day 3:  Embrace Open Source Technologies (Chef and Puppet Configurations, Containerization with Docker, and Linux) to Accelerate and Scale Solutions

      • How to Deploy Linux and OSS on Azure
      • Leverage Existing Chef / Puppet toolsets for management 
      • How to Implement Containerization with Docker to Increase Density and Performance of Virtual Machines
      • Lift and Shift Your Linux Solutions to Azure
    • Day 4:  Optimize Windows Workload Architecture and Administration Capabilities Within Azure

      • Identity Solutions: Leveraging Azure Active Directory / Active Directory Premium
      • Azure Websites: Manage Your Websites not Your VMs
      • Leveraging SQL Azure for Your Solutions to Increase Scale
      • Architecting SharePoint for the Cloud

    Register here!

    Don't miss out!

    Register through Microsoft Virtual Academy to receive reminder mails for this event and to obtain details for receiving a voucher for 50 percent off the exam price if taken by January 31st. Join the conversation on Twitter using #LevelUpAzure.