• MBAM Configuration Manager reports data is repetitive

    Let us consider the following scenario of Microsoft BitLocker Administration and Monitoring (MBAM) 2.5 deployed with integrated topology. This means you have integrated MBAM with Configuration Manager. You have deployed the MBAM group policy and all the clients started to report in so we are ready to check out the compliance status of all these machines. You browse the Reports via Configuration Manager or browse via the SSRS Reports URL and you see the following chart with the legend that doesn’t really make sense.

    You do see some percentage information but do not really know what is what from the figure. Why or how did this happen? If you have modified the MBAM related RDLs using Report Builder, you would end up with this issue. When you modify the report using Report Builder, it modifies the schema causing the report to display erratic information.

    Now that I have explained what the issue is and why it happened, how do you fix the issue? There is no easy way to undo schema changes caused by Report Builder. Below are the steps we need to follow to change the MBAM reports. Using notepad or some other ASCII text editor is advisable.

    Step 1:

    You first need to delete the MBAM folder from CM Reports.

    Step 2:

    Under following registry key, modify the value of CMIntegration to 0


    Step 3:

    Now Enable CM Integration Reports using Powershell.

    Enable-MbamCMIntegration -BitLockerProtectionBaselineLogicalName <String> -FixedDataDriveConfigurationItemLogicalName <String> -OperatingSystemDriveConfigurationItemLogicalName <String> -ReportsCollectionID <String> -ReportsOnly [-SsrsInstance <String> ] [-SsrsServer <String> ]

    You can obtain the logical name strings by viewing the BitLocker Protection baseline XML definition under \Assets and Compliance\Overview\Compliance Settings\Configuration Baselines\BitLocker Protection, right mouse click and choose View XML definition.

    Step 4:

    Once the Reports are enabled in Configuration Manager, verify if the MBAM reports are viewable and as expected.
    If you need to modify any MBAM CM integrated reports, avoid using Report Builder and use Notepad instead. That way, no schema changes are performed and reports will stay intact.

    Good Luck!

    Naziya Shaik
    Support Escalation Engineer
    Microsoft Enterprise Platforms Support

  • Troubleshooting Common Surface Pro 3 Issues Post Deployment

    With the launch of Surface Pro 3, enterprises have been testing/deploying them. Almost all deploy a customized image to Surface Pro 3 and sometimes they hit a roadblock. Today, I will talk about some of the basic things to check that can help narrow down the issues.

    Before we get to that, I would like to point out couple of articles/blogs that everyone should refer before deploying Surface Pro 3. One of my colleagues, Scott McArthur, has an excellent blogon deploying Surface Pro 3 using MDT. I would highly recommend reading through it.

    Deploy Windows to Surface Pro 3 using Microsoft Deployment Toolkit

    We also have an updated Deployment Guide available for download.

    Deployment and Administration Guide for Surface Pro 3

    Now, on to troubleshooting issues.  The first question we want to ask is:

    Can the issue be reproduced on a Windows tablet, PC or Virtual Machine?

    If the issue can be reproduced on any other Windows tablet, PC or VM, then most likely it is a software issue and we treat it as a regular Windows 8.1 case.  As such, we would troubleshoot it as if you would any other Windows issue.

    However, if the issue presents itself only on the Surface Pro 3, we need to narrow it down to the factory image or the customized image that is being deployed. If the issue happens with the factory image, it would be good idea to engage Microsoft.

    When it happens only with customized image, we need to narrow it down further if its application, driver or OS related.

    It starts with a supported Operating System. Based on KB2858199below chart represents supported Operating System. Please refer to the KB for any updates to this policy.


    Make sure the device is up to date with the latest drivers and firmware. Driver and firmware updates are available via Windows Updates. They are also available for download from the following link.

    Surface software, firmware, and drivers

    In addition, the following link lists the fixes that are included with these updates.

    Surface Pro 3 update history

    clip_image002 Note:

    Generic versions of drivers should not be included and avoided for Surface Pro 3 deployments. The reason is Surface Pro 3 drivers are specifically written for the device and other drivers are not optimized for the power management technology we use in the Surface. So, using a generic driver can cause all sorts of issues like crashes, reduced battery life, unstable system and others.

    Once we know the OS that is being deployed is correct and we have the latest drivers and firmware, we would want to ask some of additional questions:

    Can the issue be reproduced if we simply deploy the OS imported from an .iso and no other applications installed?

    In other words, if we install Windows using a USB which has a Windows 8.1 Enterprise .iso and try to reproduce the issue, do we have it?

    If not, we know it is one of the applications being deployed.  The next step is to install one application at a time to narrow down further.

    For example, we have three applications that are installed as part of post install task sequence. Let us call them:

    Application 1
    Application 2
    Application 3

    We install Application 1 and test the behavior. If we do not see the issue, we proceed with Application 2 and so on. If the issue reproduces after we install Application 2, then it is certain that there is some compatibility issue with Application 2. At that point, contact the application vendor for an update or check if it is compatible with Windows 8.1.

    A good practice would be to check and make sure that all the applications that are being included are compatible with Windows 8.1. Also, obtain updates for them if they are available.

    The issue can be reproduced with only OS installed along with drivers.

    In this scenario if using MDT/ConfigMan, does the driver package contain only the drivers for Surface Pro 3 or it has drivers for other hardware too.

    As I have already mentioned above, Surface Pro 3 drivers are specifically written and optimized for the Surface Pro 3 device. We often see cases where during deployment a wrong driver is picked and then there are issues post deployment. To make sure it’s not driver related, create a new driver package (if using MDT/ConfigMan) with only Surface Pro 3 drivers and test deployment. The blog I mentioned above gives you an idea on how the folder structure should be for drivers. If you used the blog above to setup your environment then the chances of having issue with drivers are slim.

    In case you do not have the structure as mentioned above then, as part of troubleshooting this is what you can do. It is similar to what has be already talked about in the blog above.

    Here, I am using MDT 2013 with ADK 8.1 Update installed on Windows Server 2012 R2 Update with WDS.

    Create a folder for Surface Pro 3 drivers called “SP Drivers”. You can download the latest driver here.


    Next is to create a Selection Profile for the drivers.


    Create a new task sequence for deploying Windows 8.1 and modify it to point to the selection profile created above.


    Deploy this task sequence and test the behavior.

    Device unexpectedly reboots to UEFI screen or hangs are UEFI screen during startup when undocked.

    One of the common causes is the incorrect storage driver in use. The correct driver as of writing this is STORAHCI.SYS.


    It is also available to download in the Surface Pro 3 driver pack here and is located under folder "..\Surface Pro 3 - January 2015\Intel\SATA_AHCI\".

    If you do have machines that do not have the correct controller driver, download the driver mentioned above and update.

    Device unexpectedly reboots to UEFI screen or hangs at UEFI screen during startup when docked.

    In this case, we undock the machine and see if the issue can be reproduced. If it can, then check the above point for a possible cause.

    We also want to remove any external devices connected to docking stating and see if the issue exists.

    Is the issue related to Power Management?

    When you deploy a customized image, Surface Pro 3 is not configured to hibernate after four hours. This issue is documented in KB2998588 and there is a blogon how to incorporate the commands in MDT.

    Surface enters connected standby after 1 minute when PC is locked.

    The above scenario is true irrespective of whether device is connected to AC power. Some organizations do not want the device to be entering connected standby or sleep state when Surface is docked. To work around this behavior, configure the device with the Powercfg.exe commands mentioned in KB2835052.

    The below commands can be run as part of task sequence.

    powercfg.exe /setacvalueindex SCHEME_CURRENT SUB_VIDEO VIDEOIDLE <time in seconds>
    powercfg.exe /setacvalueindex SCHEME_CURRENT SUB_VIDEO VIDEOCONLOCK <time in seconds>
    powercfg.exe /setactive SCHEME_CURRENT

    The VIDEOIDLE timeout is used when the PC is unlocked and the VIDEOCONLOCK timeout is used when the PC is at a locked screen.

    clip_image002 Note:

    These commands set the timeout used when the system is plugged in and using AC power. To set the timeouts used when on DC (battery) power, use the /setdcvalueindex switch instead of /setacvalueindex.

    Then we can change the connected standby / sleep timeout value using Group Policy preferences.

    That can be configured using Computer Configuration -- > Preference -- > Power Options.

    Use the Power Plan to control when the device goes to Connected Standby / Sleep using “Turn Off display after” setting:


    I hope that this information helps working through deploying Surface Pro 3.

    Thank you,
    Saurabh Koshta
    Support Escalation Engineer

  • The Four Stages of NTFS File Growth, Part 2

    A few years ago I wrote a blog entry entitled, “The Four Stages of NTFS File Growth”.

    This attempted to explain what happens to a file as it gains complexity. Complexity being akin to fragmentation.

    If you have not read the above mentioned blog entry, please do so now. This information will not make the slightest bit of sense unless you read my earlier post. I’ll wait.

    Welcome back.

    Since its posting, I have answered a number of questions, mostly about the structure called the attribute list. So today I want to cover this a little more in-depth to hopefully address some of these said questions.

    In the previous blog entry, I explained how very complex files had the potential of creating an attribute list (shown below).


    The base record and all the child records are each 1kb in size. Each child record keeps track of a portion of the file’s data stream. The more fragmented the data stream, the more mapping pairs are required to track the fragments, and thus the more child records will be created. Each child record must be tracked in the attribute list.

    Keep in mind that the child records can hold much more than just two mapping pairs. This is just simplified to keep the diagram from being completely unreadable.

    The problem with this is that the attribute list itself. It is NOT a child record, it is created using free space outside the Master File Table (MFT). A file’s attribute list has a hard limit of how large it can grow. This cannot be changed. If it were, it would break backwards compatibility with older versions of NTFS that wouldn’t know how to deal with a larger attribute list.

    NOTE: The diagram shows the attribute list as being smaller than the 1kb file record. And while it is true that it starts out that way, the upper limitation of the attribute list is 256kb.


    So it is possible to hit a point where a file cannot add on any additional fragments. This is often the case when the following error messages are encountered.

    • Insufficient system resources exist to complete the requested service
    • The requested operation could not be completed due to a file system limitation

    What these messages are trying to tell us is that the attribute list has grown to its maximum size and additional file fragments cannot be created.

    To put this into perspective, this isn’t simply about file SIZE. It has to do with how fragmented the file is. In fact it is very hard to MAKE happen. There are really only two scenarios where it is somewhat common.

    • Compressing very large files, like virtual hard disks (VHD)
    • Very large SQL snapshots, which are sparse

    Both compressed and sparse files introduce high levels of fragmentation because of how they are stored. So very large files that are also sparse or compressed run the risk of hitting this limitation. To add to the problem, you cannot clear this up by running defragmentation/optimization. Sparse and compressed files are going to be fragmented.

    The good news is that we figured out a way around this. The bad news is that it isn’t really well understood.

    It really starts with this hotfix.

    Installing the hotfix doesn’t resolve the issue by itself. What this hotfix does is that it gives us the ability to create instances of NTFS that use file records that are 4kb in size, rather than the 1kb that NTFS has used for the longest time.

    How is this possible? If we can’t change the size of the attribute list, how can we change the size of file records?

    The attribute list is a hard coded limitation. Microsoft made the decision, for performance reasons, that we really should keep a lid on how big the attribute list should grow. On the other hand, file record size is self-defined. By default, the size is defined as 1kb, but records could be other sizes, as long as all the records in a volume are the same size.

    This was put to the test when 4kb sector hard drives started to become popular. Since you wouldn’t want a file record to be smaller than a sector, these 4kb sector drives were formatted to utilize a file record size of 4kb. That’s where the hotfix comes into the picture. In addition to being able to use 4kb file records on 4kb sector hard drives, an option was added to the FORMAT.EXE command to force it to create an instance of NTFS with 4kb file records, regardless of sector size.

    So why should we care about the size of the file records? Look at the diagram again.


    If the records are bigger, they can store more mapping pairs, and thus track more fragments. In theory, a file could have FOUR TIMES the number of fragments before running into the same issue.

    The catch is that the size of file records is set at the time of formatting. So if you have a volume that is running into this issue, you will need to do the following.

    1. Copy off your files
    2. Reformat the drive using the switch (Format /L)
    3. Copy the files back

    You can’t change the size of file records after the fact. It has to be set when formatting. But without an understanding of just what it is that we are changing.

    This solves the problem in the short term. For the long term, other solutions were implemented to prevent fragmentation past a certain point. In the newer versions of Windows, NTFS will stop fragmenting compressed and sparse files before the attribute list reaches 100% of its maximum size.

    This should put the issue to rest once and for all. However, until everyone gets to Windows 8.1 or Windows Server 2012 R2, we will still run into this issue from time to time.

    For more information about 4kb sector drives, check out my article on Windows IT Pro.

    Robert Mitchell
    Senior Support Escalation Engineer
    Microsoft Enterprise Platforms Support

  • DST Reminder for this weekend…

    Hello Folks!  This mornings post is a friendly reminder that DST (Spring forward) is kicking in this weekend (March 8th at 2:00AM – US).  Hopefully by now you are prepared and have the latest DST cumulative patch installed:

    December 2014 cumulative time zone update for Windows operating systems

    This particular update includes changes for Russia time zones, Fiji Standard time, and Cape Verde Standard time.  Per the More information section, “This is a cumulative update rollup that includes all previous Windows time zone changes.”


    Additional Resources


  • Step by Step instructions for installing RDS Session Deployment using PowerShell in Windows Server 2012 R2

    Hello AskPerf Readers! Dhiraj here from the Windows Performance team to talk about deploying RDS using Windows PowerShell on Windows Server 2012 R2.

    As you know, PowerShell has been around for quite a few years now (November 2006 to be exact). Over the past 8 years, we have seen PowerShell become an integral part of Windows. One such example is deploying RDS within your environment. In this blog, we are going to walk you through setting this up. With that, let’s get rolling!

    Before we begin though, we need to import the RDS module using the Import-Module cmdlet:

    Import-Module RemoteDesktop


    We will use the New-SessionDeployment cmdlet to begin with the installation. Below is the syntax for this cmdlet:

    New-SessionDeployment [-ConnectionBroker] <string> [-WebAccessServer] <string> [-SessionHost] <string[]>

    Note If you are installing the Session Host on the Connection Broker, then you need to run this cmdlet on a remote server, as running it on the connection Broker will give you the following error:


    The Session Host role needs a reboot after the install, and we received the above error as PowerShell cannot resume the deployment after a reboot. However, this will work in the GUI if you do the same process.

    In this deployment, we will use 3 servers for the deployment:

    • – RD Connection Broker, RD Web Access, and RD Session Host
    • – Second RD Session Host
    • – RD license server

    We will need to add RDSH01 and DC01 to All Servers pool on RDCBWA before we start the deployment.


    Now we run the below cmdlet on RDSH01 to install RD Connection Broker, RD Web Access and RD Session Host on RDCBWA:

    New-SessionDeployment –ConnectionBroker –WebAccessServer –SessionHost

    During the install, we’ll see the following progress meters:

    1. Validation begins:


    2. Deployment begins:


    3. Connection Broker is installed:


    4. RD Web Access role is installed:


    5. RD Session Host role is installed:


    6. After all roles are installed, the server is restarted:


    Once the PowerShell setup finishes, we now go to and verify the installation. As you can see from the screenshot below, everything except the RD Gateway and Licensing server have been installed. We will now add another session host and a Licensing server.


    First, let’s add the second RD Session Host server to our deployment. We will use the Add-RDServer cmdlet and run it on the Connection Broker this time.

    Add-RDServer -Server -Role RDS-RD-SERVER -ConnectionBroker

    When you run the above command, you will see the following progress:




    clip_image029 is now rebooted:


    We can now verify the addition of the second Session Host server in Server Manager:


    We are now ready to add our Before proceeding, let’s configure RD Licensing server.for our deployment. To install RD licensing role, we use the below cmdlet:

    Add-RDServer -Server -Role RDS-LICENSING -ConnectionBroker

    You will now see the below progress messages:





    We now need to activate our License server and install CALs via the Licensing Manager GUI on the License server. I have activated the License Server and installed PerUser CALs.

    Let’s configure our deployment for licensing. We use the below cmdlet for this:

    Set-RDLicenseConfiguration -LicenseServer -Mode PerUser -ConnectionBroker

    Running the above cmdlet requires confirmation:


    Select yes and continue.

    When finished, it will return to the next line:


    To confirm that licensing is configured, run the following cmdlet:



    We can now confirm everything in Server manager:



    We are halfway done here and have completed the installation of our roles. We now need to configure RDS to make Desktop Sessions and RemoteApps available to users.

    This takes us to the next step: creating a new collection using PowerShell.

    We will create two collections here consisting each of the RDSH servers, with one for Desktop Sessions and the other for RemoteApps.

    To create a new collection, we use the below cmdlet:

    New-RDSessionCollection –CollectionName SessionCollection –SessionHost –CollectionDescription “This Collection is for Desktop Sessions” –ConnectionBroker

    This also shows a progress bar and summary when it finishes:



    We can verify this set up in Server Manager. As this collection is for Desktop Sessions, nothing else needs to be done.


    Let’s go ahead with creating the second collection for RemoteApps:

    New-RDSessionCollection –CollectionName RemoteAppCollection –SessionHost –CollectionDescription “This Collection is for RemoteApps” –ConnectionBroker

    When it completes, we see the summary and collection in Server Manager:



    As we will use this collection for publishing RemoteApps, Let’s go ahead with adding RemoteApp’s to it:

    New-RDRemoteapp -Alias Wordpad -DisplayName WordPad -FilePath "C:\Program Files\Windows NT\Accessories\wordpad.exe" -ShowInWebAccess 1 -CollectionName "RemoteAppCollection" -ConnectionBroker

    Summary progress below:



    Server Manager shows the RemoteApp added:


    And with that, you are done! Users can now access the Desktop Session and Remote App Collections.


    Windows Server 2012 R2 comes with enormous amount of PowerShell cmdlets. In this article we’ve only seen a few of them. We may dive deeper into the power of PowerShell for managing RDS for Server 2012 R2 in future posts.

    If you are interested in setting up a VDI deployment using PowerShell, please check the link below:

    Setting up a new Remote Desktop Services deployment using Windows PowerShell


  • KMS Activation High Level Overview

    Hello, folks!

    This blog is aimed to provide a high level overview of the Key Management Server (KMS) technology.

    You may have found a lot of dispersed activation information available elsewhere on the Internet, but I’m going to try and pull it all together for you in a concise format that I hope you’ll find is easy to digest.

    First, make sure you can meet the initial KMS requirements for deployment:

    1. By default, the following ports are required for activation:

    • 80
    • 443
    • 1688

    2. Activation requests are fulfilled after meeting the corresponding product count minimum.

    • Workstation OS: 25
    • Server OS: 5
    • Office: 5

    3. Activated products require a connection to the corporate network at least once every 180 days.

    Next, let’s take a look at the basic KMS infrastructure:


    KMS host machines distribute activation signals, whereas KMS clients are machines that needs to be activated (they can be either servers or workstations).

    KMS host or client machine roles can be distinguished through the type of keys used. KMS Host Key directs host machine to create a SRV record (_VLMCS) in DNS. To obtain a host key, visit here. KMS Client Key directs client machines to look for a SRV record in DNS which points to the KMS host machine. Obtain a client setup key here.


    Office Volume Activation:

    The Microsoft Office Volume License Pack is required on Office KMS host. Obtain the license packs here:

    Microsoft Office 2013 Volume License Pack
    Microsoft Office 2010 KMS Host License Pack

    After installing the license pack, it will prompt you to install Office KMS host key. If nothing goes wrong with that process, your Office KMS should be all set.


    For your reference, here are TechNet guides for setting up Office KMS activation.

    Prepare and set up the Office 2013 KMS host
    Set Up an Office 2010 KMS Host


    Additional Tool:

    Volume Activation Management Tool (VAMT) is a free utility that is very helpful to apply product keys and manage activation status.

    Download and Installation

    • This tool is part of the Windows Assessment and Deployment Kit (ADK), available here.
    • The latest version of VAMT is 3.1 as of this writing, and supports OS’s up to Windows 8.1 and Server 2012 R2.
    • VAMT Requirements:
    • The .NET Framework is required and is installed automatically with the ADK.
    • SQL Server Express is required and you should choose to install it as a feature when going through the ADK setup wizard.
    • More Information:

    There are a couple of best practices to keep in mind when using KMS, and a few common mistakes you’ll want to avoid.


    Best Practices

    1. KMS OS host and KMS office host can be the same server
    2. Keep roaming users on MAK key (roaming users are those who would not be connected to the company domain at least once every 180 days)


    Common KMS Mistakes

    1. Installing a KMS host key on clients.
    2. The KMS host key does not match the host machine OS
    3. The latest patches have not been applied to the host machine.


    And now on to some common KMS commands you’ll want to keep on tap.

    Install a product key on the KMS Host

    • slmgr /ipk <KMS Host Key>

    Activate a product key:

    • slmgr /ato

    Display OS License Information:

    • slmgr /dlv

    Display All License Information (including office activation status):

    • slmgr /dlv all

    Note: The popup window for this command doesn’t scroll, so run the following command to write the output to a text file.

    cscript.exe c:\windows\system32\slmgr.vbs /dlv all > c:\temp\dlv.txt

    I hope this has been a helpful high-level overview of our KMS technology and wish you all the best!

    Kind regards,
    Sophie Fei Xu
    Support Escalation Engineer
    Microsoft Global Business Support

  • Highly Available RDS 2008 R2 License Servers

    Hello AskPerf! My name is Matt Graham and today I want to address some questions surrounding the setup of highly available licensing servers. Anyone setting up a RDS infrastructure wants to ensure that it will keep working if a license server goes down. In Termial Services (2003) the recommended way of setting up highly available license servers was as follows:

    1. Deploy two activated license servers
    2. Either place all active licenses on single server or split between two servers. Typically you would install all licenses on a single license server in the case of a user-mode license scenario.
    3. Ensure that both license servers are discoverable

    In the scenario where you place all licenses on a single license server, when that license server goes down, the secondary server will hand out temporary licenses until you are able to build another license server or install licenses on the secondary server. This, however could be complicated depending how you have your session hosts discover your license servers.

    Server 2008 R2

    Server 2008 R2 is similar, but there are some features that work differently. For example, the Auto Discovery feature that helped TS servers find the license server is no longer available in 2008 R2 ( By design, you tell your session hosts how to find your license server via RD Licensing Manager, GPO, or the registry (

    It's important to keep in mind that the session host checks to see if a license is even needed before making a request to the license server for a CAL. So in most cases, even if your license server fails, most clients should still be able to connect to your session hosts. A new license will not be requested unless a new client tries to connect or a license has expired on a specific client. What that means is that in most environments, the failure of a license server does not mean that all of your clients that try to connect will be unable to connect.

    With that in mind, some people will still want to setup a backup license server in case their main license server fails.

    Configuration for Multiple License Servers (Per User Licensing)


    As before, you setup two license servers. In most cases, you would install some of your CAL's on one license server and install the rest of them on another license server. You then configure half of your session hosts to point first to license server 1 and secondarily to license server 2. The other half of your session hosts should point to license server 2 as the primary license server and to license server 1 as the secondary server.

    In this scenario, RDSH01 will first try to pull licenses from RDSL01. If it doesn't have licenses, it will pull from RDSH02. Likewise, RDSH01 will first try to get licenses from RDSL02 and if there aren't any available licenses, it will pull licenses from RDSL01. So you should be able to utilize all of your licenses even though different session hosts are pointed to different primary license servers.

    NOTE: You will need take into consideration how many users / computers will be connecting to which session hosts. For example, if RDSH02 is going to have twice as many users connecting to it, you will want to install more CAL's on RDSL02 as it is serving as the primary licensing server for that session host.

    Configuring Session Hosts to Point to License Servers

    If you configure your session hosts through GPO, you go to the following:

    Computer Configuration\Policies\Administrative Templates\Windows Components\Remote Desktop Services\Remote Desktop Session Host\Licensing


    In this case, the session host will first look to RDSL01 for licenses and if it can't find a license it will look to RDSL02 for a license. You can also set this via the session host configuration manager. This can be done in the following way.

    1. On the RD Session Host server, open Remote Desktop Session Host Configuration. To open Remote Desktop Session Host Configuration, click Start, point to Administrative Tools, point to Remote Desktop Services, and then click Remote Desktop Session Host Configuration.
    2. If the User Account Control dialog box appears, confirm that the action it displays is what you want, and then click Yes.
    3. In the Edit settings area, under Licensing, double-click Remote Desktop license servers.
    4. On the Licensing tab of the Properties dialog box, click Add.
    5. In the Add License Server dialog box, select a license server from the list of known license servers, and then click Add. If the license server that you want to add is not listed, in the License server name or IP address box, type the name or IP address of the license server that you want to add, and then click Add.
      You can add more than one license server for the RD Session Host server to use. The RD Session Host server contacts the license servers in the order in which they appear in the Specified license servers box.
    6. Click OK to close the Add License Server dialog box, and then click OK to save your changes to the licensing settings.

    This is a basic setup for a highly available license server in Server 2008 R2.


  • MS15-010 causing font/text issues…

    Hello Folks.  Wanted to send out a quick note on an emerging issue we are seeing in Support after installing MS15-010.  If your fonts/text are distorted on the following Operating Systems…

    • Windows Server 2008 Service Pack 2 (SP2)
    • Windows Server 2003 SP2
    • Windows Vista SP2

    …then you can download/install the following fix:

    Fix for text quality degradation after security update 3013455 (MS15-010) is installed

    Please see this link for more information.

    Additionally, this fix will be including in March’s patch cycle.

    -Krishnan Ayyer & Susan Buchanan

  • Help! My Scheduled Task does not run…

    Good morning/afternoon/evening AskPerf! Blake here with a post I’ve been meaning to write/publish for a year or so now. Here in on the Performance Team, we support a wide range of technologies, with Task Scheduler being one of them. More often than not, the number one Scheduled Task issue we encounter is as follows:

    “In Windows 2003/XP, my scheduled tasks ran with no problems. Since we’ve upgraded to Windows 2008/2008-R2/Win7/Win8/2012/2012-R2, our tasks no longer run.”

    With that, we explain that Task Scheduler was completely re-written in 2008/Vista, with one of the main changes being in Security. Here is a snippet from a Technet Article published back on March 3, 2006:

    Windows Vista Task Scheduler

    Security. In the Windows Vista Task Scheduler, security is vastly improved. Task Scheduler supports a security isolation model in which each set of tasks running in a specific security context starts in a separate session. Tasks executed for different users are launched in separate window sessions, in complete isolation from one other and from tasks running in the machine (system) context. Passwords are stored (when needed) in the Credentials Manager (CredMan) service using encryption interfaces. Using CredMan prevents malware from retrieving the stored password, tightening security further.

    In Windows Vista, the burden of credentials management in Task Scheduler has lessened. Credentials are no longer stored locally for the majority of scenarios, so tasks do not "break" when a password changes. Administrators can configure security services such as Service for Users (S4U) and CredMan, depending on whether the task requires remote or local resources. S4U relieves the need to store passwords locally on the computer, and CredMan, though it requires that passwords be updated once per computer, automatically updates scheduled tasks configured to run for the specific user with the new password.

    Enter the new world of Session 0 Isolation.

    Prior to Vista/2008 Server, all services ran in the same session as the first user who logged onto the console - this is Session 0. Well, running user apps and services in this session posed a security risk because services run at elevated privileges and can be targets for malicious code.

    Enter the new and improved Task Scheduler that uses Session 0 isolation. In Vista/2008 and higher, we mitigate this security risk by isolating services in Session 0, and making it non-interactive. Only system processes and services now run in Session 0. The first user who logs onto a machine does so in Session 1. Subsequent users log into Session 2, 3, 4, etc. Doing this isolation protects services and system processes from tasks ran in this session.

    So, how does this isolation prevent my task from running?

    • There is no active Shell (explorer.exe)
    • If a process/service tries to display a message box, the task will not complete
    • Non-interactive
    • Apps creating globally named objects
    • Possible network communication failures

    For more information about Session 0 Isolation, please see the link above.

    At this point, we need to determine if there is a simple workaround to get your task to run, or determine if the application vendor needs to be engaged.

    Typically, I start with making the following Security changes to my Scheduled Task:

    “Run only when user is logged on”


    With this option selected, my task will only run if I am logged on with my WillyP account. I can now test and confirm to see that Task Scheduler properly launches/runs my task. Selecting this option also runs my task interactively in my session.

    You will see notepad.exe running in the same session as my logged on user – Session ID 2.


    Now, let’s look at the behavior when I have the other Security option selected.

    “Run whether user is logged on or not”

    With this option selected, I am telling Task Scheduler to run my task whether I am logged on or not – aka Session 0 isolated. Let’s see how this looks when my Willyp user is logged off and I schedule a task to run.


    As you can see, notepad.exe is running in Session 0. The other process, taskeng.exe, is the Task Scheduler Engine process that started my task.

    So, you may be asking yourself, would if I am logged on with this account, and the “Run whether user is logged on or not” is selected - will it be interactive? No, as Session 0 is a non-interactive session, therefore you will not see your Action even if you are logged on as the running user account.

    Now, how do we troubleshoot this and get your task to run? Well, in troubleshooting these issues, I’ve come across multiple ways to fix them. You may have to experiment to see which of the following works for you in your scenario.

    • If your Task requires UAC Elevation, select the “Run with highest privileges” option under Security on the General tab
    • If you are launching a Batch script (.vbs/.cmd/.bat/.ps1), modify your script to add some type of logging to see where it may be failing – see the following blog for examples: Two Minute Drill: Quickly test Task Scheduler
    • Try creating a new task, but select the Configure for: option to be “Windows Server 2003, Windows XP, or Windows 2000” – this will create an XP/2003 fashioned task
    • If running a .vbs / .ps1 script, try launching it from a .cmd / .bat script – for example: “cscript.exe myscript.vbs” would be in my .cmd/.bat script, and I would then launch it from my Scheduled Task
    • Check your scripts for environmental issues – when we run a script, we default to the “%SystemRoot%\System32” folder unless specified in the script (i.e. CD C:\Scripts\Test)
    • If you are running nested scripts/programs within one script, try breaking them out as multiple Actions – for example:


    So, when script1.cmd finishes, script2.cmd will be launched. Then when script2.cmd completes, script3.cmd will run.

    • If running a 3rd party app/script, engage the app vendor to check if their app/process will run correctly in a non-interactive session
    • Try running your script with the SYSTEM account
    • Check the History tab for clues as to why your task is not running
    • If all else fails, your only choice may be to “Run only when user is logged on”

    As we come across different issues/fixes, I will add them to the bulleted list above.

    Play around with the options above and see if you can get your Scheduled Task to run. If you come across a different fix not mentioned above, let us know in the comments below.


  • DFSR: Limiting the Number of Imported Replicated Folders when using DB cloning

    Hello! Warren here to talk about a very specific scenario involving the new DB cloning feature added to DFSR in Windows Server 2012 R2. The topic is how to limit or control which RFs you import on the import server in a DB cloning scenario. Ned Pyle has more
  • Loss of "ssh" via VIP following the assignment of IP addresses to Linux VM's with multi-nic

    Problem: When creating a VM with multi nic and multiple subnets the Guests "Defualt Gateway" is not automatically set. This can cause loss of "ssh" connectivity as the "Default Gateway" is not assigned to the correct NIC more
  • How to use Dumpchk.exe to check your dump files…

    Hello AskPerf!  Today’s post is a quick one that points to one of Bob' Golding's Windows Troubleshooting videos.  He talks about how to download/run Dumpchk.exe on your dump files to check for corruption.  Check it out below:

    DumpCheck – youtube video

    DumpChk – MSDN Link to more info



  • Free Webcasts from Microsoft’s US Central Marketing Organization (USCMO)

    The US Central Marketing Organization (USCMO) here at Microsoft is putting on a new and improved webcast and I wanted to put them up for those who wish to view them.  Each webcast will stream live with interactive Q&A and will be made available on demand.  These webcasts run for about 30-60 minutes.  Please feel free to register at any time.

    Protect Your Business Against Online Fraud
    January 20, 2015
    In recent years the online fraud epidemic has become a reality.  Is your business secure?

    Social in the Enterprise
    January 21, 2015
    FOX Business Network anchor Maria Bartiromo, the first journalist ever to report live from the floor of the NY Stock Exchange, shares why a good social strategy is crucial. Social networking expert and best-selling author Gary Vaynerchuk shares the secrets to social success in the enterprise. Charlene Li, renowned author and leadership and social consultant, provides concrete recommendations for how organizations can build effective networks to become leaders in the digital era. Andy Sernovitz, leader of the word of mouth movement, explains how building internal communities increases productivity and effectiveness. And host Alex Bradley, Microsoft Office, presents new, innovative social solutions. 

    Windows Server 2003 Migration: Hardware Modernization
    January 22, 2015
    With the Pending End of Support in July 2015, organizations must understand their rationale for migration from WS03.  This is not just a support issue but importantly an opportunity to enlist the power and flexibility of modern infrastructures running platforms like Windows Server 2012 and Azure.  Migrating simply sets your infrastructure up to harness you Enterprise Cloud strategy both on and off premise.  You want to make sure that you hardware keeps pace with these dynamic technologies.  This webcast covers some of the most important aspects of upgrading the workloads on modern hardware.

    It’s a New Year, Be Ready to Adapt
    January 22, 2015
    It’s a new year, be ready to adapt. Every New Year brings both the promise and the challenge of a quickly changing business environment. Staying ahead of the curve! Whether it’s your customers’ needs, security risks or compliance that require instant access to the data that will support good decisions.

    HIPAA Compliant Cloud Solutions with Microsoft BAA
    January 23, 2015
    Join us for this important webcast on January 23rd at 11:00AM PST to learn about Microsoft’s HIPAA Business Associate Agreement (BAA). This discussion will help you to better understand how healthcare organizations with a Microsoft BAA can move toward a contemporary plan for using Microsoft’s cloud services. This webcast will show how the Microsoft BAA provides healthcare organizations with the opportunity to use cloud solutions to improve patient outcomes while maintaining compliance with the privacy and security regulations that are outlined in HIPAA.

    Announcing the Enterprise Cloud Suite
    January 26, 2015
    With Enterprise Cloud Suite (ECS), Microsoft is now able to offer a comprehensive solution to customers that provides:
    • End-to-End Productivity: provide users with tools to collaborate and stay in sync anytime, anywhere
    • Data Protection: enable strong authentication, encryption and access controls across devices
    • Device Management: manage devices and applications across PCs, smartphones and tablets
    • Unified IT environment: leverage existing investments for identity and device management across on-premises software and cloud services
    • Pricing: ECS provides the best pricing through built-in suite discounts vs. buying components separately

    Get a fresh start in 2015 with new Windows devices
    January 28, 2015
    Celebrate the New Year and get more productive in 2015 with the latest technology powered by Windows 8.1. Whether you’re looking for laptops, 2-in-1 devices, or tablets, there is definitely a lot to choose from. Join us on January 28th to check out a broad range of Windows 8.1 devices and special offers. In the meantime, visit the Windows for Business ( website to stay up to date!

    Need fast AND affordable? Why not try SQL Server?
    January 29, 2015
    Why did RSI Retail Solutions, Lifetime Products, and Havas Media migrate to SQL Server? SQL Server runs mission critical workloads, provides top-of-the-line security features, and enables customers to leverage existing assets and knowledge base – without costing a fortune. By switching or adding new workloads to SQL Server 2014, you can improve your data platform performance and your bottom line on your terms.  Join Marcello Benati, Microsoft Solution Specialist, to learn how to easily migrate existing and new mission-critical workloads to SQL Server 2014.

    Mobile Productivity in the Modern Workplace
    February 4, 2015
    Mobility is changing our personal and professional lives.  People are bringing their personal devices and apps to work. Employees expect more dynamic work environments to take advantage of mobile capabilities and work from anywhere. Apps, including productivity tools, need to work well on mobile devices and in the business scenarios these devices are used. To get work done from anywhere, mobile devices with basic services, like email, aren’t enough. In this webcast you will learn how Microsoft provides the richest productivity solution across any device, for any type of worker, in a secure, enterprise-grade way.

    Windows Server 2003: Most Common Application Migration Concerns
    February 5, 2015
    Build your migration plan - do it yourself, collaboration with a partner, or use a service.  Find out about your options whether moving your applications to the cloud or keeping in your infrastructure. 

    Enabling Customer Insights Using Business Analytics
    February 12, 2015
    Business analytics is about capturing that information in real-time and empowering people to put it to use, by combining data in new ways, to generate new insights. Hear from Pier 1 on how they use business analytics to drive their business.

    Windows Server 2003: Security Risk and Remediation
    February 18, 2015
    With Windows Server 2003 support ending on July 14, 2015,  many organizations find themselves in the situation where legacy, mission critical workloads and applications are running on a soon to be unsupported platform. Some organizations may be considering alternate security strategies – like ring-fencing their existing Windows Server 2003 servers –as a way to delay migration. This webinar examines the viability of common risk remediation tactics for Windows Server 2003-- and makes the case for migration is ultimately the best option.

    The Connected Workforce
    February 18, 2015
    The world has become a giant network, with people connecting in new ways using social and mobile technologies. Has your company adapted to this networked world? By delivering seamless social experiences across familiar work applications on an enterprise-grade platform, Microsoft helps over 400,000 companies worldwide engage, inform and connect employees. During this webcast you will learn how Microsoft can help your company connect, inform, and engage employees using enterprise social technologies.

  • We Are Hiring Windows Escalation Engineers in Munich, Germany

    Would you like to join the world’s best and most elite debuggers to enable the success of Microsoft solutions?   As a trusted advisor to our top customers you will be working with to the most experienced IT professionals and developers in the industry more
  • Case of the blank print jobs

    Hello Askperf! Anshuman here again with an interesting issue I worked a few weeks ago.

    The following pop-up appeared on my workstation intermittently:


    I then realized that I had the Send To OneNote printer set as my default printer.

    The next time this occurred, I paused the print queue and noticed that the “Remote Desktop Redirected Printer Doc” document was getting spooled under my account. This was interesting because I had several remote desktop sessions opened to different machines from my workstation, and did not send any prints jobs from them.


    So two questions came to mind:

    1. Which RDS session is this coming from?

    2. What was sending this print job?

    I then thought to myself, “when in doubt, run Process Monitor!”

    My first challenge was to figure out which server session this job was generated from. For this, I ensured that all the RDS sessions I established were using the command line option of of mstsc.exe (mstsc /v:servername). Next, I started process monitor on my workstation with a specific filter of “Process Name is mstsc.exe” and “Path contains .spl”. Since this issue was intermittent, I checked the “Drop filtered events” option. I also ensured that the Backing File option under File menu was pointing to a file, instead of Virtual Memory (pagefile). After a while the issue occurred, and procmon captured the following events:


    One of the first things I noticed was the CloseFile operation immediately after the CreateFile operation. Typically, you will see a WriteFile operation in between these two operations. So mstsc is connecting to which server? That was easily found by examining the Command Line entry of mstsc captured in the pml file:


    I logged into the problem server and launched procmon, ensuring that the Backing file option was set to point to a .pml file on a drive with enough space, and “Drop filtered events” was selected. Next I set up a filter “Path Contains tsclient” as well as “Path Contains RdpDr”. I then established an RDS session to the server from my work station and waited for the mysterious 0Kb print job. Once it happened, I had the following events in the pml file from the ProblemServer:


    So there was an addon service that got installed on the printer server with a print driver. Disabling this ensured that those mysterious 0kb jobs ceased to occur.


  • How to migrate local ports when doing print migration

    Hello Askperf! My name is Tingu, and today I’m going to talk about an interesting print migration issue I had a few weeks ago.

    We had a case where an application server was running on Windows 2003, where more than 400 print queues were created. The port was created as a local port to forward the print job in case of a failure as noted in the “Transfer documents to another printer” Technet article.

    The port was configured as \\printservername\printer.  See example screenshot below:


    Here, we were trying to move the application to a 2012 R2 server and wanted to migrate all the print queues to the new server. We used printbrmto migrate all of the local printers.  But the problem we ran into is that it did not migrate the local ports.

    When we started the migration, we did not see the local ports listed:


    Additionally after the migration, the port was not present:


    We tried to add the port manually, but gave us the error “port already exists”.  Additionally, the registry shows that the printer is set to use the forwarder.


    We really needed to get the local ports migrated as it can be a tedious task to re-create all the ports and map to their respective print queue. 

    We created a test lab and saw the same issue while migrating.  It did not matter from/which OS we were migrating.  During the migration, we saw an event ID 81 on the 2012 R2 server. (This event is not triggered if you are migrating to 2003 or 2008R2):

    Log Name:      Microsoft-Windows-PrintBRM/Admin
    Source:        Microsoft-Windows-PrintBRM
    Date:          12/25/2014
    Event ID:      81
    Task Category: Restore
    Level:         Error
    Keywords:      Print Queue
    User:          Joe
    Computer:      12345

    Printbrm.exe (the Printer Migration Wizard or the command-line tool) failed to restore print queue test. The restore process will continue, skipping this queue. Error: 0x80070057 which is “invalid parameter”
    Error: 0x80070057 which is pointing to “invalid parameter”

    So what we determined is that when you use printbrm for migration, it will not migrate the local ports.  The reason is that the local port is specific to the server, and it may cause conflicts or not work if you migrate it to a different server.  But in our case it’s a forwarder, and we need it to be migrated.

    Further testing revealed that if a local port to which the printer is mapped is already present on the destination server, then the migrated printers will use that local port for the printers.

    For example: on the source server you have a printer mapped to LPT1, and the destination server has LPT1 port available; then after the migration, the printer will be set to use that port. We created a forwarder on the destination server for a test printer before migration, and after importing the printer, we see that the port is mapped accordingly.

    Now the question is, how do we migrate multiple local ports at a time?

    Here is what we did…

    From the print management on a 2012 R2 server, we added the 2003 server.  Then we exported the list of ports to a .csv file:


    This gave us the list of all ports needing to be migrated.  We then created a script to add the ports to the destination server.  As in our case, the destination server was Windows 2012 R2 server, so we used the powershell command Add-PrinterPort.

    We copied all of the required ports into notepad, and saved it as a .ps1 file:


    We ran the .ps1 file as admin, and all of the ports got created on the destination server!


    Note If you have already tried the migration before creating the ports on the destination server, it may give you the error ‘port already exists’ while running the powershell command.  You may need to delete the printers migrated and restart the spooler and then retry the powershell command to complete the port creation.

    After that, we followed the normal migration procedureand all printers got mapped to the correct port.

    I hope this information will come in handy the next time you are working through a printer migration. 


  • How to make your existing Bitlocker encrypted environment FIPS complaint

    Hello, my name is Mayank Sharma and I am a Technical Advisor here at Microsoft. In this blog, I will discuss FIPS compliance with Bitlocker. Microsoft's solution for completely encrypting data inside laptops, desktops and removable drives. So let’s get started...

    FIPS stands for Federal Information Processing Standard and is United States Government standards that provide a benchmark for implementing cryptographic software. It basically means that if a software is approved by one of the labs that do the testing for FIPS compliance, the software meets the government standard for cryptography. Thus can be commonly used by US Federal government and organizations around the world. There is a lot that can be written about FIPS. Better I route you to the following link:

    FIPS Compliance

    To enable FIPS on a computer, i.e. tell it you have to be complaint with the government policies, we need to alter the following group policy

    Computer Configuration\Windows Settings\Security Settings\Local Policies\Security Options

    The name of the policy is following:

    System cryptography: Use FIPS compliant algorithms for encryption, hashing, and signing

    Now that we know what FIPS is and what it does, let’s focus our attention back on Bitlocker, Microsoft’s security solution for protecting data across laptops and desktops. Bitlocker uses multifactor authentication to ensure Bitlocker encrypted drive(s) will always remain in good hands. To accomplish this task, it uses multiple protectors to protect a volume. While some are ‘primary’ protectors which will be used most of the times, namely TPM, TPM and PIN, Password etc., some will be used when Bitlocker senses something has changed and goes in a lockdown mode. During a lockdown mode, it will ask user to prove that user is genuine. Examples of protectors include recovery password, recovery key, Data recovery agent, etc.

    Now here comes the tricky part. Whether or not Bitlocker is FIPS complaint is decided by if one of the cryptographic keys that protector is using is indeed FIPS compliant. Password protectors for the operating system drive/fixed data drive are not complaint with FIPS specification, so does the recovery password until Windows 8.  The below article discusses this in more detail:

    The recovery password for Windows BitLocker is not available when FIPS
    compliant policy is set in Windows Vista, Windows Server 2008, Windows 7
    and Windows Server 2008 R2

    Let’s say there is a ‘happy go lucky’ organization that uses TPM+PIN protectors to authenticate the OS drive of user’s laptop running Windows 7 and storing recovery passwords in MBAM database. If a user gets locked out, Helpdesk will provide the information of recovery password to the user to unlock the machine. This is the happy ending of the story until one day FIPS were to be mandatorily implemented.

    a. Will this happy go lucky Organization be FIPS complaint? No, as it is using recovery password as a protector which is not FIPS complaint.
    b. Does this means while infrastructure needs to be rebuilt from scratch? Of course not!

    Steps to make this environment FIPS complaint;

    Step 1:

    We need to get rid of the recovery password which is making the infrastructure non FIPS complaint. First thing would be to delete the associated recovery password with this Windows 7 machine. Run the following from an elevated command prompt:

    manage-bde -protectors -get c:

    This lists all the protectors

    Volume C: [OSDisk]
    All Key Protectors

        TPM And PIN:
          ID: {161941A3-8CB3-439C-8FC6-1642D0C97C8D}
          PCR Validation Profile:
            0, 2, 4, 11

        Numerical Password:
          ID: {C6DF1E74-467F-4BE8-9C59-C9A9F345B9A0}

    Note the ID of the Numerical password protector and to delete it run the following command:

    manage-bde -protectors -delete c: -id {C6DF1E74-467F-4BE8-9C59-C9A9F345B9A0}

    This will delete the recovery password protector.

    Step 2:

    Now, imagine if the user forgot the PIN or because of any other reasons gets locked out. We should need to have a way to break back into machine. So we need to add some protectors that will help us in lockdown situations. Fortunately, we still have a choice to make here. We can add any of the two protectors which are FIPS compliant.

    a. Data recovery agent

    How to use Bitlocker Data Recovery Agent to unlock Bitlocker Protected Drives

    b. Add a recovery key to the volume, this is as simple as running the command where e: is the destination drive where you want to store the .BEK file.

    manage-bde -protectors -add c: -rk e:

    Just save this file in a safe place.  If a machine gets locks out, copy it over to a USB drive.  More information can be found  here:

    What is a BitLocker recovery key?

    Step 3:

    Though not mandatory, once we will enable the group policy for FIPS, it will not allow creation of FIPS. We can additionally disable the creation of any more recovery passwords. Just disable the policy like I did below under Computer Configuration\Administrative Templates\Windows Components\BitLocker Drive Encryption.


    As "Password" is not a FIPS complaint protectors, you cannot use it with fixed data drive either. We can either use a smart card protector or a DRA… And happy go lucky should be happy again!

    As stated above, this is specifically meant for Windows 7/Vista and Windows Server 2008/2008R2. Had the company been proactive in moving along to a newer version of Windows (i.e. Windows 8/8.1, Windows Server 2012/2012R2), it would not have any effect on them. The recovery password is FIPS compliant for Windows 8 and above operating systems.

    So this is pretty much it. Keep your machines encrypted until next time.

    I thank Himanshu Singh for taking time out to go through this blog.

    Mayank Sharma
    Technical Advisor
    Windows Deployment Services

  • Troubleshooting Windows activation failures on Azure VMs

    If you are experiencing Windows activation failures on an Azure VM, please try the following steps to resolve the issue. An example of an error message you may see is: Error(s): Activating Windows(R), ServerDatacenter edition Error: 0xC004F074 The Software more
  • Disk Performance Internals

    Abstract: Storage is the slowest component of most computer systems. As such, storage is often a performance bottleneck. This article discusses the disk performance kernel provider, partition manager.  By understanding how the disk performance provider more
  • Driver Object Corruption Triggers Bugcheck 109

    My name is Victor Mei, I am an Escalation Engineer in Platforms Global Escalation Services in GCR.  Some customers I worked with have strong interests in debugging; but usually they got frustrated when I told them “To find the cause from this dump more
  • Recovering Azure VM by attaching OS disk to another Azure VM

    If you are unable to administer an Azure VM because of RDP or SSH failures, in many cases rebooting or resizing the VM may resolve the issue. You can troubleshoot the VM by attaching the OS disk as a data disk to a different Azure VM using the steps more
  • Surface Pro 3 Hibernation Doesn’t Occur on Enterprise Install

    Hi my name is Scott McArthur and I want to call out a recently published KB article:

    Surface Pro 3 doesn't hibernate after four hours in connected standby

    If you are deploying an image to Surface Pro 3, you are missing out on the feature where after 4 hours in Connected Standby the device will hibernate. This is a key feature related to battery life so I would recommend that all Enterprise customers install KB2955769 and incorporate these PowerCfg commands into your deployment.

    If you use Microsoft Deployment Toolkit 2013 for your deployments this is super easy. Here are the steps

    1. Under Packages, import KB2955769


    2. Create PowerCfg_Sp3.batthat contains the following commands:

    REM sets CS battery saver time-out to four hours:
    powercfg /setdcvalueindex SCHEME_CURRENT e73a048d-bf27-4f12-9731-8b2076e8891f 7398e821-3937-4469-b07b-33eb785aaca1 14400
    powercfg /setacvalueindex SCHEME_CURRENT e73a048d-bf27-4f12-9731-8b2076e8891f 7398e821-3937-4469-b07b-33eb785aaca1 14400

    REM sets CS battery saver trip point to 100:
    powercfg /setdcvalueindex SCHEME_CURRENT e73a048d-bf27-4f12-9731-8b2076e8891f 1e133d45-a325-48da-8769-14ae6dc1170b 100
    powercfg /setacvalueindex SCHEME_CURRENT e73a048d-bf27-4f12-9731-8b2076e8891f 1e133d45-a325-48da-8769-14ae6dc1170b 100

    REM sets the CS battery saver action to hibernate:
    powercfg /setdcvalueindex SCHEME_CURRENT e73a048d-bf27-4f12-9731-8b2076e8891f c10ce532-2eb1-4b3c-b3fe-374623cdcf07 001
    powercfg /setacvalueindex SCHEME_CURRENT e73a048d-bf27-4f12-9731-8b2076e8891f c10ce532-2eb1-4b3c-b3fe-374623cdcf07 001

    powercfg /setactive SCHEME_CURRENT

    3. Save PowerCfg_Sp3.bat to your Deploymentshare\Scriptsfolder

    4. Open up the task sequence you use to deploy Windows and add a custom task in the state restore phase called PowerCfg-SP3


    5. In the properties of this task sequence step, edit the following:


    6. Click the Options tab and add conditional for “Task Sequence variable model equals Surface Pro 3”


    Note:This ensures this only runs on Surface Pro 3 devices using the model variable

    Hope this helps with your Surface deployments and keep eye on this blog for more tips and tricks for Surface

    Scott McArthur
    Senior Support Escalation Engineer

  • Your technical answers and automated solutions via

    Hello folks,

    One of our Engineering PMs that supports our Diagnostics and Automated solutions published a blog regarding Bing and how you can use it to answer your technical questions and provide automated solutions.  Here is a brief overview:

    Bing Technical Instant Answers provide concise answers to technical questions directly within search results and hopefully answer your question (or help you solve an issue) without you having to actually visit the web pages linked within the answer. The answers are triggered by specific search phrases, and they try to provide a unique benefit either by precisely matching your intent or by providing additional content related to your intent. In some cases, the instant answer will link to an automated fix or troubleshooter that you can run directly from the Bing search results. Microsoft will constantly be adding new technical answers, so if you have a technical problem with a Microsoft product or service try asking Bing to see if we have an instant answer for you!

    Go check out his blog via the link below:

    Using Bing for technical instant answers and automated solutions


  • Cross Post: Using Bing for technical instant answers and automated solutions

    This is a cross post from William Keener’s Support Diagnostics and Automated Solutions blog that we wanted to add to our site.  It relates to Bing and instant answers about Microsoft Products/Technologies/Support issues and here on the AskCore site, we are all about getting this type of information out there.  Any comments made should be made on the originating post so it can be properly seen, heard, or answered.


    Using Bing for technical instant answers and automated solutions

    Bing has been providing factual instant answers (and translation instant answers) for some time now, but recently they added "technical" instant answers for questions about Microsoft products and technologies or technical support issues. My previous team built the content management system that our internal content delivery teams are now using to add technical instant answers to Bing. Here's an example technical instant answer for the "Cortana" search term: 

    Now that I'm working on support diagnostics and automated solutions again, I have been working with the Bing and content delivery teams to get some instant answers created with links to some of our automated solutions.

    And I'm happy to announce that the first one is live! So you can now search for "Windows Update Troubleshooter" (or a variety of related terms and error messages) and the first result will be a technical instant answer with a link to download and run our automated troubleshooter to fix problems with Windows Update.

    When you click the link in step 3, you will be prompted to open (or run) or save the troubleshooter.

    Just click Open (or Run) to launch the troubleshooter.

    The content delivery teams will be constantly adding more technical instant answers, and we hope to have more live with automated solutions soon!

    Note that technical instant answers are also available in the Bing app on Windows Phone. To see the phone experience, tap Search and then type or say "cortana" on your Windows Phone. Then click the "See More" link at the bottom of the second result (after the ad - "Meet Cortana on Windows Phone 8.1") and swipe left or right to view the content on each of the tabs.

  • Understanding ATQ performance counters, yet another twist in the world of TLAs

    Hello again, this is guest author Herbert from Germany. If you worked an Active Directory performance issue, you might have noticed a number of AD Performance counters for NTDS and “Directory Services” objects including some ATQ related counters. In this more