March, 2013

  • Live Customer Q&A with Virtualization Experts! (Microsoft Virtual Academy)

    The IT Pro Evangelism team, Microsoft Learning and the Microsoft Virtual Academy are pleased to announce the next FREE & PUBLIC event Live Q&A: Introduction to Hyper-V on Wednesday April 3rd, from 8:30 am – 10:00am PST with virtualization experts Jeff Woolsey (Principal Program Manager) & Symon Perriman (Senior Technical Evangelist).

    Ask your customers to join this live online event designed for IT professionals that have questions about Microsoft virtualization and want to learn about Windows Server 2012 Hyper-V.  Register here:  If you cannot make the live event, sign up anyway so you can receive a notification when the recording is published on the Microsoft Virtual Academy.

    Topics and demos may include:

    • Introduction to Microsoft Virtualization
    • Hyper-V Infrastructure
    • Hyper-V Networking
    • Hyper-V Storage
    • Hyper-V Management
    • Hyper-V High Availability and Live Migration
    • Integration with System Center 2012 Virtual Machine Manager
    • Integration with Other System Center 2012 Components

    Tweet: Ask us your questions about #Windows #Sever 2012 #HyperV! Register for this live free public Q&A event on April 3rd:

    Also check out our recent full day training Microsoft Virtualization for VMware Professionals Jump Start which is now available on the Microsoft Virtual Academy.

  • Debugging a Network Connectivity Issue - TrackNblOwner to the Rescue

    Hello Debug community this is Karim Elsaid again.  Today I’m going to discuss a recent interesting case where intermittently the server is losing access to the network.  No communication (even pings) can be done from / to the server when the more
  • Logical Processor count changes after enabling Hyper-V role on Windows Server 2012

    Hello Everyone, 

    Today we are going to talk about Processor changes in Windows Server 2012 Hyper-V servers. Hyper-V in Windows Server® 2012 supports significantly larger configurations of virtual and physical components than in previous releases of Hyper-V. This capacity enables virtualizing high performance, scale-up workloads.

    Well, we will not talk about performance and scalability in depth but an interesting change in Hyper-V on Windows Server® 2012. A Windows Server® 2012 running Hyper-V can support up to 320 Logical Processors and 64 Virtual Processors for each Virtualized partition (Virtual machine). That being said, if we have a Server 2012 with Hyper-V enabled running on a Hardware with 10 processors having 4 core each and Hyper threading enabled (Long story short - Hardware with 80 Logical Processors) you will see that only 64 Logical Processors will be assigned for the host. 

    Here is how it looks in Task Manager on a Windows Server 2012 with no Hyper-V role enabled


    After enabling the Hyper-V role on the server you will see something like below. A new entry gets added called "Host Logical Processors" and will have a value of 64. This is the number of logical processors that are assigned to the Host Operating System. Also notice that the value of “Logical processors” did not change.


    This does not mean that the remaining processors will not be used. They will still be used for Virtual machines running on the Server.

    What changed?

    This behavior is due to architectural changes in Hyper-V 2012 Server which has been designed to support 320 Logical Processors.

    Hypervisor Early Launch

    In Windows Server® 2008 and Windows Server® 2008 R2 the OS in the parent partition booted first and then launches the Hypervisor via driver hvboot.sys. But in Windows Server 2012 and Windows 8 we do an early launch of the Hypervisor before the OS in the parent partition. The Hypervisor initializes the BSP, applies any microcode update needed and enables the virtualization. The OS in the parent partition is booted on a Virtual Processor.

    Minimal Parent Hypervisor

    The Parent partition will run no more than 64 Virtual Processors regardless of the number of Logical Processors present in the system and this value is NOT user-configurable. The hypervisor continues to manage all Logical Processors, schedule guest Virtual Processors on all Processors etc.

    Why was this implemented?

    • Managing more than 64 VPs presents a scalability bottleneck.
    • Beyond 64 logical processors, the parent should not need more VPs to handle I/O for the system

    NOTE: With 64 logical processors the Parent Partition has been tested and demonstrated that it can handle over 1 million IOP’s and the sustained CPU utilization in Task Manager showed between 20-25%. In short with 64 cores in the Parent Partition is more the adequate to handle the I/O load.

    Santosh Vavilal
    Support Engineer
    Windows Core Team, Microsoft Enterprise Platforms

    Keith Hill
    Sr. Support Escalation Engineer and the Hyper-V Support Topic Owner.
    Windows Core Team, Microsoft Enterprise Platforms

  • PFE’s Troubleshooting Performance issues with Windows Performance Recorder

    Hey Folks, one of our peeps on AskPFEplat posted a great blog on Troubleshooting Performance issues using the Windows Performance Recorder app.  This tool allows you to easily capture XPERF traces on your *machines.  Go check it out!

    Troubleshooting Windows Performance Issues Using the Windows Performance Recorder

    *NOTE  Windows Performance Toolkit v5.0 is only compatible with Windows 7, Windows Server 2008 R2, Windows 8, and Windows Server 2012.



  • Understanding File System Minifilter and Legacy Filter Load Order

    Hello, my name is Fred Jeng from the Global Escalation Services team. For today’s post, I want to go over how Windows 7 and Windows Server 2008 R2 load file system mini-filters in a mixed environment when legacy filters are also present. I recently came more
  • Alternate Data Streams in NTFS

    This blog has been a long time coming. There is a bit of confusion about the subject of alternate data streams (ADS) and no small amount of suspicion. So I want to take a few minutes to set the record straight on ADS.

    A couple years ago I wrote a blog on NTFS attributes.

    You might want to review that blog before continuing. I’ll wait….

    Welcome back.

    One of the common questions I get is, “Robert. What is an alternate data stream?”

    My reply is always the same, “It is a data stream that is alternate”.

    I don’t mean to be smart aleck about it…but that’s what it is. We know from my older blog that a file is divided up into ‘attributes’ and one of these attributes is $DATA or simply called the data attribute. It is the part of the file we put data into. So if I have a text file that says, “This is my text”, then if I look at the data attribute, it will contain a stream of data that reads, “This is my text”. However, this is the normal data stream, sometimes called the primary data stream, but more accurately it is called the unnamed data stream. Why? Because it is a data stream that has no name. In the jolly land of programming it is referred to as $DATA:””


    The name of the stream will appear between the quotes. Since this is an unnamed data stream, there isn’t anything there.

    Now that we know what the unnamed data stream looks like, we can start thinking in terms of alternates. Knowing that the place we normally store data is the unnamed data stream, if a stream has a name, it is alternate. So if I had a file with an ADS named SecondStream, its full name would be, $DATA:”SecondStream”


    This is all good and fine, but unlike the unnamed data stream, we can’t see the ADS. Or can we? The answer is, yes we can. But you have to use a method different than just opening the file in NotePad.

    There are a number of tools out there that will allow you to view and manipulate ADS. One that Microsoft has provided for years is called STREAMS.EXE.

    STREAMS.EXE will display any ADS the file has.


    The formatting is a little different.


    STREAMS.EXE is fine, and I’ve used it for years, but with the release of Win8/Server 2012, I’ve discovered a new way of dealing with ADS….Windows PowerShell. Using the cmdlet, Get-Item, I can get more information than I did with STREAM.EXE.


    The output shows not only the name of the ADS and its size, but also the unnamed data stream and its size is also listed (shown as :$DATA). And now that I know the name of the ADS, I can use the Get-Content cmdlet to query its contents.


    STREAM.EXE can’t display what’s actually in an ADS. Here’s another trick that STREAM.EXE can’t do….create data streams. Using Set-Content, I’ll create a second ADS in the same file and add a line of text.


    And again, we can query for the streams using Get-Item.


    And finally, we can remove an ADS using the Remove-Item cmdlet.


    Now we know what ADS is, how to query for ADS, how to create ADS, and how to delete ADS. So what is the big deal?

    The big deal is that since ADS isn’t easily visible, it has become a cute way to hide data. Unfortunately it has also been used in the past to hide malicious code. This is how ADS got a bad name. In fact, a number of people that approach me about ADS already know that they have files with alternate data streams and they think they are infected with viruses.

    Calm down. The mere presence of an ADS doesn’t mean that there is a problem. In fact, Microsoft uses ADS for a number of functions. I can almost guarantee that if you are reading this, you probably have some ADS on your computer. Let’s take a look at a couple examples.

    Internet Explorer: Ever download an executable file from the Internet and then get warned about it when you ran it? How does that work?

    When the file is downloaded, IE slaps an ADS on it. The stream will store a tag that tells Windows what zone the file was downloaded from.


    Look Familiar?

    So using what I’ve learned so far, I can look at one of the files I’ve downloaded from the internet and see if there is an ADS on it.


    Yes, it is called ‘Zone.Identifier’. And then we can query the contents of the ‘Zone.Indentifier’ ADS.


    Now we know that the file was downloaded from zone 3. Using the zone chart we can see it came from the Internet zone.

    Value Setting
    0     My Computer
    1     Local Intranet Zone
    2     Trusted sites Zone
    3     Internet Zone
    4     Restricted Sites Zone

    Notice that my test download file is in a test directory. This means I moved the file here from my download directory. This is the cool thing about ADS, since it is part of the file, it moves with the file. Even if I copied it, the ADS would be on the new copy as well.

    Other Internet browsers use ADS in a similar fashion.

    File Classification Infrastructure: FCI is very dependent on ADS. The way that the classification works is that it puts tags on your files that allows you to keep track of what the file was classified as, no matter what happens with the file. It could be edited, copied, moved to another server, and its classification tags remain intact.

    Others: Office files and Outlook Express file use ADS. And it isn’t limited to Microsoft programs. Numerous programs utilize the ADS functionality.

    The point is that if you discover ADS on your system, it isn’t necessarily a bad thing. And just blindly stripping these data streams out of files can actually do a great deal of harm.

    And now that you have some tools to use for querying alternate data streams, they won’t be so scary.

    Thank you for your time and I hope this was educational.

    Robert Mitchell
    Senior Support Escalation Engineer
    Microsoft Corp.

  • Creating bootable USB drive for UEFI computers

    In today’s blog I am going to discuss how to create a multi-partition bootable USB drive for use UEFI based computers.

    It is common to create bootable USB flash drives or hard drives so you can boot from them to do various tasks such as

    • Boot Windows PE (WINPE) for recovery purposes
    • Boot Windows PE (WINPE) to deploy image
    • Boot Microsoft Deployment Toolkit media deployment share

    UEFI based systems such as the Surface Pro or other UEFI systems require that the boot files reside on FAT32 partition.  If they are not FAT32 the system may not see the device as bootable. 

    FAT32 has a 4GB individual file size limitation and 32GB maximum volume size.  If any of the files are larger than 4GB you may have to configure the drive differently.  Consider if you are booting Windows PE 4.0 and want to deploy a custom image using Dism.exe where the size of the image is 8GB.  You would not be able to store the image on the FAT32 partition. 

    To get around this you have create multiple partitions on the drive.  Most flash drives report themselves as removable but to create multiple partitions the drive must report itself as Fixed.  If you have access to a Windows to Go (WTG) certified drive you can use it since a requirement for WTG is that the device report as fixed.  Some USB hard drives like the Western Digital Passport report themselves as fixed also. 

    To verify if the drive is reporting itself as fixed or removable plug the drive in and open My Computer:

    • Drive shows up under “Hard Disk Drives”:  Fixed
    • Drive shows up under “Devices with Removable Storage”:  Removable

    To create a USB drive with multiple partitions use the following steps

    1. Open elevated cmd prompt
    2. Type in Diskpart and hit enter
    3. Type in the following commands:

    List disk
    Sel disk X: (where X: is your USB drive)
    Create Part Primary size=2048
    Format fs=fat32 quick Label=”Boot”
    Create part primary
    Format fs=ntfs quick Label=”Deploy”

    Note:  You can choose different sizes and volume labels depending on your needs

    At this point you can now copy your boot files to the FAT32 partition and your other files to the NTFS partition. 

    In the earlier example you would copy the contents of your custom Windows PE (WINPE) 4.0 files in C:\winpe_amd64\media to the FAT32 partition and your custom install.wim to the NTFS partition

    Hope this helps with your deployments

    Scott McArthur
    Senior Support Escalation Engineer
    Microsoft Commercial Services & Support

  • Installing Volume Activation Services Role in Windows Server 2012 to Configure ADBA


    Today’s blog will walk you through installing the Volume Activation services role on Windows Server 2012 so you can configure Active Directory Based Activation (ADBA).

    Initial Configuration of Active Directory Based Activation

    1. After installing Windows Server 2012 open Server Manager and choose “Add roles and features”

    2. Choose “Volume Activation Services” role


    Figure 1. Add roles and features wizard

    3. When complete you will see Yellow Triangle in Server manager which means that the role is installed but there is additional configuration required. Click Volume Activation Tools


    Figure 2. Post deployment configuration required

    4. Choose Active Directory-Based Activation. You can also enter different credentials than what you are currently logged in as to connect to Active Directory


    Figure 3. Select Volume Activation Method

    5. Enter your Windows Server 2012 KMS Host key (CSVLK) to create the Windows Server 2012 Activation Object in AD. You should give it a name such as “Windows Server 2012 CSVLK”


    Figure 4. Enter CSVLK KMS host key


    • The CSVLK is used to setup both KMS hosts and create the Active Directory Based Activation object
    • If you install the Windows Server 2012 KMS Host Key (CSVLK) the object created will activate both Windows Server 2012 and Windows 8
    • If you install the Windows 8 KMS Host key (CSVLK) the object created will activate only Windows 8
    • It is possible have both objects in Active Directory
    • For down-level clients such as Windows Vista, Windows Server 2008, Windows 7, Windows Server 2008R2 you will need to setup a KMS host
    • You can use VAMT 3.0, slmgr.vbs, adsiedit, or the Volume Activation tools wizard to view and configure the objects
    • The objects can be ACL’D just other objects in AD to restrict/enable access
    • For most creating the Windows Server 2012 Activation object creates the simplest configuration

    6. Choose online or phone activation and click commit. Note: This is activation for the CSVLK key and not this particular server


    Figure 5. Online or phone activation

    Changing Configuration of Active Directory Based Activation

    Once you have configured the KMS host you can check the configuration by opening Server Manager, Click Tools, “Volume activation tools”. Choose Active Directory-Based Activation and choose next. Click “Skip to Configuration” and you will see the following information about the activation objects in AD

    It contains the following information:

    • Display Name
    • Activation ID
    • Partial Product Key
    • Extended PID
    • Distinguished name

    Note: You can click the object and choose Ctrl-C to copy the information to clipboard

    You can also use the following command to view any Active Directory Based objects in AD:

    Cscript.exe %windir%\system32\slmgr.vbs /ao-list

    To manage KMS hosts, Active Directory Based activation, or get detailed information about activation for client machines you should install VAMT 3.0 which is available to install as part of the Automated Deployment Kit. This can be installed on any machine

    Good luck with your activations :)

    Scott McArthur
    Senior Support Escalation Engineer
    Microsoft Commercial Support and Services

  • Installing Volume Activation Services Role in Windows Server 2012 to setup a KMS Host


    Today’s blog will walk you through installing the Volume Activation services role on Windows Server 2012 so you can setup a new Windows Server 2012 KMS host.

    Initial Configuration of a new Windows Server 2012 KMS host

    1. After installing Windows Server 2012 open Server Manager and choose “Add roles and features”

    2. Choose Role-Based or Feature-Based installation

    3. Choose your local Windows Server 2012 Server from the Server Pool. Note: These steps do not cover adding the role to offline virtual hard disk.

    4. Choose “Volume Activation Services” role.


    Figure 1: Add roles and feature wizard 

    5. When complete you will see Yellow Triangle in Server manager which means that the role is installed but there is additional post deployment configuration required. Click Volume Activation Tools.


    Figure 2. Additional Post Deployment Configuration

    6. Choose Key Management Service (KMS) then select the Windows Server 2012 computer you want to create as KMS host. You can also enter different credentials to connect to the server


    Figure 3. Select volume Activation Method

    7. Enter your KMS Host Key (CSVLK). Note: You can only install a Windows Server 2012 CSVLK at this point. A Windows Server 2012 CSVLK can activate Windows Vista, Windows Server 2008, Windows 7, Windows Server 2008 R2, Windows 8, and Windows Server 2012 KMS clients.


    Figure 4. Install your KMS Host Key (CSVLK)

    8. Click yes for warning on that you are replacing the currently installed key


    Figure 5. Uninstall existing key

    9. Click Activate Product


    Figure 6. Activate product

    10. Select the product and type of activation(Online or Phone) and click Commit


    Figure 7. Activate online or phone

    11. On the summary page click Close to accept defaults or click Next to configure options


    Figure 8. Default KMS Configuration

    12. If you click Next you can configure the following options:

    • Volume License Activation Interval(Default 2 hours): How often the KMS client attempts activation before it is activated and during Grace and Notifications
    • Volume License renewal Interval(Default 7 days): How often KMS client renews it’s activation
    • KMS TCP Listening port(Default 1688): Port used by KMS
    • KMS Firewall Exceptions: Which firewall profile to add KMS exception to
    • DNS records (Default Yes): Publish the KMS record to DNS. If unchecked you can create the record manually using these steps
    • Publish to custom DNS zones: Names of additional DNS zones to publish the KMS record to


    Figure 9. Customize default KMS Host configuration

    13. Click Commit to save changes

    Changing Configuration of KMS Host

    Once you have configured the KMS host you can check the configuration by opening Server Manager, Click Tools, “Volume activation tools”. You will need to enter the computername and now you get a prompt where you can choose “Skip to Configuration”


    Figure 10. Skip to Configuration

    The Configuration screen shows some of the common configuration options available. See above for additional information on these options. In previous versions of Windows some of these options were only available when using slmgr.vbs or specific registry keys. Windows Server 2012 adds GUI option to configure these options now


    Figure 11. KMS Host Configuration

    You should also verify that the KMS host registered in DNS. For additional information see the following KB article:

    Event ID 12293: 0x80072338 error registering KMS host in DNS

    To manage KMS hosts, Active Directory Based activation, or get detailed information about activation for client machines you should install VAMT 3.0 which is available to install as part of the Automated Deployment Kit. This can be installed on the KMS host or client machine.

    Good luck with your activations :)

    Scott McArthur
    Senior Support Escalation Engineer
    Microsoft Commercial Support & Services

  • Windows 8 and Windows Server 2012 Recommendations for Activation


    Today’s blog will cover the options available if you are deploying Windows 8 and/or Windows Server 2012 to an existing KMS activation configuration. It references two other blogs


    • You have existing KMS hosts activating Windows Vista, Windows 7, Windows Server 2008, Windows Server 2008R2
    • Your KMS hosts are running an server operating systems and not client operating system
    • You are planning on deploying Windows 8 and/or Windows Server 2012.
    • You may or may not have Office 2010 or 2013 configured for KMS activation

    Important points

    • Windows 8 and Windows Server 2012 have the option of using Active Directory Based Activation (ADBA). This is where a domain joined computer looks for activation object in AD and is automatically activated.
      • ADBA requires AD DS be at the Windows Server 2012 schema level to store activation objects. Domain controllers running earlier versions of Windows Server can activate clients after their schemas have been updated using the Windows Server 2012 version of Adprep.exe.
    • Older operating systems will not be updated to support ADBA
    • There are 2 types of KMS host keys for Windows 8 and Windows Server 2012
      • Windows 8 client KMS Host Key (CSVLK): Only activates Windows 8 and earlier client operating systems. Can only be installed on KMS hosts that are running client operating systems
      • Windows Server 2012 KMS Host Key (CSVLK): Activates all operating systems and will be the most common CSVLK you will activate

    Configurations and Recommendations

    The following are a few common configurations and options. This should not be considered an exhaustive list of possible solutions since every deployment can be slightly different and have different goals and planning needs. When considering options the important question is if you are willing to update your schema level using Adprep. If not then Active Directory Based Activation is off the table although you can always do it later. You also need to consider Office activation if you are using Office

    Configuration #1: Current KMS hosts are running Windows Server 2003

    Windows Server 2003 KMS hosts cannot install Windows 8 or Windows Server 2012 CSVLK’s. There will be no update to support Windows 8 or Windows Server 2012 CSVLK’s

    Option #1 (Mix of KMS host and ADBA)

    Option #2 (KMS host only)

    Configuration #2: Current KMS hosts are running Windows Server 2008

    Option #1 (mix of KMS host and ADBA)

    Option #2 (Replace KMS host)

    Option #3(Maintain existing KMS host)

    • On your Windows Server 2008 KMS host install KB2757817 update and then install your Windows Server 2012 KMS Host Key (CSVLK) and activate
    • All operating systems will be activated by the 2008 KMS host
    • Note: Windows Server 2008 does not support Office KMS

    Configuration #3: Current KMS hosts are running Windows Server 2008 R2

    Option #1 (mix of KMS host and ADBA)

    Option #2 (KMS host only)

    The other factor that must be looked at when picking a solution is to determine how you will handle Office 2010 or Office 2013 activation. For additional information on Office see the following links:

    Additional resources

    Scott McArthur
    Senior Support Escalation Engineer
    Microsoft Commercial Support & Services

  • Don't Believe Everything You Read

    Recently, I was contacted by a customer who was advised by an ISV to set a registry value under one of the sub keys in HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Session Manager\.  Let's call it UseQuantumComputing = 1 (value name has been more
  • Looking for Windows Server 2012 Clustering and Hyper-V Hotfixes?

    In a previous blog, it was discussed on where you can get a list of the Failover Clustering and Hyper-V hotfixes for Windows Server 2008 R2.

    Looking for Windows Server 2008 R2 Cluster and Hyper-V Hotfixes?

    In this blog, we give the links to newly released hotfixes for Failover Clustering and Hyper-V so you know what hotfixes are out there. There are two Wiki pages that will make life easier for you when looking for hotfixes for Windows Server 2012 Clustering and Hyper-V.

    Hyper-V: Update List for Windows Server 2012

    List of Failover Cluster Hotfixes for Windows Server 2012

    These Wiki pages are updated on a regular basis.  You can setup an RSS Subscription to the Wiki page so you can the updated Wiki page in your favorite RSS app or Outlook.

    John Marlin
    Senior Support Escalation Engineer
    Microsoft Enterprise Platforms Support

  • Where Did My Space Go?


    While working on Windows 8/Windows Server 2012, I have been tied up with other content creation projects and as such have not had time to write for AskCore. However, I’ve had a number of blog ideas rattling around in my head. Today I finally found some time to get one of them committed to paper...ah...Word document.

    One of the most involved classic cases that I’ve been called upon to provide support for is the ‘where is my hard drive space’ case.

    The call usually starts like this…

    “I have a 250GB hard drive. When I look at the properties of the drive, it tells me that I have 75GB free and that 175GB are in use. However, when I add all my files together, I only see 150GB in use. That is a difference of 25GB. Where did my space go?”

    This is not an easy question to answer. I’ve spent years improving my understanding of how files are stored in NTFS and while I can explain it, I end up having to teach concepts that customers really don’t need to know. My mentor, Dennis Middleton, even wrote a couple of blogs on the subject. But they have the same problem. You have to learn quite a bit about NTFS to really understand why these numbers never seem to work out as expected.

    So in this blog entry, I will attempt to explain what is happening without getting too deep into the internals of NTFS. Some concepts I will have to explain, but I’ll try to keep the details light.

    First of all, and I can’t stress this enough, it is NOT broke. NTFS is storing data according to its design. The problem is that when you add up all the files on your disk, you aren’t seeing what you think you are seeing. In fact, unless you are deep in file system forensics, you don’t really know what you are seeing at all.

    There are two methods of looking at the space on your disk. I call them The Right Way and The Wrong Way.

    The Right Way

    Open the properties page on your volume.


    Rule of thumb: The pie does not lie.

    This is quick and slick….and always correct. Under the covers we are querying a special metafile called $BITMAP to get these values. This metafile is like a database of the clusters on your volume. It doesn’t know anything about what’s stored in a particular cluster, it just knows if the cluster is in use or not. This is also why the pie chart draws so quickly. We are only reading one file to get this information.

    The Wrong Way

    The wrong way is to use Windows Explorer, navigate to your root directory, and get the properties on all your files. To the casual observer, and honestly, to most people, this SEEMS like it should be the same thing. But it is not. It’s not even close.


    Apples and Oranges

    Comparing these two is really comparing apples and oranges. The Right Way will tell you how much free space/used space you have. The Wrong Way, however, tells you this and only this…

    The space used by the unnamed data streams for the files that the current user has access to and are not hidden.

    So any files that are hidden or that the user doesn’t have access to are not included when adding up the files using The Wrong Way. And before you start thinking to yourself, “I’m the administrator on this computer and I have access to everything.”….yes you are, but no you don’t.

    What you don’t see can really be broken down into two categories, Normal files and Metafiles.

    Normal Files

    Normal files are just that, normal files. But in this case you don’t have permissions to look at them. The best example of this is the ‘System Volume Information’ directory that is just off the root directory. By default, you don’t have permissions to view what’s in there. Since this is where VSS stores its snapshots, there can be some big files in there. And for those of you running Windows Server 2012, this is also where the Chunk Store is kept for volumes using file system deduplication.

    Also, I’ve seen applications that keep some of their logs hidden away from the user. They can be accessed via the application, but you can’t see them normally in Windows Explorer due to permissions.

    For instances like this, you can take ownership of the files/directories and change the permissions. However, doing so might cause problems to the system. So if you want to see the file sizes on files that you do not have permissions for, use the method discussed in the next few sections.


    Metafiles are files hidden away by Windows that are used for specific functions. Earlier I mentioned the $BITMAP metafile. Also, I authored a blog entry a while ago that talked about some of the other common metafiles.

    NTFS MetaFiles

    Unlike normal files, where you can take ownership of them and change the permissions, metafiles are hidden and Windows does not want you to see them. This is for your own good. Interacting with metafiles directly just causes trouble.

    Since metafiles can take up space, and we can’t look at them with Windows Explorer, I worked out a relatively quick way to look at these files. Some metafiles can be queried using various means (CHKDSK, FSUTIL, etc) but this method will work for all metafiles.

    I do want to take a moment to point out that if a metafile is taking up space, that doesn’t necessarily mean that there is a problem. Some metafiles will take up a large amount of space and it is completely normal that they do. Don’t get focused on preventing it from happening or putting a stop to it.

    Viewing MetaFiles

    To look at metafiles, we need a sector editor that understands NTFS. Microsoft has a sector editor (Disk Probe) but it doesn’t really have the functionality needed for this type of work. There are a number of sector editors out there. If you don’t have a favorite, I recommend you do some research to find one that suits your needs. For the purposes of screenshots, I will be using WinHex. Its not an endorsement. I just have to use something.

    Once the volume is loaded up, you can see all the files in the root directory.


    A number of metafiles are actually in the root directory. The $BITMAP file is highlighted and we can see that it is approximately 29.1MB in size. So just by itself, this is nearly 30MB of used space that wouldn’t show up if you used The Wrong Way. However, this isn’t really that much, and this particular metafile doesn’t normally get bigger unless the volume is extended.

    Actually most metafiles are quite small and can be ignored. So let’s look at files that can grow to considerable size.

    $LOGFILE: This file can get big if its maximum size is set beyond the default.

    $MFT: The more files you have and the more fragmentation you have, the more entries will be created in the Master File Table. This will cause the $MFT metafile to grow.

    $SECURE: This is where security descriptors are stored. The more complex your NTFS security is on this volume, the larger this file can grow. It is currently listed at 0 bytes. I’ll explain this shortly.

    Before we continue, it is important to note that the sizes listed here are not always the entire size of the file. This size just refers to the file’s primary (unnamed) data stream.

    clip_image004 Some files will show a “….” on the icon. This means that the file has some other attribute in it that is taking up some space as well.

    Using $SECURE as an example, if you drill down on that file, you can see the three additional attributes that it contains, $SDH, $SDS, and $SII.


    You don’t have to know what they are for. You just need to know that they take up space. If I added all the sizes together, I’d get the approximate amount of space that the file really uses. In this case that works out to about 2MB. So not that big in this case.

    Now let’s venture outside of the root directory. Earlier, when we listed out the root directory, at the top of the list, we could see $EXTEND.


    This is a metafile, but it is also a directory. As such, we can drill down to see what’s in it.


    We can see a few metafiles in here, including another directory. The metafile that I want to draw your attention to is $USNJRNL. This is the metafile used for the USN Journal (aka change journal or NTFS journal).

    Without getting too deep into how this file works, it is used by various applications to track changes. As such, this file can actually get pretty big. In our screenshot we can see that its listed as 0 bytes, however it does have additional attributes in it.


    When we drill down into it, you can see two additional attributes, $J and $MAX. The $J is normally the big one. While my test volume doesn’t have a large change journal, I have worked a number of cases where this has grown to several gigabytes in size. In fact my home computer has a change journal of about 12GB.

    NOTE: $J is actually sparse. So the amount of space it takes up will vary from what’s listed. But discussing sparse is outside the scope of this article.

    And while we have been using this method to look at metafiles, this works for normal files as well…even if you do not have permissions for the file. It won’t allow you to read the contents of the file, but that’s not what this is about. This is about viewing file sizes.

    Notice that in this case, I didn’t have any one place where a great deal of space was hiding. Sometimes that’s the way it works. A bunch of small spaces, used up here and there, adding up to a big difference.

    Final Notes

    There is one other place where used space can hide. Its very rare and I only include it in the spirit of being complete. We learned that The Wrong way only showed us…

    The space used by the unnamed data streams for the files that the current user has access to and are not hidden.

    So what about other streams? Without getting too deep into a discussion about streams, let me cobble up a simplified view. Picture this, you open a Word document and type out some text. Then you save that file. The text and all its formatting is saved in a structure called the ‘unnamed data stream’. But files can have other streams. They are called alternate data streams. When you use The Wrong Way, it doesn’t take into account space used in an alternate data stream.

    Most of the time alternate data streams are tiny. However, I have been involved in a couple of cases where an application stored gigabyes of data in alternate data streams. So if you are tracking down space and you just can’t find it anywhere else, look at the alternate data streams on your files.

    Microsoft provides a command line tool that will look at your alternate data streams and let you know what files have them.

    Alternate data streams are not necessarily bad. In fact some Microsoft programs use alternate data streams to tag and classify files. But again, those streams are very small. Use the STREAM.EXE tool to look for streams that are large or large amount of files that have alternate data streams.

    Also, there is a specific issue that can cause the WinSxS directory to use up additional disk space. So I want to make note of it here.

    2795190  How to address disk space issues that are caused by a large Windows component store (WinSxS) directory


    So that brings us to a close. There are a number of places on your volume that you can’t see and can’t directly interact with. This is why using The Wrong Way to look at your disk space is simply a bad idea. Trust the pie chart. And remember, just because a file or metafile is using space, doesn’t mean that there is a problem.

    Thanks for reading,

    Robert Mitchell
    Senior Support Escalation Engineer

    Interested in Azure Server Backup?  Check out my videos…

    Want to know more about Microsoft storage?  Check out my blogs...

    I also write content for Windows IT Pro magazine: