Mark Russinovich’s technical blog covering topics such as Windows troubleshooting, technologies and security.
Last time, I covered the limits and how to measure usage of one of the two key window manager resources, USER objects. This time, I’m going to cover the other key resource, GDI objects. As always, I recommend you read the previous posts before this one, because some of the limits related to USER and GDI resources are based on limits I’ve covered. Here’s a full index of my other Pushing the Limits of Windows posts:
Pushing the Limits of Windows: Physical Memory Pushing the Limits of Windows: Virtual Memory Pushing the Limits of Windows: Paged and Nonpaged Pool Pushing the Limits of Windows: Processes and Threads Pushing the Limits of Windows: Handles Pushing the Limits of Windows: USER and GDI Objects – Part 1
Pushing the Limits of Windows: Physical Memory
Pushing the Limits of Windows: Virtual Memory
Pushing the Limits of Windows: Paged and Nonpaged Pool
Pushing the Limits of Windows: Processes and Threads
Pushing the Limits of Windows: Handles
Pushing the Limits of Windows: USER and GDI Objects – Part 1
GDI objects represent graphical device interface resources like fonts, bitmaps, brushes, pens, and device contexts (drawing surfaces). As it does for USER objects, the window manager limits processes to at most 10,000 GDI objects, which you can verify with Testlimit using the –g switch:
You can look at an individual process’s GDI object usage on the Performance page of its Process Explorer process properties dialog and add the GDI Objects column to Process Explorer to watch GDI object usage across processes:
Also like USER objects, 16-bit interoperability means that USER objects have 16-bit identifiers, limiting them to 65,535 per session. Here’s the desktop as it appeared when Testlimit hit that limit on a Windows Vista 64-bit system:
Note the Start button on the bottom left where it belongs, but the rest of the task bar at the top of the screen. The desktop has turned black and the sidebar has lost most of its color. Your mileage may vary, but you can see that bizarre things start to happen, potentially making it impossible to interact with the desktop in a reliable way. Here’s what the display switched to when I pressed the Start button:
Unlike USER objects, GDI objects aren’t allocated from desktop heaps; instead, on Windows XP and Windows Server 2003 systems that don’t have Terminal Services installed, they allocate from general paged pool; on all other systems they allocate from per-session session pool.
The kernel debugger’s “!vm 4” command dumps general virtual memory information, including session information at the end of the output. On a Windows XP system it shows that session paged pool is unused:
On a Windows Server 2003 system without Terminal Services, the output is similar:
The GDI object memory limit on these systems is therefore the paged pool limit, as described in my previous post, Pushing the Limits of Windows: Paged and Nonpaged Pool. However, when Terminal Services are installed on the same Windows Server 2003 system, you can see from the non-zero session pool usage that GDI objects come from session pool:
The !vm 4 command in the above output also shows the session paged pool maximum and session pool sizes, but the session paged pool maximum and session space sizes don’t display on Windows Vista and higher because they are variable. Session paged pool usage on those systems is capped by either the amount of address space it can grow to or the System Commit Limit, whichever is smaller. Here’s the output of the command on a Windows 7 system showing the current session paged pool usage by session:
As you’d expect, the main interactive session, Session 1, is consuming the most session paged pool.
You can use the Testlimit tool with the “–g 0” switch to see what happens when the storage used for GDI objects is exhausted. The number you specify after the –g is the size of the GDI bitmap objects Testlimit allocates, but a size of 0 has Testlimit simply try and allocate the largest objects possible. Here’s the result on a 32-bit Windows XP system:
On a Windows XP or Windows Server 2003 that doesn’t have Terminal Services installed you can use the Poolmon utility from Windows Driver Kit (WDK) to see the GDI object allocations by their pool tag. The output of Poolmon the while Testlimit was exhausting paged pool on the WIndows XP system looks like this when sorted by bytes allocated (type ‘b’ in the Poolmon display to sort by bytes allocated), by inference indicating that Gh05 is the tag for bitmap objects on Windows Server 2003:
On a Windows Server 2003 system with Terminal Services installed, and on Windows Vista and higher, you have to use Poolmon with the /s switch to specify which session you want to view. Here’s Testlimit executed on a Windows Server 2003 system that has Terminal Services installed:
The command “poolmon /s1” shows the tags with the largest allocation contributing for Session 1. You can see the Gh15 tag at the top, showing that a different pool tag is being used for bitmap allocations:
Note how Testlimit was able to allocate around 58 MB of bitmap data (that number doesn’t account for GDI’s internal overhead for a bitmap object) on the Windows XP system, but only 10MB on the Windows Server 2003 system. The smaller number comes from the fact that session pool on the Windows Server 2003 Terminal Server system is only 32 MB, which is about the amount of memory Poolmon shows attributed to the Gh15 tag. The output of “!vm 4” confirms that session pool for Session1 is been consumed and that subsequent attempts to allocate GDI objects from session pool have failed:
You can also use the !poolused kernel debugger command to look at session pool usage. First, switch to the correct session by using the .process command with the /p switch and the address of a process object that’s connected to the session. To see what processes are running in a particular session, use the !sprocess command. Here’s the output of !poolmon on the same Windows Server 2003 system, where the “c” option to !poolused has it sort the output by allocated bytes:
Unfortunately, there’s no public mapping between the window manager’s heap tags and the objects they represent, but the kernel debugger’s !poolused command uses the triage.ini file from the debugger’s installation directory to print more descriptive information about a tag. The command reports that Gh15 is GDITAG_HMGR_SPRITE_TYPE, which is only slightly more helpful, but others are more clear.
Fortunately, most GDI and USER object issues are limited to a particular process hitting the per-process 10,000 object limit and so more advanced investigation to figure out what process is responsible for exhausting session pool or allocating GDI objects to exhaust paged pool is unnecessary.
Next time I’ll take a look at System Page Table Entries (System PTEs), another key system resource that can has limits that can be hit, especially on Remote Desktop sessions on Windows Server 2003 systems.
From what I understand this really only affects 16 bits of data. So would running on a 64-bit stop one program from using all the resources? How about a quad core? I'm guessing a program can only take up one core at a time.
There are only 2^16 places at the dinner table, regardless of how big the room is.
A program with multiple threads can spawn a thread per CPU core, and make that thread as busy as it likes.
@ Harry Johnston
An application doesnt have to be poor written or faulty to reach those limits. I am a heavy browser (when I say a heavy browser, I mean opening three windows of Opera, Firefox or IE at once with more than 20 tabs each one. I can see my browser usin 1500 MB (1,5GB) and running out of GDIs very fast, depending on those webpages design. (why I have so many webs open? Well, for comparisons of programs and software, to get info from tutorials and guides to make my own personalized one, etc. Most of the time I got that info and I have to wait to confirm with someone what exactly do we need/want, then go back and choose from those pages. So they are sitting there for days or even weeks, while I open others. No I dont use the 'favorites' shorcuts, as most of those pages would be closed when I've finished and I rarely look for them back.
But thats not the only thing. I have sometimes like 20 pictures open, with 2 or three tools to edit them (as each tool gives exactly what I want, the way I want it. Plus the browsers, plus other software. Hell, sometime I even have a movie runing on my second screen with my onboard soundcard while Im playing a game with my main screen and my main soundcard, having all that crap in the background. And it runs fine !!! The only problem is ... when I open a new tab on the browser ... sometimes I can barely do anything. Screen corruption (and yes, my Ram and video card are fine, Its a problem with GDI I know it for sure. I alwyas keep an eye on the GDI with my Taskmanager) and sometimes the browser even crashed. But sometimes I just need to terminate it and reload it (as it keeps all my tabs without problems) But that just gives me a few movements before it happens again (so I usually get the info from some tabs and close them down).
This doesnt happen to me all the time, as I dont use the browser to their limits 24/7, but more often than not, for sure.
It has been years since I looked at these limits, but as I recall you can bump up the number of GDI and User handles quite a bit. Further more, I think I even tested allocating a bunch more handles than 2^16 on 64-bit XP, but admittedly my memory is a bit fuzzy at this point. The desktop heap size is noticably larger under 64-bit Windows, so at least there is some relief to be found there.
Its really a good piece of info. Thanks.
I also can't understand why 100% CPU usage can be taken by a realtime process; when there should always remain at least about 10% CPU time for concurrent threads.
In other words, use 100% of CPU time if you want within a realtime process, but if there are other concurrent threads that can run and are waiting for execution and not blocked by some IO waiting for completion (whatever their priority) and that are not paused (by a debugger), they should be given at least some shared time within the 10%.
No realtime process or process with elevated privileges should be allowed to take the full control of the OS (taking all available resources : CPU time or memory limits).
Unfortunately, this easily happens, notably within web browser extensions like Adobe Flash... and this can even block the temporature monitoring and regulation, causing a system overheat, a sudden shutdown, and failure to reboot for several minutes, and possible hardware damages (notably on notebooks from lots of vendors, including IBM/Lenovo, Compaq/HP, Acer, Asustek... and many notebooks not built with a processor or GPU specially designed for mobile applications, and that don't have an hardware automatic control of temperature, but only a temperature monitor that requires some driver to slow down a bus frequency, or switching on the fan or increasing the fan speed, through Power Management APIs, or by running some thread pausing the processor for some time in more idle state).
Allowing 100% CPU usage in all conditions in a process that gets full control over all the other ones left in complete starvation during long minutes, is clearly a severe design bug in the system scheduler.
I've seen only one way to avoid this problem: run Windows as a guest OS within VM controled by a hypervisor (such as MS Windows Server, or Oracle-Sun VirtualBox if using a desktop version of Windows 7 because the XP Mode that replaces now MS VirtualPC is actually not working, not reliable, and in fact not really supported by Microsoft as it is already programmed for extinction and already incompatible with almost all desktop/notebook PCs except those that can run Windows Server.)
The justification given above (related to games) is really bad, because it implies that desktop/notebook/mobile PCs are left unprotected by DOS attacks and by every program than can take full control of it and burn the hardware easily.
As a consequence, Windows 7 (which was theoritically not made to create servers) is really not suited for mobile/notebook/desktop environments, and this is a major defect of the OS that it forgot to include a working and supported hypervisor enforcing some reasonnable system usage limits.
What I would like to know is if there is any increase in the total number of GDI objects that can be created in Windows 7 64 bit. There are still trading applcations like Bloomberg Launchpad that still crash when it hits a 10,000 GDI limit. I know other applications that require 15,000 GDI to run. (I have "over-clocked" the quota successfully)
Does 64 bit Windows 7 or Vista for that matter have a higher limit than 65,000?
btw, I see the testlimit as a stress tester this isn't supposed to be on a production box
To Jyorgy: There is absolutely no sense in having that many windows open. Change your work habits. If I had a trader that did that, I would box his ears. I have seen people with 54 IM's opening, 97 Outlook Windows and other trading apps as well and the knucklehead couldn't fathom what was wrong. It is the nature of the beast. The operating system can only draw so many objects before it can' t anymore. I am amazed that MS hasn't done anything to rectify this bottleneck. I noted that the desktop heap has gotten larger, that just means your applications are going to gulp down memory at an even faster pace. Really smart.This was one of the primary reasons I could see to upgrade the system but now, what do we do? Wait for Windows 8?
Hello, my IE8's GDI objects seem to be leaking... Are there any great tools to check the stacks of leaked GDI objects creation.
please creat blog about The performance Scaling in the Windows Server / Client, and how to measure and check the bottleneck: network, hardware, software.