Blogs

The Case of the Insecure Security Software

  • Comments 17
  • Likes

A little over a year ago I set out to determine exactly why, prior to Window Vista, the Power Users security group was considered by most to be the equivalent of the Administrators group. I knew the answer lay in the fact that default Windows permissions allow the group to modify specific Registry keys and files  that  enable members of the group to elevate their privileges to that of the Local System or Administrators group, but I didn’t know of any concrete examples. I could have manually investigated the security on every file, directory and Registry key, but instead decided to write a utility, AccessChk, that would answer questions like this automatically. AccessChk quickly showed me directories, files, keys, and even Windows services written by third parties, that Power Users could modify to cause an elevation of privilege. I posted my findings in my blog post The Power in Power Users.

Since the posting, AccessChk has grown in popularity as a system security auditing tool that helps identify weak permissions problems. I’ve recently received requests from groups within Microsoft and elsewhere to extend its coverage of securable objects analyzed to include the Object Manager namespace (which stores named mutexes, semaphores and memory-mapped files), the Service Control Manager, and named pipes.

When I revisited the tool to add this support, I reran some of the same queries I had performed when I wrote the blog post, like seeing what system-global objects the Everyone and Users groups can modify.  The ability to change those objects almost always indicates the ability for unprivileged users to compromise other accounts, elevate to system or administrative privilege, or prevent services or programs run by the system or other users from functioning. For example, if an unprivileged user can change an executable in the %programfiles% directory they might be able to cause another user to execute their code. Some applications include Windows services, so if a user could change the service executable they could obtain system privileges.

These local elevation-of-privilege and denial-of-service holes are unimportant on single-user systems where the user is an administrator, but on systems where a user expects to be secure when running as a standard user (like Windows Vista), and on shared computers like a family PCs that have unprivileged accounts, Terminal Server systems, and kiosk computers, they break down the security boundaries that Windows provides to separate unprivileged users from each other and from the system. 

In my testing I executed AccessChk commands to look for potential security issues in each of the namespaces it supports. In the commands below, the -s option has AccessChk recurse a namespace, -w has it list only the objects for which the specified group – Everyone in the examples – has write access, and -u directs AccessChk to not report errors when it can’t query objects for which your account lacks permissions. The other switches indicate what namespace to examine, where the default is the file system.

File system:           accesschk everyone -wsu %programfiles%
File system:           accesschk everyone -wsu %systemroot%
Registry:                accesschk everyone -kwsu hklm
Processes:             
accesschk everyone -pwu *
Named Objects:    
accesschk everyone -owu \basenamedobjects
Services:                accesschk everyone -cwu *

I ran similar commands looking for write access from the Authenticated Users and Users groups. An output line, which looks like “RW C:\Program Files\Vendor”, reveals a probable security flaw.

To my surprise and dismay, I found security holes in several namespaces. The security settings on one application’s global synchronization and memory mapping objects, as well as on its installation directory, allow unprivileged users to effectively shut off the application, corrupt its configuration files, and replace its executables to elevate to Local System privileges. What application has such grossly insecure permissions? Ironically, that of a top-tier security vendor.

For instance, AccessChk showed output that indicated the Users group has write access to the application’s configuration directory (note that names have been changed):

RW C:\Program Files\SecurityVendor\Config\
RW C:\Program Files\ SecurityVendor\Config\scanmaster.db
RW C:\Program Files\ SecurityVendor\Config\realtimemaster.db

Because Malware would run in the Users group, it could modify the configuration data or create its own version and prevent the security software from changing it. It could also watch for dynamic updates to the files and reset their contents.

For the object namespace, it reported output lines like this:

RW [Section]    \basenamedobjects\12345678-abcd-1234-cdef-123456789abc
RW [Mutant]     \basenamedobjects\87654321-cdab-3124-efcd-6789abc12345

I executed handle searches in Process Explorer to determine which processes had these objects open and it reported those of the security software. Sections represent shared memory so it was likely that the security agent, running in user login sessions, was using it to communicate data to the security software’s service process that was running in the Local System account. Malware could therefore modify the contents of the memory, possibly triggering a bug in the service to that might allow the malware to obtain administrative rights. At the minimum it could manipulate the data to foil the communication.

“Mutant” is the internal name for Windows mutexes, and the security software’s service was using the mutex for synchronization. That means that malware could acquire the mutex and block forward progress by the service. There were more than few of these objects with wide-open security that could potentially be used to compromise or disable the security software.

In the wake of my discovery, I analyzed the rest of my systems, as well as trial versions of other popular security, game, ISP and consumer applications. A number of the most popular in each category had problems similar to those of the security software installed on my development system. I felt like I was shining a flashlight under a house and finding rotten beams where I had assumed there was a sturdy foundation. The security research community has focused its efforts uncovering local elevations via buffer overflows and unverified parameters, but has completely overlooked these obvious problems – problems often caused by the software of security ISVs, or in some cases, their own.

Why are these holes created? I can only speculate, but because allowing unprivileged groups write-access to global objects requires explicit override of secure defaults, my guess is that they are common in software that was originally written for Windows 9x or assumed a single administrative user. When faced with permissions issues that crop up when migrating to a version of Windows with security, or that occur when their software is run by standard user accounts, the software developers have taken the easy way out and essentially turned off security.

Regardless of the reason, it’s time for software vendors – especially those of security applications - to secure their software. If you discover insecure software on your system please file a bug with the publisher, and if you are a software developer please follow the guidance in "Writing Secure Code,” by Michael Howard and David LeBlanc.

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment
  • Did you only find flawed security models in third party applications or was there numerous findings in the baseline OS?

    I work in infrastructure for a software developer,  so I'll be passing this along for consideration.

    Thanks as always

  • Thanks for the tool. I found a potential problem in a couple of minutes.

    I get lots of output for Objects on 32-bit XP, but 0 for 64-bit XP, no matter what user or group I try. Objects not working on 64-bit?

  • PingBack from http://blog.binaryparadox.net/?p=35

  • PingBack from http://www.bufferoverflow.it/2007/06/20/trovare-punti-deboli-in-windows-con-accesschk/

  • D'Oh, this is about how much news? Some year-old examples:

    Security software from G-Data gives Everyone-FullAccess on its install directory and some registry keys.

    WebDrive, FTPDrive and Novell NetDrive set NULL DACLs on their service and driver.

    The NVidia ForceWare Driver gives Everyone-FullAccess on some keys in HKLM\SYSTEM which allows a user to DoS the system by writing garbage there. Additionally, the control panel uses a shared section for no good reason.

    DeviceLock, at least until version 5.76.1, gave Everyone-FullAccess on \Device\HarddiskX objects if the access list contained as least one allow entry (and other totally removed any access, even to administrators). Hurray for "dd if=\\.\C:"!

    I could list many more examples...

    @rivet: Of course in the baseline OS there are only miniscule violations, f.e. some Full and Create access in some HKCR\CLSID\{CLSID}, but nothing serious. Microsoft isn't dumb.

  • > I get lots of output for Objects on 32-bit XP, but 0 for 64-bit XP, no matter what user or group I try. Objects not working on 64-bit?

    That's a bug that shows up only on 64-bit XP. I'll be posting an update in the next few days that addresses it.

  • I think that third party software not cause most of Windows crashes, but also creates very many security flaws.

    For example a driver creating device object (for example meant for interface with service) accessible to everyone can be very dangerous.

  • WTF? 3 out of 8 comments are the PingBack SPAM. Why doesn't Technet setup a filter that filters this SPAM? It's trivial to identify.

  • What Mark has pointed out is quite a general practice for many companies that need to migrate their software to the new OS.

    The developers do really go easy way in the most cases. However, that is so not because we're lazy, but mostly because there're at least too reasons for that, as to my experience:

    1. The companies do not wish to involve much budget into support of a new OS version and push hard to make the software run with the same experience (as on the previous OS version) in a short time frame. The marketing and top management people are naturally interested in implementation of new competitive features rather than in investment of grinding/tuning existing functionality which might become broken by an up-coming OS but maybe quickly "healed" in the way described by Mark.

    2. The companies are forced (for some or another reason) to support previous versions of OS, at least those that are still in support from Microsoft. IMHO, Vista is great about many new things in many areas. As a sys-engineer/architect and developer (by spirit), I would certainly love to re-design the software in my abode to use Vista's features and style, dropping out not modern and aged things. Yet, to meat high level goals, the interests of the company does not always coincide with my wishes.

    These are not complains, just my view on things, and having said all of that, I think Microsoft still tries to do a good job of forcing apps to be compliant to new OS version by running misc Logo programs. But I'd like to say that the "Certified for Vista" test cases do not mention such/similar checks for the software as Mark describes, even though it enforces such details as manifests, code signatures, etc. I'm not sure what rules "Designed for Vista" Logo program imposes - I did not work with that, but it might be worthwhile to update the Logo programs with Mark's prescriptions. IMHO, it will promptly force the companies to resolve the mentioned holes in their software, because logo-compliance is a matter of business, making the top management take it into account.

    P.S. I don't belong to any security software companies.

  • Dear Mark!

    This is a little Off-Topic but since I do net receive any response when writing to you an e-mail, I'll try this way.

    I experienced a nasty bug in the command line tool handle - which is in my case more useful than Process Explorer.

    The bug is reproducible on every Windows XP System.

    Try to open a txt file with excel. Excel will create you several *.tmp files in your %temp% directory. These files are called something like 37.tmp ore 74.tmp. Now as long as excel is opened there is a handle on that files. Process Explorer recognizes that handle. Handle.exe also recognizes it as long as you call it from inside the directory, or use the 8.3 folder names. If you use the full folder names with (with "" of course) then it won't recognize the handle.

    So at first I thought: It might be because of the spaces in the folder name "Local Settings". But that's not the case. Create another file in the same directory (even with the same/similar filename), and open it with a process that puts a handle on it. Now check that file with handle.exe and it works, no matter which way you try it.

    So it has to do something with the way excel opens the file, and the path using spaces or something?

    Anyways, it would be great if that would be fixed. Wouldn't it be worth a small investigation, and a blog entry ;-)

  • I love these articles, even though I have little interest in Windows; it's the technical,  investigative techniques that makes for compelling reading.  Thank you!

  • Thanks for the feedback, Jody!

  • You rock Mark.  This article cracked me up because I'm a bit frustrated with the AV companies with petty complaints about ASLR and whatnot.  Bottom line: Application Developers everywhere need to go back to school.  When doing research for a project on improvements in Trustworthy Computing, SDLC, Vista, and just hanging out on your site, Channel9, & IT Showtime I've really appreciated how you all are handling business.

    Bottom line: It's nice that an hour of your research leads to 1000's of hours of training and development for those who need to get with the times.  That just goes to show how 3rd party app developers never used their 3 years of Vista CTP builds to their advantage

  • A common reason for lowering security on the Program Files folder for an app is that the app creates data files in that location, and limited users can't create files there so....

    The reason often seems to be sheer laziness. It's too easy to use files ("myfile.txt") without any folder name, instead of a proper location with a CSIDL and too much trouble to change the app.

  • I think the problem is plain and simply that the security model is too complex and hard to use effectively without just disabling it. For example I worked on an app that had to use secure registry keys, and it was a real bitch doing stuff that was even moderatly more complicated than the examples you could find on MSDN. Even there the pickings were slim, with useless or buggy example code, and the like.

    If the security model is complex, requires a lot of knowledge and use of complex apis and interfaces and is poorly documented on top then its hardly a surprise that people mess it up so regularly.

    If MS wants to improve the general security of their ecosystem then they really need to expose simpler API's and provide MUCH better documentation. Every example that has possible security implications should be coded securely, and be extremely clear in how the security aspects work.

    People cut and paste code and then go from there, maybe they shouldn't but they do, especially when working on new material, and if the examples are crap, well then thats what gets rolled out. Ive seen this elsewhere, just by updating sample code in a scripting languages documentation to use "best practices" resulted in a general improvement of the whole ecosystem.