DLL hijacking is a common technique used by malware authors to take control of vulnerable programs. It’s useful because it essentially allows an attacker to execute arbitrary code with the executing user’s credentials. Common examples are via drive-by download attacks where, for instance, we can convince a user to download a library into a vulnerable directory. The common perception is that this is an operating system issue. Well, it is and it isn’t. Windows and Linux both have orders in which they try to locate specific libraries, granted, but they can both be exploited and they can both be misconfigured.
Here, we’re going to address manual-loading vulnerabilities specifically. Modern operating systems have specific directories they look to in order to find shared libraries. In linux, it’s some collection of /usr/lib, /usr/local/lib, /opt/lib, or similar directories. On windows systems, applications are able to load from the application loading directory, the application home directory, the system directory, the 16-bit system directory, and a few others. Developers can also load shared libraries directly, via dlopen(.) on posix compliant systems and a combination of LoadLibrary(.) and GetFullPathName(.) on Windows systems.
Well, as developers, we may not be interested in registering all our application libraries with the operating system. For example, if we have an application that we’ve designed modularly to make updating easier, we’ll likely split the application into groups of libraries we can update independently. This way, if we have specific code we expect we’ll be frequently updating, we can easily replace single, smaller library files, making the update process faster and using less bandwidth. In this case, we’re likely to install the library files themselves next to the application executable, or in a subdirectory of the application root.
Keep in mind, when we do this, many users may actually install our software in their home directories rather than in a more protected system location that they have limited access to. When you combine the design strategy I outlined above with a home directory installation location, we’re in a position where our library files, in most cases, can be overwritten by the installing user. So now, we’re in a position where our application loads code from the file system than can essentially be arbitrarily replaced. If I, as a malicious operator, can download a library into a location that an application will load from, with the same name as the overwritten library, I can now execute arbitrary code.
Here, I don’t even need to save the old library file - can can wrap the old library into the new one (after all, if I know where you’re loading library from, I likely have a copy of that library), or I can download and load that library file at runtime. That may result in a delay on initial application load, but I don’t mind putting up a pacifier dialog while I download the old library file. I’m nice like that.
Fortunately, circumventing this threat isn’t that hard, it just requires a little diligence and forethought. From an application perspective, you need to verify that the libraries you’re loading haven’t been tampered with. Fortunately, we do have ways to do this, though some systems are easier to do this with than others. Microsoft has integrated code signing capabilities via authenticode and their various code signing tools. These tools allow developers to sign DLLs, EXEs, and a variety of other types of files. Signing code can be a hassle, no doubt, but if you’re shipping code you expect users to install on their computers, be polite enough to be professional about it. Sign your code prior to distributing. It’s surprising how common unsigned code is today. OSX has similar code signing capabilities. Linux, however, doesn’t. That said, our attack model is built around DLL hijacking, assuming the attacker doesn’t have access to the loading application. As a result, adding verification code to the executable to manually check the signature associated with a library via a homegrown verification system is appropriate and would work. Ideally, this would involve manually signing a delivered library and validating the signature, but checking the library file size and hash (using a SHA-2 family hashing algorithm) is certainly better than nothing.
I did focus on Windows and Linux, but these kinds of attacks work against any kind of module system. Java, Ruby, Node.js, any of these application environments are similarly vulnerable. Remember, check your libraries!