A hash table or hash map is a data structure that uses a hash function to map identifying values, known as keys , to their associated values . Thus, a hash table implements an associative array. The various application servers store POST form data in hash tables, so that later they could be used during application development. If more than one key is hashed to a single hash using hash function, then it can lead to a problem called hash collision. Any application platform that use a hash function is easily affected by this vulnerability.
A recent n.runs’ AG’s report explains that “If the language does not provide a randomized hash function or the application server does not recognize attacks using multi-collisions, an attacker can degenerate the hash table by sending lots of colliding keys. The algorithmic complexity of inserting n elements into the table then goes to O(n**2), making it possible to exhaust hours of CPU time using a single HTTP request” .
This in turn results on DoS(Denial of Service) attacks. In general terms, DoS attacks are implemented by either forcing the targeted computer(s) to reset, or consuming its resources so that it can no longer provide its intended service or obstructing the communication media between the intended users and the victim so that they can no longer communicate adequately.
How the attack works :-
Here, three different keys namely wolf, tiger and elephant are hashed to the same hash 05 through hash function. This increases the complexity of processing a request which involves these key values and finally results in hash collisions.
Let us now take a quick glance at how these hash table vulnerabilities affect PHP, JAVA and Tomcat..
1) Apache Tomcat
As the Apache Tomcat uses hash tables for storing various http request parameters, it is affected by the above mentioned issues.
As a remedial measure, Tomcat’s Mark Thomas said: “Tomcat has implemented a work-around for this issue by providing a new option (maxParameterCount) to limit the number of parameters processed for a single request, This default limit is 10000: high enough to be unlikely to affect any application; low enough to mitigate the effects of the DoS.”
The workaround is available in variants 7.0.23 and onwards, and 6.0.35 and later. However it is suggested to implement these measures and to upgrade to safer versions which are less prone to such attacks.
Java uses the HashMap and Hashtable classes, which use the String.hashcode() hash function. Hence it could be affected by hash collision.
The case of PHP is not different too. It uses another hash function, which paves a reason for these attacks to happen in PHP as well. PHP is telling to tweak max_input_time and max_execution_time. Also they released one bug fix ,but not for production servers.
Refer : http://www.php.net/archive/2011.php#id2011-12-25-1
For those working platforms, where fixes are not yet released, the suggested work arounds are:-
Limit CPU time
Limiting the processing time for a single request can help minimize the impact of malicious requests.
Limit maximum POST size
Limiting the maximum POST request size can reduce the number of possible predictable collisions, thus reducing the impact of an attack.
Limit maximum request parameters
Some servers offer the option to limit the number of parameters per request, which can also minimize impact.
In short, the basic idea is to regulate the traffic of CPU utilization. thereby,at least you can keep a control on such attacks affecting your server before the respective fixes are released.