Skip to main content

Using JavaScript hashCode to enable Cocoon caching of POST requests

I've just faced an issue with the Cocoon caching related to POST requests. Let me describe the use case here. We use a custom XQueryGenerator to execute XQuery code over Sedna XML Database and then process the XML results in the Cocoon pipeline. For the sake of performance, I configured the pipeline caching based on the expiration timeout of 60 seconds for all XQuery invocations:
<map:pipeline id="cached-services" type="expires" internal-only="true">
  <map:parameter name="cache-expires" value="60"/>
  <map:parameter name="cache-key" 

  <map:match pattern="cached-internal-xquery/**">
    <map:generate src="cocoon:/xquery-macro/{1}" type="queryStringXquery">
      <map:parameter name="contextPath" value="{request:contextPath}"/>
    <map:transform src="xslt/postprocessXqueryResults.xslt" type="saxon"/>
    <map:serialize type="xml"/>
So you can see that both a request sitemap URI and a query string are used to form the cache key. It works perfectly until you want to send XQuery parameters via POST method instead of GET. Then the query string will be empty and identical for all the POST requests. As a result, one POST request's results will be cached for all of them, the caching breaks it all.

You may wonder why we need POST requests to actually load XML data. This is because we cannot predict how many request parameters will be there as they are generated from the list of identifiers like this:
// id_list is a Collection of identifiers to be sent as request parameters
var postData = id_list.stringJoin(
        function(object) { return "id=" + object },
var uri = "xquery/basictype_tree";

// This sends an asynchronous request and
// inserts its results into the containerId element.
new SimpleContainerTransaction(
        "uri": uri, "containerId": "treenode-details-container",
        "method": "POST", "data": postData
Here the SimpleContainerTransaction is a part of a custom YUI3-based Transaction utility.

Now it's time to fix the issue. It seems quite obvious that we should simply generate a fake GET parameter in addition to meaningful POST parameters. This fake parameter will be a hash of POST parameters to make identical requests have identical hash values. As soon as we implement this, the caching should work perfectly for this use case as well.

As we generate POST parameters string in JavaScript, I googled for JavaScript hash implementations and discovered this pretty overview of possible JavaScript hash solutions. So I adapted the first one and incorporated it into our project JS library:
String.prototype.hashCode = function() {
    var charCode, hash = 0;
    if (this.length == 0) return hash;
    for (var i = 0; i < this.length; i++) {
        charCode = this.charCodeAt(i);
        hash = ((hash << 5) - hash) + charCode;
        hash = hash & hash; // Convert to 32bit integer
    return hash;
This extends all String objects' with the hashCode function. So let's fix now the caching issue by appending POST parameters hash as a GET parameter to the URL:
var uri = "xquery/basictype_tree?hash=" + postData.hashCode();
That's it, the caching works fine again.


Popular posts from this blog

DynamicReports and Spring MVC integration

This is a tutorial on how to exploit DynamicReports reporting library in an existing  Spring MVC based web application. It's a continuation to the previous post where DynamicReports has been chosen as the most appropriate solution to implement an export feature in a web application (for my specific use case). The complete code won't be provided here but only the essential code snippets together with usage remarks. Also I've widely used this tutorial that describes a similar problem for an alternative reporting library. So let's turn to the implementation description and start with a short plan of this how-to: Adding project dependencies. Implementing the Controller part of the MVC pattern. Modifying the View part of the MVC pattern. Modifying web.xml. Adding project dependencies I used to apply Maven Project Builder throughout my Java applications, thus the dependencies will be provided in the Maven format. Maven project pom.xml file: net.sourcefo

Do It Yourself Java Profiling

This article is a free translation of the Russian one that is a transcript of the Russian video lecture done by Roman Elizarov at the Application Developer Days 2011 conference. The lecturer talked about profiling of Java applications without any standalone tools. Instead, it's suggested to use internal JVM features (i.e. threaddumps, java agents, bytecode manipulation) to implement profiling quickly and efficiently. Moreover, it can be applied on Production environments with minimal overhead. This concept is called DIY or "Do It Yourself". Below the lecture's text and slides begin. Today I'm giving a lecture "Do It Yourself Java Profiling". It's based on the real life experience that was gained during more than 10 years of developing high-loaded finance applications that work with huge amounts of data, millions currency rate changes per second and thousands of online users. As a result, we have to deal with profiling. Application pro

Using Oracle impdp utility to reload database

Here I'll show an example of using Oracle Data Pump Import (impdp) utility. It allows importing Oracle data dumps. Specifically, below is the list of steps I used on an existing Oracle schema to reload the data from a dump. Steps to reload the data from an Oracle dump We start with logging into SQL Plus as sysdba to be able to manage users. sqlplus sys/password@test as sysdba Dropping the existing user. CASCADE clause will ensure that all schema objects are removed before the user. SQL> DROP USER test CASCADE; Creating a fresh user will automatically create an empty schema with the same name. SQL> CREATE USER test IDENTIFIED BY "testpassword"; Granting DBA role to the user to load the dump later. Actually, it's an overkill and loading the dump can be permitted using a more granular role IMP_FULL_DATABASE . SQL> GRANT DBA TO test; Registering the directory where the dump is located. SQL> CREATE DIRECTORY dump_dir AS '/home/test/dumpd