Skip to main content

Using XML Catalogs in Cocoon

In this article I'm going to show a common use case of XML Catalogs. Their usage is not only recommended to avoid certain issues but can also drastically improve the performance. I'll start with explaining the issue that I've faced recently and will conclude with the resolution.

To start with, I've got the following exception: Server returned HTTP response code: 429 for URL:
The HTTP code 429 stands for "Too Many Requests" that can appear when:
The user has sent too many requests in a given amount of time. Intended for use with rate limiting schemes
Just to provide some context, I have an Apache Cocoon based application that does a lot of XSLT processing with Saxon. It appears that every time Saxon reads an xml document with a DTD reference, it tries to fetch the DTD source for validation. Obviously, if the processing rate is high enough and there is no caching, you can create a lot of excessive network traffic and hit the rate limit. The same issue has been kindly explained by W3C.

XML Catalog maps resources addresses to local copies of the same resources. Thus, the use of XML Catalogs can bring big benefits when there are many external references in your xml documents. Finally, let's look at an example catalog that resolved the above issue by using local SVG DTD files:
PUBLIC "-//W3C//DTD SVG 1.1//EN" "svg11.dtd"
So it looks pretty simple mapping the SVG formal public identifier to the local copy of the main DTD file. Both this file named catalog and all the required SVG DTD files are located under META-INF/cocoon/entities/catalog as a standard location for Cocoon. Now as you can read in How to use a catalog file and Cocoon catalog documentation, we need to create a file that must be placed in the Java classpath:
To conclude, XML Catalog appeared to me as a not really well-known mechanism that must be used as a good practice. Besides avoiding the rate limit issue, it helped to increase the performance several times in certain cases. This can happen if the application is hidden behind a slow proxy and the DTD is fetched dozens of times on a pipeline.


Popular posts from this blog

DynamicReports and Spring MVC integration

This is a tutorial on how to exploit DynamicReports reporting library in an existing  Spring MVC based web application. It's a continuation to the previous post where DynamicReports has been chosen as the most appropriate solution to implement an export feature in a web application (for my specific use case). The complete code won't be provided here but only the essential code snippets together with usage remarks. Also I've widely used this tutorial that describes a similar problem for an alternative reporting library. So let's turn to the implementation description and start with a short plan of this how-to: Adding project dependencies. Implementing the Controller part of the MVC pattern. Modifying the View part of the MVC pattern. Modifying web.xml. Adding project dependencies I used to apply Maven Project Builder throughout my Java applications, thus the dependencies will be provided in the Maven format. Maven project pom.xml file: net.sourcefo

Do It Yourself Java Profiling

This article is a free translation of the Russian one that is a transcript of the Russian video lecture done by Roman Elizarov at the Application Developer Days 2011 conference. The lecturer talked about profiling of Java applications without any standalone tools. Instead, it's suggested to use internal JVM features (i.e. threaddumps, java agents, bytecode manipulation) to implement profiling quickly and efficiently. Moreover, it can be applied on Production environments with minimal overhead. This concept is called DIY or "Do It Yourself". Below the lecture's text and slides begin. Today I'm giving a lecture "Do It Yourself Java Profiling". It's based on the real life experience that was gained during more than 10 years of developing high-loaded finance applications that work with huge amounts of data, millions currency rate changes per second and thousands of online users. As a result, we have to deal with profiling. Application pro

Using Oracle impdp utility to reload database

Here I'll show an example of using Oracle Data Pump Import (impdp) utility. It allows importing Oracle data dumps. Specifically, below is the list of steps I used on an existing Oracle schema to reload the data from a dump. Steps to reload the data from an Oracle dump We start with logging into SQL Plus as sysdba to be able to manage users. sqlplus sys/password@test as sysdba Dropping the existing user. CASCADE clause will ensure that all schema objects are removed before the user. SQL> DROP USER test CASCADE; Creating a fresh user will automatically create an empty schema with the same name. SQL> CREATE USER test IDENTIFIED BY "testpassword"; Granting DBA role to the user to load the dump later. Actually, it's an overkill and loading the dump can be permitted using a more granular role IMP_FULL_DATABASE . SQL> GRANT DBA TO test; Registering the directory where the dump is located. SQL> CREATE DIRECTORY dump_dir AS '/home/test/dumpd