Skip to main content

Elasticsearch CORS with basic authentication setup

This is a short "recipe" article explaining how to configure remote ElasticSearch instance to support CORS requests and basic authentication using Apache HTTP Server 2.4.

Proxy
To start with, we need to configure Apache to proxy requests to the Elasticsearch instance. By default, Elasticsearch is running on the port 9200:
ProxyPass /elastic http://localhost:9200/
ProxyPassReverse /elastic http://localhost:9200/

Basic authentication
Enabling basic authentication is easy. By default, Apache checks the user credentials against the local file which you can create using the following command:
/path/to/htpasswd -c /usr/local/apache/password/.htpasswd_elasticsearch elasticsearchuser
Then you'll need to use the following directives to allow only authenticated users to access your content:
AuthType Basic
AuthName "Elastic Server"
AuthUserFile /usr/local/apache/password/.htpasswd_elasticsearch
Require valid-user
For more complex setups such as LDAP-based authentication and restricting access by user group or other criterias, see official howto.

CORS
When you want to add support of CORS requests to your server, you should configure it to set proper response headers starting with Access-Control-Allow-Origin. To make it work with basic authentication, you will need the following headers:
Header set Access-Control-Allow-Origin "*"
Header set Access-Control-Allow-Credentials: "*"
Header set Access-Control-Allow-Headers: "Authorization, Content-Type, X-Requested-With"
If you need to support HTTP PUT requests or want to see more detailed explanations with examples, check Using CORS tutorial.
The above configuration is still not enough to make basic authentication work as the HTTP OPTIONS pre-flight request is sent without Authorization header. So we need to allow such requests non password-protected:
<LimitExcept OPTIONS>
 Require valid-user
</LimitExcept>

Final configuration
Let's move the complete configuration inside a Location directive now:
<Location /elastic>
 ProxyPass http://localhost:9200/
 ProxyPassReverse http://localhost:9200/

 Header set Access-Control-Allow-Origin "*"
 Header set Access-Control-Allow-Credentials: "*"
 Header set Access-Control-Allow-Headers: "Authorization, Content-Type, X-Requested-With"

 AuthType Basic
 AuthName "Elastic Server PROD"
 AuthUserFile /usr/local/apache/password/.htpasswd_elasticsearch

 <LimitExcept OPTIONS>
  Require valid-user
 </LimitExcept>
</Location>

Comments

Popular posts from this blog

DynamicReports and Spring MVC integration

This is a tutorial on how to exploit DynamicReports reporting library in an existing Spring MVC based web application. It's a continuation to the previous post where DynamicReports has been chosen as the most appropriate solution to implement an export feature in a web application (for my specific use case). The complete code won't be provided here but only the essential code snippets together with usage remarks. Also I've widely used this tutorial that describes a similar problem for an alternative reporting library.
So let's turn to the implementation description and start with a short plan of this how-to:
Adding project dependencies.Implementing the Controller part of the MVC pattern.Modifying the View part of the MVC pattern.Modifying web.xml.Adding project dependencies
I used to apply Maven Project Builder throughout my Java applications, thus the dependencies will be provided in the Maven format.

Maven project pom.xml file:
net.sourceforge.dynamicreportsdynamicrepo…

Do It Yourself Java Profiling

This article is a free translation of the Russian one that is a transcript of the Russian video lecture done by Roman Elizarov at the Application Developer Days 2011 conference.
The lecturer talked about profiling of Java applications without any standalone tools. Instead, it's suggested to use internal JVM features (i.e. threaddumps, java agents, bytecode manipulation) to implement profiling quickly and efficiently. Moreover, it can be applied on Production environments with minimal overhead. This concept is called DIY or "Do It Yourself". Below the lecture's text and slides begin.
Today I'm giving a lecture "Do It Yourself Java Profiling". It's based on the real life experience that was gained during more than 10 years of developing high-loaded finance applications that work with huge amounts of data, millions currency rate changes per second and thousands of online users. As a result, we have to deal with profiling. Application profiling is an i…

Using Oracle impdp utility to reload database

Here I'll show an example of using Oracle Data Pump Import (impdp) utility. It allows importing Oracle data dumps. Specifically, below is the list of steps I used on an existing Oracle schema to reload the data from a dump. Steps to reload the data from an Oracle dump
We start with logging into SQL Plus as sysdba to be able to manage users. sqlplus sys/password@test as sysdba Dropping the existing user. CASCADE clause will ensure that all schema objects are removed before the user. SQL> DROP USER test CASCADE; Creating a fresh user will automatically create an empty schema with the same name. SQL> CREATE USER test IDENTIFIED BY "testpassword"; Granting DBA role to the user to load the dump later. Actually, it's an overkill and loading the dump can be permitted using a more granular role IMP_FULL_DATABASE. SQL> GRANT DBA TO test; Registering the directory where the dump is located. SQL> CREATE DIRECTORY dump_dir AS '/home/test/dumpdir'; Runnin…