Skip to main content

Posts

Connection to Amazon Neptune endpoint from EKS during development

This small article will describe how to connect to Amazon Neptune database endpoint from your PC during development. Amazon Neptune is a fully managed graph database service from Amazon. Due to security reasons direct connections to Neptune are not allowed, so it's impossible to attach a public IP address or load balancer to that service. Instead access is restricted to the same VPC where Neptune is set up, so applications should be deployed in the same VPC to be able to access the database. That's a great idea for Production however it makes it very difficult to develop, debug and test applications locally. The instructions below will help you to create a tunnel towards Neptune endpoint considering you use Amazon EKS - a managed Kubernetes service from Amazon. As a side note, if you don't use EKS, the same idea of creating a tunnel can be implemented using a Bastion server . In Kubernetes we'll create a dedicated proxying pod. Prerequisites. Setting up a tunnel. ...

How to import an untrusted website certificate to the Java keystore

Java uses the keystore file named cacerts. It should already contain all trusted root CA certificates that are used to sign intermediate and leaf certificates. Leaf certificates are end user certificates that are used to secure websites with HTTPS. However, sometimes a root CA certificate might be missing from the Java keystore or a website might be using a self-signed certificate which will result in the following exception when you try to access the website from Java code: javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target For me it happened with a certificate issued by COMODO. In this case the easiest solution is to add the website certificate to the Java keystore. Shortly, it requires exporting the certificate from the website, importing it into the keystore and restarting your Java application. Please b...

PFX keystore notes

This is a short note with useful commands for PFX keystores. Import to AWS Certificate Manager When you need to import PFX certificate into AWS Certificate Manager, you will have to export the unencrypted private key and certificate chain first. Export the unencrypted private key from PFX openssl pkcs12 -in domain_certificate.pfx -nocerts -nodes -out private_key.pem Export the certificate chain from PFX openssl pkcs12 -in domain_certificate.pfx -nokeys -out certificate.pem When you have the PEM files, you can go to the AWS Certificate Manager, click "Import a Certificate" button and enter the following: Certificate body* - paste the first certificate from certificate.pem ending with the words: "-----END CERTIFICATE-----" Certificate private key* - paste the contents of private_key.pem Certificate chain - paste the complete contents of certificate.pem

SSL certificates guide

In this article I'm going to explain how to create keys, SSL certificates and key stores. This can be required to simply migrate your website to HTTPS or to enable single sign-on authentication or in other cases. SSL certificates can be used for digital signing/verification and for encryption/decryption. In case of digital signatures, the sender signs the message using a private key certificate, while the receiver verifies the signature of the message using the public key certificate. In case of encryption, the sender encrypts the message using the public key certificate, while the receiver decrypts the message using the private key. Generating keys. Generating certificates. Working with keystores. Generating keys The first step is generating a private/public key pair. This can be done in different ways. We'll use openssl utility as it will be used for certificates later as well. The important point is the key length - bigger length makes the key harder to crack. ...

Elasticsearch CORS with basic authentication setup

This is a short "recipe" article explaining how to configure remote ElasticSearch instance to support CORS requests and basic authentication using Apache HTTP Server 2.4. Proxy To start with, we need to configure Apache to proxy requests to the Elasticsearch instance. By default, Elasticsearch is running on the port 9200: ProxyPass /elastic http://localhost:9200/ ProxyPassReverse /elastic http://localhost:9200/ Basic authentication Enabling basic authentication is easy. By default, Apache checks the user credentials against the local file which you can create using the following command: /path/to/htpasswd -c /usr/local/apache/password/.htpasswd_elasticsearch elasticsearchuser Then you'll need to use the following directives to allow only authenticated users to access your content: AuthType Basic AuthName "Elastic Server" AuthUserFile /usr/local/apache/password/.htpasswd_elasticsearch Require valid-user For more complex setups such as LDAP-based...

Basic auth with Apache and Tomcat

This is a short "recipe" article explaining how to configure basic authentication for the following setup: Apache Tomcat with some application that need be partially password-protected Apache HTTP Server 2.4 as a proxy CentOS 7 Linux server Although basic authentication can be configured within Tomcat itself, my target is to use Apache for that purpose. In addition, as passing unencrypted credentials over the web is insecure, I'm going to install SSL certificates to enable HTTPS for the part of my application. This setup can be used when a part of an internal application need be secured to make it publicly accessible using a separate firewall/proxy (out of scope of this article), that part will be password-protected and SSL-encrypted. Steps Copy certificates into /etc/ssl/certs/ivanlagunov.com Create symlink: cd /etc/httpd sudo ln -s /etc/ssl/certs/ivanlagunov.com Install Apache mod_ssl sudo yum -y install mod_ssl Create file with user credentials for basi...

Using Oracle impdp utility to reload database

Here I'll show an example of using Oracle Data Pump Import (impdp) utility. It allows importing Oracle data dumps. Specifically, below is the list of steps I used on an existing Oracle schema to reload the data from a dump. Steps to reload the data from an Oracle dump We start with logging into SQL Plus as sysdba to be able to manage users. sqlplus sys/password@test as sysdba Dropping the existing user. CASCADE clause will ensure that all schema objects are removed before the user. SQL> DROP USER test CASCADE; Creating a fresh user will automatically create an empty schema with the same name. SQL> CREATE USER test IDENTIFIED BY "testpassword"; Granting DBA role to the user to load the dump later. Actually, it's an overkill and loading the dump can be permitted using a more granular role IMP_FULL_DATABASE . SQL> GRANT DBA TO test; Registering the directory where the dump is located. SQL> CREATE DIRECTORY dump_dir AS '/home/test/dumpd...