adding CORS support to elasticsearch-head plugin

There are two vulnerabilities in Elasticsearch that I recently patched in my installations.

One is the ‘script’ vuln, mentioned here.

Fix by adding

script.disable_dynamic: true

to your Elasticsearch.yml config file.

The other one has to do with CORS, which exposes data via REST endpoints.

Fix by adding

http.cors.allow-origin: "http://your.FQDN.domain.name"

to your Elasticsearch.yml config file.

In fixing the second one (CORS), I run into a problem where that broke my usage of elasticsearch-head plugin.  I use the plugin as a checked out git repo on my laptop and port forward to the actual ES server.   E.g. the URL I use is something like this

file:///Users/tinle/src/opensource/elasticsearch-head/index.html?base_uri=http://127.0.0.1:9200/

So I ended up having to patch elasticsearch-head to make it work with CORS.

diff --git a/dist/app.js b/dist/app.js
index 5bce2a3..7e58acb 100644
--- a/dist/app.js
+++ b/dist/app.js
@@ -1188,6 +1188,9 @@
                request: function( params ) {
                        return $.ajax( $.extend({
                                url: this.base_uri + params.path,
+      /**
+       * 2014/06/01 tinle
+       **/
                                dataType: "jsonp",
         crossDomain: true,
                                error: function(xhr, type, message) {
diff --git a/dist/vendor.js b/dist/vendor.js
index fb1a448..2b74180 100644
--- a/dist/vendor.js
+++ b/dist/vendor.js
@@ -6838,6 +6838,10 @@ jQuery.each( [ "get", "post" ], function( i, method ) {
                return jQuery.ajax({
                        type: method,
                        url: url,
+      /**
+       * HACK 2014/06/03 tinle
+       */
+      crossDomain: true,
                        data: data,
                        success: callback,
                        dataType: type
@@ -14439,4 +14443,4 @@ under the License.
                }
                throw "could not process value " + v;
        };
-})();
\ No newline at end of file
+})();

 

Updated: 6/4/2014 – I think the above patch should work.  I’ve been using it last few days and I am able to GET/PUT/POST, e.g. make changes to ES via elasticsearch-head.

 

 

Elasticsearch, Logstash and Kibana Meetup @ LinkedIn

We had a great ELK Meetup on Wed 5/21/2014 at LinkedIn.  The recorded video is available here.

http://www.ustream.tv/recorded/47864947

We had Kurt Hurtado, one of the logstash dev, speaking on ELK in the DevOps Environment.  Then a nice long Q&A session after, joined by Uri Boness, one of the Elasticsearch core dev.

 

MongoDB and Riak

12/18/12 UPDATE

Since I am the only DevOps working on this, and there are tons of other things requiring my attention, I had to drop riak. The engineers only know mongodb anyway, and they are reluctant to learn a new nosql (riak). Crap! So this project had been killed. Too bad.

I have some python scripts that I wrote to copy mongodb collections over to riak, if I have time, I’ll open source them.

======================

I’ve been working with MongoDB at current $WORK and previous jobs. It (used-to-be) is the nice, shiny toy that everyone rushed to. I’ve run into numerous limitation in trying to scale it up. Operationally, it can be a nightmare if the architecture was not setup correctly at the beginning.

Mongo is also a PITA to scale. There are major sites that have Mongos in the thousands, but at that point it become throwing hw and money at the problem. That just seem stupid for startups.

So at current place of $WORK, they are currently testing MongoDB, but I wanted to look for an alternative solution before we become fully committed to yet another operational nightmare.

After a lot of googling, testing, experimenting, etc. I decide on trying Riak from Basho.

Googling shows a number of companies migrated from MongoDB to Riak. Their experiences was useful, but I was looking more for concrete HOWTO to move large MongoDB over to Riak.

First, of course was to get hands-on experiences with Riak. Installed, play with it, etc. Then I used the riak-python-client lib to start migrating some data over. I wrote a script to work through all collections in a Mongo DB, for each collection, create a Riak bucket and add the Mongo doc to bucket using the Mongo _id as the key.

Right away, I run into some issues with Riak. I have a 3 nodes Riak (on 3 physical CentOS 5.8 servers). The MongoDB I was copying over was large, about 2GB on disk file size and over a million records. Partway through the conversion, 2 Riak nodes crashed and died…. WTF! No matter what I do, they wouldn’t start back up (Riak log shows some kind of erlang errors, but I don’t know erlang). So I stopped the only node running, ‘rm -rf /var/lib/riak/*’, ‘killall epmd’, restarted all 3 nodes and they came back up.

I don’t have time to debug this problem, so restarted conversion with a smaller subset of Mongo data. But this crash worries me. The erl_crash.dump shows Riak run into resource issues, unable to allocate heap memory. Hmmm.

More on my adventure in evaluating Riak vs MongoDB in the future.