Need docker environment that is capable of running my application which needs java, python, hadoop, spark and hbase

In Progress Posted 5 years ago Paid on delivery
In Progress Paid on delivery

I have a spark application which pulls information from a HDFS file system and insert data to HBase or vise versa. I need need a docker environment where i can test my spark application. The docker environment can be either single standalone node with java, python, hadoop, spark and hbase

running in it or a cluster running spark and hbase on different nodes. I want in such a way that if i execute the spark submit command then request should go through the docker spark containers and insert the data to hbase container.

Refere : [login to view URL]

Go through the above reference I want similar kind of environment but with updated versions as below. You can work with docker-compose and create a cluster where my application is running in one container and communication with other containers.

Required versions

Java - 1.8

Spark - 2.2.1

Hbase - 1.1.8

Hadoop - 2.7

Python 3.5 or latest

Note: Git hub reference is just for giving an idea how the project looks like, please don't copy that work.

Docker Hadoop HBase Linux Spark

Project ID: #17225972

About the project

5 proposals Remote project Active 5 years ago

5 freelancers are bidding on average $60 for this job

THREEPRAVEEN

Hi Obtain a Linux administrative position where I can maximize my technical experience in installation, configuration, and maintenance of Linux server based system and network applications.

$55 USD in 3 days
(0 Reviews)
0.0
lpalamede

I can do it the same way, but I need to know if you want to do your test with production data, or with a test database?

$55 USD in 3 days
(0 Reviews)
0.0