Spark Mainframe Connector Library
€30-250 EUR
Paid on delivery
Hello,
I have problem understanding and implementing the Spark Mainframe Library and I need help/ support from Spark experts who have Scala and Mainframe knowlegde who can help me answer and implement the following:
The solution that I need to implement is explained in the following Github link:
[login to view URL]
Questions with which I need support are:
1. Build the Package, explain in basic writen language, step by step how it is built and how to implement it.
2. Add this to the --jars command line path.
3. Explain what --driver-class path is.
4. Explain this bit of code and help replicate it based on new parameters (e.g specific to my needs etc.) "SPARK_MAINFRAME_CONNECTOR_CLASSPATH=$SPARK_MAINFRAME_PATH/target/scala-2.10/[login to view URL]:SPARK_MAINFRAME_PATH/lib/[login to view URL]:SPARK_MAINFRAME_PATH/lib/[login to view URL]
$ bin/spark-shell --jars $SPARK_MAINFRAME_CONNECTOR_CLASSPATH --driver-class-path $SPARK_MAINFRAME_CONNECTOR_CLASSPATH"
5. Help Implement the solution in a cloudera based hadoop cluster as explained in the link
Project ID: #18863306
About the project
7 freelancers are bidding on average €219 for this job
Hello Sir, I am a developer and i will provide machine learning solutions in python . I have an experience in libraries like : Numpy Pandas Sklearn (Scikit Learn) Matplotlib Scipy I can More
Hi, I have 8 years of experience and working on hadoop, spark, nosql, java, BI tools(tableau, powerbi), cloud(Amazon, Google, Microsoft Azure)... Done end to end data warehouse management projects on aws cloud with ha More
I’d like to provide a quality work in timely manner with all expected result. Relevant Skills and Experience Having skill for automation test I can developed a script to throubleshoot the issue and give them a resolut More
Hello, I have a year of experience in Spark & other Hadoop components. On top of it I have extensivly worked with CDH and Cloudera Manager. Please let me know if I can assist you with your project. Regards, Ankit.