Cloudera optimization for HDFS/Spark environment -- 2

Closed Posted 7 years ago Paid on delivery
Closed Paid on delivery

Need to optimize performance through configuration of server environment resource utilization (mem and CPU) on nodes for a recently installed environment utilizing cloudera 5.5.

Hadoop Hive Spark Yarn

Project ID: #10626684

About the project

7 proposals Remote project Active 7 years ago

7 freelancers are bidding on average $422 for this job

winnow1

We are a group of Data Scientists based in Bangalore. Our core areas of expertise are big data and machine learning. Can assist you in Hadoop configuration, cluster optimisation and later on in implementing complex More

$775 USD in 5 days
(4 Reviews)
4.9
mahendrasinghmar

I will do it . I think you posted also earlier . At that i was busy with other client . now i am free so lets do it

$250 USD in 3 days
(1 Review)
1.0
ITLove007

Hello I have experience of HDFS. I've build a search system using solr, hdfs(hadoop) and nutch. Solr for indexing and searching hadoop for hdfs and mapreduce nutch for crawling.

$736 USD in 5 days
(0 Reviews)
0.0
ankushkulkarni

A proposal has not yet been provided

$388 USD in 15 days
(0 Reviews)
0.0
ajithkumarkm0

I am a Cloudera and Map R certified hadoop administrator with good knowledge in Linux. Please let me know how to proceed further on this. I do have experience in hadooop cluster security, optimization and sizing More

$277 USD in 3 days
(0 Reviews)
0.0