Find Jobs
Hire Freelancers

Hadoop AWS project

$250-750 USD

Closed
Posted about 6 years ago

$250-750 USD

Paid on delivery
Minimum 4yrs of experience in Hadoop Technologies are primarily Spark, Spark Streaming and SQL, Kafka. Our project mainly deals with real-time data processing using Kafka with Spark Currently, we are using Vertica and HDFS for data storage and migrating to AWS S3. So, AWS experience is required not less than 1 year. Totally coding is in Scala. So, Scala is main. Knowledge of Akka actors, Akka framework, Traits, classes in Scala is mainly required. Good to have knowledge in setting CI/CD pipeliness through Jenkins
Project ID: 16415069

About the project

13 proposals
Remote project
Active 6 yrs ago

Looking to make some money?

Benefits of bidding on Freelancer

Set your budget and timeframe
Get paid for your work
Outline your proposal
It's free to sign up and bid on jobs
13 freelancers are bidding on average $548 USD for this job
User Avatar
Hi, It seems the requirement is created just for me. I worked Spark and Kafka streaming in all Scala, Java and Python (Pyspark pykafka). Additionally I also worked in HP vertica. Thanks!
$1,000 USD in 7 days
5.0 (14 reviews)
4.3
4.3
User Avatar
Hello, I have more than 4+ years of experience in Hadoop technologies like HDFS, MapReduce, Spark,Hive, MongoDB, Sqoop, Kafka,Storm, Zookeeper etc. I am an expert in Spark programming(Scala,Spark-shell, pyspark) Please contact me for more details
$400 USD in 10 days
4.7 (5 reviews)
3.9
3.9
User Avatar
Expert in scala and spark. Extensive knowledge with akka actor. My expertise comes from implementing generic components in scala to run on spark. Proficiency in rx Observables and Futures in scala for parallel processing. Worked extensively with kafka, hbase, zookeeper and redis in the inhouse big data platform in my previous employment. Please do let me know if you need more details or an interview for this selection.
$555 USD in 10 days
0.0 (0 reviews)
0.0
0.0
User Avatar
Hi, I have 7 years of experience and working on hadoop, spark, nosql, java, cloud... Done end to end data warehouse management projects on aws cloud with hadoop, hive, spark and presodb. Worked on multiple etl project like Kafka, nifi, flume, mapreduce, spark with XML/JSON., Cassandra, mongodb, hbase, redis, oracle, sap hana, ASE.... Many more. Let's discuss the required things in detail. I am committed to work done and strong in issue resolving as well. Thanks
$750 USD in 10 days
4.9 (3 reviews)
0.0
0.0
User Avatar
hi, I have about 5 years of working experience on Big data mainly with code on Scala and also hands on AWS technologies like S3 .I would love to work on this project. Thanks
$555 USD in 10 days
5.0 (3 reviews)
0.0
0.0
User Avatar
Certified Hadoop and Linux Administrator Will give free demo on my personal server if required For my qualifications you can see my profile
$333 USD in 10 days
0.0 (0 reviews)
0.0
0.0
User Avatar
hi we r having good exp in spark scala kafka we use AWS s3 as storage we would like to work on this project
$555 USD in 15 days
0.0 (0 reviews)
0.0
0.0
User Avatar
I am currently working in similar kind of project wuth databricks.I am very much familiar with spark streaming, spark SQL and Kafka.
$533 USD in 10 days
0.0 (0 reviews)
0.0
0.0
User Avatar
A proposal has not yet been provided
$250 USD in 3 days
0.0 (0 reviews)
0.0
0.0

About the client

Flag of UNITED STATES
United States
0.0
0
Member since Mar 4, 2018

Client Verification

Thanks! We’ve emailed you a link to claim your free credit.
Something went wrong while sending your email. Please try again.
Registered Users Total Jobs Posted
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Loading preview
Permission granted for Geolocation.
Your login session has expired and you have been logged out. Please log in again.