spark application on google cloud platform for fetching and processing data from hdfs

Hello we are looking for a scala developer who has experience working on handling data in .packet form on spark clusters on google cloud platform. Basically the task is to access data from hdfs in .packet form, query through the data for relevant UIDs, fetch some specific fields in those UIDs, process parameters by performing some mathematical computations on those fields for those specific UIDs and store the processed values in a separate .packet file on hdfs. Further aggregation needs to be performed on the computed values, and final summary file needs to be stored into Mongo dB.

The technologies you need to be comfortable with : Dataproc on google (cloud native hadoop and spark), airflow (will be used for scheduling), google cloud platform (in general), scala (for scripts), Mongo dB (for data export)

Compétences : Moteur Google App, Hadoop, NoSQL Couch & Mongo , Scala, Spark

Voir plus : hadoop on google cloud platform, no filesystem for scheme: gs, google dataproc hdfs, gcs connector maven, google cloud storage vs hdfs, gsutil hdfs, class com.google.cloud.hadoop.fs.gcs.googlehadoopfilesystem not found, google dataproc tutorial, cron php google cloud platform, find developer for google cloud platform, Google Cloud Platform, google cloud application, google cloud print java application, web application processing data report, geofencing application google map

Concernant l'employeur :
( 79 commentaires ) BANGALORE, India

N° du projet : #16298138

8 freelance font une offre moyenne de ₹12156 pour ce travail


Hello I have extensive experience working with various data formats and using Spark to deal with those. I believe I'll be able to complete this task successfully. About me: - 3 years of experience working in the f Plus

35000 ₹ INR en 7 jours
(6 Commentaires)

I have work experience 5 year in big data technology . I have experience in elasticsearch, java, scala , spark. for more info ping m.e

11111 ₹ INR en 7 jours
(3 Commentaires)

Hi, I have more than 3+ years of experience in Hadoop technologies like mapreduce , spark, hdfs etc. I can complete your project contact me for more details

10000 ₹ INR en 3 jours
(4 Commentaires)

I am interested to work on this project as I have relevant experience in Big Data,Sqoop, Hadoop, Spark, Hive, Kafka, Spark Streaming, Rdd, Datframe, Dataset , Python, Scala, google cloud, azure, aws. I am well versed i Plus

11111 ₹ INR en 6 jours
(4 Commentaires)
12222 ₹ INR en 3 jours
(1 Commentaire)

hi, I am hadoop, sparkand nosql engineer with 6 years of experience. can do this, comfortable with spark, scala, airflow, Google cloude. data proc I can manage.

2250 ₹ INR en 1 jour
(1 Commentaire)

We have 8 years of experience working in Machine Learning. We have built various recommendation engines, web apps, crawlers, analytical dashboards etc. We have rich experience in Python, Spark, R, Scala, Cassandra, Hiv Plus

7777 ₹ INR en 3 jours
(0 Commentaires)

Languages: JAVA. Java/J2EE: Core JAVA,JAVAFX, Advanced JAVA, Servlet, JSP, JSTL, EJB, JDBC, Junit, Web Services, XML, XSD, JAX-RS, DOM, SAX, Multithreading, JTA, Custom Tags, JPA API’s. Web Technologies: Html, DHTML Plus

7777 ₹ INR en 3 jours
(0 Commentaires)