We are having a requirement for bigdata technologies who should have good experience on spark, scala, hive , sqoop and also should know how to extract data from oracle database using spark and load to hdfs.
9 freelances font une offre moyenne de 45148 ₹ pour ce travail
hi, I can do this Hadoop, hive work since, I have expertise in bigdata technology with 9 years of experience. good command over Hadoop, hive, spark, nosql, java, Linux, aws, gcp... please let me know if we can have q Plus
Hi, I am an experienced Data Engineer with a solid background in Spark. I have worked on many Big Data projects with Spark, Scala, Python, Cassandra, Snowflake, AWS,... Let's have a call for more details about the proj Plus
Hi, I have very good knowledge on these stack as I have been working as big data engineer from past 6 years. I would definitely help to solve this. We can surely catch up and you can share more information in thi Plus
I'm having good experience in spark,hive,Scala and RDBMS Database. I'm interested to work on this assignment.
Good Experience in Retail and Finance [login to view URL] working in MNC using Big Data technologies like Hadoop Spark Scala and Kafka
Hi, I am a data Engineer having 4.6+ years of experience in Scala and Spark, basically in big data with other technologies like python, airflow, Cassandra. I think I would be really helpful in this project. Thanks & Plus
Hi , I have experience in big data with Hadoop spark and scala for more than 4 years I have also good hands on and worked in a project where we used to populate data from rdbms/ S3 location to the hdfs / other preferre Plus
Hi There, I am having overall 7.5 years of experience in Spark application development using Scala, Java and Python. I have worked with 6 Spark projects with various ecosystem. I am a certified Spark developer by clou Plus
Having 4.5 years of experience in similar use case. I can work on this use case. Feel free to connect me.