Working on a hadoop project which involves Spark with scala, hive , impala, and sqoop.
Looking for daily 2 hours of support
Monthly Payout INR 20000
Please mention experience with all these technologies.
17 freelance font une offre moyenne de ₹22712 pour ce travail
Hi I am a data engineer with 3+ years of experience in the industry. I have worked on deploying highly scalable, resilient and durable solutions using big data tech both in clouds and on-premise. I have expertise in f Plus
Hi, I have 2+ years of exp working with spark in scala, 4+ years of exp working in hive, sqoop. total 6+ years of experience. Kindly reach out to discuss it in further details.
Hi, Review my profile. 4 Years of experience as BI/ETL Developer and Oracle Business Intelligence Certified professional. Adept in preparing data for analysis, Creating dashboards deploying dashboards on serv Plus
I have 2 years of experience in all these technologies and have certification in Spark and Hadoop ecosystem as well. Lets discuss in chat to finalize the deal.
I have been working as a senior hadoop developer from past 4+ years. I am currently on Spark, scala, hive , Apache NiFi and Azure. You can check my linked profile at [login to view URL] I am c Plus
Currenly works in hadoop project which involves Spark with scala, hive , impala, sqoop, spark and python as hadoop developer. As this work requires no experience, i can also complete this work as experience fellow. I c Plus
From past 4 years, I have been working on various components of hadoop ecosystem including Spark, Impala, hive, sqoop and various hadoop components. Yes, I can provide you the assistance required. Relevant Skills and Plus
• Possess 2years of analysis and development experience in working projects and prototypes. • Hands on experience on major components of Hadoop ecosystem like Apache Spark, Map Reduce, HDFS, HIVE, PIG, Sqoop and HBASE Plus
Currently, i am working on the same skills as you required having 1.5 years of experience. I am graduated from one of the India's top most institute's called NIT's. I am workaholic. I can support up to 6 months. Rel Plus
What would be the kind of work? Data scrubs, transformations? or what kind of processing you would need to perform?
Has been working on these technologies since a year and has gained a decent amount of knowledge . Hadoop - 1.6 Years Sqoop - 6 months Spark With Java - 1 Year Scala - 3 months Hive - 1 year
I have 4 years of experience on Hadoop eco systems. Have a good working experience with the cloudera flatform. Have Experience in Sqoop,Flume, Kafka, mapreduce, Pig, Hive, Habse, Cassandra Spark core, Spark SQL, can cr Plus