Find Jobs
Hire Freelancers

Freelancers from Big Data Domain

$75-95 AUD

Fermé
Publié il y a presque 6 ans

$75-95 AUD

Payé lors de la livraison
Looking for 2 freelancer from Big Data Doamin.
N° de projet : 17519131

Concernant le projet

16 propositions
Projet à distance
Actif à il y a 6 ans

Cherchez-vous à gagner de l'argent ?

Avantages de faire une offre sur Freelancer

Fixez votre budget et vos délais
Soyez payé pour votre travail
Surlignez votre proposition
Il est gratuit de s'inscrire et de faire des offres sur des travaux
16 freelances proposent en moyenne $91 AUD pour ce travail
Avatar de l'utilisateur
Hi, I have more than 4+ years of experience in big data technologies like MapReduce, Spark, Scala, Hive, AWS, Azure contact me.. I can complete your project
$105 AUD en 10 jours
4,7 (8 commentaires)
4,0
4,0
Avatar de l'utilisateur
I am having 5+ years of experience in Java ,Hadoop ,Spark,Scala .Mostly worked on real time applications using spark,kafka,hbase. Also experienced in Apache Nifi tool, Hive,Oozie,Falcon. As per your description you need big data resources .I think i am the best person to help you in that. So I would request you to consider me for this project. Hope to hear a positive response from you Thank you
$75 AUD en 10 jours
5,0 (7 commentaires)
2,8
2,8
Avatar de l'utilisateur
Hi Buddy I am Professional web developer .I am a new worker in freelacer, But I am professional web developer as well as 3 year experience in development. I have more than 3 Year Experience In Web Development with PHP,javascript,jquery,codeniegter, i develop many site this is now lived I can aslo fix any javascript and php Issues. now i am reading to doing your project if you give me a chance then i will try my best work for you. thanks
$77 AUD en 10 jours
0,0 (0 commentaires)
0,0
0,0
Avatar de l'utilisateur
Hello We are group of freelancers working extensively on Data Warehousing, ETL, Data Mining, Data Analytics, Product Development and deployment on cloud, Machine Learning, Deep Learning and Microservices. After excelling our skills for more than 7 years in these fields and services, now we have formed a group of freelancers who work collaboratively on array of technologies. We have excelled our self in Datasets - Hive, Pig, Spark, Hadoop, Mapreduce, Flink Scripting Languages - Python, Scala, Java Amazon Web Services - EC2, EMR, CloudWatch, Lambda, RDS, ELB, Route S3, Amazon S3 and Redshift. Along with Hbase, Cassandra, TDCH, Sqoop, OraOp, Oozie, Azkaban, Airflow, Flume and Kafka. We are dedicated team of freelancers having excellent project management skills and altruistic approach towards clients. We truly look forward to work with you. Thank You.
$94 AUD en 10 jours
0,0 (0 commentaires)
0,0
0,0
Avatar de l'utilisateur
i have around 8 years of exp in IT. 4 years of exp in apache spark,scala,hive,sqoop and Big data technologies
$94 AUD en 10 jours
0,0 (0 commentaires)
0,0
0,0
Avatar de l'utilisateur
What is the problem you are trying to solve and what are the technologies you are going to use in this project?
$105 AUD en 10 jours
0,0 (0 commentaires)
0,0
0,0
Avatar de l'utilisateur
I have been working with BigData and Cloud Technologies and have AWS Solution Architect Cerification and have hands on experience with AWS EMR, PySpark, Hive, Cloudera, Hadoop, Redshift, Amazon RDS, Python, Java. MySQL and etc. I have lead and implemented many projects.
$111 AUD en 5 jours
0,0 (0 commentaires)
0,0
0,0

À propos du client

Drapeau de INDIA
Bhinmal, India
5,0
60
Membre depuis août 21, 2017

Vérification du client

Merci ! Nous vous avons envoyé un lien par e-mail afin de réclamer votre crédit gratuit.
Une erreur a eu lieu lors de l'envoi de votre e-mail. Veuillez réessayer.
Utilisateurs enregistrés Total des travaux publiés
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Chargement de l'aperçu
Permission donnée pour la géolocalisation.
Votre session de connexion a expiré et vous avez été déconnecté. Veuillez vous connecter à nouveau.