I am collecting Tweet data by flume in HDFS. I have flume files stored in HDFS. I need to copy the data into my local directory in the server then convert it to CSV and save it on my computer/laptop.
The candidate must be experienced with HDFS, Flume, and Command-Line.
Freelancers from South Asia preferred.
7 freelances font une offre moyenne de 8286 ₹ pour ce travail
Your project description has piqued my interest, I am A competent Big Data engineer with vast experience in HDFS, FLUME AND HADOOP . I can work on the tweet data and save it in your local directory as per your requir Plus
Hi, I have industry experience in developing data pipelines and I can develop application to for your case. I would like to discuss in detail about your setup and requirement. Regards, Ritesh
I AM CURRENTLY STUDYING AT IIT DELHI IN FIRST YEAR AND I AM EXPERT AT EXCEL AND WORD AND CAN DO ANY WORK VERY EFFECTIVELY.
Dear sir/madam, Thanks for your job post.I'm finished reading about your job. It's great job.I would love to do this work and i have 2 year experience in this regard. if you like leave me a message. Thanks you !!! be Plus
I've done course of Big Data Hadoop and want to implement my learning in your project. I'm not expert here but can do your work with the help of my notes and experience.
I am Prashant Gupta, very interested to provide the support as Hadoop Admin Below are my brief details for your consideration. Experience: 6+ years of overall IT experience and 4 years Hands on experience as a Big Da Plus
I can easily help you to complete this project, contact me I have good experience in in hdfs and data streaming