We need an Flink application (jar) that will run inside Kinesis Analytics environment. It needs to read JSON records from a Kinesis Data Stream, accumulate 5 minutes and then send it to an S3 bucket.
I will be very similar to AWS Kinesis Firehose, writing the events in batch to a S3 file in the GZIP format.
The most import requirement, which is the reason we can't use Firehose, it that the directory structure in the S3 Bucket must use an information that is inside the JSON record.
So the folder structure needs to be:
And eventType is at the ROOT path of the JSON object:
2 freelances font une offre moyenne de 145 $ pour ce travail
Hi I read your project very carefully and I think it is an exciting project and I love to work on this project. I represent a team of developers. we have 3+ years of experience in development. We have been working wi Plus
Hi, I started my journey as full stack web developer working with Ruby, Python and React. Once I got to know about big data, I got interested to learn big data domain, and now working on it. Currently I am working wit Plus