I need a kafka setup of several kafka clusters where there is 1 central kafka cluster X and several decentral kafka clusters: A, B, C and D.
X should run on a central server machine and A, B, C and D should run on satellite machines which automatically should sync up to X. (possibly mirror maker could be used for this).
It is a requirement that there are temporarily for shorter or longer periods no connection between a satellite cluster (like A, B, C or D) and the central cluster X. During these disconnected periods data which are sent locally to a satellite cluster should just be buffered up there - and as soon as connection is re-established, the satellite machine should sync up to X again automatically (again - maybe using mirror maker). All data posted to a satellite machine is done so directly from another process, running a python program (this also needs to be set up). This entire setup should be emulated in docker containers where X is running in 1 container, A in another container, B in another container etc. so that it all can be easily started and stopped using docker compose. If there is something better than mirror maker i am open to suggestions if you can persuade me. As a bonus i would like to keep the data in the clusters (possibly using a postgress backend) such that its always possible to pull data out from a cluster again (also using an external python process).
To summarise, I need a push model with many peers and a central server running over a volatile network connection, like this:
X is central server
A,B,C,D are satellites
m are mirror makers
| indicates connection
<>^v indicates direction of connection
7 freelances font une offre moyenne de 158 $ pour ce travail
hi There, i hope you're doing well! first of all thank you for your job posting and i am very happy to bid for your project before sharing any price and time estimation i need to check all the points in detail and want Plus
Hello, I have recently created apache kafka cluster as one of the client requiement. With the extensive knowledge of containerize kafka and networking settings internal / external Will help on this usecase.
Hello. I am an expert in Big Data expert and have 3 years of experiences in this field. And I have been worked as a data analyst in big data project team and mastered data sources(Strucutred and undstructured data), da Plus
Hi, I have gone through your requirements and I can do this task immediately. I'm confident to say that I have enough skills and resources needed for this project . I can assure you for a complete professional work o Plus