· Understanding Business Requirements and Functional Requirements provided by Architect to develop & deploy the Spark and Java based solutions well within timelines.
· Design and development of applications/application features conforming to the quality standards/best practices and expectations throughout the development life cycle in Java, Spark, and Big Data technologies
· Owns the end to end implementation of the assigned data processing components/product features i.e. design, development , deployment and testing of the data processing components and associated flows conforming to best coding practices
· Works closely with Platform architect and other internal stakeholders during the development life cycle. Participate in Design and System Testing activities of all data processing features/activities and play key role in building /upgrading the data processing module and flows.
· Responsible for Unit Testing & Integration testing
· Manage data at scale using relational databases and SQL
· Build the best in class ELT based solutions for effective data ingestion and transformation
· Work closely with the IT and other internal stakeholders to deliver well within timelines
· Explore existing application systems, determines areas of complexity, proactively identify the potential risks/issues and provide the resolutions for the identified issues
· Guiding and the mentoring the other data management team members in their day to day activities
· Leading the effort in the technical documentation i.e. documenting the technical approach to the solutions, providing the support in test case scenarios, associated config settings recommendations, technical limitations etc
• 5 to 7 years of overall experience in software development using Java and at least 2 years of in-depth coding in data processing area in Spark technology ( along with Big Data platforms, RDBMS, NoSQL)
• Deep understanding of Java and is expected to perform complex data transformations in Spark
• Advanced SQL capabilities are a must. Knowledge of database design techniques and experience working with extremely large data volumes is mandatory.
• Experience in Agile SDLC and AWS is a must.
• Product development experience is a must
• Expertise in ETL and data wrangling using Java
• Experience in programming in Linux/Unix environment, including shell scripting
• Excellent troubleshooting skills
• Spark Developer certification from any of (Databricks, MAPR, Cloudera or Hortonworks etc) would be a plus.
• Bachelor’s degree in Computer Science or Engineering or related discipline
14 freelances font une offre moyenne de 1014 ₹/heure pour ce travail
Greetings! I have read your description carefully and understand what you actually need. I am an experienced Java Full Stack developer and can handle this project quickly and easily in good manners