1. Analyze, develop, refactor, fix, test, review and deploy functionality, and bug fixes in ETL that moves data between Snowflake Cloud Data warehouse layers.
2. Database and Query tuning, diagnosis, and resolution of performance issues leveraging ELT and push-down if required.
3. Writing of complex SnowSqls and generating of Data Extracts and Metrics As per the Business requirements.
4. Creating of snowSql Scripts to perform unit testing and documenting them as artifacts.
5. Analyzing the data, understanding of data model and creating of data mapping documents for report development.
6. SnowSql performance tuning, Identifying the issues and resolving them as per the timelines.
7. Use and improve ETL frameworks, continuous data quality frameworks and other automation in data pipeline.
8. Service data availability SLOs and attend triage meetings to engage with Security, Infrastructure and Workload management teams to issue resolutions.
9. Compliance to Agile Jira SDLC controls, Service Now Change & Incident management, and Data Ops Gitlab CI/CD pipeline.
10. Participate in daily standups, lead design reviews and offshore coordination.
13 freelances font une offre moyenne de 36 $/heure pour ce travail
I have 5+ years of experience with big data solutions on AWS cloud. Designed more than 200 data pipelines doing data ingestion, data quality validations and ETL.