4
Detailed Description
- Help to design and develop a data estate that is performant, accessible, secure, scalable, maintainable, and extensible.
- Help to implement true CI/CD.
- Design and develop EDW using Snowflake, DBT etc.
- Design and develop AWS/Azure Data Lake
- Design and develop data ingestion pipelines
- Model EDW entities and ensure all data is complete, accurate, timely, and well documented
- Work towards the implementation of a true Self-Service analytics platform.
Job Specifications
- Bachelor’s degree in computer science or related fields
- Some experience and knowledge of coding language such as Python.
- Good experience and knowledge of the SQL query language.
- Some understanding of star schemas and data warehouse concepts.
- Some knowledge of AWS/Azure tools and technologies
- Beneficial – ETL and ELT experience – both batch and microservices-led.
- Minimum three (3) years’ experience in a data engineer role or similar.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.