Job Description
To design, build, and maintain scalable data pipelines and architectures that support analytics, reporting, and business intelligence initiatives.
- Enables data-driven decision-making by ensuring data availability and quality.
- Leads automation and optimization of data workflows.
- Supports cross-functional teams with robust data infrastructure.
Qualifications
Type of Qualification: First Degree
Field of Study: Engineering or any other related field
Experience Required
5-7 years
- Experience in building databases, warehouses, reporting and data integration solutions. Experience building and optimising big data data-pipelines, architectures and data sets. Experience in creating and integrating APIs. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
- Deep understanding of data pipelining and performance optimisation, data principles, how data fits in an organisation, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory
- Experience in database programming languages including SQL, PL/SQL, SPARK and or appropriate data tooling. Experience with data pipeline and workflow management tools
Additional Information
Behavioural Competencies:
- Articulating Information
- Developing Expertise
- Embracing Change
- Examining Information
- Interpreting Data
- Team Working
Technical Competencies:
- Big Data Frameworks and Tools
- Data Engineering
- Data Integrity
- Data Quality
- Stakeholder Management