To design, code, verify, test, document, amend, and secure data pipelines and data stores according to agreed architecture, solution designs, standards, policies and governance requirements. To monitor and report on own progress and proactively identify issues related to data engineering activities. To collaborate in reviews of work with others where appropriate.
Qualifications
- Completed Matric
- Information Studies Degree
- Information Technology Degree
Experience
- 5-7 years’ experience understanding of data pipelining and performance optimization, data principles, how data fits in an organization, including customers, products and transactional information. Knowledge of integration patterns, styles, protocols and systems theory
- 5-7 years’ experience in building databases, warehouses, reporting and data integration solutions. Building and optimizing big data data-pipelines, architectures and data sets. Experience in creating and integrating APIs. Experience performing root cause analysis on internal and external
- data and processes to answer specific business questions and identify opportunities for improvement.
- 5-7 years’ experience in database programming languages including SQL, PL/SQL, SPARK and or appropriate data tooling. Experience with data pipeline and workflow management tools
Additional Information
Behavioural Competencies
- Adopting Practical Approaches
- Articulating Information
- Checking Things
- Developing Expertise
- Documenting Facts
Technical Competencies
- Big Data Frameworks and Tools
- Data Engineering
- Data Integrity
- IT Knowledge





