Jobs

Vacancies at Kenya Airways

  • Contents
  • Open Jobs
    1. Senior Manager – Data Engineering and Governance
    2. Data Governance Manager
    3. Data Engineer Manager
  • Method of Application

Vacancies at Kenya Airways

Senior Manager – Data Engineering and Governance

  • Job Type Full Time
  • Qualification BA/BSc/HND
  • Experience 5 years
  • Location Nairobi
  • Job Field Data, Business Analysis and AI&nbsp

Brief Description

  • Job Purpose Statement: Reporting to the Head, Data, Analytics & AI, the Head of Data Engineering & Governance will be responsible for leading a team of data engineers and data governance to design and deliver a data engineering solution for our internal and external facing business lines. You will get a chance to leverage your strategic planning, business analysis and technical knowledge of data engineering, data governance, tools, and data architecture definition. Inaddition to managing our portfolio of datasets, you will play key roles inhelping our data scientists and analysts to leverage these datasetseffectively.

Detailed Description

  • Using architecture and data engineering techniques to design and provide tools dedicated to data extraction, analysis and enhancement (build common service layers as much as possible)
  • Perform research and analysis (including technological watch) as needed to understand market trends and impact
  • Contribute to building & maintaining the global analytic environment of SGL (which includes Data Science & Big data platform, Data Catalog and Data Capture tools) to ease exploitation of data
  • Take part in the strategic comity for Data Analytics Solution
  • Ensure compliance with policies related to Data Management and Data Protection, in close relationship with the Data Protection Officer, Security & Risk regulation teams
  • Contribute to building data engineering pipelines & API for Data Science / Big Data applications
  • Take active part in data architecture conception, environments design, core components development based on conceptual architecture/design, etc.
  • Design, manage and support PoC, contribute to the choice of tools (build or buy) with all the team & the Group, test solutions. Identify and challenge partners and providers when relevant.
  • Document services and build all relevant documentation
  • Act as a SME and tech lead / veteran for any data engineering question and manage data engineers within the Data Analytics Solution organization.
  • Promote data cultural change within the division to build a data-driven company (convince people of the importance of data, how it should be managed and used, …)
  • Collaborate with SGL local teams, FIT department colleagues, IT SME (functional, data, solution and technical architects, data scientists, innovators, business experts…)
  • Promote services, contribute to the identification of innovative initiatives within the Group, share information on new technologies in dedicated internal communities.

Job Requirements

  • Bachelor’s degree in Statistics, Software Engineering, Engineering, Machine Learning, Mathematics, Computer Science, Economics, or any other related quantitative field.
  • Big Data, Analytics or Data Science certification from recognized institutions
  • At least 5 years’ experience in BI developments
  • Proven and successful experience track record of leading high-performing data engineering teams
  • Proven experience on innovation implementation from exploration to production: these may include containerization, Machine learning/AI, Agile environment, on premise and cloud (Azure, AWS, Google) (mandatory).

go to method of application »

Method of Application

Brief Description

  • The data engineer willoversee the expansion and optimization of our data architecture and datapipeline with the purpose of improving data analysis and ML workflows. The dataengineer will be handling the design and construction of scalable data systemsand researching new use cases for data acquisition.

Detailed Description

Technical

  • Help to design and develop a data estate that is performant, accessible, secure, scalable, maintainable, and extensible.
  • Help to implement true CI/CD.
  • Design and develop EDW using Snowflake, DBT etc.
  • Design and develop AWS/Azure Data Lake
  • Design and develop data ingestion pipelines
  • Model EDW entities and ensure all data is complete, accurate, timely, and well documented
  • Work towards the implementation of a true Self-Service analytics platform.

Governance & DataProtection

  • Ensure that all work follows best security practices and fully adheres to DPA and other data regulations.
  • Ensure that all work follows the correct approval and sign-off process before it is pushed into Production.
  • Ensure that all work is documented and if needed, has a runbook in order to guarantee business continuity and support
  • Work with others in the team to keep the Data Dictionary and Ubiquitous Language complete and up todate.

PROCESS

  • Follow existing processes and work to improve/identify gaps in these processes.
  • Ensure the correct SDLC promotion processes are followed.
  • Follow the correct sign-off processes to ensure that only approved releases are deployed into Production.
  • Ensure that all AWS/Azure development follows CI/CD processes and is repeatable.

Team/People

  • Evangelize about the data team across the business.
  • Build relationships with members of the data team and the wider business team.
  • Work closely and collaboratively with all members of the data team.
  • Work closely with and learn from tech and team leads and challenge proposed solutions with your own ideas.

Job Requirements

Job Specifications

  • Bachelor’s degree in computer science or related fields
  • Some experience and knowledge of coding language such as Python.
  • Good experience and knowledge of the SQL query language.
  • Some understanding of star schemas and data warehouse concepts.
  • Some knowledge of AWS/Azure tools and technologies
  • Beneficial – ETL and ELT experience – both batch and microservices-led.
  • Minimum three (3) years’ experience in a data engineer role or similar.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.

Leave a Comment