Senior Data Engineer
John Keells Holdings PLC (JKH) is Sri Lanka’s largest listed conglomerate in the Colombo Stock Exchange. From managing hotels and resorts in Sri Lanka and the Maldives to providing port, marine fuel and logistics services to IT solutions, manufacturing of food and beverages to running a chain of supermarkets, tea broking to stock broking, life insurance and banking to real estate, we have made our presence felt in virtually every major sphere of the economy.
Octave will be the cornerstone of our data driven strategic and operational decisions!
Team Octave will solve JKH’s most intractable problems across industry verticals by building pipelines using Python and Azure data factory to work on one of the country’s richest data lakes, building and implementing complex machine learning algorithms which will impact millions of Sri Lankans.
We are looking for driven and enthusiastic analytics practitioners for our Data and Advanced Analytics Centre of Excellence- Octave.
Role description
The Data Engineers will work with the data engineering team to help develop high performance & high scalability enterprise applications. Working closely with more experienced data engineers, they will be learning how to build pipelines and manage data. They will also work with data scientists and business leadership to develop data pipelines for model development and productionalization. Their work will involve high level of interface with senior data engineers and the tech lead.
Key responsibilities
- Gather requirements, assess gaps and build roadmaps and architectures
- Work closely with Data Analysts to ensure data quality and availability for analytical modelling
- Work with data engineers and senior data engineers and learn how to explore suitable options, design, and create data pipelines (data lake / data warehouses) for specific analytical solutions
- Identify gaps and implement solutions for data security, quality and automation of processes
- Support maintenance, bug fixing and performance analysis along data pipeline
- Contribute to knowledge building and sharing by researching best practices, documenting solutions and continuously iterating on new ways to solve problems
Desired Skills / Competencies
Education + Technical skills / experience
- Bachelor's degree in Computer Science, Engineering, Mathematics or Statistics.
- 2+ years of experience in data engineering field
- Basic knowledge of at least 1 programming language
- Basic understanding of Microsoft Azure or other leading cloud platforms
- Limited understanding of maintaining and configuring data engineering and ETL tools (e.g., Informatica, Azure Data Factory, SSIS)
- Basic understanding of data tools such as Hive, Spark, Hadoop, or Presto
Managerial Skills
- Ability to work well in agile environments in diverse teams with multiple stakeholders
- Strong project management and organizational skills
- Ability to effectively communicate complex analytical and technical content
Mindset & Behavior
- High energy and passionate individual who inspires teammates to reach their maximum potential
- Excited about trying new solutions outside standard approved
- Willing to adopt an iterative approach; experimental mindset to drive innovation
Interested candidates are encouraged to apply on HIVE before the 21/07/2023