The Dangote Group is one of the most diversified business conglomerates in Africa with a hard-earned reputation for excellent business practices and products' quality with its operational headquarters in the bustling metropolis of Lagos, Nigeria in West Africa. The Group's activities encompass: Cement - Manufacturing / Importing Sugar - Manufacturing & Refining Salt - Refining Flour & Semolina - Milling Pasta - Manufacturing Noodles - Manufacturing Poly Products - Manufacturing Logistics - Port Management and Haulage Real Estate Dangote Foundation Since inception, the Group has experienced phenomenal growth on account of quality of its goods and services, its focus on cost leadership and efficiency of its human capital. Today, Dangote Group is a multi-billion Naira company poised to reach new heights, in every endeavour competing with itself to better the past. The Group's core business focus is to provide local, value added products and services that meet the 'basic needs' of the populace. Through the construction and operation of large scale manufacturing facilities in Nigeria and across Africa, the Group is focused on building local manufacturing capacity to generate employment and provide goods for the people.
Description
The ideal candidate will have a strong background in data engineering, with proficiency in ETL/ELT processes, big data technologies, and cloud platforms. You will play a critical role in ensuring data accessibility, quality, and scalability to support business intelligence, analytics, and operational objectives. This position offers an exciting opportunity to work collaboratively with cross-functional teams and leverage your expertise to shape our data infrastructure.
Responsibilities
- Design, develop, and implement scalable data pipelines to extract, transform, and load data from various sources.
- Build and maintain data warehouses and lakes to support business intelligence and analytics.
- Write high-quality code and conduct peer code reviews to ensure best practices.
- Develop data models and schemas for effective data analysis and reporting.
- Optimize data processing, query performance, and platform efficiency.
- Ensure data quality, integrity, and security through robust governance practices
- Lead the migration of legacy data platforms to Azure.
- Modernize applications and databases to leverage Azure's advanced capabilities.
- Implement monitoring solutions for database usage, performance, and reliability.
- Develop automated data quality checks and testing procedures.
Requirements
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 4+ years of experience in a similar role within the technology industry.
- Proven experience in data engineering, with a strong foundation in ETL/ELT processes and data warehousing.
- Proven ability to work with both OLTP (Postgres/MySQL) and OLAP systems (Redshift/Vertica/Snowflake).
- Hands-on experience in designing, optimizing, and managing batch and streaming data pipelines.
- Expertise in data migration and modernization, particularly on Azure cloud platforms.
- Microsoft Certification in Azure Data Engineering is highly desirable.
- Expertise in SQL, Python, and other programming languages relevant to data engineering.
- Experience with big data technologies such as Hadoop, Spark, and Hive.
- Knowledge and hands-on experience with cloud platforms like AWS, Azure, or GCP, including services such as Azure Data Factory, Azure Synapse Pipeline, Azure Event Hub, and Azure IoT Hub.
- Familiarity with data visualization tools like Tableau or Power BI.
- Knowledge of machine learning and statistical modelling is a plus.
- Onsite availability required
Benefits
- Private Health Insurance.
- Paid Time Off.
- Opportunities for Professional Growth and Career Advancement.
- Training and Development Programs.
- Competitive Salary.
- Collaborative and Supportive Work Environment.
Method of Application
Signup to view application details.
Signup Now