datatrota
Signup Login
Home Jobs Blog

Data Engineering Instructor at AltSchool Africa

AltSchool AfricaLagos, Nigeria Data and Artificial Intelligence
Contract

AltSchool Africa is an Edtech product, an alternative to school, launched with the sole motivation to increase employability and reduce unemployment by helping Talents in Africa jump-start a career in Tech. With AltSchool Africa, everyone can benefit from the myriad of opportunities available in Tech. At AltSchool Africa we pride ourselves on possessing a culture of execution that caters to wavemakers rather than wave riders. If you're interested in joining a diverse and dynamic team that values speed, continuous improvement, focus, accountability, and are on an impact mission, you'll feel right at home at AltSchool Africa.

Job Summary

  • We are seeking a seasoned professional with a passion for Data Engineering and education to join our team as a Data Engineering Instructor.
  • The ideal candidate will have extensive experience in Data Engineering, proficiency in relevant tools and technologies, and a dedication to fostering a rich learning environment for students.

Technical Requirements
Core Technical Skills:

  • BLOB Storage / File Processing: Understanding of Binary Large Objects (BLOBs) and how to efficiently store, retrieve, and process large files, such as images, videos, and other unstructured data.
  • Python Programming Language: Strong command of Python, including its libraries and frameworks, to handle data processing, automation, and other programming tasks.
  • Relational Databases: In-depth knowledge of relational database management systems (RDBMS), including their design, normalization, indexing, and transaction management.
  • SQL: Advanced skills in SQL for querying, updating, and managing large datasets within relational databases, including complex joins, subqueries, and optimization techniques.
  • No-SQL or Object-Oriented Databases: Familiarity with NoSQL databases like MongoDB or Cassandra, focusing on non-relational data storage solutions that offer flexibility and scalability for unstructured data.
  • Data Warehouses/Lakes: Expertise in designing, implementing, and managing data warehouses and lakes, which are essential for storing and analyzing large volumes of structured and unstructured data.
  • Analytical Databases: Proficiency in using analytical databases designed for complex queries and data analysis, ensuring high performance and scalability.
  • Pipelines (Batch + Streaming): Knowledge of data pipelines, including batch processing for handling large volumes of data at once, and streaming for real-time data processing and analysis.
  • Job Scheduling: Experience with job scheduling tools that automate the execution of data processing tasks at specified times or in response to events.
  • Orchestration: Expertise in orchestrating complex workflows that involve multiple tasks, dependencies, and data flows using tools like Apache Airflow or Prefect.
  • dbt (Data Build Tool): Proficiency with dbt, a data transformation tool that enables data analysts and engineers to transform raw data into models for analysis.
  • Apache Beam: Understanding of Apache Beam for creating data processing pipelines that work across different execution engines, such as Apache Flink or Google Cloud Dataflow.
  • Apache Spark: In-depth knowledge of Apache Spark, an open-source distributed computing system, for big data processing and analytics.

Who You Are:

  • You are eager to shape the skills, minds, and trajectories of the newest generation of cloud computing professionals.
  • You are the person that your learners see as a career mentor and can easily reach out to when they need help.
  • You have at least 4 years of industry experience with data engineering.
  • A data engineer instructor must possess proficiency in programming languages (SQL, Python, Java, Scala), database management (relational and NoSQL databases), data warehousing, data pipeline development (ETL/ELT, batch and stream processing), big data technologies (Apache Hadoop, Apache Spark), cloud platforms (AWS, Google Cloud, Microsoft Azure), data modeling and design, data governance and security, orchestration tools (Apache Airflow, Prefect), version control and CI/CD, problem-solving and analytical skills, collaboration and communication, attention to detail, and learning agility

Soft Skills:

  • Communication: Ability to clearly articulate complex data engineering concepts and strategies to learners with diverse backgrounds.
  • Curriculum Design: Experience in creating comprehensive course outlines and detailed teaching materials for maximum engagement and learning.
  • Pedagogical Skills: Knowledge of virtual teaching methodologies and tools.
  • Teamwork: Proven ability to collaborate effectively with cross-functional teams.
  • Adaptability: Eagerness to continuously update the curriculum and teaching methods.

Preferred:

  • Experience teaching remotely and use of requisite tools (i.e., video conferencing, messaging, etc. )
  • Backgrounds in adult learning, pedagogy, and curriculum development.

Method of Application

Signup to view application details. Signup Now
X

Send this job to a friend