datatrota
Signup Login
Home Jobs Blog

Senior Data Engineer (Remote) at Yassir

YassirLagos, Nigeria Data and Artificial Intelligence
Full Time
Yassir is the leading super App for on demand and payment services in the Maghreb region set to changing the way daily services are provided. It currently operates in 26 cities across Algeria, Morocco and Tunisia with recent expansions into France, Canada and Sub-Saharan Africa. It is backed (+$43M in funding) by VCs from Silicon Valley, Europe and other parts of the world including Y Combinator, which is the precursor of the likes of Airbnb, Stripe, Dropbox, Doordash, among others. We offer on-demand services such as ride-hailing and last-mile delivery. Building on this infrastructure, we are now introducing financial services to help our users pay, save and borrow digitally. Helping usher the continent into a digital economy era. We’re not just about serving people - we’re about creating a marketplace to bring people what they need while infusing social values.

Responsibilities

  • Build a centralized data lake on GCP Data services by integrating diverse data sources throughout the enterprise.
  • Develop, maintain, and optimize SPARK-powered batch and streaming data processing pipelines. Leverage GCP data services for complex data engineering tasks and ensure smooth integration with other platform components
  • Design and implement data validation and quality checks to ensure data's accuracy, completeness, and consistency as it flows through the pipelines.
  • Work with the Data Science and Machine Learning teams to engage in advanced analytics.
  • Collaborate with cross-functional teams, including data analysts, business users, operational and marketing teams, to extract insights and value from data.
  • Collaborate with the product team to design, implement, and maintain the data models for analytical use cases.
  • Design, develop, and upkeep data dashboards for various teams using Looker Studio.
  • Engage in technology explorations, research and development, POC’s and conduct deep investigations and troubleshooting.
  • Design and manage ETL/ELT processes, ensuring data integrity, availability, and performance.
  • Troubleshoot data issues and conduct root cause analysis when reporting data is in question.

Requirements
Required Technical Skills:

  • PySpark
  • GCP - Big Query, Dataproc, Dataflow, Dataplex, Pub-Sub and Cloud Storage
  • Advance SQL knowledge
  • NoSQL (Preferably MongoDB)
  • Programming languages - Scala/Python
  • Great Expectation - similar DQ framework
  • Familiarity with workflow management tools like Airflow, Prefect or Luigi.
  • Understanding of Data Governance, DWH and Data Modelling.

Good to have skills:

  • Infrastructure as Code - Terraform
  • Docker and Kubernetes
  • Looker Studio
  • AI and ML engineering knowledge.

Method of Application

Signup to view application details. Signup Now
X

Send this job to a friend