datatrota
Signup Login
Home Jobs Blog

Data Engineer at Zedcrest Capital Limited

Zedcrest Capital LimitedLagos, Nigeria Data and Artificial Intelligence
Full Time
Zedcrest Capital is a privately-funded investment firm with interests in Fixed Income Securities Trading and Proprietary Investments across varied sectors of the productive economy. In every venture we are invested in, we are steadfastly committed to putting our clients’ interests first. This fiduciary responsibility defines our relationship with clients and informs the basis of every decision we make. This core principle is the foundation of our business as we work to provide value to all our stakeholders.

JOB SUMMARY

We are seeking an experienced Senior Data Engineer to architect and build our data infrastructure supporting analytics, machine learning, and real-time processing across our fintech platform. This role will be instrumental in designing scalable, cloud-agnostic solutions that handle financial data with the highest standards of accuracy, security, and compliance.

RESPONSIBILITIES

  • Architect and build scalable data infrastructure across multi-cloud environments (AWS, GCP, Azure)
  • Design and maintain ETL/ELT pipelines processing high-volume financial data with strict accuracy requirements
  • Implement real-time streaming pipelines for fraud detection, transaction monitoring, and event-driven workflows
  • Establish data governance frameworks ensuring compliance with financial regulations (SOC2, PCI-DSS, GDPR)
  • Build and optimize data warehouses supporting analytics, ML, and business intelligence
  • Mentor engineers and drive best practices for data quality, security, and performance
  • Collaborate with data scientists, analysts, and product teams to deliver data solutions

REQUIREMENTS

  • 5+ years in data engineering with 2+ years in senior/lead capacity
  • Proven experience with financial data systems and regulatory compliance (PCI-DSS, SOC2)
  • Multi-cloud architecture experience (AWS, GCP, or Azure)
  • Expert SQL and data modeling across multiple databases (PostgreSQL, Snowflake, Redshift, BigQuery)
  • Strong Python and/or Dotnet/Java/Scala programming
  • Distributed processing frameworks (Spark, Flink, Beam)
  • Streaming technologies (Kafka, Kinesis, Pub/Sub)
  • Orchestration tools (Airflow, Prefect, Dagster)
  • Modern data stack (dbt, Fivetran, data observability tools)
  • Track record mentoring engineers and leading technical projects
  • Strong communication skills with technical and non-technical stakeholders
  • Strategic thinking for long-term scalability and architecture decisions

Method of Application

Signup to view application details. Signup Now

More jobs like this

X

Send this job to a friend