datatrota
Signup Login
Home Jobs Blog

Data Engineer at International Rescue Committee (IRC)

International Rescue Committee (IRC)Abuja, Nigeria Data and Artificial Intelligence
Full Time
The International Rescue Committee (IRC) is a global humanitarian aid, relief and development nongovernmental organization. Founded in 1933 at the request of Albert Einstein, the IRC offers emergency aid and long-term assistance to refugees and those displaced by war, persecution or natural disaster. The IRC is currently working in over 40 countries and 22 U.S. cities where it resettles refugees and helps them become self-sufficient. Composed of first responders, humanitarian relief workers, international development experts, health care providers, and educators, the IRC has assisted millions of people around the world since its founding in 1933.

Job Summary: 

The Data Engineer will play a pivotal role in the development, maintenance, and optimization of our data infrastructure, focusing on data integration, ETL/ELT processes, and management of Data Warehouses, and Lakehouse. 

The successful candidate will be responsible for building and maintaining data pipelines using Databricks, Synapse, and Fabric environments, using tools like DBT and Databricks pipelines. Key responsibilities include ensuring system readiness in terms of security, performance, and health, completing data loads, and implementing data modeling to support various business domains. This hands-on role demands strong technical expertise alongside excellent communication and collaboration skills.

Major Responsibilities: 

  • Develop Python, SQL, and PySpark-based applications and data flows within Databricks.
  • Build and maintain data pipelines using DBT and Databricks, ensuring efficient and scalable data processes.
  • Design and implement real-time and batch data processing systems to support analytics, reporting, and business needs.
  • Monitor and analyze data pipelines for performance, reliability, cost, and efficiency.
  • Proactively address any issues or bottlenecks to ensure smooth operations.
  • Discover opportunities for process improvements, including redesigning pipelines and optimizing data delivery mechanisms.
  • Manage and maintain Azure cloud services and monitor alerts to ensure system availability and performance.

Minimum Requirements:

  • Demonstrated ability writing SQL scripts (required).
  • Some experience with Python (strong plus).
  • Exposure to Databricks is a significant advantage.
  • Experience working in a cloud environment (strong plus).
  • Experience with DBT Core or DBT Cloud is a major plus.
  • Ability to quickly learn and absorb existing and new data structures.
  • Excellent interpersonal and communication skills (both written and verbal).
  • Ability to work independently and collaboratively within a team.

Preferred Additional Requirements

  • Experience with cloud platforms (Azure preferred).
  • Databricks Data Engineer Certification or similar. 
  • Software development

Key Working Relationships: 

  • Data Team
  • Enterprise systems owners and technical and analytics teams under their leadership

Method of Application

Signup to view application details. Signup Now
X

Send this job to a friend