FairMoney is a platform that helps people access instant loans within 5 minutes through our android app.
From the corners of Kano, Nigeria to the quarters of Chennai in India, we’re connecting people to prosperity by taking an effective financial product to them. We’re giving them the loans they need to fuel their dreams and a payment plan that allows them to make their vision a reality.
Requirements
Roles and Responsibilities
As a Data Engineer at FairMoney, you will be responsible for mainly, but not limited to:
- Work closely with Data Analysts/Scientists, understand the business problems, and translate the requirements into a database, ETL, or reporting solution.
- Design, build, and maintain integration of heterogeneous data sources for DW & BI solutions to simplify analysis across the products.
- Design and implement data models aligning with business needs and supporting best practices in data warehousing.
- Contribute to the ongoing development and optimization of the data warehouse architecture.
- Implement tools and processes for ensuring data quality, and freshness with reliable, versioned, and scalable solutions.
- Identify issues in data flow and improvements in data stack which comprises visualization tools (Tableau), data warehouse (BigQuery), data modeling (dbt), git, and other in-house built as well as open source tools.
- Implement best practices for indexing, partitioning, and query optimization.
- Stay up-to-date with new technologies in Data Engineering, Analytics, and Data Science.
- Implementing new technologies in a production environment to create a frictionless platform for data analytics and science teams to reduce turnaround time from raw data to insights and ML training/serving
- Make it easy for business stakeholders to get a better understanding of data, and make the organization data-literate for self-serve analytics.
Requirements
- 3+ years of work experience in designing, developing & maintaining ETL, databases & OLAP Schema, and Public Objects (Attributes, Facts, Metrics, etc.)
- Hands-on experience in designing and implementing data ingestion/integration processes using SQL/Python.
- Strong proficiency in SQL and experience with database systems (e.g., PostgreSQL, MySQL, or similar)
- Proficient in developing automated workflows using Airflow or similar tools.
- Proficiency in data principles, system and architecture, dimensional data modeling for Data Warehousing and Business Intelligence, and data governance principles.
- Good to have development experience in building business applications with a good command of Bash, Docker, Kubernetes, and Cloud Platforms.
- Ability to learn new software and technologies quickly to create prototypes for business use cases and make them ready for production.
- Effectively form relationships with the business stakeholders to help with the adoption of data-driven decision-making.
- Excellent problem-solving skills and attention to detail.
Method of Application
Signup to view application details.
Signup Now