eHealth4everyone is a leading digital health social enterprise dedicated to making the world healthier. We are a new kind of mission-driven organization with services, expertise and products focused on making the business of healthcare delivery work for everyone, through technology-enabled optimizations. We believe that if health is a right, proven digital health solutions and expertise such as ours should not be a privilege. Working with participants across the spectrum of healthcare, from individuals, healthcare providers, government departments and stakeholders, insurance organizations, pharmaceutical companies, and various private sector players, we enable high performance across the continuum of healthcare delivery and management.
Job Description
- As a Data Engineer specializing in data lakes, data warehouses, and ETL, you will be responsible for designing, implementing, and maintaining our data infrastructure.
- You will work closely with data scientists, analysts, and other stakeholders to ensure seamless data flow, high-quality data, and accessibility for analytical and operational use cases.
Key Responsibilities
- Design, build, and maintain scalable data lake and data warehouse architectures to store structured and unstructured data.
- Develop and manage ETL (Extract, Transform, Load) processes to ingest data from various sources into the data lake and data warehouse.
- Ensure data quality, data governance, and data security practices are implemented and maintained.
- Collaborate with data scientists and analysts to understand data requirements and provide solutions for data access and analysis.
- Optimize data storage and retrieval performance.
- Monitor and troubleshoot data infrastructure issues, ensuring high availability and reliability.
- Implement and maintain data catalog and metadata management tools.
- Stay updated with the latest trends and technologies in data engineering, data lakes, and data warehouses.
Qualifications
- Bachelor's Degree in Computer Science, Information Technology, or a related field.
- 3+ years of experience in data engineering or a similar role.
- Strong experience with data lake technologies such as AWS S3, Azure Data Lake, Google Cloud Storage, or similar.
- Proficiency in ETL tools and processes (e.g., AWS Glue, Apache NiFi, Talend).
- Experience with big data processing frameworks like Apache Spark or Hadoop.
- Knowledge of data warehousing concepts and technologies (e.g., Amazon Redshift, Google BigQuery, Snowflake).
- Experience with SQL and NoSQL databases.
- Familiarity with data governance and data security best practices.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills.
Preferred Qualifications:
- Experience with cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with containerization and orchestration tools like Docker and Kubernetes.
- Knowledge of data catalog and metadata management tools (e.g., AWS Glue Data Catalog, Apache Atlas).
- Experience with data visualization tools and techniques.
- Relevant certifications in data engineering or cloud platforms.
Method of Application
Signup to view application details.
Signup Now