datatrota
Signup Login
Home Jobs Blog

Data Engineer at Clickatell

ClickatellLagos, Nigeria Data and Artificial Intelligence
Full Time

Clickatell is a global leader in mobile messaging and transaction services, which enable its customers to connect, interact and transact with their business partners and communities on the mobile phone. Video: Ontbytsake (Breakfast Matters) Profile of Clickatell Clickatell's global footprint means that it can deliver short message services (SMS) through it’s next-generation Clickatell Message eXchange (CMneXt) to over 960 mobile networks in over 220 countries and territories, with the potential to reach 6 billion mobile phone users - more than 80 percent of the world’s population. In addition, with Clickatell Transaction eXchange (CTX), it provides the essential link between mobile consumers and their financial institution, with services like airtime top-up. More than 15,000 enterprise, government, medium and small business customers and application developers have embraced Clickatell’s technology solutions. Founded in 2000, Clickatell is headquartered in Redwood City, California, USA.

Purpose

  • As a Senior Data Engineer, you will be joining our cross-functional data team, and supporting the design, build and testing of high-performance, scalable and multi-event level data solutions. You will work with structured, semi structured and unstructured data. You will play a critical role on the strengthen of our analytical presence in the Chat Commerce market.

We Do The Right Things

Responsibilities of the Role

Data Pipeline Management:

  • Design, build, and maintain scalable data pipelines for data collection, storage, and processing (using AWS Services would be plus) while maintaining data integrity through rigorous testing and validation. Ability to build processes that support data transformation, workload management, data structures, dependency and metadata.

Collaboration:

  • Work closely with data scientists, analysts, and other stakeholders to understand data requirements and ensure data quality. Actively participate in solution design and modelling to ensure data products are developed according to best practices, standards, and architectural principles.

ETL/ELT Processes:

  • Develop and manage ETL (Extract, Transform, Load) processes to convert raw data into usable formats.

Data Governance:

  • Ensure data compliance and security, maintaining high standards for data integrity and privacy.

Performance Monitoring:

  • Monitor and optimize the performance of data systems, identifying and implementing improvements. Monitor and troubleshoot data platform issues, providing timely resolutions

AI/ML:

  • Build and maintain Clickatell AI and Machine Learning pipelines, ensuring seamless integration with existing data infrastructure. Develop and deploy large language models (LLMs) to improve natural language processing (NLP) capabilities and automating data-driven insights will be plus.

Innovation:

  • Identify opportunities for improvement of existing ETL processes to enhance data integrity and accuracy

Mentorship:

  • Oversee and mentor junior data engineers, ensuring alignment with business objectives and best practices.

Team Player:

  • Self-directed and dedicated team player who positively engages with the team to solve.
  • Protect Clickatell's information, intellectual property, and corporate data systems in accordance with prescribed guidelines.
  • Work in a scrum-based team that is passionate about enabling a data culture throughout the organization

We Are On A Learning Journey
Requirements of the Role

  • Bachelor's degree in computer science, Engineering, or a related field.

Work Experience

  • Experience: At least 3+ years of experience in data engineering, with a strong background in building and maintaining ETL /ELT processes and data pipelines and 2+ years of experience working in Cloud environments (AWS or Azure)
  • Technical Skills: Proficiency in SQL, Python, PySpark and big data technologies such as Hadoop, Spark, and cloud platforms like AWS or Azure and streaming technologies such as Kafka, Kinesis. Proficiency in AWS services (e.g. S3, Redshift, Dynamo DB, Aurora DB, EMR, Glue) would be great.
  • AI/ML Skills: Experience with machine learning frameworks (e.g., TensorFlow, PyTorch) and AI model deployment. Experience with generative AI models and techniques would be a plus.
  • Soft Skills: Effective communication skills, both verbal and written, to effectively collaborate with cross-functional teams and present findings to non-technical stakeholders.
  • Analytical Skills: Ability to analyse complex data sets and derive actionable insights.
  • CI/CD Skills: Experience working with DevOps tools such as GitLabs, Jenkins, CodeBuild, CoePipeline, CodeDeploy, etc.
  • Nice to have –Experience in working with large language models (LLMs) for NLP tasks, including model fine-tuning and deployment

Knowledge and Abilities

  • Strong analytical and problem-solving skills.
  • Self-disciplined, eager to help, and most importantly a thirst for continual learning
  • Build on our coaching culture, you are someone who will not only be willing, but also passionate about assisting colleagues.
  • A passion for working with data and a desire to learn and adapt to new technologies.
  • Ability to make a difference and lasting impact

Method of Application

Signup to view application details. Signup Now
X

Send this job to a friend