datatrota
Signup Login
Home Jobs Blog

Data Engineer (CONFLUENT KAFKA) at 2AM Tech Limited

2AM Tech LimitedLagos, Nigeria Data and Artificial Intelligence
Full Time
2AM Tech is a major provider of outsourced tech talent. We have engineeers currently deployed in more than 12 countries. We also provide innovative enterprise-strength tools for automating organisational processes. Our principal activities include business process automation, software design, research and development, artificial intelligence and big data.

Role Description:

  • This is a contract Senior Data Engineer role with some flexibility for remote work. The Senior Data Engineer will be responsible for employing their skills and techniques to deploy, maintain and support the Confluent Kafka platform within the client organization in line with the customer strategy. During various aspects of this process, you should collaborate with co-workers using agile methodology to ensure that your approach meets the needs of the customer.
  • The Senior Data Engineer will work in a hybrid role, located in Lagos and having the ability to work from home.

Qualifications:

  • Data Engineering and Data Modeling skills
  • Data Analytics skills
  • Strong programming skills in languages such as SQL, Python.
  • Experience with Agile methodologies
  • 4 years work experience on Confluent Kafka (Brokers, Zookeeper, Kafka Connect, Schema Registry, Kafka Rest).
  • 4 years experience working in an Event streaming environment
  • Solid experience in creating Kafka topics, installing and configuring relevant Kafka connectors and administering Confluent Kafka environment
  • Solid experience working with Linux Operating System
  • Experience with Kerberos Authentication and Authorization (Krb5,JAAS).
  • Experience with SSL communication. (Keystores and Truststores).
  • Working experience working as part of an agile team
  • Experience using Ansible

Responsibilities:

  • Develop and implement solutions using Confluent Kafka
  • Administer and improve use of Confluent Kafka across the client organization including Kafka Connect, Zookeeper, Brokers, Schema Registry, Kafka Rest, ksqlDB, and custom implementations.
  • Work with multiple teams to ensure best use of Confluent Kafka and safe-data event streaming.
  • Understand and apply event-driven architecture patterns and Confluent Kafka best practices and enable development teams to do the same.
  • Assist developers and operations team in ensuring that Confluent Kafka platform is configured, secured and operates in line with the customer expectations.
  • Continuous learning to be a Confluent Kafka subject matter expert. Within the organization.
  • Work with Kafka and Confluent API's (e.g. metadata, metrics, admin) to provide pro-active insights and automation.
  • Work with DevOps team to ensure that Kafka-related metrics are exported to the required platform.
  • Perform regular reviews of performance data to ensure efficiency and resiliency.
  • Contribute regularly to event-driven patterns, best practices, and guidance.
  • Review feature release and change logs for Confluent Kafka and related components to ensure best use of these systems across the organization.
  • Develop an expert-level understanding of data integration, migration and deployment using CI/CD tools as it relates to Confluent Kafka
  • Acquire a deep understanding of source and sink connector technical details for a variety of platforms including S3, Casandra, Oracle and others as required.

Option Skills:

  • Flink streaming experience (Optional).
  • Experience with RabbitMQ (Optional).
  • Experience with JulieOps (Optional).
  • Experience with Apache Nifi (Optional)

Method of Application

Signup to view application details. Signup Now
X

Send this job to a friend