datatrota
Signup Login
Home Jobs Blog

Apache Kafka Jobs in Nigeria

View jobs that require Apache Kafka skill on TechTalentZone
  • Onafriq logo

    Solution Architect

    OnafriqLagos, Nigeria29 November

    Onafriq is an omnichannel network of networks, making borders matter less by providing our partners with a single pathway to unlock the full power of ...

    Hybrid
  • Moniepoint Inc. (Formerly TeamApt Inc.) logo

    Cloud Engineer

    Moniepoint Inc. (Form..Lagos, Nigeria27 November

    Moniepoint is a financial technology company digitising Africa’s real economy by building a financial ecosystem for businesses, providing them with all ...

    Remote
  • Princeps Credit Systems Limited logo

    Java Engineer

    Princeps Credit Syste..Lagos, Nigeria26 November

    At Credit Wallet we are determined to make financial loans accessible to those who need it most. You can apply for a Credit Wallet loan from the comfort of ...

    Onsite
  • Atlas Copco logo

    Enterprise Architect - Applications & Integrations

    Atlas CopcoLagos, Nigeria19 November

    With a tradition of innovation dating back to the Group’s foundation in 1873, Atlas Copco’s business has evolved into four distinct business areas, ...

    Onsite
  • Aku Fintech Services Limited logo

    Senior Data Engineer

    Aku Fintech Services ..Lagos, Nigeria19 November

    Aku offers easy payments, simple banking - for everyone. Send / receive money and pay bills via USSD, app, cards + more. We’re a digital bank licensed ...

    Onsite
  • Zipnet Innovations & Technologies Ltd logo

    Java Developer (Senior-level)

    Zipnet Innovations & ..Lagos, Nigeria04 October

    At Zipnet Innovation & Technologies Ltd we are a Technology services and solution provider specialized in end to end IT services and solutions for Business and ...

    Onsite
  • Tek Experts logo

    Middleware Engineer (Payments)

    Tek ExpertsLagos, Nigeria18 September

    Tek Experts provides the services of a uniquely passionate and expert workforce that takes intense pride in helping companies manage their business operations. ...

    Hybrid
  • Ascentech Services Limited logo

    Middleware Engineer (Payments)

    Ascentech Services Li..Lagos, Nigeria13 September

    Ascentech Services Ltd acts as a gateway to provide a wide range of recruitment and selection services to companies. We are a dedicated team of professional ...

    Onsite
  • Onafriq logo

    Solution Architect

    OnafriqLagos, Nigeria22 August

    Onafriq is an omnichannel network of networks, making borders matter less by providing our partners with a single pathway to unlock the full power of ...

    Onsite
  • Mkobo Microfinance Bank Limited (Mkobobank) logo

    Apache Fineract Senior Engineer / Architect

    Mkobo Microfinance Ba..Lagos, Nigeria20 July

    MKOBO Microfinance Bank Limited is a fully licensed MFB by the Central Bank of Nigeria (CBN). MKOBO was conceived to help solve consumers’ need for ...

    Onsite
  • Tranter IT Infrastructure Services Limited logo

    Application Developer

    Tranter IT Infrastruc..Lagos, Nigeria26 June

    TITIS is Tranter IT Infrastructure Services Limited, it is a spin-off of Tranter International Company. Tranter International Company was incorporated in ...

    Onsite
  • Flutterwave logo

    Senior Data Engineer

    FlutterwaveLagos, Nigeria24 June

    Our mission is to power a new wave of prosperity across Africa. By enabling global digital payments on a continent that’s been largely cut off from the ...

    Onsite
  • Moniepoint Inc. (Formerly TeamApt Inc.) logo

    Cloud Infrastructure Engineer

    Moniepoint Inc. (Form..Nigeria11 June

    Moniepoint is a financial technology company digitising Africa’s real economy by building a financial ecosystem for businesses, providing them with all ...

    Remote
  • Flutterwave logo

    Senior Data Engineer

    FlutterwaveLagos, Nigeria02 May

    Our mission is to power a new wave of prosperity across Africa. By enabling global digital payments on a continent that’s been largely cut off from the ...

    Hybrid
  • TalentUp Africa logo

    Data & Machine Learning Engineer

    TalentUp AfricaLagos, Nigeria19 March

    TalentUp Africa uses quizzes and games, all based on specific lessons, to identify candidates’ capabilities, skill sets, and personalities. Through ...

    Onsite
  • Cellulant logo

    Group Head: Platform Engineering

    CellulantLagos, Nigeria16 March

    Cellulant is a mobile commerce and content company that manages delivers and bills for digital content and commerce service actualized over telecom network. ...

    Onsite

What is Apache Kafka?

Apache Kafka is a distributed data store optimized for ingesting and processing streaming data in real time. Streaming data is data that is continuously generated by thousands of data sources, which typically send the data records simultaneously. A streaming platform needs to handle this constant influx of data and process the data sequentially and incrementally.

Kafka provides three main functions to its users:

  • Publish and subscribe to streams of records

  • Effectively store streams of records in the order in which records were generated

  • Process streams of records in real-time

Kafka is primarily used to build real-time streaming data pipelines and applications that adapt to the data streams. It combines messaging, storage, and stream processing to allow storage and analysis of both historical and real-time data.

Developers can leverage these Kafka capabilities through four APIs:

  1. Producer API: This enables an application to publish a stream to a Kafka topic. A topic is a named log that stores the records in the order they occurred relative to one another. After a record is written to a topic, it can’t be altered or deleted; instead, it remains in the topic for a preconfigured amount of time—for example, for two days—or until storage space runs out.

  2. Consumer API: This enables an application to subscribe to one or more topics and to ingest and process the stream stored in the topic. It can work with records in the topic in real time, or it can ingest and process past records.

  3. Streams API: This builds on the Producer and Consumer APIs and adds complex processing capabilities that enable an application to perform continuous, front-to-back stream processing—specifically, to consume records from one or more topics, to analyze or aggregate or transform them as required, and to publish resulting streams to the same topics or other topics. While the Producer and Consumer APIs can be used for simple stream processing, it’s the Streams API that enables the development of more sophisticated data- and event-streaming applications.

  4. Connector API: This lets developers build connectors, which are reusable producers or consumers that simplify and automate the integration of a data source into a Kafka cluster.

How does Kafka work?

Kafka combines two messaging models, queuing and publish-subscribe, to provide the key benefits of each to consumers. Queuing allows for data processing to be distributed across many consumer instances, making it highly scalable. However, traditional queues aren’t multi-subscriber. The publish-subscribe approach is multi-subscriber, but because every message goes to every subscriber it cannot be used to distribute work across multiple worker processes. Kafka uses a partitioned log model to stitch together these two solutions. A log is an ordered sequence of records, and these logs are broken up into segments, or partitions, that correspond to different subscribers. This means that there can be multiple subscribers to the same topic and each is assigned a partition to allow for higher scalability. Finally, Kafka’s model provides replayability, which allows multiple independent applications reading from data streams to work independently at their own rate.

What are the benefits of Kafka's approach?

  1. Scalable: Kafka’s partitioned log model allows data to be distributed across multiple servers, making it scalable beyond what would fit on a single server.

  2. Fast: Kafka decouples data streams so there is very low latency, making it extremely fast.

  3. Durable: Partitions are distributed and replicated across many servers, and the data is all written to disk. This helps protect against server failure, making the data very fault-tolerant and durable.