Skip to content

Data processing with Apache Kafka, REST API and Redis

License

Notifications You must be signed in to change notification settings

zt-9/go-kafka-example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

go-kafka-example

Data processing with Apache Kafka, REST API and Redis.

This project involves processing on-chain blockchain data using Apache Kafka, REST API, and Redis. There are three topics: "address", "label" and "transaction". These topics are handled by a single producer and consumer.

The producer, operating on port 8080, receives REST API requests from users, sending the respective messages to the consumer (port 8081). Redis is used as the in-memory database.

Prerequisites

stack

Run the application

  1. Start docker:

    make run/docker
  2. Start the producer in a new terminal

    make run/producer
  3. Start the consumer in a new terminal

    make run/consumer
  4. Send API requests to localhost:8080

    For example:

    • create an address: send POST request to localhost:8080/addresses/0x87631B45877794f9cdd50a70c827403e3C36d072 with body

      {
      "address": "0x87631B45877794f9cdd50a70c827403e3C36d072",
      "labels": ["eoa"]
      
      }
    • get an address: send GET request to localhost:8080/addresses/0x87631B45877794f9cdd50a70c827403e3C36d072

    • create a transaction: send POST request to localhost:8080/transactions/0x6ffa912cc7da2b5ec51a2cc1152ab39a54f0c72f1b3f32072c9bba154b585780 with body

       {
       "hash":"0x6ffa912cc7da2b5ec51a2cc1152ab39a54f0c72f1b3f32072c9bba154b585780",
       "chainid": 2,
       "from": "0x4838B106FCe9647Bdf1E7877BF73cE8B0BAD5f97",
       "to":"0x388C818CA8B9251b393131C08a736A67ccB19297",
       "status":"success"
       }

REST API endpoints

  • port: localhost:8080
  • /addresses: GET
  • /addresses/:address: GET, POST, PUT, DELETE
  • /labels: GET
  • /labels/:label: GET, POST, PUT, DELETE
  • /transactions: GET
  • /transactions/:hash: GET, POST, PUT, DELETE

Project structure

  • /cmd main applications for this project
  • /cmd/producer kafka producer. it sends messages to kafka.
  • /cmd/consumer kafka consumer. It process the received
  • /config configuration
  • /controllers controllers that handles api endpoints request
  • /models data structures and models used in REST API
  • /utils util/helper functions

About

Data processing with Apache Kafka, REST API and Redis

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published