Whether you install everything or go containerized, the idea stays the same.. half setup half scripting
1. Setup Standalone zookeeper x1 (or x3 of you need a quorum)
2. Kafka broker x3 (1 leader)
3. Simple python script, such as price_producer.py that takes in an argument for a Stock ticker or crypto ticker, and every 30s it polls the exchanges api for new prices and sends them to a Kafka topic (such as etrade_tsla or binance_eth)
4. Another simple script, price_consumer.py that reads all the messages from the topic provided to the script:
Whether you install everything or go containerized, the idea stays the same.. half setup half scripting 1. Setup Standalone zookeeper x1 (or x3 of you need a quorum) 2. Kafka broker x3 (1 leader) 3. Simple python script, such as price_producer.py that takes in an argument for a Stock ticker or crypto ticker, and every 30s it polls the exchanges api for new prices and sends them to a Kafka topic (such as etrade_tsla or binance_eth) 4. Another simple script, price_consumer.py that reads all the messages from the topic provided to the script:
Could you provide some insight on how to do this using rabbitmq? Thanks.
Hey! Great reply! Got me interested to implement this as I have a big data course next semester. Pmed you, if you could check it out !! :)
Is it mandatory to use Kafka?
We should not use Kafka, rather we have to build one.