A pastebin alternative written for Cloud Computing project by:
PES2UG22CS393 PES2UG22CS902 PES2UG22CS416 PES2UG22CS384
$ flask --app api/api run -h localhost -p 8080
- Docker
Make sure Docker Desktop is running in background (if you're on windows/mac). if you face any issues with docker on terminal, maybe try logging in first?$ docker login -u <docker_username> -p <docker_password>
- confluent-kafka
$ pip install confluent-kafka
- Start Kafka Stack.
$ docker compose up --build
- Run the docker_topics.py file.
$ python docker_topics.py
- Start the flask api.py file.
$ flask --app api run
- Run the simple consumer basic.py file.
$ python basic.py
- Run the api spammer.py file.
$ python spammer.py
Good to go :D
P.S: When you're done, use
$ docker compose down
or
$ docker compose down -v
This one in particular removes the containers and clears the persistent data in the volume so no useless logs remain or are persisted.
Our main library to interact with postgresql.
$ pip install psycopg2
- Run the docker compose file as mentioned previously.
- Run the docker_topics.py file.
- Run postgre_schema.py
$ python run postgre_schema.py
This creates the tables in the database, ready to be filled with logs.
pgadmin is a postegresql visualisation tool (GUI basically) that helps you verify if everything's fine. To access that:
-
Go to localhost:8080
-
email: admin@admin.com
password: admin -
Click on 'Add New Server'. (unless somehow it already exists there on left sidebar o_o)
-
Give a name for your Server.
-
Move to the "Connection Tab".
Hostname: postgres
Port: 5432
Maintenance Database: logs_db
username: admin
password: password -
Save.
You got it :)
You can navigate to the server, look for logs_db and check if everything's fine. You can also run sql commands by clicking on logs_db and navigating towards the 'cylinder with a triangle' icon. That's your Query Tool (essentially SQL shell). Run commands. Check if logs are present. Have a bun samosa 🍔
(Should misfortune befall you, here's a 15 min video on using pgadmin: https://youtu.be/WFT5MaZN6g4?si=36_71O4zve2JD7vC)
- docker-compose.yml has been updated to accommodate Prometheus and Grafana.
- You might need prometheus_client, just on the safer side.
$ pip install prometheus_client
-
You'll need to follow the same previous steps of setting up all the necessary files.
-
Configure Prometheus
- Log into Prometheus via localhost:9090
- Go to status -> Targets. Ensure api.py is running.
- Perfect, we can move to Grafana.
-
Configure Grafana
- Log into Grafana via localhost:3000
- email: admin
password: admin
(you'll be prompted to set a new password, if you want, you could) - You'll now have to add a data source (prometheus basically)
- Connections -> Search for Prometheus -> Add new Data Source
- You can give it a name
URL: http://prometheus:9090 - Scroll down and click "Save and Test" to confirm its working.
- Navigate to Dashboards
- Create Dashboards
Good to go :D