##Wroking:
- Any process that wants log data, initializes an object of a stream(like NetBanking Stream for NetBanking logs.
- The stream object exposes 5 logging methods i.e.INFO, DEBUG, WARN, ERROR, FATAL.
- Each logging method is having its separate queue, worker and sink and it initializes on demand.
- At logging time, the LogData object is pushed into multiprocessing queue.(This queue is Process safe.)
- One or more workers based on write_mode sync or asyc keep listening on the queues for the data.
- While processing data, the data is sent to the specific sink based on the sink configs.
PSQL Database Table: CREATE TABLE log_table( id serial PRIMARY KEY, level VARCHAR (10) NOT NULL, mmessage_namespace VARCHAR (50) NOT NULL, message_content TEXT NOT NULL, log_time TIMESTAMP NOT NULL, added_on TIMESTAMP WITH TIME ZONE DEFAULT CURRENT_TIMESTAMP );
##Running Environment:
- Tested on Jupyter notebook. Flask App can also be used.
- PSQL Database server is used for Testing the logging.
- Fetches settings of streams from Settings/settings.json
- For running Tests, just run '%run Tests/TestFile.py'
##Further Works:
- Proper Managers for Queues and Workers.
- Removing dependency of Sink on LogData Object by introducing a SinkObjectInterface.
- Using unittest python library for TestCases and making them a part of build process using single command.
- Testing the Cost and performance by using Redis server for temporary queuing.