8000 GitHub - rioliu-rh/rag-chatbot
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

rioliu-rh/rag-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RAP based chatbot app

load documents from Confluence space. User can talk with vLLM with queried context from vectore database. This is experient for AI enabled knowledge base.

Usage

  • Create env file in .env or in other places, required os env vars are listed below
  LLM_API_URL=<your LLM api url, currently only supports OpenAI compatible API>
  LLM_API_KEY=<your api key generated by LLM service provider>
  LLM_MODEL_NAME=<model name>
  EMBEDDING_API_URL=<embedding AI model api url, e.g. nomic-embed-text>
  EMBEDDING_API_KEY=<api key generated by embedding AI service provider>
  CONLFUENCE_URL=<Confluence base url e.g. https://spaces.mycompany.com/>
  CONLFUENCE_TOKEN=<private token generated on Confluence>
  CONLFUENCE_SPACE_KEY=<your project key>
  CHROMA_DB_COLLECTION_NAME=confluence_index
  CHROMA_DB_PERSISTENT_DIR=/tmp/confluence_rag/chroma_langchain_db
  SQLLITE_DB_FILE_PATH=//tmp/confluence_rag/record_manager_cache
  • Install this app locally
$ pip3 install -e .
  • Index docs from Confluence to local vectore store (Chroma based)
$ python3 chatbot/doc/index.py --help
Usage: index.py [OPTIONS]

Options:
  --env-file-path TEXT  env file path contains the required os env vars
  --help                Show this message and exit.

run above command in your cron job, vector store will be updated in incremental mode

$ pthon3 chatbot/doc/index.py
{'num_added': 0, 'num_updated': 0, 'num_skipped': 60, 'num_deleted': 0}
  • Launch chatbot UI (Streamlit based)
$ streamlit run chatbot/ui/chatwindow.py

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0