8000 GitHub - cccZone/tavily-chat: 带有网络访问的对话聊天 Conversational chat with web access
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

cccZone/tavily-chat

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tavily Chatbot Example

Tavily Chatbot Demo

👋 Welcome to the Tavily Chatbot Repository!

This repository provides a simple yet powerful example of building a conversational chatbot with real-time web access, leveraging Tavily's advanced search capabilities. It intelligently routes queries between its base knowledge and live Tavily web searches, ensuring accurate, up-to-date responses.

Designed for ease of customization, you can extend this core implementation to:

  • Integrate proprietary data
  • Modify the chatbot architecture
  • Modify LLMs

Tavily empowers developers to easily create custom chatbots and agents that seamlessly interact with web content.

Features

  • 🔍 Intelligent question routing between base knowledge and web search
  • 🧠 Conversational memory with LangGraph
  • 🌐 Real-time web search capabilities powered by Tavily
  • 🚀 FastAPI backend with async support
  • 🔄 Streaming of Agentic Substeps
  • 💬 Markdown support in chat responses
  • 🔗 Citations for web search results

Architecture Diagram

Chatbot Demo

📂 Repository Structure

This repository includes everything required to create a functional chatbot with web access:

📡 Backend (backend/)

The core backend logic, powered by LangGraph:

  • chatbot.py – Defines the chatbot architecture, state management, and processing nodes.
  • prompts.py – Contains customizable prompt templates.
  • utils.py – Utilities for parsing and managing conversation messages.

🌐 Frontend (ui/)

Interactive React frontend for dynamic user interactions and chatbot responses.

Server

  • app.py – FastAPI server that handles API endpoints and streaming responses.

Setup Instructions

Set up environment variables:

a. Create a .env file in the root directory with:

TAVILY_API_KEY="your-tavily-api-key"
OPENAI_API_KEY="your-openai-api-key"
VITE_APP_URL=http://localhost:5173

b. Create a .env file in the ui directory with:

VITE_BACKEND_URL=http://localhost:8080

Backend Setup

Python Virtual Environment

  1. Create a virtual environment and activate it:
python3 -m venv venv
source venv/bin/activate  # On Windows: .\venv\Scripts\activate
  1. Install dependencies:
python3 -m pip install -r requirements.txt
  1. From the ro 6F08 ot of the project, run the backend server:
python app.py

Docker

  1. Alternatively, build and run the backend using Docker from the root of the project:
# Build the Docker image
docker build -t chat-tavily .

# Run the container
docker run -p 8080:8080 --env-file .env chat-tavily

Frontend Setup

  1. Navigate to the frontend directory:
cd ui
  1. Install dependencies:
npm install
  1. Start the development server:
npm run dev

API Endpoints

  • POST /stream_agent: Chat endpoint that handles streamed LangGraph execution

Contributing

Feel free to submit issues and enhancement requests!

📞 Contact Us

Have questions, feedback, or looking to build something custom? We'd love to hear from you!


Tavily Logo

Powered by Tavily - The web API Built for AI Agents

About

带有网络访问的对话聊天 Conversational chat with web access

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 73.4%
  • Python 20.2%
  • CSS 3.9%
  • JavaScript 1.5%
  • Other 1.0%
0