TensorZero is an open-source stack for industrial-grade LLM applications. It unifies an LLM gateway, observability, optimization, evaluation, and experimentation.
-
Updated
Jun 19, 2025 - Rust
8000
TensorZero is an open-source stack for industrial-grade LLM applications. It unifies an LLM gateway, observability, optimization, evaluation, and experimentation.
LSP-AI is an open-source language server that serves as a backend for AI-powered functionality, designed to assist and empower software engineers, not replace them.
Instant, controllable, local pre-trained AI models in Rust
Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llama/alpaca/gpt4all/vicuna/rwkv model.
LLama.cpp rust bindings
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
Production-Ready LLM Agent SDK for Every Developer
Run Open Source/Open Weight LLMs locally with OpenAI compatible APIs
LLaMa 7b with CUDA acceleration implemented in rust. Minimal GPU memory needed!
A collection of serverless apps that show how Fermyon's Serverless AI (currently in private beta) works. Reference: https://developer.fermyon.com/spin/serverless-ai-tutorial
DuckDuckGo AI to OpenAI API
Add a description, image, and links to the llama topic page so that developers can more easily learn about it.
To associate your repository with the llama topic, visit your repo's landing page and select "manage topics."