gguf
Here are 99 public repositories matching this topic...
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
-
Updated
Nov 4, 2024 - Dart
LLM Agent Framework in ComfyUI includes Omost,GPT-sovits, ChatTTS,GOT-OCR2.0, and FLUX prompt nodes,access to Feishu,discord,and adapts to all llms with similar openai / aisuite interfaces, such as o1,ollama, gemini, grok, qwen, GLM, deepseek, moonshot,doubao. Adapted to local llms, vlm, gguf such as llama-3.2, Linkage graphRAG / RAG
-
Updated
Dec 3, 2024 - Python
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
-
Updated
Dec 2, 2024 - TypeScript
Practical Llama 3 inference in Java
-
Updated
Nov 29, 2024 - Java
An open source DevOps tool for packaging and versioning AI/ML models, datasets, code, and configuration into an OCI artifact.
-
Updated
Dec 3, 2024 - Go
Go library for embedded vector search and semantic embeddings using llama.cpp
-
Updated
Oct 28, 2024 - Go
Search for anything using Google, DuckDuckGo, phind.com, Contains AI models, can transcribe yt videos, temporary email and phone number generation, has TTS support, webai (terminal gpt and open interpreter) and offline LLMs
-
Updated
Dec 1, 2024 - Python
Making offline AI models accessible to all types of edge devices.
-
Updated
Feb 12, 2024 - Dart
Gradio based tool to run opensource LLM models directly from Huggingface
-
Updated
Jun 27, 2024 - Python
Improve this page
Add a description, image, and links to the gguf topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gguf topic, visit your repo's landing page and select "manage topics."