Starred repositories
[Lumina Embodied AI Community] 具身智能技术指南 Embodied-AI-Guide
Qwen3 is the large language model series developed by Qwen team, Alibaba Cloud.
🚀🚀 「大模型」2小时完全从0训练26M的小参数GPT!🌏 Train a 26M-parameter GPT from scratch in just 2h!
🚀 「大模型」1小时从0训练26M参数的视觉多模态VLM!🌏 Train a 26M-parameter VLM from scratch in just 1 hours!
Python tool for converting files and office documents to Markdown.
Subdomain enumeration and information gathering tool
LLMs-from-scratch项目中文翻译
仅需Python基础,从0构建大语言模型;从0逐步构建GLM4\Llama3\RWKV6, 深入理解大模型原理
Implement a ChatGPT-like LLM in PyTorch from scratch, step by step
🌟 The Multi-Agent Framework: First AI Software Company, Towards Natural Language Programming
No fortress, purely open ground. OpenManus is Coming.
Use LLMs to dig out what you care about from massive amounts of information and a variety of sources daily.
Neo4j graph construction from unstructured data using LLMs
Stay on top of trending topics on social media and the web with AI
思通舆情 是一款开源免费的舆情系统,支持本地化部署。支持对海量的舆情数据进行多维交叉分析和深度挖掘,为用户户提供全面的舆情数据,专业的舆情分析。
Code relative to "Reliable evaluation of adversarial robustness with an ensemble of diverse parameter-free attacks"
ICLR21 Tent: Fully Test-Time Adaptation by Entropy Minimization
Official PyTorch implementation of Learning to (Learn at Test Time): RNNs with Expressive Hidden States
This is an Online Test-time Adaptation (OTTA) benckmark conducted on ViT backbones. The paper has been accepted by IJCV.
Test-time Prompt Tuning (TPT) for zero-shot generalization in vision-language models (NeurIPS 2022))
Code for ICLR 2023 paper (Oral) — Towards Stable Test-Time Adaptation in Dynamic Wild World
ImageNet-Sketch data set for evaluating model's ability in learning (out-of-domain) semantics at ImageNet scale
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.