Stars
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
A GPT-4 AI Tutor Prompt for customizable personalized learning experiences.
🆓免费的 ChatGPT 镜像网站列表,持续更新。List of free ChatGPT mirror sites, continuously updated.
Code for NeurIPS 2022 Paper, "Poisson Flow Generative Models" (PFGM)
[IROS 2024] A LiDAR-inertial odometry (LIO) package that can adjust the execution frequency beyond the sweep frequency
NeRF (Neural Radiance Fields) and NeRF in the Wild using pytorch-lightning
A curated list of awesome neural radiance fields papers
《深入浅出 PyTorch——从模型到源码》源代码和勘误(见Issues)
Chinese translation of Bjarne Stroustrup's HOPL4 paper
2021年【思维导图】盒子,C/C++,Golang,Linux,云原生,数据库,DPDK,音视频开发,TCP/IP,数据结构,计算机原理等
翻墙、免费翻墙、免费科学上网、免费节点、免费梯子、免费ss/v2ray/trojan节点、蓝灯、谷歌商店、翻墙梯子
🚀AI拟声: 5秒内克隆您的声音并生成任意语音内容 Clone a voice in 5 seconds to generate arbitrary speech in real-time
In-depth tutorials on deep learning. The first one is about image colorization using GANs (Generative Adversarial Nets).
MulimgViewer is a multi-image viewer that can open multiple images in one interface, which is convenient for image comparison and image stitching.
Learn OpenCV : C++ and Python Examples
HTML5 video speed controller (for Google Chrome)
🥗 All-in-one professional pop-up dictionary and page translator which supports multiple search modes, page translations, new word notebook and PDF selection searching.
本文原文由知名 Hacker Eric S. Raymond 所撰寫,教你如何正確的提出技術問題並獲得你滿意的答案。
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
[ECCV 2020] Official PyTorch implementation of 'Gen-LaneNet: a generalized and scalable approach for 3D lane detection'
Improved Road Connectivity by Joint Learning of Orientation and Segmentation (CVPR2019)
Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.
[CVPR2020] f-BRS: Rethinking Backpropagating Refinement for Interactive Segmentation https://arxiv.org/abs/2001.10331