-
Stanford University
- https://www.chenwangjeremy.net
Highlights
- Pro
Stars
Official repository of "SAMURAI: Adapting Segment Anything Model for Zero-Shot Visual Tracking with Motion-Aware Memory"
Codebase for Automated Creation of Digital Cousins for Robust Policy Learning
CUDA Python: Performance meets Productivity
Python3 library for controlling Kuka iiwa from an external PC
🕊️ HATO: Learning Visuotactile Skills with Two Multifingered Hands
Universal Monocular Metric Depth Estimation
[RSS 2024] "DexCap: Scalable and Portable Mocap Data Collection System for Dexterous Manipulation" code repository
CoTracker is a model for tracking any point (pixel) on a video.
VoxPoser: Composable 3D Value Maps for Robotic Manipulation with Language Models
TensorRT-LLM provides users with an easy-to-use Python API to define Large Language Models (LLMs) and support state-of-the-art optimizations to perform inference efficiently on NVIDIA GPUs. TensorR…
Code base for See to Touch project: https://see-to-touch.github.io/
"MimicPlay: Long-Horizon Imitation Learning by Watching Human Play" code repository
"Sequential Dexterity: Chaining Dexterous Policies for Long-Horizon Manipulation" code repository
Benchmarking Knowledge Transfer in Lifelong Robot Learning
Web Based Visualizer for Simulation Environments
A PyTorch implementation of Perceiver, Perceiver IO and Perceiver AR with PyTorch Lightning scripts for distributed training
The repository provides code for running inference with the SegmentAnything Model (SAM), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
OmniGibson: a platform for accelerating Embodied AI research built upon NVIDIA's Omniverse engine. Join our Discord for support: https://discord.gg/bccR5vGFEx
A modular, real-time controller library for Franka Emika Panda robots
Forked from https://github.com/simlabrobotics/allegro_hand_linux_v4