[NeurIPS2022] Egocentric Video-Language Pretraining
-
Updated
May 9, 2024 - Python
[NeurIPS2022] Egocentric Video-Language Pretraining
Official code and data for EgoBody dataset (2022 ECCV)
[CVPR 2024] Official code for EgoGen: An Egocentric Synthetic Data Generator
PyTorch code for EgoHMR (ICCV 2023): Probabilistic Human Mesh Recovery in 3D Scenes from Egocentric Views
Official implementation of Balanced Spherical Grid for Egocentric View Synthesis (CVPR 2023)
[ICCV, 2023] Multiple humans in 3D captured by dynamic and static cameras in 4K.
[CVPR 2022] Egocentric Action Target Prediction in 3D
Code for the paper "AMEGO: Active Memory from long EGOcentric videos" published at ECCV 2024
Codebase for "Multimodal Distillation for Egocentric Action Recognition" (ICCV 2023)
EventEgo3D: 3D Human Motion Capture from Egocentric Event Streams [CVPR'24]
Code implementation for paper titled "HOI-Ref: Hand-Object Interaction Referral in Egocentric Vision"
(ECCV 2024) Official repository of paper "EgoExo-Fitness: Towards Egocentric and Exocentric Full-Body Action Understanding"
The official PyTorch implementation of the IEEE/CVF Computer Vision and Pattern Recognition (CVPR) '24 paper PREGO: online mistake detection in PRocedural EGOcentric videos.
The champion solution for Ego4D Natural Language Queries Challenge in CVPR 2023
✌️ Detection and tracking hand from FPV: benchmarks and challenges on rehabilitation exercises dataset
Official implementation of "A Backpack Full of Skills: Egocentric Video Understanding with Diverse Task Perspectives", accepted at CVPR 2024.
[ECCV 2024] Official code release for "Multimodal Cross-Domain Few-Shot Learning for Egocentric Action Recognition"
Official code repository to download the TREK-150 benchmark dataset and run experiments on it.
Official repository of ECCV 2024 paper - "HAT: History-Augmented Anchor Transformer for Online Temporal Action Localization"
A dataset of egocentric vision, eye-tracking and full body kinematics from human locomotion in out-of-the-lab environments. Also, different use cases of the dataset along with example code.
Add a description, image, and links to the egocentric-vision topic page so that developers can more easily learn about it.
To associate your repository with the egocentric-vision topic, visit your repo's landing page and select "manage topics."