Datasets
Open Access
ExoNet: Egocentric Images of Walking Environments
- Citation Author(s):
- Submitted by:
- Brokoslaw Laschowski
- Last updated:
- Fri, 11/29/2024 - 06:53
- DOI:
- 10.21227/rz46-2n31
- Links:
- License:
- Categories:
- Keywords:
Abstract
Egocentric vision is important for environment-adaptive control and navigation of humans and robots. Here we developed ExoNet, the largest open-source dataset of wearable camera images of real-world walking environments. The dataset contains over 5.6 million RGB images of indoor and outdoor environments, which were collected during summer, fall, and winter. 923,000 of the images were human-annotated using a 12-class hierarchical labelling architecture. ExoNet serves as a communal platform to develop, optimize, and compare new deep learning models for egocentric perception, with applications in robotics and computational neuroscience.
Reference:
Laschowski B, McNally W, Wong A, and McPhee J. (2022). Environment Classification for Robotic Leg Prostheses and Exoskeletons using Deep Convolutional Neural Networks. Frontiers in Neurorobotics. DOI: 10.3389/fnbot.2021.730965.
*Details on the ExoNet database are provided in the references above. Please email Dr. Brokoslaw Laschowski (blaschow@uwaterloo.ca) for any additional questions and/or technical assistance.
Dataset Files
- ExoNet_Database.zip (140.29 GB)
- Labels.csv (19.67 MB)
- ExoNet_Images.zip (164.82 GB)
Open Access dataset files are accessible to all logged in users. Don't have a login? Create a free IEEE account. IEEE Membership is not required.
Comments
I need to access this dataset for academic reasons.
I need to access this dataset for academic reasons.
I need this dataset for academic reasons.
i need this dataset for academic reasons.
i need this dataset for academic reasons.
What is the difference between ExoNet_Database and ExoNet_Images?
What is the difference between ExoNet_Database and ExoNet_Images?
Can anyone really train this dataset? How do I feel that a lot of labels are wrong?