Extended Reality to Assess Short-Term Spatial Memory—A Comparative Study of Mixed Reality, Augmented Reality, and Virtual Reality
<p>The object selection menu for setting up a task.</p> "> Figure 2
<p>Selected 3D objects for the MR application: (<b>A</b>) a violin; (<b>B</b>) a set of books; (<b>C</b>) a mug; (<b>D</b>) an earth globe; (<b>E</b>) a frog-shaped decorative object; (<b>F</b>) a toy car; (<b>G</b>) a camera; and (<b>H</b>) a mushroom-shaped decorative object.</p> "> Figure 3
<p>Room layout and object positions used in the MR study. The objects are highlighted with red-bordered squares.</p> "> Figure 4
<p>Virtual objects in the physical environment as seen by the participant.</p> "> Figure 5
<p>Box plots comparing the performance variables for the MR (red), AR (green), and VR (blue) applications.</p> "> Figure 6
<p>Interaction plots for the performance variables when using the MR application and taking into account the age and gender of the participants. The triangles represent the women’s group, and the red rhombuses represent the men’s group. Black solid lines connect women, while red dashed lines connect men. Lines connect only consecutive points within the same group to highlight trends and avoid overlaps.</p> "> Figure 7
<p>Radial graph showing the mean of the subjective variables considering the MR, AR, and VR applications.</p> "> Figure 8
<p>Scatter plots for the positive correlations between satisfaction and enjoyment and presence and enjoyment. The green area represents a 95% confidence level interval for predictions from a linear model. The red line represents the fitted regression line, showing the linear relationship between variables.</p> "> Figure 9
<p>Scatter plots for the negative correlation between concentration and computer experience (<b>left</b>) and the negative correlation between usability and computer experience (<b>right</b>). The green area represents a 95% confidence level interval for predictions from a linear model. The red line represents the fitted regression line, showing the linear relationship between variables.</p> ">
Abstract
:1. Introduction
1.1. Spatial Memory
1.2. Technology-Assisted Assessment of Spatial Memory
2. Materials and Methods
2.1. Participants
2.2. Measures
2.2.1. Performance Variables
- Total Objects: The total number of objects correctly placed at the end of the evaluation phase. The minimum is 0, and the maximum is 8 if the participant placed all of the objects correctly. The application considers an object to be correctly placed if it is in the correct location with a margin of error of approximately 50 cm.
- Total Attempts: The total number of failed attempts. This variable stores the total number of failed attempts to place an object by the user, summing the number of failures for each object. The minimum is 0, and the maximum is 24 (3 for each of the 8 objects).
- Learning Time: The time (in seconds) that the user needs to memorize the positions of the objects. The users must touch each of the objects with their hand when they think that they have memorized its position. When the user touches the last object, the application registers the time spent. The order in which the objects are memorized is not taken into account. The order is at the user’s discretion.
- Evaluation Time: The time (in seconds) taken by the user to place all of the objects in their positions during the evaluation phase. The time is automatically saved when the participant places the last object or fails for the third time.
2.2.2. Other Performance Variables
2.2.3. Subjective Variables
2.3. Procedure
2.3.1. MR Task
- The Mixed Reality application
- Hardware and software
- MR task
- (1)
- The familiarization phase: The first phase consists of placing three virtual objects on three markers that appear to be positioned in the real world. The objective of this phase is to familiarize the participants with the operation of the HoloLens 2 headset and the MR application. No data are collected during this phase.
- (2)
- The learning phase: In this phase, the participants can visualize eight virtual objects that are located in the real world. The participants are informed that they can take as much time as they need to memorize the position of each object. When the participants think that they can remember the position of an object, they must touch it with their hand, and the object lights up. When all eight objects have been touched, the time is recorded, and the phase ends. Figure 4 shows the participants’ point of view of the room with the eight objects to be memorized.
- (3)
- The evaluation phase: In this phase, a previously memorized object appears next to the participant (the starting area). The participant must pick up the object and place it in the location where he or she remembered it from the learning phase. If the object is placed correctly, a new object appears in the starting area of the room, and this action must be repeated until all eight objects have been placed. If the participant places an object in the wrong position, the application informs the participant with a sound. The participant is allowed up to three attempts. If all three attempts fail, the application moves on to the next object. The successes, attempts, and total time spent are automatically recorded by the application.
2.3.2. Map Task
2.3.3. Questionnaires
3. Results
3.1. Performance Variables
3.2. Gender
3.3. Subjective Variables
3.4. Relationships Between the Rest of the Subjective Variables and the Variables of Experience with Computers and Video Games
4. Discussion
- A more immersive experience can be created by seamlessly blending digital content with the real world.
- The freedom to interact with digital content without holding or manipulating the device allows for more natural and intuitive interactions.
- Because MR headsets project digital content directly into the user’s field of view, there are fewer environmental distractions compared to viewing content on a mobile device screen.
- Headsets are becoming more ergonomically designed, while mobile AR requires users to hold and manipulate a device.
- The environment is real and does not need to be modeled. The user’s home, a therapist’s room, or any other chosen location can be used.
- MR allows users to maintain awareness of their real environment while interacting with virtual content, increasing safety and enabling collaboration with others in physical space. In contrast, VR isolates users from the real world, which can lead to disorientation and safety issues.
- With optical see-through headsets, users can interact with virtual and physical objects using natural gestures and movements in their real environment. In VR with headsets, interaction is mainly limited to virtual objects within the modeled environment.
- MR with optical see-through headsets enables social interaction and collaboration among users in the same physical space, fostering communication and teamwork. VR with headsets tends to isolate users in individual virtual environments, limiting social interaction to online platforms.
- MR allows virtual objects to interact with real-world objects and the real world. VR with headsets lacks this capability because the virtual content is isolated from the physical world.
- MR preserves spatial cues and depth perception from the real world, allowing users to accurately perceive distances and spatial relationships between objects. In contrast, VR with headset generates spatial cues only in the virtual environment, which can lead to discrepancies between perceived and actual distances.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Buttussi, F.; Chittaro, L. Acquisition and Retention of Spatial Knowledge through Virtual Reality Experiences: Effects of VR Setup and Locomotion Technique. Int. J. Hum. Comput. Stud. 2023, 177, 103067. [Google Scholar] [CrossRef]
- Chan, E.; Baumann, O.; Bellgrove, M.A.; Mattingley, J.B. Negative Emotional Experiences during Navigation Enhance Parahippocampal Activity during Recall of Place Information. J. Cogn. Neurosci. 2014, 26, 154–164. [Google Scholar] [CrossRef] [PubMed]
- Zimmermann, K.; Eschen, A. Brain Regions Involved in Subprocesses of Small-Space Episodic Object-Location Memory: A Systematic Review of Lesion and Functional Neuroimaging Studies. Memory 2017, 25, 487–519. [Google Scholar] [CrossRef] [PubMed]
- Baddeley, A. Working Memory. Science 1992, 255, 556–559. [Google Scholar] [CrossRef] [PubMed]
- Allen, R.J.; Baddeley, A.D.; Hitch, G.J. Evidence for Two Attentional Components in Visual Working Memory. J. Exp. Psychol. Learn. Mem. Cogn. 2014, 40, 1499–1509. [Google Scholar] [CrossRef]
- Allen, R.J.; Castellà, J.; Ueno, T.; Hitch, G.J.; Baddeley, A.D. What Does Visual Suffix Interference Tell Us about Spatial Location in Working Memory? Mem. Cognit. 2015, 43, 133–142. [Google Scholar] [CrossRef]
- Ruddle, R.A.; Lessels, S. The Benefits of Using a Walking Interface to Navigate Virtual Environments. ACM Trans. Comput. Interact. 2009, 16, 1–18. [Google Scholar] [CrossRef]
- Singh, J.; Singh, G.; Verma, R.; Prabha, C. Exploring the Evolving Landscape of Extended Reality (XR) Technology. In Proceedings of the 2023 3rd International Conference on Smart Generation Computing, Communication and Networking (SMART GENCON), Bangalore, India, 29–31 December 2023; pp. 1–6. [Google Scholar]
- Milgram, P.; Kishino, F. A Taxonomy of Mixed Reality Visual Displays. IEICE Trans. Inf. Syst. 1994, 77, 1321–1329. [Google Scholar]
- Azuma, R.T. A Survey of Augmented Reality. Presence Teleoper. Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
- Koleva, B.; Benford, S.; Greenhalgh, C. The Properties of Mixed Reality Boundaries. In ECSCW ’99, Proceedings of the Sixth European Conference on Computer Supported Cooperative Work, Copenhagen, Denmark, 12–16 September 1999; Bødker, S., Kyng, M., Schmidt, K., Eds.; Springer: Dordrecht, The Netherlands, 1999; pp. 119–137. ISBN 978-94-011-4441-4. [Google Scholar]
- Lindeman, R.W.; Noma, H. A Classification Scheme for Multi-Sensory Augmented Reality. In Proceedings of the 2007 ACM Symposium on Virtual Reality Software and Technology, Newport Beach, CA, USA, 5–7 November 2007; Association for Computing Machinery: New York, NY, USA, 2007; pp. 175–178. [Google Scholar]
- Mann, S.; Furness, T.; Yuan, Y.; Iorio, J.; Wang, Z. All Reality: Virtual, Augmented, Mixed (X), Mediated (X,Y), and Multimediated Reality. arXiv 2018, arXiv:1804.08386. [Google Scholar]
- Speicher, M.; Hall, B.D.; Nebeling, M. What Is Mixed Reality? In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow Scotland, UK, 4–9 May 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–15. [Google Scholar]
- Skarbez, R.; Smith, M.; Whitton, M.C. Revisiting Milgram and Kishino’s Reality-Virtuality Continuum. Front. Virtual Real. 2021, 2, 647997. [Google Scholar] [CrossRef]
- Parveau, M.; Adda, M. 3iVClass: A New Classification Method for Virtual, Augmented and Mixed Realities. Procedia Comput. Sci. 2018, 141, 263–270. [Google Scholar] [CrossRef]
- Kara, M.; Çakıcı Alp, N. Assessing the Adoption of the Yavuz Battleship Application in the Mixed Reality Environment Using the Technology Acceptance Model. Multimed. Syst. 2024, 30, 76. [Google Scholar] [CrossRef]
- Munoz-Montoya, F.; Juan, M.C.; Mendez-Lopez, M.; Molla, R.; Abad, F.; Fidalgo, C. SLAM-Based Augmented Reality for the Assessment of Short-Term Spatial Memory. A Comparative Study of Visual versus Tactile Stimuli. PLoS ONE 2021, 16, e0245976. [Google Scholar] [CrossRef] [PubMed]
- Juan, M.-C.; Estevan, M.; Mendez-Lopez, M.; Fidalgo, C.; Lluch, J.; Vivo, R. A Virtual Reality Photography Application to Assess Spatial Memory. Behav. Inf. Technol. 2022, 42, 686–699. [Google Scholar] [CrossRef]
- Wallace, J.E.; Krauter, E.E.; Campbell, B.A. Animal Models of Declining Memory in the Aged: Short-Term and Spatial Memory in the Aged Rat1. J. Gerontol. 1980, 35, 355–363. [Google Scholar] [CrossRef]
- Oades, R.D.; Isaacson, R.L. The Development of Food Search Behavior by Rats: The Effects of Hippocampal Damage and Haloperidol. Behav. Biol. 1978, 24, 327–337. [Google Scholar] [CrossRef]
- Zepeda, I.; Cabrera, F. Human and Rat Behavioral Variability in the Dashiell Maze: A Comparative Analysis. Int. J. Comp. Psychol. 2021, 34, 49482. [Google Scholar] [CrossRef]
- Langlois, J.; Bellemare, C.; Toulouse, J.; Wells, G.A. Spatial Abilities and Technical Skills Performance in Health Care: A Systematic Review. Med. Educ. 2015, 49, 1065–1085. [Google Scholar] [CrossRef]
- Mitolo, M.; Gardini, S.; Caffarra, P.; Ronconi, L.; Venneri, A.; Pazzaglia, F. Relationship between Spatial Ability, Visuospatial Working Memory and Self-Assessed Spatial Orientation Ability: A Study in Older Adults. Cogn. Process. 2015, 16, 165–176. [Google Scholar] [CrossRef]
- Walkowiak, S.; Foulsham, T.; Eardley, A.F. Individual Differences and Personality Correlates of Navigational Performance in the Virtual Route Learning Task. Comput. Human Behav. 2015, 45, 402–410. [Google Scholar] [CrossRef]
- Picucci, L.; Caffò, A.O.; Bosco, A. Besides Navigation Accuracy: Gender Differences in Strategy Selection and Level of Spatial Confidence. J. Environ. Psychol. 2011, 31, 430–438. [Google Scholar] [CrossRef]
- Cimadevilla, J.M.; Lizana, J.R.; Roldán, M.D.; Cánovas, R.; Rodríguez, E. Spatial Memory Alterations in Children with Epilepsy of Genetic Origin or Unknown Cause. Epileptic Disord. 2014, 16, 203–207. [Google Scholar] [CrossRef] [PubMed]
- Shore, D.I.; Stanford, L.; MacInnes, W.J.; Brown, R.E.; Klein, R.M. Of Mice and Men: Virtual Hebb—Williams Mazes Permit Comparison of Spatial Learning across Species. Cogn. Affect. Behav. Neurosci. 2001, 1, 83–89. [Google Scholar] [CrossRef] [PubMed]
- Simons, D.J.; Wang, R.F. Perceiving Real-World Viewpoint Changes. Psychol. Sci. 1998, 9, 315–320. [Google Scholar] [CrossRef]
- Frances Wang, R.; Simons, D.J. Active and Passive Scene Recognition across Views. Cognition 1999, 70, 191–210. [Google Scholar] [CrossRef]
- Zeigler, B.P.; Sheridan, T.B. Human Use of Short-Term Memory in Processing Information on a Console. IEEE Trans. Hum. Factors Electron. 1965, HFE-6, 74–83. [Google Scholar] [CrossRef]
- Rodríguez-Andrés, D.; Juan, M.-C.; Méndez-López, M.; Pérez-Hernández, E.; Lluch, J. MnemoCity Task: Assessment of Childrens Spatial Memory Using Stereoscopy and Virtual Environments. PLoS ONE 2016, 11, e0161858. [Google Scholar] [CrossRef]
- Fajnerová, I.; Rodriguez, M.; Levčík, D.; Konrádová, L.; Mikoláš, P.; Brom, C.; Stuchlík, A.; Vlček, K.; Horáček, J. A Virtual Reality Task Based on Animal Research—Spatial Learning and Memory in Patients after the First Episode of Schizophrenia. Front. Behav. Neurosci. 2014, 8, 157. [Google Scholar] [CrossRef]
- Juan, M.-C.; Mendez-Lopez, M.; Perez-Hernandez, E.; Albiol-Perez, S. Augmented Reality for the Assessment of Children’s Spatial Memory in Real Settings. PLoS ONE 2014, 9, e113751. [Google Scholar] [CrossRef]
- Mendez-Lopez, M.; Perez-Hernandez, E.; Juan, M.-C. Learning in the Navigational Space: Age Differences in a Short-Term Memory for Objects Task. Learn. Individ. Differ. 2016, 50, 11–22. [Google Scholar] [CrossRef]
- Munoz-Montoya, F.; Juan, M.-C.; Mendez-Lopez, M.; Fidalgo, C. Augmented Reality Based on SLAM to Assess Spatial Short-Term Memory. IEEE Access 2019, 7, 2453–2466. [Google Scholar] [CrossRef]
- Munoz-Montoya, F.; Fidalgo, C.; Juan, M.-C.; Mendez-Lopez, M. Memory for Object Location in Augmented Reality: The Role of Gender and the Relationship Among Spatial and Anxiety Outcomes. Front. Hum. Neurosci. 2019, 13, 113. [Google Scholar] [CrossRef] [PubMed]
- Keil, J.; Korte, A.; Ratmer, A.; Edler, D.; Dickmann, F. Augmented Reality (AR) and Spatial Cognition: Effects of Holographic Grids on Distance Estimation and Location Memory in a 3D Indoor Scenario. PFG–J. Photogramm. Remote Sens. Geoinf. Sci. 2020, 88, 165–172. [Google Scholar] [CrossRef]
- Juan, M.-C.; Mendez-Lopez, M.; Fidalgo, C.; Molla, R.; Vivó, R.; Paramo, D. A SLAM-Based Augmented Reality App for the Assessment of Spatial Short-Term Memory Using Visual and Audtitory Stimuli. J. Multimodal User Interfaces 2022, 16, 319–333. [Google Scholar] [CrossRef]
- Witmer, B.G.; Singer, M.J. Measuring Presence in Virtual Environments: A Presence Questionnaire. Presence Teleoper. Virtual Environ. 1998, 7, 225–240. [Google Scholar] [CrossRef]
- Brooke, J. SUS-A Quick and Dirty Usability Scale. In Usability Evaluation in Industry; Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, A.L., Eds.; Taylor & Francis: London, UK, 1996. [Google Scholar]
- Slater, M.; Usoh, M.; Steed, A. Depth of Presence in Virtual Environments. Presence Teleoper. Virtual Environ. 1994, 3, 130–144. [Google Scholar] [CrossRef]
- Schneegans, S.; Bays, P.M. No Fixed Item Limit in Visuospatial Working Memory. Cortex 2016, 83, 181–193. [Google Scholar] [CrossRef]
- Unity. Available online: https://unity.com (accessed on 30 October 2024).
- MRTK. Available online: https://learn.microsoft.com/en-us/windows/mixed-reality/mrtk-unity (accessed on 30 October 2024).
- Microsoft Azure Spatial Anchors. Available online: https://learn.microsoft.com/en-us/azure/spatial-anchors (accessed on 30 October 2024).
- R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2023; Available online: http://www.r-project.org (accessed on 30 October 2024).
- Monteiro, D.; Wang, X.; Liang, H.-N.; Cai, Y. Spatial Knowledge Acquisition in Virtual and Physical Reality: A Comparative Evaluation. In Proceedings of the 2021 IEEE 7th International Conference on Virtual Reality (ICVR), Foshan, China, 20–22 May 2021; pp. 308–313. [Google Scholar]
- Llana, T.; Garces-Arilla, S.; Juan, M.-C.; Mendez-Lopez, M.; Mendez, M. An Immersive Virtual Reality-Based Object-Location Memory Task Reveals Spatial Long-Term Memory Alterations in Long-COVID. Behav. Brain Res. 2024, 471, 115127. [Google Scholar] [CrossRef]
- Banquiero, M.; Valdeolivas, G.; Ramón, D.; Juan, M.-C. A Color Passthrough Mixed Reality Application for Learning Piano. Virtual Real. 2024, 28, 67. [Google Scholar] [CrossRef]
- Masalkhi, M.; Waisberg, E.; Ong, J.; Zaman, N.; Sarker, P.; Lee, A.G.; Tavakkoli, A. Apple Vision Pro for Ophthalmology and Medicine. Ann. Biomed. Eng. 2023, 51, 2643–2646. [Google Scholar] [CrossRef] [PubMed]
MR (Mdn; IQR) | AR (Mdn; IQR) | U | Z | p | r | |
---|---|---|---|---|---|---|
Total Objects | 8; 1 | 8; 1 | 786 | 0.194 | 0.850 | 0.021 |
Total Attempts | 3; 4 | 3; 5 | 718 | −0.495 | 0.624 | 0.055 |
Learning Time | 167.65; 617.59 | 130.22; 63.08 | 962 | 1.877 | 0.061 | 0.207 |
Evaluation Time | 727.38; 472.63 | 147.8; 125.26 | 1537 | 7.453 | <0.001 | 0.823 |
MR (Mdn; IQR) | VR (Mdn; IQR) | U | Z | p | r | |
---|---|---|---|---|---|---|
Total Objects | 8; 1 | 8; 1 | 343 | −0.411 | 0.689 | 0.056 |
Total Attempts | 3; 4 | 1; 5 | 423.5 | 1.079 | 0.284 | 0.147 |
Learning Time | 167.65; 617.59 | 71; 18 | 653 | 5.039 | <0.001 | 0.686 |
Evaluation Time | 727.38; 472.63 | 412; 234 | 607 | 4.241 | <0.001 | 0.577 |
MR | MAP (Mdn; IQR) | U | Z | p | r |
---|---|---|---|---|---|
8; 1 | 8; 1 | 4 | −1.890 | 0.073 | 0.248 |
Women (Mdn; IQR) | Men (Mdn; IQR) | U | Z | p | r | |
---|---|---|---|---|---|---|
Total Objects | 8; 0 | 7.5; 1.75 | 137.5 | 1.676 | 0.099 | 0.311 |
Total Attempts | 1; 2 | 5; 4.5 | 65 | −1.769 | 0.081 | 0.328 |
Learning Time | 167.65; 606.91 | 157.22; 467.16 | 123 | 0.786 | 0.451 | 0.146 |
Evaluation Time | 727.38; 254.98 | 878.52; 670.12 | 88 | −0.742 | 0.477 | 0.138 |
MR (Mdn; IQR) | AR (Mdn; IQR) | U | Z | p | r | |
---|---|---|---|---|---|---|
Enjoyment | 6; 1.5 | 6; 1 | 589.5 | −1.028 | 0.307 | 0.118 |
Concentration | 5; 3 | 6.5; 1 | 481.0 | −2.199 | 0.028 | 0.252 |
Usability | 5.75; 1.5 | 6.66; 1 | 371.0 | −3.345 | <0.001 | 0.384 |
Competence | 7; 1 | 7; 1 | 686.5 | 0.062 | 0.955 | 0.007 |
Calmness | 7; 0 | 6; 2 | 946.0 | 3.290 | <0.001 | 0.377 |
Expertise | 6; 2 | 6; 2 | 590.5 | −1.021 | 0.310 | 0.117 |
Non-mental effort | 7; 1 | 6; 1 | 769.0 | 1.030 | 0.306 | 0.118 |
Non-physical effort | 7; 0 | 6; 2 | 977.0 | 3.638 | <0.001 | 0.417 |
Satisfaction | 7; 0.5 | 6; 1 | 1056.0 | 4.145 | <0.001 | 0.475 |
Presence | 3.75; 2.25 | 5.33; 1.16 | 305.5 | −4.024 | <0.001 | 0.462 |
MR (Mdn; IQR) | VR (Mdn; IQR) | U | Z | p | r | |
---|---|---|---|---|---|---|
No cybersickness | 7; 1 | 7; 0 | 297.5 | −1.665 | 0.099 | 0.227 |
Enjoyment | 6; 1.5 | 7; 1 | 225.5 | −2.529 | 0.012 | 0.344 |
Concentration | 5; 3 | 6.5; 1.5 | 277.0 | −1.527 | 0.129 | 0.208 |
Usability | 5.75; 1.5 | 6.3; 1.3 | 210.5 | −2.652 | 0.008 | 0.361 |
Competence | 7; 1 | 7; 1 | 357.5 | −0.101 | 0.927 | 0.014 |
Calmness | 7; 0 | 7; 1 | 445.0 | 1.914 | 0.057 | 0.260 |
Expertise | 6; 2 | 6; 2 | 334.5 | −0.507 | 0.618 | 0.069 |
Non-mental effort | 7; 1 | 7; 1 | 362.0 | −0.010 | 1.000 | 0.001 |
Non-physical effort | 7; 0 | 7; 0 | 369.0 | 0.207 | 0.849 | 0.028 |
Satisfaction | 7; 0.5 | 7; 0.6 | 341.0 | −0.407 | 0.691 | 0.055 |
Presence | 3.75; 2.25 | 6; 2 | 156.5 | −3.601 | <0.001 | 0.490 |
Women (Mdn; IQR) | Men (Mdn; IQR) | U | Z | p | r | |
---|---|---|---|---|---|---|
No cybersickness | 7; 0 | 7; 2.5 | 123.0 | 0.997 | 0.332 | 0.185 |
Enjoyment | 6.5; 1 | 5.75; 1.375 | 137.5 | 1.446 | 0.155 | 0.268 |
Concentration | 6; 2 | 4; 2.75 | 158.5 | 2.409 | 0.017 | 0.447 |
Usability | 6; 1.875 | 5.375; 1.5625 | 133.5 | 1.249 | 0.220 | 0.232 |
Competence | 7; 1 | 7; 1 | 122.5 | 0.887 | 0.389 | 0.165 |
Calmness | 7; 0 | 7; 0 | 121.0 | 1.166 | 0.259 | 0.217 |
Expertise | 6; 1.5 | 5.5; 1 | 129.5 | 1.121 | 0.272 | 0.208 |
Non-mental effort | 7; 1 | 7; 1 | 92.0 | −0.638 | 0.540 | 0.118 |
Non-physical effort | 7; 0 | 7; 0 | 99.5 | −0.454 | 0.680 | 0.084 |
Ergonomics | 7; 1 | 6; 1 | 118.5 | 0.642 | 0.536 | 0.119 |
Satisfaction | 7; 0.5 | 6.5; 0.875 | 121.0 | 0.762 | 0.460 | 0.142 |
Presence | 4.5; 2.37 | 3.5; 2.3125 | 105.5 | 0.022 | 1.000 | 0.004 |
Anxiety | 11; 5.5 | 9.5; 3.75 | 130.5 | 1.119 | 0.273 | 0.208 |
c2 2 | c3 3 | c4 4 | c5 5 | c6 6 | c7 7 | c8 8 | c9 9 | c10 10 | c11 11 | c12 12 | |
---|---|---|---|---|---|---|---|---|---|---|---|
c1 1 | 0.20 | −0.08 | 0.09 | 0.12 | 0.15 | 0.12 | −0.02 | 0.02 | 0.19 | −0.03 | 0.18 |
c2 2 | 0.56 | 0.43 | 0.72 | 0.24 | 0.71 | 0.32 | 0.21 | 0.35 | 0.64 | 0.76 | |
c3 3 | 0.55 | 0.38 | 0.18 | 0.39 | 0.02 | 0.21 | 0.07 | 0.43 | 0.38 | ||
c4 4 | 0.44 | 0.15 | 0.42 | 0.35 | 0.21 | 0.32 | 0.47 | 0.16 | |||
c5 5 | 0.41 | 0.50 | 0.29 | 0.14 | 0.45 | 0.66 | 0.51 | ||||
c6 6 | 0.37 | 0.18 | −0.14 | 0.29 | 0.03 | 0.04 | |||||
c7 7 | 0.25 | −0.07 | 0.35 | 0.48 | 0.64 | ||||||
c8 8 | 0.39 | 0.34 | 0.22 | 0.22 | |||||||
c9 9 | 0.31 | 0.42 | 0.10 | ||||||||
c10 10 | 0.46 | 0.15 | |||||||||
c11 11 | 0.33 |
c1 1 | c2 2 | c3 3 | c4 4 | c5 5 | c6 6 | c7 7 | c8 8 | c9 9 | c10 10 | c11 11 | c12 12 | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
g1 g1 | −0.01 | −0.19 | −0.49 | −0.13 | −0.09 | 0.20 | −0.01 | 0.19 | −0.13 | −0.30 | −0.15 | −0.19 |
g2 g2 | −0.11 | −0.28 | −0.52 | −0.29 | −0.04 | −0.07 | −0.05 | 0.14 | −0.11 | −0.06 | 0.03 | −0.16 |
g3 g3 | 0.04 | −0.07 | −0.21 | −0.05 | 0.07 | 0.02 | 0.04 | 0.18 | 0.19 | −0.16 | 0.09 | −0.07 |
g4 g4 | −0.08 | −0.14 | −0.19 | −0.25 | −0.01 | 0.03 | 0.11 | −0.16 | 0.00 | −0.27 | 0.21 | −0.17 |
g5 g5 | −0.04 | −0.04 | −0.51 | −0.38 | −0.04 | 0.06 | 0.20 | 0.14 | −0.01 | −0.01 | 0.00 | 0.02 |
gt gt | −0.12 | −0.16 | −0.53 | −0.31 | −0.02 | 0.13 | 0.14 | 0.04 | −0.09 | −0.19 | 0.01 | −0.09 |
e e | −0.02 | −0.27 | −0.60 | −0.41 | −0.18 | 0.21 | 0.12 | −0.16 | −0.22 | −0.03 | −0.15 | −0.20 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ponce, D.; Mendez-Lopez, M.; Lluch, J.; Juan, M.-C. Extended Reality to Assess Short-Term Spatial Memory—A Comparative Study of Mixed Reality, Augmented Reality, and Virtual Reality. Sensors 2024, 24, 7938. https://doi.org/10.3390/s24247938
Ponce D, Mendez-Lopez M, Lluch J, Juan M-C. Extended Reality to Assess Short-Term Spatial Memory—A Comparative Study of Mixed Reality, Augmented Reality, and Virtual Reality. Sensors. 2024; 24(24):7938. https://doi.org/10.3390/s24247938
Chicago/Turabian StylePonce, David, Magdalena Mendez-Lopez, Javier Lluch, and M.-Carmen Juan. 2024. "Extended Reality to Assess Short-Term Spatial Memory—A Comparative Study of Mixed Reality, Augmented Reality, and Virtual Reality" Sensors 24, no. 24: 7938. https://doi.org/10.3390/s24247938
APA StylePonce, D., Mendez-Lopez, M., Lluch, J., & Juan, M.-C. (2024). Extended Reality to Assess Short-Term Spatial Memory—A Comparative Study of Mixed Reality, Augmented Reality, and Virtual Reality. Sensors, 24(24), 7938. https://doi.org/10.3390/s24247938