[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/3544549.3585729acmconferencesArticle/Chapter ViewFull TextPublication PageschiConference Proceedingsconference-collections
Work in Progress
Public Access

Virtual Fidgets: Opportunities and Design Principles for Bringing Fidgeting to Online Learning

Published: 19 April 2023 Publication History

Abstract

We present design guidelines for incorporating fidgeting into the virtual world as a tool for students in online lectures. Fidgeting is associated with increased attention and self-regulation, and has the potential to help students focus. Currently there are no fidgets, physical or virtual, designed for preserving attention specifically in online learning environments, and no heuristics for designing fidgets within this domain. We identify three virtual fidget proxies to serve as design probes for studying student experiences with virtual fidgeting. Through a study of eight students using our virtual fidget proxies in online lectures, we identify eight emergent themes that encompass student experience with virtual fidgeting in lectures. Based on these themes, we present four principles for designing domain-specific virtual fidgets for online lectures. We identify that virtual fidgets for lectures should be context-aware, visually appealing, easy to adopt, and physically interactive.

1 Introduction

Many people at some time or another feel an urge to fidget. In lectures, it is common to fiddle with objects in one’s surrounding environment. While this may seem like a distraction, fidgets afford people a way to release pent up energy [25], allowing them to pay more attention and even increase retention of lecture material [13]. Most spontaneous fidget objects end up being non-distracting items around us, such as office supplies. In the virtual world, however, the landscape is sparse for potential virtual fidgets that are not designed to draw attention from the user. When a student watching a video lecture has trouble focusing, they are likely to turn to activities unrelated to the lecture, such as social media [35] which, in the virtual world, is often the student’s closest “fidget object”. Social media, by its very nature, is designed to firmly hold attention [2], and prior work has shown that students’ attention and retention significantly worsen when completing tasks on computers during lectures [28, 29]. These digital distractions have become even more common in the age of Zoom and online learning during the COVID-19 pandemic, with one large-scale study finding that 23-31% of people reported multitasking during meetings [4]. Trying to maintain attention during long video conferences can also lead to emotional burnout, or what has been commonly referred to as “Zoom fatigue” [27].
In traditional classroom settings, fidgeting has particularly been shown to greatly improve children’s learning and attention [5, 13, 16], which may have sparked the growth of fidget toys [3]. While physical fidget tools do provide benefit to many people, as more classes move to an online or hybrid structure due to the COVID-19 pandemic, it is essential for students to have integrated ways to both maintain focus and bounce back from temporary inattention. We propose translating these fidgeting principles into virtual learning environments as a response to these changes. We believe virtual fidgets have a high potential for impact specifically in online lectures, where students often passively watch lectures without active engagement in the form of participation.
This paper seeks to understand how students feel about using virtual fidgets on the same screen as their online lectures. We theorize that giving users a satisfying fidget located directly on the screen next to their lecture may fulfill their need for stimulation, keeping them more engaged in the lecture than if they were simultaneously using social media. To explore this, we designed a user study where students use virtual fidgets on the same screen as an online lecture. The insights from our users’ perspectives were then used as a basis to provide recommendations for future designs. As a result of this work, we identify that the fidget intervention reduced transfer to other sites in a majority of participants. Based on our study, we present four core design principles to serve as guidelines for future virtual fidget tools. We suggest that virtual fidget tools should be context-aware, visually appealing, easy to adopt, and physically interactive.
Figure 1:
Figure 1: Images of the three fidget tools chosen for this study, set up in a Zoom lecture.

2 Related Work

Building on discoveries in psychology, HCI researchers have explored computing-supported artifacts to improve the experiences of online video conferencing through movement. Active workstations, for example, focus on physical activity while working [8], which addresses Bailenson’s suggestion that the reduced physical mobility during meetings may be a key contributor to Zoom fatigue [1]. Others have created smartphone apps that utilize vibration for tactile, haptic feedback for the purpose of attention regulation [33]. Research on mindfulness [7] and affective experience on work have further inspired HCI systems work [31]. A few examples include Haas & An’s fashionable wearables that promote emotional regulation for people with anxiety [17], Roquet et al.’s Purrrble teddy bears that embody emotional self-regulation for youth’s mental health [11], and Ji & Isbister’s AR fidgets that place users in immersive experiences to regulate their mood [19]. Audience response systems are another alternative that can activate attention by adding an interactive component to a lecture. However, they are not initiated by the student, and past work has found that students can feel negatively about being monitored by such systems [23].
We find much of Karlesky & Isbister’s work on “fidget widgets” useful in our investigation of virtual fidgets, and chose apps for our study inspired by their four key themes of fidget widgets: Tangential, Playful, Digital, and Tangible [21]. Our investigation is unique in that although some of our apps do have tangible elements, we do not introduce new physical objects or devices in addition to the computers that participants already had access to, whereas prior work often incorporates custom hardware. We also adopt Karlesky et al.’s recommendation to think about designing for the “margins” of a workspace [22], motivated by the concept of spatial contiguity, which finds that placing instruction material close to each other helps with learning [15]. Chalkley et al. also found that physical movement—which prior research has established is correlated with better retention—can actually increase when screen engagement goes up [6]. Additionally, considering that gazing away from screens is distracting and appears to be disrespectful [26], collocating it may also benefit fellow participants of the lesson.
Our investigation includes the unique contribution of investigating virtual fidgets as an intervention for a specific task. Prior work has developed novel fidgets for self-regulation in both the physical and virtual world. Our study goes further by investigating the use of virtual fidgets for a particular goal, which in this case is to help students deal with distractions during online lectures. Our experiment design resembles da Câmara et al.’s work on identifying children’s fidget object preferences by using existing apps as design probes to elicit feedback about opportunities for future virtual fidgets [10]. While fidgets have historically been associated with children, we differentiate our work by focusing specifically on adult college and graduate level students in the context of online video lectures. We also do not introduce new devices in the space, and collocate the fidget "in the margins" of the lecture material (on the screen).

3 Methods

This paper aims to provide a basis for incorporating a fidget into virtual learning spaces without requiring an external object. Thus, we did not seek to design the "ideal" virtual fidget, but rather explore guidelines for a future tool. For this investigation, we broadly consider virtual fidgets interactive tools where the primary input and feedback happen on the user’s device. To do this, we had participants use pre-existing apps that have the potential to serve as virtual fidgets. In addition to drawing on Karlesky et al.’s four themes of "fidget widgets", we also selected the following three tools because they closely mimicked a “fidget object”: users can interact with the tool without it holding their direct attention for an extended period of time [21]. The tools ultimately chosen for this study were: Desktop Pets [9], Fluid Simulation [12], and CursorEffect2 [34] (Figure 1). Desktop Pets is a macOS app which provides users with small, pixelated "pets" that walk around their screen in front of all other applications. Users can interact with the pets by clicking and dragging them with their mouse. Fluid Simulation is a web application that allows users to generate colorful swirls of light by clicking and dragging their mouse on a blank canvas. To use the Fluid Simulation, participants opened a split screen with both their virtual lecture and this fidget tool. CursorEffect2 is another macOS app that decorates users’ cursors with animations like lights and colors as they move their mouse. Unlike the other two apps, CursorEffect2 does not require users to click with their pointing devices in order to activate its visual effects.
We recruited eight participants at the University of Washington by publicizing the study on multiple community pages. All participants were students who had recently attended at least one online lecture. We presented participants with a Google Form that included all instructions and questions. Participants were first asked to choose one of three fidget tools based on what appealed to them. We then asked participants to watch an online lecture of their choosing, lasting at least thirty minutes, while using the tool. Afterwards, participants were tasked to answer several feedback questions. The questionnaire begins with the participants’ experience with Zoom lectures in general. We used this as a baseline to determine if participants typically become distracted during online lectures. We also asked participants to specify what they usually do if they lose focus during virtual classes. After participants finished the lecture, participants recounted their experience using the fidget tool by answering a mix of Likert scale and free-response questions.
All three authors then together performed a thematic analysis. We organized the open-ended responses by gathering themes that might inform the design of future virtual fidget tools. The questions were: "How did you feel about having the fidget on your screen along with the lecture?" "What were 2 things you liked about using the fidget?" "What were 2 things you disliked about using the fidget?" "What do you wish the fidget could do that it currently does not?". We utilized a collaborative cardsorting method to complete this thematic analysis. We printed out the individual responses on paper and and collaboratively created codes for recurring themes in the responses. Each time we felt a response matched an existing theme or warranted a new theme, we labeled it accordingly with the code. All three researchers had to agree on the code before marking the decision.

4 Results

4.1 Experiences with Distraction in Virtual Lectures

All of the participants reported getting distracted during their most recent video lecture, and all reported using other sites during these moments of distraction, such as Facebook, Instagram, Twitter, Email, Slack, and Messenger/iMessage. These responses provided a baseline for the type of distraction common in video lectures. Each of these secondary sites involve reading text, which can make it difficult to simultaneously process the lecture itself. Similarly, many of these sites are designed to grab attention. A student with no alternative to watching their lecture may eventually find their way to focusing again simply from the lecture being the main stimulus. However, once attention has been transferred to an attention-grabbing alternative, it can be very difficult to get back on track.

4.2 Reactions to the Virtual Fidgets

For the next section of the study, we prompted users to select one of three virtual fidget proxies based on what appealed to them. The goal of this section was to determine how users felt about our virtual fidget proxies, and to determine principles for future design of specialized virtual fidgets for online learning environments. All participants ended up choosing two of the possible three options, with half choosing Desktop Pets and half choosing the Fluid Simulation. Interestingly, no participants chose CursorEffect2, suggesting that the presentation of the app was not appealing to participants. Figure 2 shows responses to each of the four Likert scale questions.
Figure 2:
Figure 2: Participant ratings of virtual fidget effectiveness.
The biggest impact of the fidget was in preventing 62.5% of users from switching to other sites. The other responses were mixed, with some indicating more usefulness and help with focusing than others. This indicates that there may be some individuals who benefit more from fidgeting than others, and that a tool designed for fidgeting should focus on those who perceive a benefit from fidgeting. Participants 5, 7, and 8 all rated they were likely to use the fidget again, indicating that despite the fact that these fidgets were not originally designed for fidgeting, they capture some of the necessary features of fidgets.
The majority of data collected was in the form of short written responses, for which we conducted a thematic analysis. Through this activity, we identified eight themes that captured the range of participants’ thoughts about using the fidget with their virtual lecture, denoted by a designated letter code. The themes are presented in the following section with key examples of user feedback.

4.3 Themes

Sites: The user reported that the use of the fidget helped them not use other sites.
“I also think it is better than watching clips on Instagram or podcasts or people talking because that tends to sit in your subconscious a lot more and now you’re just thinking about that. The Fluid Simulations don’t affect my subconscious thoughts and so it’s easy to get back to focusing on the lecture” (Participant 8, Fluid Simulation)
“While I was expecting it will further distract me from the lecture, I am surprised that sometimes the pet prevented me from switching to another app and surfing other sites” (Participant 3, Desktop Pets)
Physicality: The user commented on the physical nature of the fidget.
“[It] satisfies the inner need to fidget or get distracted or play around with something in my hand…I could still look at it in my peripherals while moving the mouse around on the screen, and I would subconsciously be really satisfied while still focusing on the lecture.” (Participant 5, Fluid Simulation)
“I will say I am more of a physical fidgeter since I still wanted to do something with my hands—i.e. fidget with my hair or spin my pencil, so I don’t think the Desktop Pets really changed that urge.” (Participant 3, Desktop Pets)
Novelty: The user had a different experience of the fidget at first compared to once they were familiar with it. For example, some reported finding the fidget distracting at first, while others found the fidget repetitive over time.
“It was a little distracting at first, but not particularly distracting after the first 5 minutes.” (Participant 8, Fluid Simulation)
“I didn’t like that the virtual pet moved back and forth in a repetitive straight line that became predictable” (Participant 7, Desktop Pets)
Companionship: The user commented on a feeling of having a companion due to the presence of the fidget.
“I liked it! Felt like the virtual pet was stuck on the Zoom with me” (Participant 7, Desktop Pets)
“With the cat at my Zoom meeting, it feels like I had a pet alongside in my apartment” (Participant 4, Desktop Pets)
Location: The user referenced the location of the fidget in relation to their experience. For example, some comments positively enjoyed the proximity of the fidget to their lecture, while others encountered issues where the fidget got in the way of parts of the screen.
“I didn’t like the detached interface, because I would like to have this as a background thing while I’m able to see the Zoom call in full screen” (Participant 8, Fluid Simulation)
“Once the Desktop Pets were set up with a certain browser window structure (I wanted the pets to walk on top of the browser windows) I didn’t want to change browsers—looking at the fidget when it framed the video player helped me look at the slides more instead of listening to the audio and scrolling on my phone which is what I usually do” (Participant 3, Desktop Pets)
Intrusiveness: The user described the intrusiveness of the fidget to their experience.
“It was non-intrusive, I could control when to use it” (Participant 6, Fluid Simulation)
“I also didn’t like having to click and drag it out of the way when I wanted to see something specific” (Participant 7, Desktop Pets)
Focus: The user described a change in focus as a result of the fidget.
“Just the act of interacting with the fidget made me realize that I have to focus back on the lecture, although it was a fleeting realization, it at least helped me realize it quickly” (Participant 6, Fluid Simulation)
“I enjoyed watching the pets walk around the screen. Whenever they walked close to the video player, I could feel my attention shift towards the slides that were being shared” (Participant 3, Desktop Pets)
Visual: The user commented on the aesthetic appeal of the fidget.
“It was nice for a little break and cute to look at” (Participant 1, Desktop Pets)
“It was a nice visual while watching a mainly static PowerPoint” (Participant 2, Fluid Simulation)
Figure 3:
Figure 3: How often each theme occurred among the 32 responses.
Figure 3 shows the relative frequencies of each theme, separated by fidget used. Users most frequently brought up the location and intrusiveness of the fidget when discussing their experience, indicating that these factors have a large impact on the user experience. Some of these themes appeared more for one fidget than the other. Participants who used the Desktop Pets fidget brought up Location, Intrusiveness, and Novelty more often than those using the Fluid Simulation. The Desktop Pets fidget involved characters moving autonomously across the screen, and many people noted having to move them out of the way or having trouble figuring out how to use them. In contrast, the Fluid Simulation had more mentions of keeping people off other sites, as well as helping focus and being visually appealing.

5 Discussion

The intervention of adding a virtual fidget in the learning environment was successful as an intermediary distraction barrier for a majority of participants. It gave students an outlet for distraction to prevent switching to other sites, often helping participants return to focus on the lecture. We found that students’ impressions of the virtual fidgets were overall positive, and the majority of the issues students had with the fidgets stemmed from the fact that they were not designed for fidgeting or with lectures in mind. This suggests that tools specifically built for virtual fidgeting is a worthwhile intervention in online learning, where attention is vital and distraction via social media is common.

5.1 Principles for Designing Virtual Fidgets

Based on our user feedback, we have identified the particular features of the Fluid Simulation and Desktop Pets fidget proxies that students liked and disliked. Our themed groupings show both areas of consideration in the design of fidgets, as well as fidgeting outcomes. From this feedback, we propose principles for designing domain-specific virtual fidgets intended for students watching online lectures.
Principle 1: The fidget should be context-aware Based on the themes of Location and Intrusiveness. One of the most powerful interventions provided by the fidget was the fact that the fidget had the capacity to re-engage students by giving them an interaction close to the lecture itself. Equally importantly, the fidget had the capacity to interrupt the experience for the user if it obstructed important content. Therefore, designs of virtual fidgets should understand the interface in which they are used, and leverage that information to promote engagement. This can come in the form of points of interest created by the fidget that relate spatially to the content of the lecture. Designs should also avoid obscuring interface elements of the lecture that the user will always need to access.
Principle 2: The fidget should promote physical interaction Based on the theme of Physicality. Despite the predominantly visual nature of the fidgets, part of the satisfaction students reported relied on the physical interaction between the student and their mouse. It was not enough for students to have an engaging visual among their lecture; it was important that the act of fidgeting also included motor input. This is consistent with traditional fidgeting, where physical movement helps release pent-up energy. Therefore, designs of virtual fidgets should consider how the user interacts with the fidget, and consider how standard input devices can function as satisfying fidget inputs.
Principle 3: The fidget should be visually appealing Based on the themes of Visuality, Focus, Companionship, and Other Sites. Given that participants reported frequently navigating away from lectures, the virtual fidget should be designed with enough visual appeal to keep their attention. The fidget must be designed to capture the interest of the user, while not being so distracting as to pull all focus from the lecture. The visuals of the fidget also help keep the attention of the user close to the lecture content. Traditional fidget tools are always available but often engage the user both physically and visually, requiring the user to look away from the screen and toward the object. In contrast, virtual fidgets can be manipulated with the hands and provide visual feedback at the desired point of attention on the screen. Maintaining the visual interest of the virtual feedback gives users both an outlet to fidget and an extra reason to look where they are supposed to for their lecture.
Principle 4: The fidget should be easy to adopt Based on the theme of Novelty. Many users reported initial confusion while figuring out how to use the fidget. For example, Desktop Pets already has features that allow users to pick up and move the pets, yet several participants said they wished they could move the pets around. This implies that the interactions were not easily discoverable. Both fidgets had various settings the user could manipulate which was a distraction for many participants. Therefore, applications designed for fidgeting should be simple and self-explanatory.

6 Limitations

An important but intentional limitation is that our findings are not generalizable. This is because the study was conducted on a small sample of students, and our survey did not collect data about their demographic backgrounds. We also recognize that fidgeting is not universally helpful. Similar to how Homer et al. [18] find that the effects of visual educational tools depend on individuals’ pre-existing preferences for visual learning, the effects of fidgeting likely depend on individuals’ pre-existing preferences for fidgeting. This is consistent with suggestions that those with ASD or ADHD are more likely to benefit from physical fidgets than others [3]. Given our application in online learning, it would be valuable to investigate whether these virtual fidgets are helpful to those with learning disabilities. It is important to note that although we point to this literature as an example of variations in preferences, we do not make claims that virtual fidgets necessarily benefit these specific groups as we did not design this study with such populations [32].
The context of our study is also limited in scope because, as intended, it was designed specifically for online lecture environments, so our findings may only apply to educational settings that do not involve interaction between participants (e.g. discussion-based classes or conference meetings). Additionally, the apps we chose were used as design probes or proxies and have their own limitations as they were not developed for the purpose of virtual fidgets. The novelty effect of these apps could also diminish over time. Lastly, all data in this study were self-reported, so we do not make claims about quantifiable changes to attention, and instead focus on how these virtual fidgets improved participants’ affective experience.

7 Future Work

This investigation is a preliminary step in understanding the role of virtual fidgets in online learning and provides a basis for future work on the subject. In particular, because some responses showed particular enthusiasm for the virtual fidgets over others, future work could engage more deeply with those who benefit most from fidgeting and their needs in virtual learning environments. We recommend leveraging user-centered and/or participatory design methods to ensure it is designed well with and for a more specific group of intended users. If future work implements such a tool, a user evaluation may also be beneficial to investigate whether participants’ self-reports are consistent with measurable outcomes. Future work should also investigate the optimal level of stimulation a virtual fidget should offer to balance being engaging without being too distracting, which may vary between individuals. Additionally, this project has investigated virtual fidgets in online lectures only from the perspective of students. Future work could look at how instructors feel about their students using virtual fidgets, especially because instructors may perceive the fidgets as distracting. There are elements of social presence and perception of a virtual fidget that could be investigated, such as an indicator when a student is using a virtual fidget to avoid misunderstandings or incorrect assumptions [26]. This would also reduce the virtual fidget user’s stress of monitoring self-presentation [14]. Alternatively, instructors might see virtual fidgets as an opportunity to engage students in a new way, and future work could develop fidgets that are responsive and relevant to their specific lecture content. We envision that the tool may even be adapted to be used as a way to deploy teaching material, like Schacter and Szpunar’s [30] work on integrating regular quizzes or comprehension checks, or a lecture-related visual aid [20]. Finally, lectures are not the only times people feel a need to fidget, and future work could consider other contexts in which virtual fidgeting is appropriate and beneficial, such as pre-recorded videos or interactive meetings. Informing the design of features to support attention in video meetings with Kuzminykh and Rintel’s [24] functional recommendations for video conferencing UI would also be useful when implementing a virtual fidget that lives directly on or next to the Zoom interface.

8 Conclusion

This study was an initial investigation on how students may feel about virtual fidgets in online educational lecture environments to help with attention regulation. In addition to creating an enjoyable experience, we hope this work helps ensure that the global shift toward remote work and learning is inclusive to those who may be more susceptible to distractions online. In summary, our work contributes eight major themes extracted from participants’ responses and four key design principles based on our results. We have shown that virtual fidgets can benefit certain students for the specific task of watching online lectures and that the concept is worth pursuing in future work.

Acknowledgments

Thanks to James Fogarty and Lisa Elkin for their guidance with this project. This material is based upon work supported by the NSF Graduate Research Fellowship under Grant No. DGE-2140004. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Supplementary Material

MP4 File (3544549.3585729-talk-video.mp4)
Pre-recorded Video Presentation
MP4 File (3544549.3585729-video-preview.mp4)
Video Preview

References

[1]
Jeremy N. Bailenson. 2021. Nonverbal Overload: A Theoretical Argument for the Causes of Zoom Fatigue. Technology, Mind, and Behavior 2, 1 (Feb. 2021). https://doi.org/10.1037/tmb0000030
[2]
Vikram R. Bhargava and Manuel Velasquez. 2021. Ethics of the Attention Economy: The Problem of Social Media Addiction. Business Ethics Quarterly 31, 3 (2021), 321–359. https://doi.org/10.1017/beq.2020.32
[3]
Lindsey Biel. 2017. Fidget toys or focus tools. Autism File 74 (2017), 12–13.
[4]
Hancheng Cao, Chia-Jung Lee, Shamsi Iqbal, Mary Czerwinski, Priscilla N Y Wong, Sean Rintel, Brent Hecht, Jaime Teevan, and Longqi Yang. 2021. Large Scale Analysis of Multitasking Behavior During Remote Meetings. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–13. https://doi.org/10.1145/3411764.3445243
[5]
Shelley Carson, Margaret Shih, and Ellen Langer. 2001. Sit Still and Pay Attention?Journal of Adult Development 8 (Jan. 2001), 183–188. https://doi.org/10.1023/A:1009594324594
[6]
Joe D Chalkley, Thomas T Ranji, Carina EI Westling, Nachiappan Chockalingam, and Harry J Witchel. 2017. Wearable sensor metric for fidgeting: screen engagement rather than interest causes NIMI of wrists and ankles. In Proceedings of the European Conference on Cognitive Ergonomics 2017. 158–161.
[7]
Brian Chin, Emily K. Lindsay, Carol M. Greco, Kirk Warren Brown, Joshua M. Smyth, Aidan G. C. Wright, and J. David Creswell. 2021. Mindfulness interventions improve momentary and trait measures of attentional control: Evidence from a randomized controlled trial. Journal of Experimental Psychology. General 150, 4 (April 2021), 686–699. https://doi.org/10.1037/xge0000969
[8]
Woohyeok Choi, Aejin Song, Darren Edge, Masaaki Fukumoto, and Uichin Lee. 2016. Exploring user experiences of active workstations: a case study of under desk elliptical trainers. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. 805–816.
[9]
[9] Federico Curzel. 2021. https://apps.apple.com/us/app/desktop-pets/id1575542220?mt=12
[10]
Suzanne B. da Câmara, Rakshit Agrawal, and Katherine Isbister. 2018. Identifying Children’s Fidget Object Preferences: Toward Exploring the Impacts of Fidgeting and Fidget-Friendly Tangibles. In Proceedings of the 2018 Designing Interactive Systems Conference. ACM, Hong Kong China, 301–311. https://doi.org/10.1145/3196709.3196790
[11]
Claudia Daudén Roquet, Nikki Theofanopoulou, Jaimie L Freeman, Jessica Schleider, James J Gross, Katie Davis, Ellen Townsend, and Petr Slovak. 2022. Exploring Situated & Embodied Support for Youth’s Mental Health: Design Opportunities for Interactive Tangible Device. In CHI Conference on Human Factors in Computing Systems. ACM, New Orleans LA USA, 1–16. https://doi.org/10.1145/3491102.3502135
[12]
[12] Pavel Dobryakov. 2017. https://experiments.withgoogle.com/fluid-simulation
[13]
James Farley, Evan Risko, and Alan Kingstone. 2013. Everyday attention and lecture retention: the effects of time, fidgeting, and mind wandering. Frontiers in Psychology 4 (2013). https://doi.org/10.3389/fpsyg.2013.00619
[14]
Joey George, Akmal Mirsadikov, Misty Nabors, and Kent Marett. 2022. What do Users Actually Look at During ‘Zoom’Meetings? Discovery Research on Attention, Gender and Distraction Effects. In Proceedings of the 55th Hawaii International Conference on System Sciences.
[15]
Paul Ginns. 2006. Integrating information: A meta-analysis of the spatial contiguity and temporal contiguity effects. Learning and Instruction - LEARN INSTR 16 (Dec. 2006), 511–525. https://doi.org/10.1016/j.learninstruc.2006.10.001
[16]
Kelsey Grodner. 2015. To fidget or not to fidget: The effect of movement on cognition. Murray State University.
[17]
Katelyn E Haas and Su Kyoung An. 2022. Conceptual Development of a Fashion-Forward Garment Aimed To Ease Anxiety Through Fidget Components. In Breaking Boundaries. Iowa State University Digital Press. https://doi.org/10.31274/itaa.13793
[18]
Bruce D. Homer, Jan L. Plass, and Linda Blake. 2008. The effects of video on cognitive load and social presence in multimedia-learning. Computers in Human Behavior 24, 3 (May 2008), 786–797. https://doi.org/10.1016/j.chb.2007.02.009
[19]
Chen Ji and Katherine Isbister. 2022. AR Fidget: Augmented Reality Experiences that Support Emotion Regulation through Fidgeting. In CHI Conference on Human Factors in Computing Systems Extended Abstracts. ACM, New Orleans LA USA, 1–4. https://doi.org/10.1145/3491101.3519874
[20]
Cheryl Johnson and Richard Mayer. 2012. An Eye Movement Analysis of the Spatial Contiguity Effect in Multimedia Learning. Journal of experimental psychology. Applied 18 (Feb. 2012), 178–91. https://doi.org/10.1037/a0026923
[21]
Michael Karlesky and Katherine Isbister. 2013. Fidget widgets: secondary playful interactions in support of primary serious tasks. In CHI ’13 Extended Abstracts on Human Factors in Computing Systems on - CHI EA ’13. ACM Press, Paris, France, 1149. https://doi.org/10.1145/2468356.2468561
[22]
Michael Karlesky and Katherine Isbister. 2016. Understanding Fidget Widgets: Exploring the Design Space of Embodied Self-Regulation. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction. ACM, Gothenburg Sweden, 1–10. https://doi.org/10.1145/2971485.2971557
[23]
Robin H Kay and Ann LeSage. 2009. Examining the benefits and challenges of using audience response systems: A review of the literature. Computers & Education 53, 3 (2009), 819–827.
[24]
Anastasia Kuzminykh and Sean Rintel. 2020. Classification of Functional Attention in Video Meetings. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, Honolulu HI USA, 1–13. https://doi.org/10.1145/3313831.3376546
[25]
James A Levine, Sara J Schleusner, and Michael D Jensen. 2000. Energy expenditure of nonexercise activity. The American Journal of Clinical Nutrition 72, 6 (12 2000), 1451–1454. https://doi.org/10.1093/ajcn/72.6.1451 arXiv:https://academic.oup.com/ajcn/article-pdf/72/6/1451/23960973/1451.pdf
[26]
Jennifer Marlow, Eveline van Everdingen, and Daniel Avrahami. 2016. Taking Notes or Playing Games?: Understanding Multitasking in Video Communication. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing. ACM, San Francisco California USA, 1726–1737. https://doi.org/10.1145/2818048.2819975
[27]
Hadar Nesher Shoshan and Wilken Wehrt. 2022. Understanding “Zoom fatigue”: A mixed-method approach. Applied Psychology 71, 3 (2022), 827–852. https://doi.org/10.1111/apps.12360 _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1111/apps.12360.
[28]
Evan F. Risko, Nicola Anderson, Amara Sarwal, Megan Engelhardt, and Alan Kingstone. 2012. Everyday Attention: Variation in Mind Wandering and Memory in a Lecture. Applied Cognitive Psychology 26, 2 (2012), 234–242. https://doi.org/10.1002/acp.1814 _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/acp.1814.
[29]
Evan F. Risko, Dawn Buchanan, Srdan Medimorec, and Alan Kingstone. 2013. Everyday attention: Mind wandering and computer use during lectures. Computers & Education 68 (Oct. 2013), 275–283. https://doi.org/10.1016/j.compedu.2013.05.001
[30]
Daniel L. Schacter and Karl K. Szpunar. 2015. Enhancing attention and memory during video-recorded lectures.Scholarship of Teaching and Learning in Psychology 1, 1 (March 2015), 60–71. https://doi.org/10.1037/stl0000011
[31]
Myeong-Gu Seo, Lisa Feldman Barrett, and Jean M Bartunek. 2004. The Role of Affective Experience in Work Motivation. Academy of Management Review (2004).
[32]
Katta Spiel, Eva Hornecker, Rua Mae Williams, and Judith Good. 2022. ADHD and technology research–investigated by neurodivergent readers. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–21.
[33]
Anna Williams, Brianna Posadas, Diandra Prioleau, Isabel Laurenceau, and Juan E Gilbert. 2018. User Perceptions of Haptic Fidgets on Mobile Devices for Attention and Task Performance. In International Conference on Applied Human Factors and Ergonomics. Springer, 15–22.
[34]
Xiao Dong Zhou. 2021. CursorEffect2. https://apps.apple.com/us/app/cursoreffect2/id1585374223?mt=12
[35]
Andrew H. Zureick, Jesse Burk-Rafel, Joel A. Purkiss, and Michael Hortsch. 2018. The interrupted learner: How distractions during live and video lectures influence learning outcomes. Anatomical Sciences Education 11, 4 (2018), 366–376. https://doi.org/10.1002/ase.1754 arXiv:https://anatomypubs.onlinelibrary.wiley.com/doi/pdf/10.1002/ase.1754

Cited By

View all
  • (2024)The Pictorial is Neurodivergent: A Case for More Fidgets and Fewer FixesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660755(3485-3500)Online publication date: 1-Jul-2024

Index Terms

  1. Virtual Fidgets: Opportunities and Design Principles for Bringing Fidgeting to Online Learning

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
    April 2023
    3914 pages
    ISBN:9781450394222
    DOI:10.1145/3544549
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 19 April 2023

    Check for updates

    Author Tags

    1. Fidgets
    2. Online Learning
    3. User Study

    Qualifiers

    • Work in progress
    • Research
    • Refereed limited

    Funding Sources

    Conference

    CHI '23
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)422
    • Downloads (Last 6 weeks)92
    Reflects downloads up to 12 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)The Pictorial is Neurodivergent: A Case for More Fidgets and Fewer FixesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3660755(3485-3500)Online publication date: 1-Jul-2024

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media