[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/2957265.2961836acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
research-article

Tag your emotions: a novel mobile user interface for annotating images with emotions

Published: 06 September 2016 Publication History

Abstract

People tend to collect more and more data, this is especially true for images on mobile devices. Tagging images is a good way to sort such collections. While automatic tagging systems are often focused on the content, such as objects or persons in the image, manual annotations are very important to describe the context of an image. Often especially emotions are important, e.g., when a person reflects a situation, shows images from a very personal collection to others, or when using images to illustrate presentations. Unfortunately, manual annotation is often very boring and users are not very motivated to do so. While there are many approaches to motivate people to annotate data in a conventional way, none of them has focused on emotions. In this poster abstract, we present EmoWheel; an innovative interface to annotate images with emotional tags. We conducted a user study with 18 participants. Results show that the EmoWheel can enhance the motivation to annotate images.

References

[1]
Morgan Ames and Mor Naaman. 2007. Why We Tag: Motivations for Annotation in Mobile and Online Media. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 971--980.
[2]
John Brooke. 1996. SUS-A quick and dirty usability scale. Usability evaluation in industry 189, 194 (1996).
[3]
Erik Cambria and Amir Hussain. 2012. Sentic album: content-, concept-, and context-based online personal photo management system. Cognitive Computation 4, 4 (2012), 477--496.
[4]
Bruno Cardoso, Osvaldo Santos, and Teresa Romão. 2015. On Sounder Ground: CAAT, a Viable Widget for Affective Reaction Assessment. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 501--510.
[5]
Michela Dellagiacoma, Pamela Zontone, Giulia Boato, and Liliana Albertazzi. 2011. Emotion Based Classification of Natural Images. In Proceedings of the 2011 International Workshop on DETecting and Exploiting Cultural diversiTy on the Social Web (DETECT '11). ACM, New York, NY, USA, 17--22.
[6]
David Fonseca, Oscar García, Jaume Duran, Marc Pifarré, and Eva Villegas. 2008. An Image-centred Search and Indexation System Based in User's Data and Perceived Emotion. In Proceedings of the 3rd ACM International Workshop on Human-centered Computing (HCC '08). ACM, New York, NY, USA, 27--34.
[7]
Jeong Ho Kim, Lovenoor Aulck, Michael C. Bartha, Christy A. Harper, and Peter W. Johnson. 2012. Are there Differences in Force Exposures and Typing Productivity between Touchscreen and Conventional Keyboard? Proceedings of the Human Factors and Ergonomics Society Annual Meeting 56, 1 (2012).
[8]
Frank Nack, Ansgar Scherp, and Chantal Neuhaus. 2014. Semiotic Tagging: Enriching the Semantics of Tags for Improved Image Retrieval. In Semantic Computing (ICSC), 2014 IEEE International Conference on. IEEE, New York, 7--14.
[9]
Robert Plutchik. 1984. Emotions: A general psychoevolutionary theory. Approaches to emotion 1984 (1984), 197--219.
[10]
Nina Runge, Dirk Wenig, and Rainer Malaka. 2014. Keep an Eye on Your Photos: Automatic Image Tagging on Mobile Devices. In Proceedings of the 16th International Conference on Human-computer Interaction with Mobile Devices & Services (MobileHCI '14). ACM, New York, NY, USA, 513--518.
[11]
Stefanie Schmidt and Wolfgang G. Stock. 2009. Collective indexing of emotions in images. A study in emotional information retrieval. Journal of the American Society for Information Science and Technology 60, 5 (2009), 863--876.
[12]
Karen Simonyan and Andrew Zisserman. 2014. Very Deep Convolutional Networks for Large-Scale Image Recognition. CoRR abs/1409.1556 (2014). http://arxiv.org/abs/1409.1556
[13]
Mohammad Soleymani and Maja Pantic. 2012. Human-centered implicit tagging: Overview and perspectives. In Systems, Man, and Cybernetics (SMC), 2012 IEEE International Conference on. IEEE, 3304--3309.
[14]
Jakub Šimko and Mária Bieliková. 2012. Personal Image Tagging: A Game-based Approach. In Proceedings of the 8th International Conference on Semantic Systems (I-SEMANTICS '12). ACM, New York, NY, USA, 88--93.
[15]
Bolei Zhou, Agata Lapedriza, Jianxiong Xiao, Antonio Torralba, and Aude Oliva. 2014. Learning deep features for scene recognition using places database. In Advances in Neural Information Processing Systems. 487--495.

Cited By

View all
  • (2024)Exploring Retrospective Annotation in Long-Videos for Emotion RecognitionIEEE Transactions on Affective Computing10.1109/TAFFC.2024.335970615:3(1514-1525)Online publication date: Jul-2024
  • (2024)Designing a wheel-based assessment tool to measure visual aesthetic emotionsCognitive Systems Research10.1016/j.cogsys.2023.10119684:COnline publication date: 1-Mar-2024
  • (2022)Automatic Tagging by Leveraging Visual and Annotated Features in Social MediaIEEE Transactions on Multimedia10.1109/TMM.2021.305503724(2218-2229)Online publication date: 2022
  • Show More Cited By

Index Terms

  1. Tag your emotions: a novel mobile user interface for annotating images with emotions

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MobileHCI '16: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct
    September 2016
    664 pages
    ISBN:9781450344135
    DOI:10.1145/2957265
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 September 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. apps
    2. emotions
    3. image tagging
    4. mobile devices
    5. plutchik
    6. user interfaces

    Qualifiers

    • Research-article

    Conference

    MobileHCI '16
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 202 of 906 submissions, 22%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)26
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 07 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Exploring Retrospective Annotation in Long-Videos for Emotion RecognitionIEEE Transactions on Affective Computing10.1109/TAFFC.2024.335970615:3(1514-1525)Online publication date: Jul-2024
    • (2024)Designing a wheel-based assessment tool to measure visual aesthetic emotionsCognitive Systems Research10.1016/j.cogsys.2023.10119684:COnline publication date: 1-Mar-2024
    • (2022)Automatic Tagging by Leveraging Visual and Annotated Features in Social MediaIEEE Transactions on Multimedia10.1109/TMM.2021.305503724(2218-2229)Online publication date: 2022
    • (2021)Affective rating of audio and video clips using the EmojiGridF1000Research10.12688/f1000research.25088.29(970)Online publication date: 6-Apr-2021
    • (2021)An Enriched Emoji Picker to Improve Accessibility in Mobile CommunicationsHuman-Computer Interaction – INTERACT 202110.1007/978-3-030-85623-6_25(418-433)Online publication date: 26-Aug-2021
    • (2020)Affective rating of audio and video clips using the EmojiGridF1000Research10.12688/f1000research.25088.19(970)Online publication date: 11-Aug-2020
    • (2020)Using a participatory activities toolkit to elicit privacy expectations of adaptive assistive technologiesProceedings of the 17th International Web for All Conference10.1145/3371300.3383336(1-12)Online publication date: 20-Apr-2020
    • (2020)RCEA: Real-time, Continuous Emotion Annotation for Collecting Precise Mobile Video Ground Truth LabelsProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376808(1-15)Online publication date: 21-Apr-2020
    • (2019)Tagging emotions using a wheel user interfaceProceedings of the 13th Biannual Conference of the Italian SIGCHI Chapter: Designing the next interaction10.1145/3351995.3352056(1-5)Online publication date: 23-Sep-2019
    • (2019)The Relation Between Valence and Arousal in Subjective Odor ExperienceChemosensory Perception10.1007/s12078-019-09275-7Online publication date: 27-Nov-2019
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media