[go: up one dir, main page]
More Web Proxy on the site http://driver.im/ skip to main content
10.1145/1101149.1101243acmconferencesArticle/Chapter ViewAbstractPublication PagesmmConference Proceedingsconference-collections
Article

Affect-based indexing and retrieval of films

Published: 06 November 2005 Publication History

Abstract

Digital multimedia systems are creating many new opportunities for rapid access to content archives. In order to explore these collections using search applications, the content must be annotated with significant features. An important and often overlooked aspect of human interpretation of multimedia data is the affective dimension. Affective labels of content can be extracted automatically from within multimedia data streams. These can then be used for content-based retrieval and browsing. In this study affective features extracted from multimedia audio content are mapped onto a set of keywords with predetermined emotional interpretations. These labels are then used to demonstrate affect-based retrieval on a range of feature films.

References

[1]
Smeaton, A.F., Lee, H. and McDonald, K.: Experiences of Creating Four Video Library Collections with the Fischlar System, International Journal on Digital Libraries, 4(10) (2004) 42--44
[2]
Hauptmann, A.G., Christel, M.G.: Successful Approaches in the TREC Video Retrieval Evaluations, Proceedings of ACM Multimedia 2004, New York City, ACM (2004) 668--675
[3]
Zhang, T., Kuo, C. -C. J.: Content-Based Audio Classification and Retrieval for Audiovisual Data Parsing, Kluwer Academic Publishers, (2001)
[4]
Hanjalic, A., Xu., L.-Q.: Affective Video Content Representation and Modeling, IEEE Transactions on Multimedia, 7(1) (2005) 143--154
[5]
Chan, C. H., and Jones, G. J. F.: Annotation of Multimedia Audio Data with Affective Labels for Information Management, Proceedings of PRIS 2005 - 5th International Workshop on Pattern Recognition in Information Systems, Miami, USA (2005) 94--103
[6]
Salway, A. and Graham, M.: Extracting Information about Emotions in Films, Proceedings of ACM Multimedia, Berkeley, ACM (2003) 299--302
[7]
Russell, J., Mehrabian, A.: Evidence for a Three-Factor Theory of Emotions, Journal of Research in Personality, 11 (1977) 273--294
[8]
Bradley, M. M.: Emotional Memory: A Dimensional Analysis. In: van Groot, S., van de Poll, N.E., Sargeant, J. (eds.) The Emotions: Essays on Emotion Theory, Hillsdale, NJ: Erlbaum (1994) 97--134
[9]
Dietz, R., Lang, A.: Affective Agents: Effects of Agent Affect on Arousal, Attention, Liking and Learning, Proceedings of the Third International Cognitive Technology Conference, San Francisco (1999)
[10]
Ortony, A, Clore, G.L., Coolins, A.: The Cognitive Structure of Emotions, CUP (1988)
[11]
Picard, R: Affective Computing, MIT Press (1997)
[12]
Luhn, H. P.: The Automatic Creation of Literature Abstracts, IBM Journal of Research and Development, 2(2) (1958) 159--165
[13]
S. E. Robertson, S. E., Walker, S., Jones, S., Hancock-Beaulieu, M., and M.Gatford, M.: Okapi at TREC-3. In D. K. Harman, editor, Proceedings of TREC-3, NIST (1995) 109--126

Cited By

View all
  • (2023)Stepwise Fusion Transformer for Affective Video Content AnalysisInternational Conference on Neural Computing for Advanced Applications10.1007/978-981-99-5847-4_27(375-386)Online publication date: 30-Aug-2023
  • (2022)Searching, Navigating, and Recommending Movies through Emotions: A Scoping ReviewHuman Behavior and Emerging Technologies10.1155/2022/78310132022(1-24)Online publication date: 2-Dec-2022
  • (2022)P2SL: Private-Shared Subspaces Learning for Affective Video Content Analysis2022 IEEE International Conference on Multimedia and Expo (ICME)10.1109/ICME52920.2022.9859902(1-6)Online publication date: 18-Jul-2022
  • Show More Cited By

Index Terms

  1. Affect-based indexing and retrieval of films

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    MULTIMEDIA '05: Proceedings of the 13th annual ACM international conference on Multimedia
    November 2005
    1110 pages
    ISBN:1595930442
    DOI:10.1145/1101149
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 November 2005

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. affect-based retrieval
    2. affective labelling of multimedia
    3. content-based retrieval

    Qualifiers

    • Article

    Conference

    MM05

    Acceptance Rates

    MULTIMEDIA '05 Paper Acceptance Rate 49 of 312 submissions, 16%;
    Overall Acceptance Rate 2,145 of 8,556 submissions, 25%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)1
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 25 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Stepwise Fusion Transformer for Affective Video Content AnalysisInternational Conference on Neural Computing for Advanced Applications10.1007/978-981-99-5847-4_27(375-386)Online publication date: 30-Aug-2023
    • (2022)Searching, Navigating, and Recommending Movies through Emotions: A Scoping ReviewHuman Behavior and Emerging Technologies10.1155/2022/78310132022(1-24)Online publication date: 2-Dec-2022
    • (2022)P2SL: Private-Shared Subspaces Learning for Affective Video Content Analysis2022 IEEE International Conference on Multimedia and Expo (ICME)10.1109/ICME52920.2022.9859902(1-6)Online publication date: 18-Jul-2022
    • (2022)ML-TFN: Multi Layers Tensor Fusion Network for Affective Video Content AnalysisNeural Computing for Advanced Applications10.1007/978-981-19-6142-7_14(184-196)Online publication date: 21-Oct-2022
    • (2021)Multimodal Local-Global Attention Network for Affective Video Content AnalysisIEEE Transactions on Circuits and Systems for Video Technology10.1109/TCSVT.2020.301488931:5(1901-1914)Online publication date: 1-May-2021
    • (2021)Inter-Brain EEG Feature Extraction and Analysis for Continuous Implicit Emotion Tagging During Video WatchingIEEE Transactions on Affective Computing10.1109/TAFFC.2018.284975812:1(92-102)Online publication date: 1-Jan-2021
    • (2020)Sharing emotions: determining films’ evoked emotional experience from their online reviewsInformation Retrieval Journal10.1007/s10791-020-09373-1Online publication date: 9-May-2020
    • (2018)Audio-Based Granularity-Adapted Emotion ClassificationIEEE Transactions on Affective Computing10.1109/TAFFC.2016.25987419:2(176-190)Online publication date: 1-Apr-2018
    • (2017)What Can Self-Reports and Acoustic Data Analyses on Emotions Tell Us?Proceedings of the 2017 Conference on Designing Interactive Systems10.1145/3064663.3064770(489-501)Online publication date: 10-Jun-2017
    • (2015)Video Affective Content Analysis: A Survey of State-of-the-Art MethodsIEEE Transactions on Affective Computing10.1109/TAFFC.2015.24327916:4(410-430)Online publication date: 23-Nov-2015
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media