Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors
<p>Flowchart of the experimental procedure of our research (BR is blinking rate and FT is facial temperature).</p> "> Figure 2
<p>Proposed system for evaluating fear.</p> "> Figure 3
<p>Dual (visible-light and thermal) cameras used in our method and their images.</p> "> Figure 4
<p>Commercial EEG device and locations of 16 electrodes based on the international 10–20 system. (<b>a</b>) Emotiv EPOC headset; (<b>b</b>) positions of 16 electrodes.</p> "> Figure 5
<p>Four corresponding (calibration) pairs of points produced by four NIR illuminators to obtain the geometric transform matrix and an example for measuring calibration accuracy. (<b>a</b>) Four pairs of corresponding (calibration) points; (<b>b</b>) pair of points for measuring calibration accuracy.</p> "> Figure 6
<p>(<b>a</b>) Detected face and facial feature regions in the visible-light image; (<b>b</b>) mapped face and facial feature regions in the thermal image after geometric transformation.</p> "> Figure 7
<p>Example of defined ROIs used to measure the change of facial temperature.</p> "> Figure 8
<p>Comparison of delta and beta waves before and after watching a horror movie. (<b>a</b>) Change in delta waves; (<b>b</b>) change in beta waves; (<b>c</b>) change in the ratio of delta to beta waves.</p> "> Figure 8 Cont.
<p>Comparison of delta and beta waves before and after watching a horror movie. (<b>a</b>) Change in delta waves; (<b>b</b>) change in beta waves; (<b>c</b>) change in the ratio of delta to beta waves.</p> "> Figure 9
<p>Example of detecting pupil regions using sub-block-based template matching.</p> "> Figure 10
<p>Example of determining whether eyes are open and closed. (<b>a</b>) Open eyes; (<b>b</b>) closed eyes.</p> "> Figure 11
<p>Experiment for measuring the accuracy of the geometric transform. The top and bottom figures of (<b>a</b>–<b>c</b>) are images from the visible-light and thermal cameras, respectively. The NIR illuminator is positioned at example positions: (<b>a</b>) Position 1, (<b>b</b>) Position 5 and (<b>c</b>) Position 9.</p> "> Figure 12
<p>Experimental procedure for measuring fear (BR is blinking rate, FT is facial temperature and SE means subjective evaluation).</p> "> Figure 13
<p>Comparison of subjective evaluation scores before and after watching the horror movie.</p> "> Figure 14
<p>Comparisons of FTs of facial feature regions before and after watching the horror movie (FT is facial temperature).</p> "> Figure 15
<p>Comparisons of eye blinking rate before watching the horror movie and in the last 1 min of watching the movie (BR is blinking rate).</p> "> Figure 16
<p>Ratios of delta band to beta band of EEG data before and after watching the horror movie.</p> "> Figure 17
<p>Comparison of subjective evaluation scores before and after watching the video clip of emotionally-neutral content to the subjects.</p> "> Figure 18
<p>Comparisons of the facial temperature of facial feature regions before and after watching the video clip of emotionally-neutral content to the subjects (FT is facial temperature).</p> "> Figure 19
<p>Comparisons of eye blinking rate before and in the last 1 min of watching the video clip of emotionally-neutral content to the subjects (BR is blinking rate).</p> "> Figure 20
<p>Ratios of delta band to beta band of EEG data before and after watching the video clip of emotionally-neutral content to the subjects.</p> ">
Abstract
:1. Introduction
- -
- First, to enhance the accuracy of the evaluation of fear, we measure electroencephalogram (EEG) signals, eye blinking rate (BR), facial temperature (FT) and a subjective evaluation (SE) score before and after a user watches a horror movie.
- -
- Second, to accurately and conveniently measure the blinking rate of each user, a remote eye image capturing system is implemented using a high-speed mega-pixel camera. In addition, the changes in facial temperature are non-intrusively measured using dual (visible-light and thermal) cameras; the region of interest (ROI) for measuring the changes of facial temperature on the face are automatically detected in successive images, which enables enhancing the measurement accuracy of evaluating fear.
- -
- Third, we prove that our experimental results are caused by fear through the comparative experiments with a video clip having the same length and emotionally-neutral content presented to the subjects. In addition, we compensate for the measured values with the horror movie by using those with the video clip of emotionally-neutral content in order to obtain more accurate values in the experiments for measuring the fear emotion excluding other factors.
Category | Method | Advantages | Disadvantage | |
---|---|---|---|---|
Using a single modality | Visible-light camera-based methods [3,4,5,6]. | User’s emotion is recognized based on facial expression in an image. | - Providing comfort to the user without the attachment of sensors. - Less expensive method than bio-signal or thermal camera-based methods. | - Analyzing emotion is difficult if the person has no expression. - Extraction of the facial feature points can be affected by non-uniform illumination. |
Thermal camera-based methods [7,12]. | Measuring the change of facial temperature according to emotion. | - Providing comfort to the user without the attachment of sensors. - Analyzing emotion is easy, even if the person has no expression. - Extraction of the facial feature points is not affected by illumination condition. | - More expensive method than visible-light camera-based method. - Difficult to detect regions of facial features because the texture of facial features is not distinct in the thermal image. | |
Voice-based methods [2,8,9]. | Measuring the change of voice features according to emotion. | Less expensive method than bio-signal or thermal camera-based method. | - The performance can be affected by surrounding noises. | |
Physiological signal-based methods [10,11,16,17,18]. | ECG [10] and EEG [11,16,17,18] are analyzed for emotion detection. | High accuracy of emotion detection and fast speed of data acquisition. | - More discomfort to the user because sensors are attached to the body. - More influenced by the motion of the head, body and muscles than camera- or voice-based methods. | |
Using multiple modalities | Multiple physiological signal-based methods [13,14,15]. | - Using EEG, heart rate, SC, respiration, ST and psychometrical ratings [13]. - Using PPG, EMG, ECG, GSR and ST [14]. - Using SC, blood volume pressure, ST, EMG and respiration [15]. | Higher accuracy of emotion detection compared to single modality-based methods. | - More discomfort to the user because many sensors are attached to the body. - More influenced by the motion of the head, body and muscles than single modality-based methods. |
Hybrid method using both physiological signals and non-intrusive camera-based methods (proposed method). | Using facial temperature, EEG, blinking rate and subjective evaluation for evaluating fear. | - Higher accuracy of emotion evaluation compared to single modality-based methods. - More comfort to the user and higher freedom of movement without the attachment of sensors, except for a wireless EEG device. - Facial feature positions on the thermal image can be easily detected using dual (visible-light and thermal) cameras. | Larger amount of data to be processed after acquiring the image sequences by dual cameras. |
2. Evaluating Fear by Multimodality-Based Measurements
2.1. Overall Procedure of the Proposed Method and Multimodal Sensors for Measuring Fear
2.2. The Analysis of Facial Temperature Variation
2.3. Analysis of EEG Variation
2.4. Analysis of Eye Blinking Rate Variation
3. Experimental Results
Ground Truth Position | Calculated Position (by Geometric Transform Matrix) | RMS Error (Pixels) | |||
---|---|---|---|---|---|
Position | X | Y | X | Y | |
1 | 62 | 57 | 63 | 58 | 1.41 |
2 | 155 | 57 | 156 | 59 | 2.24 |
3 | 249 | 61 | 249 | 63 | 2 |
4 | 68 | 112 | 67 | 112 | 1 |
5 | 155 | 112 | 154 | 112 | 1 |
6 | 239 | 111 | 239 | 113 | 2 |
7 | 67 | 176 | 67 | 175 | 1 |
8 | 158 | 178 | 158 | 178 | 0 |
9 | 249 | 179 | 249 | 179 | 0 |
Average | 1.18 |
Questions for Subjective Test |
---|
I am having difficulty seeing |
I am scared |
I have a headache |
I am anxious |
I feel unpleasant |
Before Watching the Horror Movie | After Watching the Horror Movie | |
---|---|---|
Average | 1.175 | 4.113 |
Standard deviation | 0.272 | 2.208 |
Region | Average of All Regions | Middle of the Forehead | Left Eye | Right Eye | ||||
---|---|---|---|---|---|---|---|---|
Before | After | Before | After | Before | After | Before | After | |
Average | 15,111.08 | 15,039.47 | 15,101.48 | 15,038.56 | 15,119.34 | 15,050.73 | 15,112.23 | 15,039.24 |
Standard deviation | 46.77801 | 47.57241 | 53.48507 | 56.4484 | 57.7577 | 45.40563 | 53.06459 | 48.92598 |
p-value | 0.00017 | 0.00295 | 0.00085 | 0.00034 | ||||
Region | Left cheek | Right cheek | ||||||
Before | After | Before | After | |||||
Average | 15,105.64 | 15,031.03 | 15,116.71 | 15,037.77 | ||||
Standard deviation | 53.81195 | 55.63021 | 38.96623 | 53.48883 | ||||
p-value | 0.00057 | 0.00006 |
Before Watching the Horror Movie | Last 1 Min While Watching the Horror Movie | |
---|---|---|
Average | 23.75 | 25.88 |
Standard deviation | 14.45 | 11.92 |
Electrode | AF3 | AF4 | F3 | F4 | ||||
---|---|---|---|---|---|---|---|---|
Before | After | Before | After | Before | After | Before | After | |
Average | 1.2417 | 1.1899 | 1.4300 | 1.2577 | 1.0548 | 1.0091 | 1.0962 | 0.9638 |
Standard deviation | 0.3031 | 0.4990 | 0.5296 | 0.4095 | 0.3325 | 0.4124 | 0.2735 | 0.3069 |
p-value | 0.7256 | 0.3120 | 0.7324 | 0.2074 | ||||
Electrode | F7 | F8 | FC5 | FC6 | ||||
Before | After | Before | After | Before | After | Before | After | |
Average | 1.2765 | 1.2769 | 1.3056 | 1.1483 | 1.1958 | 1.0737 | 1.1592 | 1.0113 |
Standard deviation | 0.3760 | 0.2711 | 0.3097 | 0.4150 | 0.4526 | 0.4058 | 0.3546 | 0.4092 |
p-value | 0.9974 | 0.2344 | 0.4281 | 0.2836 | ||||
Electrode | O1 | O2 | P7 | P8 | ||||
Before | After | Before | After | Before | After | Before | After | |
Average | 1.1204 | 0.9422 | 1.1249 | 1.0264 | 1.2184 | 1.0587 | 1.2211 | 1.0875 |
Standard deviation | 0.3214 | 0.3023 | 0.4222 | 0.4904 | 0.3464 | 0.3433 | 0.4367 | 0.3492 |
p-value | 0.1166 | 0.5473 | 0.2003 | 0.3473 | ||||
Electrode | T7 | T8 | ||||||
Before | After | Before | After | |||||
Average | 1.2337 | 1.1158 | 1.2095 | 1.0458 | ||||
Standard deviation | 0.5250 | 0.7247 | 0.3232 | 0.4044 | ||||
p-value | 0.6026 | 0.2160 |
4. Analyses of Experimental Results
Cohen’s d | Effect Size | |
---|---|---|
Eye blinking rate | 0.1605 | Small |
EEG | 0.5713 | Medium |
Subjective evaluation | 1.8675 | Large |
Facial temperature | 1.6868 | Large |
Gradient | R2 | Correlation | |
---|---|---|---|
EEG vs. blinking rate | −0.0063 | 0.00004 | −0.0061 |
EEG vs. facial temperature | 0.5965 | 0.3113 | 0.5579 |
EEG vs. subjective evaluation | 0.1139 | 0.0085 | 0.0921 |
Blinking rate vs. facial temperature | 0.1329 | 0.0166 | 0.1289 |
Blinking rate vs. subjective evaluation | 0.5952 | 0.2491 | 0.4991 |
Facial temperature vs. subjective evaluation | 0.5765 | 0.2486 | 0.4986 |
EEG | Blinking Rate | Facial Temperature | Subjective Evaluation | The Sum of All of the Correlation Values with Other Modalities | |
---|---|---|---|---|---|
EEG | 1 | −0.0061 | 0.5579 | 0.0921 | 0.6439 |
Blinking rate | −0.0061 | 1 | 0.1289 | 0.4991 | 0.6219 |
Facial temperature | 0.5579 | 0.1289 | 1 | 0.4986 | 1.1854 |
Subjective evaluation | 0.0921 | 0.4991 | 0.4986 | 1 | 1.0898 |
5. Conclusions
Acknowledgments
Author Contributions
Conflicts of Interest
References
- Kwon, D.-S.; Kwak, Y.K.; Park, J.C.; Chung, M.J.; Jee, E.-S.; Park, K.-S.; Kim, H.-R.; Kim, Y.-M.; Park, J.-C.; Kim, E.H.; et al. Emotion interaction system for a service robot. In Proceedings of the 16th IEEE International Conference on Robot and Human Interactive Communication, Jeju, Korea, 26–29 August 2007; pp. 351–356.
- Machot, F.A.; Mosa, A.H.; Dabbour, K.; Fasih, A.; Schwarzlmüller, C.; Ali, M.; Kyamakya, K. A novel real-time emotion detection system from audio streams based on Bayesian quadratic discriminate classifier for ADAS. In Proceedings of the 3rd International Workshop on Nonlinear Dynamics and Synchronization and 16th International Symposium on Theoretical Electrical Engineering, Klagenfurt, Austria, 25–27 July 2011; pp. 1–5.
- SHORE™. Object and Face Recognition. Available online: http://www.iis.fraunhofer.de/en/ff/bsy/tech/bildanalyse/shore-gesichtsdetektion.html (accessed on 15 March 2015).
- Strupp, S.; Schmitz, N.; Berns, K. Visual-based emotion detection for natural man-machine interaction. Lect. Notes Artif. Intell. 2008, 5243, 356–363. [Google Scholar]
- Sun, Y.; Sebe, N.; Lew, M.S.; Gevers, T. Authentic emotion detection in real-time video. Lect. Notes Comput. Sci. 2004, 3058, 94–104. [Google Scholar]
- Cohen, I.; Sebe, N.; Garg, A.; Chen, L.S.; Huang, T.S. Facial expression recognition from video sequences: Temporal and static modeling. Comput. Vis. Image Underst. 2003, 91, 160–187. [Google Scholar] [CrossRef]
- Pavlidis, I.; Levine, J.; Baukol, P. Thermal image analysis for anxiety detection. In Proceedings of the IEEE International Conference on Image Processing, Thessaloniki, Greece, 7–10 October 2001; pp. 315–318.
- Bedoya-Jaramillo, S.; Belalcazar-Bolaños, E.; Villa-Cañas, T.; Orozco-Arroyave, J.R.; Arias-Londoño, J.D.; Vargas-Bonilla, J.F. Automatic emotion detection in speech using mel frequency cesptral coefficients. In Proceedings of the Symposium of Image, Signal Processing, and Artificial Vision, Antioquia, Colombia, 12–14 September 2012; pp. 62–65.
- Sanchez, M.H.; Tur, G.; Ferrer, L.; Hakkani-Tür, D. Domain adaptation and compensation for emotion detection. In Proceedings of the 11th Annual Conference of the International Speech Communication Association, Makuhari, Japan, 26–30 September 2010; pp. 2874–2877.
- Agrafioti, F.; Hatzinakos, D.; Anderson, A.K. ECG pattern analysis for emotion detection. IEEE Trans. Affect. Comput. 2012, 3, 102–115. [Google Scholar] [CrossRef]
- Lin, Y.-P.; Wang, C.-H.; Wu, T.-L.; Jeng, S.-K.; Chen, J.-H. EEG-based emotion recognition in music listening: A comparison of schemes for multiclass support vector machine. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, Taipei, Taiwan, 19–24 April 2009; pp. 489–492.
- Eom, J.-S.; Sohn, J.-H. Emotion recognition using facial thermal images. J. Ergon. Soc. Korea 2012, 31, 427–435. [Google Scholar] [CrossRef]
- Baumgarter, T.; Esslen, M.; Jäncke, L. From emotion perception to emotion experience: Emotions evoked by pictures and classical music. Int. J. Psychophysiol. 2006, 60, 34–43. [Google Scholar] [CrossRef] [PubMed]
- Cheng, K.-S.; Chen, Y.-S.; Wang, T. Physiological parameters assessment for emotion recognition. In Proceedings of the IEEE EMBS International Conference on Biomedical Engineering and Sciences, Langkawi, Malaysia, 17–19 December 2012; pp. 995–998.
- Chun, J.; Lee, H.; Park, Y.S.; Park, W.; Park, J.; Han, S.H.; Choi, S.; Kim, G.H. Real-time classification of fear/panic emotion based on physiological signals. In Proceedings of the Eighth Pan-Pacific Conference on Occupational Ergonomics, Bangkok, Thailand, 17–19 October 2007.
- Cheemalapati, S.; Gubanov, M.; Vale, M.D.; Pyayt, A. A real-time classification algorithm for emotion detection using portable EEG. In Proceedings of the IEEE 14th International Conference on Information Reuse and Integration, San Francisco, CA, USA, 14–16 August 2013; pp. 720–723.
- Schutter, D.J.; van Honk, J. Electrophysiological ratio markers for the balance between reward and punishment. Cogn. Brain Res. 2005, 24, 685–690. [Google Scholar] [CrossRef] [PubMed]
- Putman, P.; van Peer, J.; Maimari, I.; van der Werff, S. EEG theta/beta ratio in relation to fear-modulated response-inhibition, attentional control, and affective traits. Biol. Psychol. 2010, 83, 73–78. [Google Scholar] [CrossRef] [PubMed]
- Hermans, E.J.; Ramsey, N.F.; van Honk, J. Exogenous testosterone enhances responsiveness to social threat in the neural circuitry of social aggression in humans. Biol. Psychiatry 2008, 63, 263–270. [Google Scholar] [CrossRef] [PubMed]
- Gazelle. Available online: http://www.ptgrey.com/products/gazelle/gazelle_camera_link.asp (accessed on 15 March 2015).
- SFH 4550. Available online: http://www.jlab.org/accel/inj_group/laser2001/pockels_files/pockels_switch_notebook_files/SFH4550.pdf (accessed on 15 March 2015).
- Ghiass, R.S.; Arandjelović, O.; Bendada, H.; Maldague, X. Infrared face recognition: A literature review. In Proceedings of the International Joint Conference on Neural Networks, Dallas, TX, USA, 4–9 August 2013; pp. 1–10.
- ICI 7320 Pro Specifications. Available online: http://www.infraredcamerasinc.com/Thermal-Cameras/Fix-Mounted-Thermal-Cameras/ICI7320_Pro_fix-mounted_thermal_camera.html (accessed on 15 March 2015).
- Webcam C600. Available online: http://www.logitech.com/en-us/support/5869 (accessed on 15 March 2015).
- Emotiv EPOC. Available online: http://www.emotiv.com/epoc.php (accessed on 15 March 2015).
- Emotiv SDK. Available online: http://innovatec.co.jp/content/etc/ResearchEditionSDK.pdf (accessed on 15 March 2015).
- Bang, J.W.; Heo, H.; Choi, J.-S.; Park, K.R. Assessment of eye fatigue caused by 3D displays based on multimodal measurements. Sensors 2014, 14, 16467–16485. [Google Scholar] [CrossRef] [PubMed]
- Viola, P.; Jones, M.J. Robust real-time face detection. Int. J. Comput. Vis. 2004, 57, 137–154. [Google Scholar] [CrossRef]
- Schutter, D.J.; van Honk, J. Decoupling of midfrontal delta-beta oscillations after testosterone administration. Int. J. Psychophysiol. 2004, 53, 71–73. [Google Scholar] [CrossRef] [PubMed]
- Schutter, D.J.; van Honk, J. Salivary cortisol levels and the coupling of midfrontal delta-beta oscillations. Int. J. Psychophysiol. 2005, 55, 127–129. [Google Scholar] [CrossRef] [PubMed]
- Tortella-Feliu, M.; Morillas-Romero, A.; Balle, M.; Llabrés, J.; Bornas, X.; Putman, P. Spontaneous EEG activity and spontaneous emotion regulation. Int. J. Psychophysiol. 2014, 94, 365–372. [Google Scholar] [CrossRef] [PubMed]
- Shutter (2004 Film). Available online: https://en.wikipedia.org/wiki/Shutter_(2004_film) (accessed on 27 June 2015).
- Silent Hill (Film). Available online: https://en.wikipedia.org/wiki/Silent_Hill_(film) (accessed on 27 June 2015).
- Wolfgang, J.-K. On the preferred viewing distances to screen and document at VDU workplaces. Ergonomics 1990, 33, 1055–1063. [Google Scholar]
- Student’s t-Test. Available online: http://en.wikipedia.org/wiki/Student’s_t-test (accessed on 15 March 2015).
- Effect Size. Available online: http://en.wikipedia.org/wiki/Effect_size#Cohen.27s_d (accessed on 15 March 2015).
- Correlation and Dependence. Available online: http://en.wikipedia.org/wiki/Correlation_and_dependence (accessed on 15 March 2015).
- Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual; Technical Report A-8; University of Florida: Gainesville, FL, USA, 2008. [Google Scholar]
- Cheng, S.-Y.; Hsu, H.-T. Mental Fatigue Measurement Using EEG. In Risk Management Trends; Intech: Rijeka, Croatia, 2011. [Google Scholar]
- Occipital Lobe. Available online: https://en.wikipedia.org/wiki/Occipital_lobe (accessed on 27 June 2015).
- Cho, D.-C.; Kim, W.-Y. Long-range gaze tracking system for large movements. IEEE Trans. Biomed. Eng. 2013, 60, 3432–3440. [Google Scholar] [CrossRef] [PubMed]
- Carbonea, A.; Martineza, F.; Pissalouxa, E.; Mazeikab, D.; Velazquezc, R. On the design of a low cost gaze tracker for interaction. Procedia Technol. 2012, 3, 89–96. [Google Scholar] [CrossRef]
- Lebedev, M.A.; Nicolelis, M.A.L. Brain-machine interfaces: Past, present and future. Trends Neurosci. 2006, 29, 536–546. [Google Scholar] [CrossRef] [PubMed]
- Reyes, J.F.; Tosunoglu, S. An overview of brain-computer interface technology applications in robotics. In Proceedings of the Florida Conference on Recent Advances in Robotics, Gainesville, FL, USA, 4–5 May 2011; pp. 1–5.
- Zhang, B.; Wang, J.; Fuhlbrigge, T. A review of the commercial brain-computer interface technology from perspective of industrial robotics. In Proceedings of the IEEE International Conference on Automation and Logistics, Hong Kong and Macau, China, 16–20 August 2010; pp. 379–384.
- Yeom, H.G.; Kim, J.S.; Chung, C.K. Estimation of the velocity and trajectory of three-dimensional reaching movements from non-invasive magnetoencephalography signals. J. Neural Eng. 2013, 10, 1–9. [Google Scholar] [CrossRef] [PubMed]
© 2015 by the authors; licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Choi, J.-S.; Bang, J.W.; Heo, H.; Park, K.R. Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors. Sensors 2015, 15, 17507-17533. https://doi.org/10.3390/s150717507
Choi J-S, Bang JW, Heo H, Park KR. Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors. Sensors. 2015; 15(7):17507-17533. https://doi.org/10.3390/s150717507
Chicago/Turabian StyleChoi, Jong-Suk, Jae Won Bang, Hwan Heo, and Kang Ryoung Park. 2015. "Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors" Sensors 15, no. 7: 17507-17533. https://doi.org/10.3390/s150717507
APA StyleChoi, J.-S., Bang, J. W., Heo, H., & Park, K. R. (2015). Evaluation of Fear Using Nonintrusive Measurement of Multimodal Sensors. Sensors, 15(7), 17507-17533. https://doi.org/10.3390/s150717507