eprintid: 6157 rev_number: 2 eprint_status: archive userid: 1 dir: disk0/00/00/61/57 datestamp: 2023-11-09 16:17:54 lastmod: 2023-11-09 16:17:54 status_changed: 2023-11-09 16:05:04 type: article metadata_visibility: show creators_name: Babiker, A. creators_name: Faye, I. creators_name: Prehn, K. creators_name: Malik, A. title: Machine learning to differentiate between positive and negative emotions using pupil diameter ispublished: pub note: cited By 19 abstract: Pupil diameter (PD) has been suggested as a reliable parameter for identifying an individual's emotional state. In this paper, we introduce a learning machine technique to detect and differentiate between positive and negative emotions. We presented 30 participants with positive and negative sound stimuli and recorded pupillary responses. The results showed a significant increase in pupil dilation during the processing of negative and positive sound stimuli with greater increase for negative stimuli. We also found a more sustained dilation for negative compared to positive stimuli at the end of the trial, which was utilized to differentiate between positive and negative emotions using a machine learning approach which gave an accuracy of 96.5 with sensitivity of 97.93 and specificity of 98. The obtained results were validated using another dataset designed for a different study and which was recorded while 30 participants processed word pairs with positive and negative emotions. © 2015 Babiker, Faye, Prehn and Malik. date: 2015 publisher: Frontiers Media S.A. official_url: https://www.scopus.com/inward/record.uri?eid=2-s2.0-84954200761&doi=10.3389%2ffpsyg.2015.01921&partnerID=40&md5=3582e3933fd73b7e0ea4c364bafe0d08 id_number: 10.3389/fpsyg.2015.01921 full_text_status: none publication: Frontiers in Psychology volume: 6 number: DEC refereed: TRUE issn: 16641078 citation: Babiker, A. and Faye, I. and Prehn, K. and Malik, A. (2015) Machine learning to differentiate between positive and negative emotions using pupil diameter. Frontiers in Psychology, 6 (DEC). ISSN 16641078