This HTML5 document contains 32 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
dctermshttp://purl.org/dc/terms/
n2https://kar.kent.ac.uk/id/eprint/
wdrshttp://www.w3.org/2007/05/powder-s#
dchttp://purl.org/dc/elements/1.1/
n7http://purl.org/ontology/bibo/status/
n9https://kar.kent.ac.uk/id/subject/
rdfshttp://www.w3.org/2000/01/rdf-schema#
n6https://kar.kent.ac.uk/id/eprint/81490#
n12https://demo.openlinksw.com/about/id/entity/https/raw.githubusercontent.com/annajordanous/CO644Files/main/
n3http://eprints.org/ontology/
n18https://kar.kent.ac.uk/id/event/
n15https://kar.kent.ac.uk/81490/
bibohttp://purl.org/ontology/bibo/
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
n4https://kar.kent.ac.uk/id/document/
n10https://kar.kent.ac.uk/id/
xsdhhttp://www.w3.org/2001/XMLSchema#
n17https://demo.openlinksw.com/about/id/entity/https/www.cs.kent.ac.uk/people/staff/akj22/materials/CO644/
n13https://kar.kent.ac.uk/id/person/

Statements

Subject Item
n2:81490
rdf:type
n3:ConferenceItemEPrint bibo:AcademicArticle n3:EPrint bibo:Article
rdfs:seeAlso
n15:
n3:hasAccepted
n4:3209575
n3:hasDocument
n4:3209575 n4:3209580 n4:3209581 n4:3209582 n4:3209583 n4:3209584
dc:hasVersion
n4:3209575
dcterms:title
Music Emotion Capture: Ethical issues around emotion-based music generation
wdrs:describedby
n12:export_kar_RDFN3.n3 n17:export_kar_RDFN3.n3
dcterms:date
2020-05-15
dcterms:creator
n13:ext-a.k.jordanous@kent.ac.uk n13:ext-c.li@kent.ac.uk n13:ext-grba@kent.ac.uk
bibo:status
n7:nonPeerReviewed n7:published
bibo:abstract
People’s emotions are not always detectable, e.g. if a person has difficulties/lack of skills in expressing emotions, or if people are geographically separated/communicating online). Brain-computer interfaces (BCI) could enhance non-verbal communication of emotion, particularly in detecting and responding to users’ emotions e.g. music therapy, interactive software. Our pilot study Music Emotion Capture 1 detects, models and sonifies people’s emotions based on their real-time emotional state, measured by mapping EEG feedback onto a valence-arousal emotional model 2 based on [3]. Though many practical applications emerge, the work raises several ethical questions, which need careful consideration. This poster discusses these ethical issues. Are the work’s benefits (e.g. improved user experiences; music therapy; increased emotion communication abilities; enjoyable applications) important enough to justify navigating the ethical issues that arise? (e.g. privacy issues; control of representation of/reaction to users’ emotional state; consequences of detection errors; the loop of using emotion to generate music and music affecting the emotion, with the human in the process as an “intruder”). 1 Langroudi, G., Jordanous, A., & Li, L. (2018). Music Emotion Capture: emotion-based generation of music using EEG. Emotion Modelling and Detection in Social Media and Online Interaction symposium @ AISB 2018, Liverpool. 2 Paltoglou, G., & Thelwall, M. (2012). Seeing stars of valence and arousal in blog posts. IEEE Transactions on Affective Computing, 4(1) [3] Russell, J.A. (1980). ‘A circumplex model of affect’, Journal of Personality and Social Psychology, 39
dcterms:isPartOf
n10:repository
dcterms:subject
n9:M1 n9:QA76.76.I59 n9:QA76.87 n9:QA76 n9:Q180.55.M67 n9:Q335
bibo:authorList
n6:authors
bibo:presentedAt
n18:ext-72bdaf553838cff63dcea71d02640b52