This HTML5 document contains 34 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
n13https://kar.kent.ac.uk/66564/
dctermshttp://purl.org/dc/terms/
n2https://kar.kent.ac.uk/id/eprint/
wdrshttp://www.w3.org/2007/05/powder-s#
dchttp://purl.org/dc/elements/1.1/
n14http://purl.org/ontology/bibo/status/
rdfshttp://www.w3.org/2000/01/rdf-schema#
n10https://kar.kent.ac.uk/id/subject/
n18https://demo.openlinksw.com/about/id/entity/https/raw.githubusercontent.com/annajordanous/CO644Files/main/
n8http://eprints.org/ontology/
n16https://kar.kent.ac.uk/id/event/
bibohttp://purl.org/ontology/bibo/
n17https://kar.kent.ac.uk/id/eprint/66564#
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
n7https://kar.kent.ac.uk/id/
n11https://kar.kent.ac.uk/id/document/
xsdhhttp://www.w3.org/2001/XMLSchema#
n4https://demo.openlinksw.com/about/id/entity/https/www.cs.kent.ac.uk/people/staff/akj22/materials/CO644/
n6https://kar.kent.ac.uk/id/person/

Statements

Subject Item
n2:66564
rdf:type
n8:ConferenceItemEPrint bibo:Article bibo:AcademicArticle n8:EPrint
rdfs:seeAlso
n13:
n8:hasAccepted
n11:978349
n8:hasDocument
n11:2773913 n11:978349 n11:978422 n11:2773910 n11:2773911 n11:2773912
dc:hasVersion
n11:978349
dcterms:title
Music Emotion Capture: sonifying emotions in EEG data
wdrs:describedby
n4:export_kar_RDFN3.n3 n18:export_kar_RDFN3.n3
dcterms:date
2018-08-21
dcterms:creator
n6:ext-a.k.jordanous@kent.ac.uk n6:ext-c.li@kent.ac.uk n6:ext-grba@kent.ac.uk
bibo:status
n14:peerReviewed n14:published
bibo:abstract
People’s emotions are not always obviously detectable, due to difficulties expressing emotions, or geographic distance (e.g. if people are communicating online). There are also many occasions where it would be useful for a computer to be able to detect users’ emotions and respond to them appropriately. A person’s brain activity gives vital clues as to emotions they are experiencing at any one time. The aim of this project is to detect, model and sonify people’s emotions. To achieve this, there are two tasks: (1) to detect emotions based on current brain activity as measured by an EEG device; (2) to play appropriate music in real-time, representing the current emotional state of the user. Here we report a pilot study implementing the Music Emotion Capture system. In future work we plan to improve how this project performs emotion detection through EEG, and to generate new music based on emotion-based characteristics of music. Potential applications arise in collaborative/assistive software and brain-computer interfaces for non-verbal communication.
dcterms:isPartOf
n7:repository
dcterms:subject
n10:QA76.76 n10:QA76.76.I59 n10:QA76.87 n10:QA76.9.H85 n10:BF n10:QA76 n10:QA76.575 n10:Q335
bibo:authorList
n17:authors
bibo:presentedAt
n16:ext-759090e3e53e67fc962a7d8b5164b2c0