This HTML5 document contains 40 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
n22doi:10.1007/
dctermshttp://purl.org/dc/terms/
n2https://kar.kent.ac.uk/id/eprint/
wdrshttp://www.w3.org/2007/05/powder-s#
n11http://purl.org/ontology/bibo/status/
dchttp://purl.org/dc/elements/1.1/
n16https://kar.kent.ac.uk/84439/
n20https://kar.kent.ac.uk/id/subject/
rdfshttp://www.w3.org/2000/01/rdf-schema#
n19https://demo.openlinksw.com/about/id/entity/https/raw.githubusercontent.com/annajordanous/CO644Files/main/
n9http://eprints.org/ontology/
n13https://kar.kent.ac.uk/id/event/
n7http://www.loc.gov/loc.terms/relators/
bibohttp://purl.org/ontology/bibo/
n14https://kar.kent.ac.uk/id/org/
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
owlhttp://www.w3.org/2002/07/owl#
n6https://kar.kent.ac.uk/id/eprint/84439#
n4https://kar.kent.ac.uk/id/document/
n12https://kar.kent.ac.uk/id/
xsdhhttp://www.w3.org/2001/XMLSchema#
n18https://demo.openlinksw.com/about/id/entity/https/www.cs.kent.ac.uk/people/staff/akj22/materials/CO644/
n8https://kar.kent.ac.uk/id/person/

Statements

Subject Item
n2:84439
rdf:type
bibo:BookSection n9:EPrint n9:BookSectionEPrint bibo:Article
rdfs:seeAlso
n16:
owl:sameAs
n22:978-3-030-68793-9_15
n7:EDT
n8:ext-008d056af499b4d5a3f6d73074042031 n8:ext-d66505250a3f4a768a86f72c5955b730 n8:ext-8eb613866c8dff75303877f8e744f60d n8:ext-95b119be19949624b4f4fdad0b00418d n8:ext-b04223a0c649ae0cda32514b29da2953 n8:ext-5de38aa9e0214bc0802b507a50287923 n8:ext-4e30934329d58959704d039ff27919e1 n8:ext-1f3286b05c8da4287bdfb40afc454556
n9:hasAccepted
n4:3221799
n9:hasDocument
n4:3221805 n4:3221806 n4:3221807 n4:3221808 n4:3221799 n4:3221804
dc:hasVersion
n4:3221799
dcterms:title
Adapting to Movement Patterns for Face Recognition on Mobile Devices
wdrs:describedby
n18:export_kar_RDFN3.n3 n19:export_kar_RDFN3.n3
dcterms:date
2021-02-21
dcterms:creator
n8:ext-r.m.guest@kent.ac.uk n8:ext-f.deravi@kent.ac.uk n8:ext-mjb228@kent.ac.uk
bibo:status
n11:peerReviewed n11:published
dcterms:publisher
n14:ext-6c8b7c40a5167b142d7fb1354cd46407
bibo:abstract
Facial recognition is becoming an increasingly popular way to authenticate users, helped by the increased use of biometric technology within mobile devices, such as smartphones and tablets. Biometric systems use thresholds to identify whether a user is genuine or an impostor. Traditional biometric systems are static (such as eGates at airports), which allow the operators and developers to create an environment most suited for the successful operation of the biometric technology by using a fixed threshold value to determine the authenticity of the user. However, with a mobile device and scenario, the operational conditions are beyond the control of the developers and operators. In this paper, we propose a novel approach to mobile biometric authentication within a mobile scenario, by offering an adaptive threshold to authenticate users based on the environment, situations and conditions in which they are operating the device. Utilising smartphone sensors, we demonstrate the creation of a successful scenario classification. Using this, we propose our idea of an extendable framework to allow multiple scenario thresholds. Furthermore, we test the concept with data collected from a smartphone device. Results show that using an adaptive scenario threshold approach can improve the biometric performance, and hence could allow manufacturers to produce algorithms that perform consistently in multiple scenarios without compromising security, allowing an increase in public trust towards the use of the technology.
dcterms:isPartOf
n12:repository
dcterms:subject
n20:QA n20:TK7882.B56
bibo:authorList
n6:authors
bibo:editorList
n6:editors
bibo:presentedAt
n13:ext-30beb31dd2583f989a6b44f02c67ca98
bibo:volume
12668