Effects of Facial Movements to Expressive Speech Productions: A Computational Study

Conference proceedings article


Authors/Editors


Strategic Research Themes

No matching items found.


Publication Details

Author listProm-On S., Onsri M.

PublisherHindawi

Publication year2019

Start page481

End page484

Number of pages4

ISBN9781728101101

ISSN0146-9428

eISSN1745-4557

URLhttps://www.scopus.com/inward/record.uri?eid=2-s2.0-85083257598&doi=10.1109%2fICKII46306.2019.9042701&partnerID=40&md5=8cc86c6fb1834107858c8ef5a2cc473b

LanguagesEnglish-Great Britain (EN-GB)


View on publisher site


Abstract

This paper presents a computational study on the relation between the movement of visual facial and acoustic features. Audio-visual corpus on expressive speech production was collected for the study. For the corpus, the relevant stimuli consist of 4 facial expression and 4 Thai sentences, in total of 16 combinations. Video and audio data were captured from 10 native Thai speakers. Each speaker pronounces the sentence with specified expression. In total, there are 160 audio-visual tracks for the analysis. Facial features were extracted and tracked by using visual markers through the pronunciation. At the same time, acoustic data, particularly the fundamental frequency (F0) was tracked and synchronized with the facial data. Computational analysis on the landmark and dynamic features of both visual and audio data were performed. This result provides the templates of the expressive facial movement together with the acoustic adjustments. ฉ 2019 IEEE.


Keywords

Audio-synchronizationFacial expressionFacial movementSpeech production


Last updated on 2023-25-09 at 07:36