Effects of Facial Movements to Expressive Speech Productions: A Computational Study
Conference proceedings article
Authors/Editors
Strategic Research Themes
No matching items found.
Publication Details
Author list: Prom-On S., Onsri M.
Publisher: Hindawi
Publication year: 2019
Start page: 481
End page: 484
Number of pages: 4
ISBN: 9781728101101
ISSN: 0146-9428
eISSN: 1745-4557
Languages: English-Great Britain (EN-GB)
Abstract
This paper presents a computational study on the relation between the movement of visual facial and acoustic features. Audio-visual corpus on expressive speech production was collected for the study. For the corpus, the relevant stimuli consist of 4 facial expression and 4 Thai sentences, in total of 16 combinations. Video and audio data were captured from 10 native Thai speakers. Each speaker pronounces the sentence with specified expression. In total, there are 160 audio-visual tracks for the analysis. Facial features were extracted and tracked by using visual markers through the pronunciation. At the same time, acoustic data, particularly the fundamental frequency (F0) was tracked and synchronized with the facial data. Computational analysis on the landmark and dynamic features of both visual and audio data were performed. This result provides the templates of the expressive facial movement together with the acoustic adjustments. ฉ 2019 IEEE.
Keywords
Audio-synchronization, Facial expression, Facial movement, Speech production