幸运飞艇计划

Home
Nonmanual
Conference

Three papers from NONMANUAL at LREC-SL 2024

Visualization of one headshake in NGT measured with OpenFace
Visualization of headshake measurements with OpenFace, with peaks detected
Photo:
Vadim Kimmelman

Main content

In May 2024, the聽11th Workshop on the Representation and Processing of Sign Languages: Evaluation of Sign Language Resources (LREC-SL 2024: ) took place in Torino, Italy. It is an important venue for computational and general linguists working on sign languages.听

The NONMANUAL project was represented by three poster presentations.

  • Kimmelman, V., M. Oomen & R. Pfau (2024) Headshakes in NGT: Relation between Phonetic Properties & Linguistic Functions聽Proceedings of LREC-SL 2024.听听

We use OpenFace to measure head rotation during headshakes expressing negation in NGT (Sign Language of the Netherlands). We find that some of the phonetic/kinetic measures of headshakes correlate with their linguistic functions.

  • Kimmelman, V., A. Price, J. Safar, C. de Vos & J. Bulla (2024) Nonmanual Marking of Questions in Balinese Homesign Interactions: a Computer-Vision Assisted Analysis.听Proceedings of LREC-SL 2024.听

We look at nonmanual marking of questions in 5 deaf homesigners from Bali. It turns out that polar questions and non-polar questions are marked by opposite directions of head movements, and this is consistent across homesigners. The analysis is based on a combination of manual annotation and extracting measurements of head tilt (pitch) with OpenFace.

  • Susman, M. & V. Kimmelman. (2024) Eye Blink Detection in Sign Language Data Using CNNs and Rule-Based Methods聽Proceedings of LREC-SL 2024.听

Eye blinks are an important prosodic markers across sign languages. However, cross-linguistic research on these markers is almost non-existent. In order to enable cross-linguistic comparison, we develop and test two methods of automatic detection of eyeblinks. Both methods produce promising results.

The three studies are published in proceedings of the workshop, and available in open access. Follow the links above to read the papers!聽