The pattern of change in voice fundamental frequency during intervocalic syllables can be a biomarker for dysregulated laryngeal muscle tension. Methods to identify this pattern—termed relative fundamental frequency (RFF)—require manual intervention, which poses a challenge for clinical translation. We show that acoustic-driven machine learning and signal processing techniques can be used to automate RFF estimation with preserved accuracy.
Gill, A., Raiff, L., Kirchgessner, E., Stepp, C.E., Kline, J.C., & Vojtech, J.M. “Validation of an automated relative fundamental frequency analysis for clinical voice evaluation,” The 15th International Conference on Advances in Quantitative Laryngology, Voice and Speech Research, Phoenix, AZ, USA, March 30–April 1, 2023. [poster presentation]
This study presents an evaluation of ability-based methods extended to keyboard generation for alternative communication in people with dexterity impairments due to motor disabilities. We highlight key observations relating to the heterogeneity of the manifestation of motor disabilities, perceived importance of communication technology, and quantitative improvements in communication performance when characterizing an individual's movement abilities to design personalized AAC interfaces.
Mitchell, C.M., Cler, G.J., Fager, S.K., Contessa, P., Roy, S.H., De Luca, G., Kline, J.C., & Vojtech, J.M. “Ability-based Keyboards for Augmentative and Alternative Communication: Understanding How Individuals’ Movement Patterns Translate to More Efficient Keyboards,” In CHI Conference on Human Factors in Computing Systems Extended Abstracts (CHI EA '22). Association for Computing Machinery, New York, NY, USA, Article 412, 1–7, 2022. doi: 10.1145/3491101.3519845
This study evaluated two head control-based access methods that could be used for 2-D cursor control of an augmentative and alternative communication system: one based on surface electromyographic and acclerometry, and one based on computer vision technology.
Vojtech, J.M., Hablani, S., Cler, G.J., & Stepp, C.E., “Integrated head-tilt & surface electromyographic cursor control for augmentative and alternative communication,” Conference on Motor Speech, Santa Barbara, CA, USA, February 19–23, 2020.
Surface electromyography (sEMG) is a promising computer access method for individuals with motor impairments. However, optimal sensor placement is a tedious task requiring trial-and-error by an expert, particularly when recording from facial musculature likely to be spared in individuals with neurological impairments. Here, we demonstrate that non-experts can place sEMG sensors in the vicinity of usable muscle sites for computer access and healthy individuals will learn to efficiently control a human–machine interface.
Vojtech, J.M., Noordzij Jr., J.P., Cler, G.J., & Stepp, C.E. “Effects of prosody on the intelligibility, communication efficiency, and perceived naturalness of synthetic speech in augmentative and alternative communication,” Conference on Motor Speech, Savannah, GA, USA, February 22–25, 2018.
We sought to improve the clinical applicability of using surface electromyography as a computer access method for individuals with motor speech disorders by reducing the complexity of sensor placement. We describe difficulties in predicting user performance, likely as a result of the diverse manifestation of motor speech disorders.
Vojtech, J.M., Cler, G.J., Fager, S., & Stepp, C.E. “Predicting optimal surface electromyographic control of communication devices in individuals with motor speech disorders,” Conference on Motor Speech, Savannah, GA, USA, February 22–25, 2018.
This work describes the identification of quantitative features to predict sensor configurations on the face for improved electromyographic cursor control.
Vojtech, J.M., Cler, G.J., Noordzij, Jr., J.P., & Stepp, C.E., “Evaluation of facial electromyographic sensor configuration for optimizing human-machine interface control,” Boston Speech Motor Control Mini Symposium, Boston University, Boston, MA, USA, March 31, 2017.