A Fast Method for High-Resolution Voiced/Unvoiced Detection and GCI/GOI Estimation of Speech

Abstract: We propose a fast speech analysis method which simultaneously performs high-resolution voiced/unvoiced detection (VUD) and accurate estimation of glottal closure and glottal opening instants (GCIs and GOIs, respectively). The proposed algorithm exploits the structure of the glottal flow derivative in order to estimate GCIs and GOIs only in voiced speech using simple time-domain criteria. We compare our method with well-known GCI/GOI methods, namely, the dynamic programming projected phase-slope algorithm (DYPSA), the yet another GCI/GOI algorithm (YAGA) and the speech event detection using the residual excitation and a mean-based signal (SEDREAMS). Furthermore, we examine the performance of the aforementioned methods when combined with state-of-the-art VUD algorithms, namely, the robust algorithm for pitch tracking (RAPT) and the summation of residual harmonics (SRH). Experiments conducted on the APLAWD and SAM databases show that the proposed algorithm outperforms the state-of-the-art combinations of VUD and GCI/GOI algorithms with respect to almost all evaluation criteria for clean speech. Experiments on speech contaminated with several noise types (white Gaussian, babble, and car-interior) are also presented and discussed. The proposed algorithm outperforms the state-of-the-art combinations in most evaluation criteria for signal-to-noise ratio greater than 10 dB.

Related publications

  1. A Fast Method for High-Resolution Voiced/Unvoiced Detection and Glottal Closure/Opening Instant Estimation of Speech
    A.I. Koutrouvelis; G.P. Kafentzis; N.D. Gaubitch; R. Heusdens;
    IEEE Trans. Audio, Speech and Language Processing,
    Volume 24, Issue 2, pp. 316-328, February 2016. Matlab code available from Xplore. DOI: 10.1109/TASLP.2015.2506263
    document

Repository data

File: GEFBA.zip
Size: 459 kB
Modified: 25 January 2018
Type: software
Authors: Andreas Koutrouvelis, Richard Heusdens, Nikolay Gaubitch
Date: December 2015
Contact: Richard Heusdens