混响
计算机科学
克罗内克产品
线性预测
克罗内克三角洲
滤波器(信号处理)
语音识别
均方预测误差
可理解性(哲学)
计算复杂性理论
算法
产品(数学)
自适应滤波器
数学
声学
哲学
物理
几何学
认识论
量子力学
计算机视觉
作者
Gongping Huang,Jacob Benesty,Israel Cohen,Jingdong Chen
标识
DOI:10.1109/taslp.2022.3161150
摘要
Reverberation, whichis caused by late reflections, impairs not only speech quality but also intelligibility. Consequently, dereverberation, a process to mitigate the impact of reverberation, has attracted significant research interests. Numerous approaches have been developed in the literature, among which the weighted-prediction-error (WPE) one has demonstrated promising potential for reducing or eliminating reverberation. The WPE method has been well studied and several variants have been developed. The adaptive one, called adaptive WPE (AWPE) method, has been widely investigated for use in real applications as it can deal with reverberation in time-varying acoustic environments. However, the computational complexity of AWPE is high, which may be a problem for its implementation in real-time systems. This paper presents some new insights into AWPE-based speech dereverberation by introducing the concepts of Kronecker product and partially time-varying filtering. It then develops two algorithms for dereverberation with lower complexity than AWPE. The significant contributions of this work are as follows. First, we propose a Kronecker product filtering framework for speech dereverberation, where the linear prediction filter is formulated as the Kronecker product of two sets of shorter filters. Second, we propose a partially time-varying Kronecker product filter for dereverberation. Instead of estimating the entire linear prediction filter as in the conventional method, the proposed one only needs to update part of the filter. The proposed approaches can significantly reduce the computational complexity without sacrificing dereverberation performance as compared to AWPE. Simulation results validate the theoretical analysis and justify the advantages of the new methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI