Abstract:
High-resolution algorithms are very attractive in the field of array signal processing. While they are hardly applied in practice due to the performance degradation, which is caused by practical arrays’ imperfections such as gain and phase errors. To solve this problem, a new subspace-based calibration method with signal sources is proposed in this paper. By modeling the gain and phase errors of an array, the problem of array calibration could be transformed to a parameter estimation problem, which can be easily solved by the classical Lagrange multiplier method. This proposed calibration method only needs one signal source with known direction. Compared with the current auto-calibration methods, it possesses much less computation complexity and does not need any iteration. More importantly, this calibration method can be applied to arbitrary arrays with known sensor positions, such that it is more suitable to a real-world array before the array is equipped. To demonstrate the performance of the proposed calibration method, an experiment is conducted in a semi-anechoic room to measure and calibrate a sixteen-sensor circular acoustic array. The results have shown that the gain and phase errors can be estimated accurately by our proposed method. What’s more, we also show that the performance of Minimum Variance Distortionless Response (MVDR) beamformer has been improved a lot after calibration, which proves the correction and validity of our method further.