The Quantization Kernel Least Mean P-norm Algorithm Using Transfer Learning
-
-
Abstract
In α-stable distribution noise environment, the performance of the kernel least mean P-norm algorithm (KLMP) is significantly better than that of the kernel least mean square algorithm (KLMS). However, the computational complexity and storage capacity of the KLMP algorithm increase linearly with the number of iterations, which is inconvenient for practical applications. To solve this problem, the-nearest-instance-centroid-estimation kernel least mean P-norm algorithm (NICE-KLMP) is proposed, which applies transfer learning to divide the total filter based on the sample instance into partial subfilters that have partial tight support, the training of each subfilter is driven by different inputs. To further reduce the storage capacity. the online vector quantization technique is introduced and the-nearest-instance-centroid-estimation quantization kernel least mean P-norm algorithm (NICE-QKLMP) is proposed. The simulation results of prediction of a Mackey-Glass time series in α-stable distribution noise show that the complexity of NICEKLMP and NICE-QKLMP algorithm is significantly lower than that of KLMP algorithm, and the performance of the impulse noise rejection is significantly stronger than NICE-KLMS algorithm, and equivalent to the KLMP algorithm.
-
-