Abstract:
Traditional neural network always takes least mean square(LMS) or recursive least square (RLS) as convergence criterion. However, Normalized least mean square(NLMS) can achieve better network performance in some instances, such as adaptive equalization. NLMS criterion was introduced to Levenberg-Marquardt algorithm based neural network in this paper. By normalizing the output error of the network and adopting the LM algorithm as training method, the neural network converges fast. Theoretical analysis and experimental results show: compared with the steepest decent algorithm based NLMS criterion and the LM algorithm based LMS criterion, our neural network program converges fast and gets smaller normalized error. When approaching into the neural network watermark system, our method can achieve blind extraction and better robustness for channel attack, such as additive noise, low-pass filtering, re-quantifying, et al. Moreover, the performance achieves 4% increase on average.