Wang Xiaoyu, Li Fan, Cao Lin, Li Jun, Zhang Chi, Peng Yuan, Cong Fengyu. End to End Underwater Targets Recognition Using the Modified Convolutional Neural Network[J]. JOURNAL OF SIGNAL PROCESSING, 2020, 36(6): 958-965. DOI: 10.16798/j.issn.1003-0530.2020.06.018
Citation: Wang Xiaoyu, Li Fan, Cao Lin, Li Jun, Zhang Chi, Peng Yuan, Cong Fengyu. End to End Underwater Targets Recognition Using the Modified Convolutional Neural Network[J]. JOURNAL OF SIGNAL PROCESSING, 2020, 36(6): 958-965. DOI: 10.16798/j.issn.1003-0530.2020.06.018

End to End Underwater Targets Recognition Using the Modified Convolutional Neural Network

  • Traditional feature-based underwater target recognition methods perform poorly due to the high complexity of underwater acoustic signals. Advanced recognition methods based on the deep learning model can effectively reduce the information loss caused by the feature extraction, thereby improving the classification performance. In this paper, we proposed a convolutional neural network (CNN) model suitable for the underwater targets recognition scenario, which introduced a one-dimension Convolution layer with the kernel of 1 in the convolution module to preserve the local characteristics of underwater acoustic signals and reduce the complexity of the model; meanwhile, replaced the fully connected layer with a global average pooling (GAP) layer which outputted the interpretable results based on the feature vector corresponding to feature map and reduced the training parameters to prevent overfitting. The results showed that the modified CNN model achieved a classification accuracy of 91.7%, compared with the classification method based on conventional CNN which obtained 69.8% and features of higher-order statistics (HOS) which obtained 85%. It is concluded that the proposed method can better preserve the time-domain structure of underwater acoustic signals, furthermore improving the classification performance.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return