SUN Jinguang, WU Mingyan. Person Re-identification Method Based on Multi-scale Weighted Feature Fusion[J]. JOURNAL OF SIGNAL PROCESSING, 2022, 38(10): 2201-2210. DOI: 10.16798/j.issn.1003-0530.2022.10.021
Citation: SUN Jinguang, WU Mingyan. Person Re-identification Method Based on Multi-scale Weighted Feature Fusion[J]. JOURNAL OF SIGNAL PROCESSING, 2022, 38(10): 2201-2210. DOI: 10.16798/j.issn.1003-0530.2022.10.021

Person Re-identification Method Based on Multi-scale Weighted Feature Fusion

  • ‍ ‍Aiming at the problem of low person re-identification rate due to the common occlusion and variable pedestrian posture in person re-identification, a person re-identification method based on multi-scale weighted feature fusion(MSWF) was proposed. Firstly, the method used the backbone network ResNeSt-50 to extract downsampling 3 times features, downsampling 4 times features and downsampling 5 times features, and input them into the weighted feature pyramid network and then used a fast normalization fusion method for feature fusion. The introduction of weighting operations in feature fusion allowed the model to learn how to assign weight values to fused features during the training process, so that the features of different scales could be fully utilized to obtain richer person features. Finally, the fused high-level features which had rich semantic information were used as global features, and the fused high-resolution features were used as local features. During the training process, the model was trained by combining Softmax classification loss function, triplet loss function and center loss function. In the testing phase, global features and local features were concatenated along the channel dimension to represent person features, and Euclidean distance was used to calculate distance between preson. A lot of experiments were done on the Market-1501 datasets, DukeMTMC-reID datasets, CUHK03-Labeled datasets and CUHK03-Detected datasets. On the Market-1501 datasets, the mAP and Rank-1 reached 89.2% and 95.8%. On the DukeMTMC-reID datasets, the mAP and Rank-1 reached 79.7% and 90.4%. On the CUHK03-Labeled datasets, the mAP and Rank-1 reached 80.1% and 82.4%. Rank-1 and mAP of CUHK03-Detected datasets reached 80.1% and 76.6%. The experimental results show that the person re-identification rate and mean average precision of this method are better than many current mainstream methods.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return