Abstract:
Millimeter-wave radar and optical camera are one of the main perceptron combinations of unmanned autonomous platforms such as autonomous driving and disaster monitoring robots, and mapping radar grid coordinates to optical pixel coordinates is the key basis for subsequent microwave and visual information fusion processing. The existing coordinate mapping method mainly solves the problem of mapping single-frame point cloud coordinates and optical pixel coordinates of radar coordinate system. The occupied grid map constructed by radar motion accumulation multi-frame point cloud can obtain stronger geometric semantic feature information than that of single-frame radar point cloud, but its raster coordinates are difficult to directly establish a coordinate mapping relationship with optical pixel coordinates. To solve this problem, this paper proposes a mapping method for radar raster coordinates and optical pixel coordinates based on unidirectional transformation. On this basis, the influence of error factors such as the number of feature points on the mapping accuracy and the validity conditions are analyzed. Finally, the actual collected millimeter-wave radar and optical image data verify the effectiveness of the mapping method.