Authors: Junhwan Ryu, Sungho Kim

Abstract: In the pedestrian detection field, CCD and Infrared (IR) sensor have been used for all-day detection. In order to fuse heterogeneous data, researches have been conducted mainly to make the network deeper or to concatenate different feature maps at different locations. However, there is still a problem of not fully utilizing both data. To fuse heterogeneous data from different domains more efficiently, we propose a novel fusion method called feature map transfer. The proposed method consists of a parallel structure of object detection networks similar to existing researches and transferring feature maps of different domains obtained from this parallel network to each other. This method does not require any modifications, such as adding or removing layers to an already configured network structure. Radiometric temperature data and intensity-based grayscale data of Far-infrared (FIR) pedestrian datasets were used for fusion, instead of the existing CCD and IR fusion. Experimental results show that the log-average miss rate is improved by about 5% without any modification of the network when using the proposed method rather than the conventional concatenation only method for FIR pedestrian datasets.


DOI: DOI: https://dx.doi.org/10.5244/C.33.344
Comments: Presented at BMVC 2019: ODRSS 2019 Workshop on Object Detection and Recognition for Security Screening, Cardiff, UK.
Cite as: Paper (PDF): ODRSS2019_P_3_Ryu.pdf