DOI QR코드

DOI QR Code

Matching Points Extraction Between Optical and TIR Images by Using SURF and Local Phase Correlation

SURF와 지역적 위상 상관도를 활용한 광학 및 열적외선 영상 간 정합쌍 추출

  • Received : 2015.02.27
  • Accepted : 2015.03.18
  • Published : 2015.03.31

Abstract

Various satellite sensors having ranges of the visible, infrared, and thermal wavelengths have been launched due to the improvement of hardware technologies of satellite sensors development. According to the development of satellite sensors with various wavelength ranges, the fusion and integration of multisensor images are proceeded. Image matching process is an essential step for the application of multisensor images. Some algorithms, such as SIFT and SURF, have been proposed to co-register satellite images. However, when the existing algorithms are applied to extract matching points between optical and thermal images, high accuracy of co-registration might not be guaranteed because these images have difference spectral and spatial characteristics. In this paper, location of control points in a reference image is extracted by SURF, and then, location of their corresponding pairs is estimated from the correlation of the local similarity. In the case of local similarity, phase correlation method, which is based on fourier transformation, is applied. In the experiments by simulated, Landsat-8, and ASTER datasets, the proposed algorithm could extract reliable matching points compared to the existing SURF-based method.

위성센서 기술이 발전함에 따라서 가시광선, 적외선, 열적외선 영역 등의 파장대를 탐지하는 다양한 센서들이 발사되고 있다. 이에 따라, 다중센서 영상의 융합 및 통합에 대한 연구들이 진행되고 있으며, 이를 위해서는 다중센서의 정합이 필수적이다. 위성영상의 정합 및 자동기하보정을 위하여 SIFT, SURF와 같은 알고리즘이 제안되었다. 그러나, 광학영상과 열적외선 상의 경우 다른 분광특성을 가지고 있기 때문에, 기존의 영상정합기법을 적용할 경우에는 높은 정확도를 확보하기 어려운 문제를 지닌다. 본 연구에서는 SURF를 이용하여 참조영상의 특징점을 추출하였으며, 추출된 특징점의 위치를 기반으로 지역적 상관도를 추정하여 정합쌍을 추출하고자 하였다. 지역적 상관도의 경우에는 퓨리에 변환을 기반으로 하는 위상 상관도 기법을 적용하였다. 가상의 고해상도 다중센서 영상과 Landsat-8, ASTER 영상을 이용한 실험결과, 기존의 SURF를 활용한 정합기법과 비교하여 본 연구에서 제안한 방법이 두 영상 간 정합쌍을 더욱 효과적으로 추출할 수 있음을 확인하였다.

Keywords

References

  1. Bay, H., Ess, A., Tuytelaars, T. and Gool, L. V., 2008, Speeded-up robust features, Computer Vision and Image Understanding, Vol. 110, No. 3, pp. 346-359. https://doi.org/10.1016/j.cviu.2007.09.014
  2. Byun, Y., Choi, J. and Han, Y., 2013, An area-based image fusion scheme for the integration of SAR and optical satellite imagery, IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, Vol. 6, No. 5, pp. 2212-2220. https://doi.org/10.1109/JSTARS.2013.2272773
  3. Dong, J., Zhuang, D., Huang Y. and Fu, J., 2009, Advances in multi-sensor data fusion: algorithms and applications, Sensors, Vol. 9, No. 10, pp. 7771-7784. https://doi.org/10.3390/s91007771
  4. Fischler, M. and Bolles, R., 1981, Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography, Communications of the ACM, Vol. 24, No. 6, pp. 381-395. https://doi.org/10.1145/358669.358692
  5. Gwon, H., Lee, I. and Choi, T., 2013, Electro-optics and infrared image registration using gaussian pyramids, Advanced Science and Technology Letters, Vol. 29(SIP 2013), pp. 55-59.
  6. Han, Y., Byun, Y., Choi, J., Han, D. and Kim, Y., 2012, Automatic registration of high-resolution images using local properties of features, Photogrammetric Engineering and Remote Sensing, Vol. 78, No. 3, pp. 211-221. https://doi.org/10.14358/PERS.78.3.211
  7. Han, Y., Choi, J., Byun, Y. and Kim, Y., 2014, Parameter optimization for the extraction of matching points between high-resolution multisensor images, IEEE Transactions on Geoscience and Remote Sensing, Vol. 52, No. 9, pp. 5612-5621. https://doi.org/10.1109/TGRS.2013.2291001
  8. Klimaszewski, J., Kondej, M., Kawecki, M. and Putz, B., 2013, Registration of Infrared and visible images based on edge extraction and phase correlation approaches, Image Processing and Communications Challenges 4, Vol. 184, pp. 153-162. https://doi.org/10.1007/978-3-642-32384-3_19
  9. Lee, Y., 2014, Automatic extraction method of control point based on geospatial web service, Journal of the Korean Society for Geospatial Information System, Vol. 22, No. 2, pp. 17-24. https://doi.org/10.7319/kogsis.2014.22.2.017
  10. Lowe, D., 2004, Distinctive image features from scale-invariant keypoints, International Journal of Computer Vision, Vol. 60, No. 2, pp. 91-110. https://doi.org/10.1023/B:VISI.0000029664.99615.94
  11. Lu, P., 2013, Rotation invariant Registration of 2D aerial images using local phase correlation, Master thesis, Uppsala University.
  12. Ye, C., 2014, Image registration using outlier removal and triangulation-based local transformation, Korean Journal of Remote Sensing, Vol. 30, No. 6, pp. 787-795. https://doi.org/10.7780/kjrs.2014.30.6.9
  13. Yeom, J., Han, Y. and Kim, Y., 2013, Analysis of shadow effect on high resolution satellite image matching in urban area, Journal of the Korean Society for Geospatial Information System, Vol. 21, No. 2, pp. 93-98. https://doi.org/10.7319/kogsis.2013.21.2.093
  14. Zhao, D., Yang, Y., Ji, Z. and Hu, X., 2014, Rapid multimodality registration based on MM-SURF, Neurocomputing, Vol. 131, pp. 87-97. https://doi.org/10.1016/j.neucom.2013.10.037

Cited by

  1. Matching Performance Estimation of a Structural Feature Area with the Ortho-Rectified Satellite Image through Edge Component Analysis vol.16, pp.8, 2018, https://doi.org/10.14801/jkiit.2018.16.8.65
  2. Automated Coregistration of Multisensor Orthophotos Generated from Unmanned Aerial Vehicle Platforms vol.2019, pp.None, 2015, https://doi.org/10.1155/2019/2962734