DOI QR코드

DOI QR Code

On-Site vs. Laboratorial Implementation of Camera Self-Calibration for UAV Photogrammetry

  • Han, Soohee (Dept. of Geoinformatics Engineering, Kyungil University) ;
  • Park, Jinhwan (School of Convergence & Fusion System Engineering, Kyungpook National University) ;
  • Lee, Wonhee (School of Convergence & Fusion System Engineering, Kyungpook National University)
  • Received : 2016.07.18
  • Accepted : 2016.08.24
  • Published : 2016.08.31

Abstract

This study investigates two camera self-calibration approaches, on-site self-calibration and laboratorial self-calibration, both of which are based on self-calibration theory and implemented by using a commercial photogrammetric solution, Agisoft PhotoScan. On-site self-calibration implements camera self-calibration and aerial triangulation by using the same aerial photos. Laboratorial self-calibration implements camera self-calibration by using photos captured onto a patterned target displayed on a digital panel, then conducts aerial triangulation by using the aerial photos. Aerial photos are captured by an unmanned aerial vehicle, and target photos are captured onto a 27in LCD monitor and a 47in LCD TV in two experiments. Calibration parameters are estimated by the two approaches and errors of aerial triangulation are analyzed. Results reveal that on-site self-calibration excels laboratorial self-calibration in terms of vertical accuracy. By contrast, laboratorial self-calibration obtains better horizontal accuracy if photos are captured at a greater distance from the target by using a larger display panel.

Keywords

1. Introduction

Camera calibration plays a major role in photogrammetry, which determines the projection from 3D coordinates of points to 2D image coordinates. Once a projection is known, 3D information can be inferred from 2D and vice versa (Faugeras et al., 1992). In traditional photogrammetry, cameras are calibrated in a laboratory. There, calibration parameters are determined by analyzing photos captured onto accurately measured targets installed at calibration sites. In contrast to related approaches, self-calibration, which was introduced by Faugeras et al. (1992), is popular because it does not depend on calibration references. Instead, it uses only relationships between targets in photos captured in different locations. Self-calibration is defined as the process of determining calibration parameters of a camera directly from multiple uncalibrated images of unstructured scenes. Such scenes do not require any special calibration objects (Wikipedia, 2016).

Camera calibration has been widely implemented by modern photogrammetric or computer vision solutions, as well as by stand-alone tools that are known to work based on self-calibration theory. Among the tools, GML C++ camera calibration toolbox is a free tool that uses a calibration pattern to complete the calibration process automatically (Graphics and Media Lab, 2016). If applied using multiple patterns, the process can be more stabilized by using fewer target photos and resulting in improved accuracy. Camera calibration toolbox for Matlab is a better known tool, which is implemented in both Matlab and in C and is an open source included in OpenCV (Bouguet, 2015). The tool requires more sophisticated parameter settings but results in greater accuracy. Agisoft Lens is a stand-alone tool provided in a package with Agisoft PhotoScan (Agisoft, 2016). PhotoScan is a commercial software product that is capable of photogrammetric processing of digital photos and 3D spatial data generation without intensive photogrammetric knowledge and skill. PhotoScan can perform camera calibration in two ways: 1) by means of an on-site implementation during relative orientation and aerial triangulation by using aerial photos, or 2) conducting a laboratorial implementation through Agisoft Lens by using photos captured onto a patterned target displayed on digital panels.

Many studies in aerial photogrammetry have used unmanned aerial vehicles (UAVs) and conducted aerial triangulation by using PhotoScan (Yeo et al., 2016; Kim et al., 2016; Lee, 2015; Sung and Lee, 2016; Kim, 2014). However, few have focused on whether the camera calibration itself is implemented by means of an on-site or laboratorial approach. In fact, although these approaches usually employ an on-site implementation, determining the extent of the impact that the calibration method has on the overall accuracy of aerial triangulation and ground positioning is difficult. Choi et al. (2015) tested the compatibility of a conventional digital photogrammetric workstation (DPW) and some current UAV-oriented photometric software. The camera calibration parameters were considered, but were supplied by the camera vendor. Lee and Oh (2012) compared the accuracy of on-site self-calibration using flat control points with laboratorial self-calibration in close-range photogrammetry. They concluded that on-site self-calibration is as accurate as laboratorial self-calibration, but it is still in question if the same result is expectable in UAV photogrammetry. This study considers that camera self-calibration may be influenced by the two different approaches in UAV photogrammetry. Thus, we implement the on-site and laboratorial approaches by using Agisoft PhotoScan and Lens and present the results.

 

2. On-site vs. Laboratorial Self-calibration

General steps of aerial triangulation by using PhotoScan are: 1) “adding photos” in which photos are imported to the software. 2) “aligning photos” in which PhotoScan extracts several feature points from photos and identifies conjugate points among them to conduct relative orientation, and self-calibration is then implemented utilizing the conjugate point pairs. 3) “creating markers” in which ground control points (GCPs) are marked on the photos by means of manual on-screen digitizing. Their absolute coordinates are also entered manually. 4) “optimizing cameras” in which camera calibration parameters are updated and exterior orientation parameters are finally optimized. Because the calibration parameters are estimated by using the on-site photos during the second process, we refer to the approach henceforth as on-site self-calibration.

Agisoft Lens is an automatic camera calibration tool that uses a series of photos captured onto a chessboard pattern. The pattern can be displayed on digital panels, such as LCD monitors and TVs, for use as a calibration target. The minimum number of photos for calibration is known to be three and vendors recommend that the whole area of a photo be covered by the pattern (Fig. 1). The parameters in Table 1 are estimated by the calibration, which are related to Eq. (1) in Brown’s model (Brown, 1971)

Fig. 1.Recommendation in capturing a target (Agisoft, 2016)

Table 1.Camera calibration parameters estimated by Agisoft Lens

where, ∆x and ∆y denote departures from collinearity due to lens distortion,

After the estimation, the parameters are imported to PhotoScan in “pre-calculated type” and marked “fixed” to block further modification. Afterwards, the aerial triangulation is conducted in the same manner as the on-site approach except that no further optimization of calibration parameters is allowed. We refer to this approach henceforth as laboratorial self-calibration.

The two approaches of on-site and laboratorial self-calibration fundamentally share an algorithmic base of self-calibration and are implemented by the same vendor. Regardless of the type of software, whether packaged or stand-alone, a remarkable difference exists in the acquisition of photos to be used in the calibration. On-site self-calibration uses photos acquired from the sky, which are then applied to aerial triangulation. An expected shortcoming is that recognizing targets along with their coordinates may be ambiguous because these targets would not have the same shape and may encounter blurring because of distant observation.

Laboratorial self-calibration uses photos of targets in close proximity that may result in better recognition of targets and is expected to cover the flaws anticipated in on-site self-calibration. However, laboratorial self-calibration possesses a problem that shooting (or capturing) distance is limited by the size of targets displayed on digital panels and the targets are captured in closer proximity contrary to the conventional laboratorial calibration. Small calibration errors estimated in a close model may not be critical in close-range photogrammetry. However, it cannot be overlooked in aerial photogrammetry. The problem can be more serious in aerial photogrammetry by using UAVs. UAVs are usually equipped with lenses of shorter focal lengths to cover larger areas at low altitudes. Shorter focal length requires targets in even closer proximity to fulfill the recommendation shown in Fig. 1.

In our study, the two approaches of on-site and laboratorial self-calibration were implemented in UAV-based aerial photogrammetry and assessed based on aerial triangulation results.

 

3. Application

3.1 Specifications of data

Photos were captured from a college campus by using a rotatory-winged UAV. A camera was installed on a gimbal to control pitching and panning. The UAV aviated automatically over five strips maintaining overlaps and sidelaps. Specifications for the UAV, installed camera, and flight are given in Tables 2, 3, and 4, respectively.

Table 2.Specifications of UAV

Table 3.Specifications of installed camera

Table 4.Specifications of flight

Aerial targets were chosen from among traffic lines and manhole covers, and 3D coordinates were acquired by means of global navigation satellite system (GNSS) survey based on virtual reference station (VRS). Following a visual inspection of the photos, eight were designated as GCPs and twelve as checkpoints (Fig. 2).

Fig. 2.Test area and distribution of control points (GCPs: red circles, check points: orange rectangles)

3.2 Implementation of on-site self-calibration

Sixty-eight photos were captured by the UAV and imported into PhotoScan. “aligning photo” recognized approximately 70,000 tie points and conducted relative orientation. Self-calibration was implemented on-site and coarse calibration parameters were estimated. Twenty points were marked on the photos and their coordinate information was entered. “optimizing camera” converted exterior orientation parameters from relative coordinates to absolute ones by using the eight GCPs and optimized the camera calibration parameters. Finally, aerial triangulation accuracy was estimated using the twelve checkpoints.

3.3 Implementation of laboratorial self-calibration

As described in Section 2, laboratorial self-calibration is expected to be vulnerable to the problem of short shooting distance. In this study, it is more vulnerable because the focal length of the camera is as short as 3.61mm, which corresponds to 20mm in a full-frame format. Two experiments were conducted to assess the impact of the shooting distance by using two digital panels of different sizes. In the first experiment, a 27in LCD monitor was employed to display the chessboard target, and nine photos were taken at different orientations. In the second experiment, a 47in LCD TV was employed and nine photos were captured at similar angular orientations but from a farther distance (Figs. 3 and 4).

Fig. 3.Target displayed on 47in LCD TV in the second experiment

Fig. 4.Nine photos used in the second experiment

In each experiment, photos of the target were imported to Agisoft Lens and calibration parameters were estimated automatically. Sixty-eight aerial photos were then imported to PhotoScan. Before the “aligning photo” process was performed, the new calibration parameters as well as the control points marked in the on-site self-calibration were imported to PhotoScan. Without modifying the calibration parameters, we conducted relative orientation and aerial triangulation by using the same eight GCPs. Finally, accuracy was estimated based on the same condition.

 

4. Results and Discussion

The calibration parameters estimated from the three experiments are shown in Table 5 and residuals in Figs. 5, 6, and 7. Noticeable changes in lens distortion parameters could not be recognized based on numerical comparisons in Table 5. However, changes in focal lengths affected the estimation of ground sampling distances (GSD) along with average flight altitudes: 3.4cm/pixel at 91.5m for on-site implementation compared to 4.0cm/pixel at 93.4m and 93.2m for laboratorial implementations.

Table 5.Calibration parameters

Fig. 5.Residuals in on-site self-calibration

Fig. 6.Residuals in laboratorial self-calibration using 27in panel

Fig. 7.Residuals in laboratorial self-calibration using 47in panel

Residuals of on-site self-calibration were smaller than the others shown in Fig. 5. However, the irregular pattern of residuals did not seem practical because residuals increased along the radial directions in ordinary lenses. The spreading patterns of laboratorial self-calibration, as shown in Figs. 6 and 7, seem more practical when considering that the tested lens was wide angle. Compared to the 47in panel, the 27in panel revealed more dramatic increases in residuals along radial directions.

The errors of aerial triangulation in horizontal and vertical directions were estimated using GCPs, as shown in Table 6. The errors of on-site self-calibration were measured to be the lowest, but for laboratorial self-calibration were highest when using a 27in panel and moderate when using a 47in panel. We can reason that on-site self-calibration produces more stable results than was anticipated, but laboratorial self-calibration degrade camera calibration if photos are captured closer to the target. However, another aspect can be observed from the errors estimated by the checkpoints in Table 7. Laboratorial self-calibration using both panels results in smaller horizontal errors than does on-site self-calibration. This means that the horizontal accuracy of laboratorial self-calibration is better than that of on-site self-calibration, although the latter remains still better in terms of vertical accuracy.

Table 6.Errors of aerial triangulation estimated by GCPs (unit: cm)

Table 7.Errors of aerial triangulation estimated by check points (unit: cm)

However, a twofold increase in the horizontal errors, as observed with on-site self-calibration from Tables 6 to 7, can be explained by the following: the calibration parameters were estimated such that they shared the errors of exterior orientation parameters, which is necessary to minimize the overall errors estimated using the GCPs. In other words, the calibration parameters were over-fitted to GCPs and thus resulted in more errors if estimated using the checkpoints. The impractical pattern of residuals seen in Fig. 5 was also because the parameters were over-fitted.

Drawbacks of laboratorial self-calibration can be found in the quality of photos. To cover the whole area of a photo with the target as recommended in Fig. 1, photos were captured as close as possible to the panels. This caused the photos to be poorly focused, resulting in inaccurate recognition of feature points. This worsened even further when the photos were captured onto a 27in panel (compare Figs. 8 and 9) and shooting angles were off-vertical to the panels.

Fig. 8.Poorly focused photo captured onto a 27in panel

Fig. 9.Better focused photo captured onto a 47in panel

An increase in the vertical errors in all approaches was believed to derive from vertical errors, which are more noticeable than horizontal errors in the GNSS survey. However, the true cause of these errors requires further investigation, which will be conducted in our next study.

 

5. Conclusion

This study investigated two camera self-calibration approaches: on-site self-calibration and laboratorial self-calibration. Both of these approaches are based on self-calibration theory and implemented by means of a commercial photogrammetric solution, Agisoft PhotoScan. On-site self-calibration implements camera calibration and aerial triangulation by using the same aerial photos. By contrast, laboratorial self-calibration implements camera self-calibration by using photos captured onto a patterned target displayed on a digital panel, and then conducts aerial triangulation by using the aerial photos.

Aerial photos were captured by a UAV, and target photos were captured using a 27in LCD monitor and a 47in LCD TV. Calibration parameters were estimated and errors of aerial triangulation were analyzed. Results revealed that on-site self-calibration excels laboratorial self-calibration in terms of vertical accuracy. By contrast, laboratorial self-calibration obtains better horizontal accuracy if photos were captured at a greater distance from the target by using a larger panel.

Issues to be explored in a future study include the following: 1) GCP surveying should be improved to reduce the vertical errors of aerial triangulation. 2) Larger panels should be tested for laboratorial self-calibration to enable greater distances for capturing photos of the target. 3) The impact of panel size and shooting distances should be estimated in terms of error propagation.

References

  1. Agisoft (2016), Agisoft PhotoScan, Agisoft LLC, Saint Petersburg, Russia, http://www.agisoft.com/ (last date accessed: 28 May 2016).
  2. Brown, D.C. (1971), Close-range camera calibration, Proceedings of the Symposium on Close-Range Photogrammetry System, ISPRS, 28 July-1 August, Illinois, USA, pp. 855-866.
  3. Bouguet, J.Y. (2015), Camera calibration toolbox for Matlab, Computational Vision at Caltech, Pasadena, USA, http://www.vision.caltech.edu/bouguetj/calib_doc/ (last date accessed: 28 May 2016).
  4. Choi, Y.W., You, J.H., and Cho, G.S. (2015), Accuracy analysis of UAV data processing using DPW, Journal of the Korean Society for Geospatial Information Science, Vol. 23, No. 4, pp. 3-10. (in Korean with English abstract)
  5. Faugeras, O.D., Luong, Q.T., and Maybank, S.J. (1992), Camera self-calibration: theory and experiments, Proceedings of the Second European Conference on Computer Vision, LNCS, 19-22 May, Santa Margherita Ligure, Italy, pp. 321-334.
  6. Graphics and Media Lab (2016), GML C++ camera calibration toolbox, Graphics and Media Lab, Moscow, Russia, http://graphics.cs.msu.ru/en/node/909/ (last date accessed: 16 July 2013).
  7. Kim, S.G. (2014), A Study on Construction and Application of Spatial Information Utilizing Unmanned Aerial Vehicle System, Ph.D. dissertation, Mokpo National University, Mokpo, Korea, 161p. (in Korean with English abstract)
  8. Kim, M.C., Yoon, H.J., Chang, H. J., and Yoo, J.S. (2016), Damage analysis and accuracy assessment for river-side facilities using UAV images, Journal of the Korean Society for Geospatial Information Science, Vol. 24, No. 1, pp. 81-87. (in Korean with English abstract) https://doi.org/10.7319/kogsis.2016.24.1.081
  9. Lee, C.N. and Oh, J.H. (2012), A study on efficient self-calibration of a non-metric camera for close-range photogrammetry, Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, Vol. 30, No. 6-1, pp. 511-518. (in Korean with English abstract) https://doi.org/10.7848/ksgpc.2012.30.6-1.511
  10. Lee, Y.C. (2015), Assessing the positioning accuracy of high density point clouds produced from rotary wing quadcopter unmanned aerial system based imagery, Journal of the Korean Society for Geospatial Information Science, Vol. 23, No. 2, pp. 39-48. (in Korean with English abstract)
  11. Sung, S.M. and Lee, J.O. (2016), Accuracy of parcel boundary demarcation in agricultural area using UAV-photogrammetry, Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography, Vol. 34, No. 1, pp. 53-62. (in Korean with English abstract) https://doi.org/10.7848/ksgpc.2016.34.1.53
  12. Wikipedia (2016), Camera auto-calibration, Wikimedia Foundation, Inc., https://en.wikipedia.org/wiki/Camera_auto-calibration (last date accessed: 28 May 2016).
  13. Yeo, H.J., Choi, S.P., and Yeu, Y. (2016), An improvement of efficiently establishing topographic data for small river using UAV, Journal of the Korean Society for Geospatial Information Science, Vol. 24, No. 1, pp. 3-8. (in Korean with English abstract) https://doi.org/10.7319/kogsis.2016.24.1.003

Cited by

  1. Calibration and accuracy assessment in a direct georeferencing system for UAS photogrammetry vol.39, pp.15-16, 2018, https://doi.org/10.1080/01431161.2018.1434331
  2. 시설물 모니터링을 위한 기준영상 기반 스마트폰 영상의 기하보정 vol.33, pp.2, 2016, https://doi.org/10.7780/kjrs.2017.33.2.10
  3. 카메라 검정 방법과 내부표정 요소 적용에 따른 UAS 기반의 DSM 정확도 평가 vol.33, pp.5, 2016, https://doi.org/10.7780/kjrs.2017.33.5.3.3
  4. 드론 영상 기반 시설물 점검 - 기준 영상을 활용한 자동 처리 중심으로 vol.26, pp.2, 2016, https://doi.org/10.7319/kogsis.2018.26.2.021
  5. Determining the Optimal Number of Ground Control Points for Varying Study Sites through Accuracy Evaluation of Unmanned Aerial System-Based 3D Point Clouds and Digital Surface Models vol.4, pp.3, 2016, https://doi.org/10.3390/drones4030049
  6. Calibration of Industrial Cameras for Aerial Photogrammetric Mapping vol.12, pp.19, 2016, https://doi.org/10.3390/rs12193130