AprilTag array-aided extrinsic calibration of camera–laser multi-sensor system
© The Author(s) 2016
Received: 1 March 2016
Accepted: 24 June 2016
Published: 25 July 2016
This paper presents a new algorithm for extrinsically calibrating a multi-sensor system including multiple cameras and a 2D laser scanner. On the basis of the camera pose estimation using AprilTag, we design an AprilTag array as the calibration target and employ a nonlinear optimization to calculate the single-camera extrinsic parameters when multiple tags are in the field of view of the camera. The extrinsic parameters of camera–camera and laser–camera are then calibrated, respectively. A global optimization is finally used to refine all the extrinsic parameters by minimizing a re-projection error. This algorithm is adapted to the extrinsic calibration of multiple cameras even if there is non-overlapping field of view. For algorithm validation, we have built a micro-aerial vehicle platform with multi-sensor system to collect real data, and the experiment results confirmed that the proposed algorithm yields great performance.
KeywordsMulti-sensor AprilTag array Extrinsic calibration Non-overlapping
Nowadays, multiple sensors are widely used in various robot systems such as unmanned ground and aerial vehicles. These sensors provide abundant information like visual image and range measurement of the environment. Fusing these sensors information is necessary to understand the environment significantly. But, whenever multiple sensors are combined, one also has to deal with additional calibration issues, which is frequently overlooked. Quantities are seldom measured at the same position and in the same coordinate frame, implying that the alignment, the relative position and/or orientation of the sensors have to be determined. A good calibration is a prerequisite to do sensor fusion.
In many challenging tasks for robot systems like environment 3D mapping  and self-localization , cameras and 2D laser scanner supply intensity information and depth information, respectively. At the same time, a large field of view is usually required in these tasks. Capturing a large field of view is often no possible by using a single camera only, and multiple cameras have to be used. Hence, multiple cameras and 2D laser scanner will play a more and more important roles in robot systems.
We propose an extrinsic calibration algorithm for multiple cameras, even if there is no overlapping field of view among them.
We propose an extrinsic calibration algorithm between a camera and a 2D laser scanner using AprilTag array calibration target, and it is integrated with the multi-camera extrinsic calibration into a multi-sensor joint extrinsic calibration with a final global optimization.
This paper is organized as follows: Firstly, “Related work” section provides a review of related approaches. A description of extrinsic calibration pattern of multi-sensor system is given in “Calibration pattern” section. In “Extrinsic calibration of multi-sensor system” section, the joint extrinsic calibration algorithm for multi-sensor system is introduced. Experiments and discussions are presented in “Experiments” section. “Conclusion and future work” section provides the conclusion and future work.
With the preference of multi-camera system over single camera, the extrinsic parameters calibration among multiple cameras becomes the basic requirement. In order to improve the accuracy and ignore the time-consuming of calibration, offline estimation of extrinsic parameters is desirable. In contrast to online calibration methods like , offline calibration relies on calibration patterns with known geometry and appearance and need not consider the real-time capability of the calibration algorithm. Conventional offline calibration uses artificial and two-dimensional calibration targets like checkerboard. It is popular because its corners can be detected accurately and reliably, while other patterns are also possibly demonstrated in [5, 6]. Svoboda et al.  make use of the overlapping fields of view of the cameras, and it can calibrate stereo camera or circular camera rig with all cameras pointing inwards. However, systems with cameras pointing outwards are increasingly popular, and usually there are minimal or no overlapping fields of view. Li et al.  presented a multi-camera system calibration toolbox adapted to minimal overlapping fields of view using a feature descriptor. In [8, 9], hand-eye calibration methods are used to calibrate this system, but they are often not accurate due to visual odometry drift. In addition to camera models, the production convenience and expansibility of the calibration pattern are also focused on. In early research, Bouguet  made use of cubes with a chessboard or circular dots on their surfaces. Yet this pattern is not convenient to build with high precision. Strauβ et al.  use checkerboard targets and combine many of them to a rigid, three-dimensional target. The checkerboards are provided with a graphical 2D code for uniquely identifying every single corner of the checkerboard. But the calibration target is also with complex structure and not easy to produce.
Our multi-camera calibration work focuses on both the adaptability to camera models and the production convenience of calibration pattern. We group many tags of AprilTag  to an array and print them into a board with great planarity. These tags are identified by their different appearances, and each of them is marked as a unique ID. The extrinsic parameters of multiple cameras can be estimated only if there is one complete tag at least in each camera’s field of view.
To extrinsically calibrate a laser range finder and a camera, different calibration targets and geometry constraints are presented. Kwak et al.  and Li et al.  propose v-shaped calibration target and the right-angled triangulation board, respectively, to generate constraints on the extrinsic parameters through establishing line-point correspondences. Their drawback is that it is difficult to guarantee that the selected laser points exactly lie on the calibration target boundary. Zhang et al.  use the checkerboard to be the calibration target. As the state of the art, this method establishes constraints on the extrinsic parameters with plane parameters and is extended to extrinsically calibrate other kinds of range sensors and cameras [15–17]. In our research, the checkerboard was replaced with planar AprilTag array as the calibration target, and the plane-line correspondences  were employed to build the constraints.
AprilTag-based pose estimation
AprilTag is a robust and flexible visual fiducial system proposed by Olson  in 2011. It uses a 2D bar code style “tag” as Fig. 1a shows, allowing full 6-DOF localization of features from a single image.
Design of calibration target
The coordinate system of the first tag (ID = 0) in the AprilTag array is treated as the global coordinate system. Thus, the transformation matrix T n 0 between an arbitrary tag (ID = n) coordinate system and the global coordinate system can be computed precisely without effort.
Accurate localization In , the localization accuracy of the AprilTag has been evaluated using a ray-tracing simulator. Two experiments measuring the orientation accuracy and distance accuracy validated the high precision and reliability in localization. Without loss of generality, real-world performance of the system will be lower than synthetic experiments for noise, lighting variation, etc. But it is still good.
Great adaptability to camera models The camera pose in tag coordinate system can be estimated as long as there is one complete tag in the field of view. For the multi-camera system with non-overlapping views, we can decouple these cameras into several different pairs of neighboring cameras to guarantee that at least one tag appears in fields of view. Hence, the extrinsic calibration of multiple cameras can be realized through coordinate systems transformations, which we will talk about in “Extrinsic calibration of multi-sensor system” section.
Reliable tags identification The estimated camera pose is in the tag coordinate system. Therefore, identifying these different tags in the AprilTag array is important to localize the camera in global coordinate system. The AprilTag provides the users a large number of distinguishable codes for tags and reliable tag identification algorithm.
Convenient production and expansibility The AprilTag array can be generated easily with expected size and distribution by a CDE package available at https://github.com/ethz-asl/kalibr/wiki/downloads.
Furthermore, we can conveniently extend the AprilTag array by printing additional part and then jointing the original target.
Extrinsic calibration of multi-sensor system
The extrinsic calibration of multiple sensors is to identify the rigid transformations among these sensor coordinate systems. This paper focuses on the extrinsic calibration of a multi-sensor system including multiple cameras and a 2D laser scanner. This calibration process can be carried out in both static and dynamic environments. The static environment is advised since the dynamic environment may lead to lower accuracy due to the multiple-sensor data unsynchronization problem. In general, the calibration is pre-operated before the final task and a static environment can be easily established. Therefore, most calibration is operated under a static environment.
This calibration is composed of two components: camera to camera and camera to 2D laser scanner. These two calibration processes are combined as a joint procedure.
Multiple-camera extrinsic calibration
Dual-camera extrinsic calibration
Single-camera localization optimization
It is possible that there are several complete tags appearing in the field of view of a single camera at the same time. The AprilTag would provide us several poses relative to these detected tags. These poses may not be same after the transformations into global coordinate system for the reason that the pose estimation error in different tag coordinate systems is different. Obviously, these measurement errors affect the performance of the extrinsic calibration.
The function above assumes that there are n tags detected in image. R n C and t n C denote tag (ID = n) rotation matrix and translation vector, respectively, relative to the camera, which are given by AprilTag. We model the F(R C G , t C G ) minimization as a nonlinear optimization problem and figure it out by using the Levenberg–Marquardt method [19, 20].
Camera and laser extrinsic calibration
It is assumed that there is at least one camera sharing common field of view with the 2D laser scanner, which means that the laser points are in the field of view of that camera. We extrinsically calibrate the 2D laser scanner and the camera using plane-line correspondence. Unlike [14, 21], our method employs the AprilTag array to be the calibration target.
Although the forward camera and the laser scanner may have no common field of view, we can still refine T L F by the nonlinear minimization process described in previous section.
Algorithm: Multi-sensor system extrinsic calibration
1. Simultaneous images and laser scan points’ coordinates.
2. Intrinsic parameters of each camera.
3. The transformations between two arbitrary tags in the calibration target.
1. Dual-camera extrinsic calibration
(a) Single-camera extrinsic calibration using AprilTag.
(b) Optimization of single-camera extrinsic parameters by a nonlinear minimization process.
(c) Computing the extrinsic parameters (R D F , t D F ) by the matrix transformations.
2. Camera and laser extrinsic calibration
(a) Using plane-line correspondence for the geometry constraints.
(b) Solving these equations for the linear solution.
(c) Optimizing the linear solution to refine the camera and laser extrinsic parameters.
3. Global optimization to refine the multi-sensor system extrinsic parameters.
(a) Computing the extrinsic parameters between other cameras and the laser.
(b) Global optimization of all the sensors extrinsic parameters.
Output: Extrinsic parameters of multi-sensor system
Single-camera pose estimation
Position estimation of RMSE with multiple estimations by multiple tags
Multiple tags (RMSE)
Attitude estimation of RMSE with multiple tags and the optimization result errors
Multiple tags (RMSE)
Multi-sensor jointly extrinsic calibration
Conclusion and future work
In this paper, we proposed a new algorithm for extrinsically calibrating multi-sensor system including multiple cameras and a 2D laser scanner using the AprilTag array as the calibration target. This algorithm uses the AprilTag to estimate cameras’ poses and employ a nonlinear optimization method to refine these poses when multiple tags are in the field of view. Then, the camera–camera and laser–camera extrinsic parameters are estimated on the basis of the single-camera pose. Finally, a global optimization is used to refine all the extrinsic parameters. This algorithm is adapted to multiple-camera extrinsic calibration with non-overlapping field of view, and it has the advantages of being simple to use and yields great performance, which have been validated by real data.
In future work, more experiments and analyses about the influences of the tags number in the field of view or the camera module, etc. on the calibration accuracy should be carried out. In addition, it should be possible to extend the multi-sensor system into other sensor configurations, and an accurate and stable dynamic calibration can be taken into account.
DT proposed the calibration algorithm, designed the validation process, and wrote the manuscript. TH, LS, ZM and CP built the experiment system for the validation, analyzed the data, and revised the manuscript. All authors read and approved the final manuscript.
This work was supported in part by the Scientific Research Foundation of Graduate School of National University of Defense Technology, in part by the National Science Foundation of China under Grant 61005077 and Grant 61075072 and in part by the Foundation for the Author of Excellent Doctoral Dissertation of Hunan Province under Grant YB2011B0001. The authors would like to thank Boxin Zhao, Zhaowei Ma, Shulong Zhao, Zhiwei Zhong, Dou Hu, etc. for their contribution in building the multi-sensor system.
The authors declare that they have no competing interests.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
- Biber P, Andreasson H, Duckett T, Schilling A. 3D modeling of indoor environments by a mobile robot with a laser scanner and panoramic camera. In: IEEE/RSJ international conference on intelligent robots and system (IROS), 2004, pp. 3430–35.Google Scholar
- Huh S, Shim D, Kim J. Integrated navigation system using camera and gimbaled laser scanner for indoor and outdoor autonomous flight of UAVs. In: IEEE/RSJ international conference on intelligent robots and system (IROS), 2013, pp. 3158–63.Google Scholar
- Olson E. AprilTag: a robust and flexible visual fiducial system. In: IEEE international conference on robotics and automation (ICRA), 2011, pp. 3400–07.Google Scholar
- Knorr M, Niehsen W, Stiller C. Online extrinsic multi-camera calibration using ground plane induced homographies. In: IEEE intelligent vehicles symposium, 2013, pp. 236–241.Google Scholar
- Li B, Heng L, Koser K, Pollefeys M. A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), 2013, pp. 1301–07.Google Scholar
- Schmalz C, Forster F, Angelopoulou E. Camera calibration active versus passive targets. Opt Eng. 2011;50(11):113601.View ArticleGoogle Scholar
- Svoboda T, Martinec D, Pajdla T. A convenient multicamera self-calibration for virtual environments. Presence Teleoperators Virtual Environ. 2005;14(4):407–22.View ArticleGoogle Scholar
- Tsai R, Lenz R. A new technique for fully autonomous and efficient 3D robotics hand/eye calibration. IEEE Trans Robot Autom. 1989;5(3):345–58.View ArticleGoogle Scholar
- Shiu Y, Ahmad S. Calibration of wrist-mounted robotic sensors by solving homogeneous transform equations of the form ax = xb. IEEE Trans Robot Autom. 1989;5(1):16–29.View ArticleGoogle Scholar
- Bouguest J. Camera calibration toolbox for matlab. 2004.Google Scholar
- Strauβ T, Ziegler J, Beck J. Calibration multiple cameras with non-overlapping views using coded checkerboard targets. In: International conference on intelligent transportation system (ITSC), 2014, pp. 2623–28.Google Scholar
- Kwak K, Huber D, Badino H, Kanade T. Extrinsic calibration of a single line scanning lidar and a camera. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), 2011, pp. 25–30.Google Scholar
- Li G, Liu Y, Dong L, Cai X, Zhou D. An algorithm for extrinsic parameters calibration of a camera and laser range finder using line features. In: IEEE/RSJ international conference on intelligent robots and systems (IROS), 2007, pp. 3854–59.Google Scholar
- Zhang Q, Pless R. Extrinsic calibration of a camera and laser range finder (improves camera calibration). In: IEEE/RSJ international conference on intelligent robots and systems (IROS), 2004, pp. 2301–06.Google Scholar
- Geiger A, Moosmann F, Car O, Schuster B. Automatic camera and range sensor calibration using a single shot. In: IEEE international conference on robotics and automation (ICRA), 2012, pp. 3926–43.Google Scholar
- Zhou L, Deng Z. Extrinsic calibration of a camera and a LIDAR based on decoupling the rotation from the translation. In: IEEE intelligent vehicles symposium (IV), 2012, pp. 642–8.Google Scholar
- Mirzaei F, Kottas D, Roumeliotis S. 3D LIDAR-camera intrinsic and extrinsic calibration: identifiability and analytical least-squares-based initialization. Int J Robot Res. 2012;31(4):452–67.View ArticleGoogle Scholar
- Corke P. Robotics, vision and control. Stanford: Stanford University; 2014.MATHGoogle Scholar
- Levenberg K. A method for the solution of certain problems in least squares. Q Appl Math. 1944;2:164–8.MathSciNetMATHGoogle Scholar
- Marquardt D. An algorithm for least squares estimation of nonlinear parameters. SLAM J Appl Math. 1963;11(2):431–41.MathSciNetView ArticleMATHGoogle Scholar
- Zhou L. A new minimal solution for the extrinsic calibration of a 2D LIDAR and a camera using three plane-line correspondences. IEEE Sens J. 2014;14(2):442–54.View ArticleGoogle Scholar