ISSN: 2229-371X

All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

LINEAR AND NON-LINEAR CAMERA CALIBRATION TECHNIQUES

Suchi Upadhyay 1, S.K.Singh 2 ,Manoj Gupta 3 ,Ashok K. Nagawat4
  1. M.Tech Student, Suresh Gyan Vihar University, Jaipur (Rajasthan), India
  2. Associate Professor & M.Tech Co-ordinator, Suresh Gyan Vihar University, Jaipur (Rajasthan), India
  3. Assistant Professor, Department of Computer Science & Engineering, Central University of Rajasthan, Kishangarh, Ajmer (Rajasthan), India
  4. Professor, University of Rajasthan, Jaipur, India
Related article at Pubmed, Scholar Google

Visit for more related articles at Journal of Global Research in Computer Sciences

Abstract

This Paper deals with calibrate a camera to find out the intrinsic and extrinsic camera parameters which are necessary to recover the depth estimation of an object in stereovision system

Keywords

Camera Calibration, Tsai’s algorithm, Stereovision, Linear Calibration, Non-Linear Calibration, Depth estimation

INTRODUCTION

A 3D projection is a mathematical transformation used to project three dimensional points onto a two dimensional plane. Often this is done to simulate the relationship of the camera to subject. 3D projection is often the first step in the process of representing three dimensional shapes two dimensionally in computer graphics. Perspective projection is a type of rendering that graphically approximates on a planar (2D) surface the images of 3D objects so as to approximate actual visual perception.
image
Camera calibration is a necessary step in 3D computer vision in order to extract metric information from 2D images. Much work has been done, starting in the photogrammetry community, and more recently in computer vision. Zhengyou Zhang [1] gives an idea on “Camera Calibration with One-Dimensional Objects”. According to the dimension of the calibration objects, he can classify the calibration techniques into three categories:
Self-calibration: Here, no calibration object is used and only image point correspondences are required.
2D plane based calibration
3D reference object based calibration
Camera calibration is the process of determining the internal and external parameters of the camera so that the location of the objects observed by the camera can be determined [2]. If accurate camera calibration methods are used, the problem of recovering depth information from stereo image pairs is significantly simplified.
Basically there are two parameters that include camera calibration technique, one is intrinsic parameter and another is extrinsic parameter. The intrinsic parameters define the internal geometric and optical characteristics of the camera whereas the extrinsic parameters define the position and orientation of the camera within an arbitrary defined 3D coordinate system [3].
2D Case of Camera Calibration
2D plane based calibration technique in this category requires observing a planar pattern (Figure 2). Different from Tsai’s technique, the knowledge of the plane motion is not necessary, because almost anyone can make such a calibration pattern by themselves.
image
Assumption
Pixel coordinates:
The measurement in pixel coordinates is taken from the 2D projected image plane.
(a) Unit in pixels
(b) Origin: top left corner
(c) x values increase from left to right
(d) y values increase from top to bottom
World coordinates:
The measurement in world coordinates is taken from the arbitrary world reference frame.
User defined
In order to measure the real size of objects there must be a mapping from each pixel coordinate to a world coordinate.
Camera parameters introduced
Camera parameters are both interior and exterior such as follows:

INTERIOR PARAMETERS:

Geometry of CCD
dx : Center-to-center distance of pixels in x direction
dy : Center-to-center distance of pixels in y direction
Principal point
xp : x-coordinate for principal point, relative to center of
image
yp : y-coordinate for principal point, relative to center of image
Camera constant
f : focal length
Lens distorsion coefficients
k1 : first order lens distortion coefficient
k2 : second order lens distortion coefficient
k2 : third order lens distortion coefficient
Frame buffer property
x s : scaling factor
Exterior parameters:
Rigid body transform
Rx: rotation around x-axis
Ry: rotation around y-axis
Rz: rotation around z-axis
Tx: translation in x direction
Ty: translation in y direction
Tz: translation in z direction
3D Case of Camera Calibration
3D Camera calibration is performed by observing a calibration object whose geometry in 3D space is known with very good precision.
The calibration object usually consists of two or three planes orthogonal to each other (Figure 5). Sometimes, a plane undergoing a precisely known translation is also used, which equivalently provides 3D reference points.
Tsai’s Perspective Projection Camera Model
Tsai uses a pinhole camera model to describe the transformation of points in 3D space to pixels in the camera’s frame buffer. Tsai's camera model consists of 11 parameters: six extrinsic, “exterior-orientation" parameters (Rx,Ry,Rz,Tx,Ty,Tz) and five intrinsic, “interior-orientation" parameters (f,Cx,Cy,sx,k1). For a fixed lens all 11 camera parameters are constants estimated from calibration data taken from a single camera view (i.e. the exterior and interior orientation of the camera is fixed).
Tsai’s camera model contains the following parameters:
R A 3x3 rotation matrix
R
T A translation vector
F The focal length of the camera
Sx An uncertainty factor introduced by the image capture hardware
k1 The radial lens distortion co-efficient
(Cx,Cy) The image centre
Of these R, T, f, sx and k1 are to be determined using Tsai’s algorithm for calibration. Cx and Cy can be determined beforehand, and will not need to be re-calibrated.
image
In Tsai's model, illustrated in Fig. 1, the origin of the camera-centered coordinate system (xc,yc,zc) coincides with the front nodal point of the camera; the zc axis coincides with the camera's optical axis. The image plane is assumed to be parallel to the (xc,yc) plane and at a distance f from the origin, where f is the pinhole camera's effective focal length. The relationship between the position of a point P within the world coordinates (xw,yw,zw) and the point's image in the camera's frame buffer (Xf,Yf) is defined by a sequence of coordinate transformations. The first transformation is a rigid body rotation and translation from the worldcoordinate system (xw,yw,zw) to the camera-centered coordinate system (xc, yc, zc). This is described by
image
is the 3x3 rotation matrix describing the orientation of the camera in the world-coordinate system. R can also be expressed as
R = Rot (Rx) Rot (Ry) Rot (Rz) (3)
the product of three rotations around the x, y, and z axes of the world-coordinate system.
The second transformation is a perspective projection (using an ideal pinhole-camera model) of the point in the camera coordinates to the position of its image in undistorted sensor-plane coordinates, (Xu,Yu). This transformation is described by
image
image
The third transformation, illustrated in Figure 6, is from the undistorted (ideal) position of the point's image in the sensor plane to the true position of the point's image, (Xd,Yd), which results from geometric lens distortion. This is described by
image
where, k1 is the coefficient of radial lens distortion.
The _nal transformation is between the true position of the point's image on the sensor plane and its coordinates in the The final transformation is between the true position of the point's image on the sensor plane and its coordinates in the camera's frame buffer, (Xf,Yf). This is described by
image
where, Cx and Cy are the coordinates (in pixels) of the intersection of the zc axis and the camera's sensor plane; dx and dy are the effective center-to-center distances between the camera's sensor elements in the xc and yc directions; and sx is a scaling factor to compensate for any uncertainty in the ratio between the number of sensor elements on the CCD and the number of pixels in the camera's frame buffer in the x direction [7].

TSAI’S ALGORITHM

The algorithm given by Tsai is a two-stage process designed to be performed without operator assistance. It calibrates the R, T, f, k1 and sx parameters from the above camera model (Figure 3). The algorithm executes quickly on PC hardware due to the absence of large non-linear searches.
A calibration pattern is required by this algorithm and Tsai provides different versions for coplanar and non-coplanar calibration patterns. It is a single view algorithm; however it can be adapted to be used with multiple views of the calibration pattern. The first stage of the process determines the extrinsic parameters sx, R and the first two components of the translation vector, Tx and Ty. The focal length f and the z component of the translation vector Tz are also estimated at this stage. This is achieved by solving a system of linear equations whose input is the coordinates of points in the calibration pattern, both in the image and in the real world. The various parameters are then recovered from the solution to this system. The second stage of the process involves a steepest descent search. This is used to determine the radial distortion factor k1 which cannot be determined from the calibration pattern; f and Tz are also adjusted during the search [3].

SYSTEM IMPLEMENTATION

Calibrate a projective camera using a Linear Least Square approach and without taking radial distortion into account.
Given a MATLAB data file that contains 3D co-ordinates of some points in the scene along with their 2D projection in the image. We have to write a MATLAB function called LinearCalib that computes the projective camera parameters. The signature of the function should be as follows- Function [CamCalib] = LinearCalib [Points 3D, Points 2D] Inputs: Points 2D = A 2xN matrix of N 2D points. Points 3D = A 4xN matrix of N 4D homogeneous coordinates.
Outputs: CamCalib = A 3x4 projective camera matrix
H/W and S/W Requirements
Basic H/W Requirements
CPU: Intel-Original, PentiumIV, 2.4 GHz of faster
RAM: 256 MB or greater for best performance
Hard Disk: 40 GB with at least 10 GB of free space
Webcam: Logitech Quick cam Pro 3000
Digital Camera: Canon Power Shot- A420, 4.0 Mega pixels with Magnification Ratio 11x.
Basic S/W Requirements
Operating System: Windows XP with Service Pack 2
Development Tool: MATLAB 7.1
DirectX: Release 9 or later
Flow Diagram of the Project
this work consists of two parts: linear and non-linear camera calibration techniques. For linear camera calibration technique; we have used a single camera to take the view of this grid pattern object and for non-linear camera calibration technique; two cameras or the same camera has been used twice to take the separate views of the same. The non-linear camera calibration technique gives us the depth information of an object in stereovision system. So the flow diagram of work can be considered as the following two parts:
Flow Diagram for Camera Calibration
Step1: Two planes are placed at a right angle with checkerboard patterns.
Step2: We know positions of the selected points with respect to the world coordinate system of the target. Step3: We position camera in front of target and find image coordinates in pixels.
Step4: We obtain functions in MATLAB that calculate the projection matrix M.
Step5: Now we find the camera intrinsic and extrinsic parameters with respect to the target in the world reference frame.
Flow Diagram for Depth Estimation
Step1: Two planes are placed at a right angle with checkerboard patterns
Step2: Capture the image with respect to a fixed world coordinate which is to be considered.
Step2: Capture the same image from a shifted position with respect to a fixed world coordinate
Step3: Generate 2D projected image, taken as input to calculate the projection matrix M.
Step4: Projection matrix M gives us the camera parameters fx , fy , Ox , Oy , R and T. Step5: From these parameters we can obtain the depth ration r l Z / Z .

CONCLUSION

In this paper we discuss an approach to calibrate a single camera and estimate the depth of an object in a stereo vision system. Camera calibration is done to get the intrinsic and extrinsic parameters of a camera. These parameters can further be used to acquire the knowledge of depth information of an object in stereovision system. The depth together with the orientation can be used to reconstruct the 3D image of the object in a two dimensional plane which is also called 3D reconstruction.
As our previous works give an idea of depth information of an object in stereovision system and from that depth information we can obtain the mesh concept about the object and we hope that it will help us to reconstruct a 3D object in the near future.

References

  1. Zhengyou Zhang. Camera Calibration With One- Dimensional Objects, MSR-TR-2001-120, August 2002.
  2. Barry McCullagh, Fergal Shevlin. Coplanar Camera Calibration With Small Depth Of Field Lens, Trinity College, Ireland.
  3. Michael Tapper and Phillip J. McKerrow and Jo Abrantes. Problems Encountered in the Implementation of Tsai’s Algorithm for Camera Calibration, University of Wollongong, North Wollongong 2510.
  4. R. Y. Tsai. A versatile camera calibration technique for high-accuracy 3d machine vision metrology using off-theshelf tv cameras and lenses. IEEE Journal of Robotics and Automation, RA-3(4):323-344, August 1987.
  5. R. K. Lenz & R. Y.Tsai. "Techniques for Calibration of the Scale Factor and Image Center for high accuracy 3D Machine Vision Metrology", IEEE Trans. Pattern Analysis and Machine Intelligence,val PAMI 1, no.5,pp.713-720/87.
  6. Horn, B.K.P. (2000). Tsai’s Camera Calibration method revisited, MIT Press, Cambridge, Massachusetts and McGraw-Hill, New York.
  7. Reg G. Willson. Modeling and Calibration of Automated Zoom Lenses, 3M Center, Building 518-1-01, Saint Paul, MN 55144-1000 USA.
  8. Van de Loosdrecht Machine Vision, Noordelijke Hogeschool Leeuwarden. 2D Camera Calibration on Computer vision, 12 January 2007.
  9. David A. Forsyth & Jean Ponce. Book: Computer Vision (A Modern Approach), Pearson Education.
  10. Eric Marchand and Francois Chaumette. A new formulation for non-linear camera calibration using virtual visual servoing, Theme 3, N 4096, Janvier 2001.
  11. Benoit Telle and Marie-Jos´ee Aldon. LIRMM, UMR CNRS/UMII, n.C55060 161, rue ADA, 34392 Montpellier - cedex5 – France and Nacim Ramdani CERTES Universit Paris XII- Val de Marne, ave G. de Gaulle, 94000 Crteil. Camera calibration and 3D reconstruction using interval analysis.
  12. O. Faugeras. Three-Dimensional Computer Vision: AGeometric Approach, MIT Press, 1996, pp. 33-68.
  13. R. Hartley and A. Zisserman. Multiple View Geometry in Computer Vision, Cambridge University, Press, 2000, pp. 138-183.
  14. T. A. Clarke. City University, London, UK and J. G. Fryer. The University of Newcastle, Australia, The Development of Camera Calibration Methods and Models.
  15. Peter Sturm and Srikumar Ramalingam. A Generic Concept for Camera Calibration, INRIA Rhone-Alpes, 38330 Montbonnot, France, University of California, Santa Cruz, CA 95064, USA.
  16. Philippe Guermeur. ENSTA - laboratoire LEI - 32 Bd Victor - 75015 Paris – France and Jean Louchet. INRIA Rocquencourt, Fractales project - B.P. 105 - 78153 Le Chesnay Cedex – France, An Evolutionary Algorithm for Camera Calibration.
  17. Barry McCullagh, Fergal Shevlin. Coplanar Camera Calibration with Small Depth of Field Lens, Trinity College, Dublin 2, Ireland.
  18. H. Zollner and R. Sablatnig. Comparison of Methods for Geometric Camera Calibration using Planar Calibration Targets, Vienna University of Technology (PRIP Group).