SG191452A1 - Automatic calibration method and apparatus - Google Patents
Automatic calibration method and apparatus Download PDFInfo
- Publication number
- SG191452A1 SG191452A1 SG2011097888A SG2011097888A SG191452A1 SG 191452 A1 SG191452 A1 SG 191452A1 SG 2011097888 A SG2011097888 A SG 2011097888A SG 2011097888 A SG2011097888 A SG 2011097888A SG 191452 A1 SG191452 A1 SG 191452A1
- Authority
- SG
- Singapore
- Prior art keywords
- nfov
- camera
- wfov
- cameras
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 77
- 239000011159 matrix material Substances 0.000 claims abstract description 15
- 230000009466 transformation Effects 0.000 claims abstract description 15
- 238000013507 mapping Methods 0.000 claims description 21
- 230000008569 process Effects 0.000 description 11
- 238000013459 approach Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000013213 extrapolation Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 230000001154 acute effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- AHLBNYSZXLDEJQ-FWEHEUNISA-N orlistat Chemical compound CCCCCCCCCCC[C@H](OC(=O)[C@H](CC(C)C)NC=O)C[C@@H]1OC(=O)[C@H]1CCCCCC AHLBNYSZXLDEJQ-FWEHEUNISA-N 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Landscapes
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
OF THE DISCLOSUREA method and apparatus for automatically calibrating between a narrowfield of view (NFOV) camera and a wide field of view (WFOV) camera by performing feature matching between the respective NFOV and WFOV images, using the matched points to form a transformation matrix; using this transformation matrix to determine the (X, Y) coordinates of the WFOV image that correspond to the specific set of (P, T, Z) values used..
Description
AUTOMATIC CALIBRATION METHOD AND APPARATUS
[001] The present invention relates to a method for automatic calibration of a
Narrow Field-Of-View (NFOV) camera to a Wide Field-Of-View (WFOV) camera and an apparatus for carrying out the method.
[002] There has been quite a bit of innovation in the area of surveillance and detection with NFOV and WFOV camera systems. NFOV and WFOV camera systems remain the best option when it comes to surveillance due to the expansive and also detailed views that can be achieved. A single WFOV camera has a 60 degree or more field of view and enables surveillance from a distance of, for example, 50 meters, while some clustered WFOV camera systems allow views of 150 degrees or more. NFOV cameras are much less expansive since their lenses are designed for a narrower scope. An NFOV camera typically has a field of view of 10 degrees or less. When the two cameras are combined into one system, detailed surveillance can be accomplished over a wide field of view. One major application for such NFOV and WFOV camera systems is for gathering biometric data.
[003] Most conventional non-intrusive biometric systems use a combination of
WFQV and NFOV cameras. These systems are typically used to track a person or thing and gather biometric data from a distance. Normally, the WFOV camera(s) is or are kept stationery and used to monitor a wide range of area.
Once the WFQOV camera detects the presence of a human or object moving through the field of view, each NFOV camera pans, tilts and zooms in on the target and gathers detailed images. However, calibrating between the NFOV and WFOV cameras can sometimes be a problem. This is due to the fact that the coordinates of a WFOV camera do not align with the coordinates of a
NFOV camera.
[004] Manually calibrating from a NFOV camera to a WFOV camera is an extremely tedious and time consuming task. Identifiable markers are placed at pre-designated (X, Y) coordinates of the WFOV image. The NFOV camera is then manually adjusted so the identifiable marker can be seen. The pan, tilt and zoom (P, T, Z) values of the NFOV camera are then manually recorded. This process is repeated for the next set of coordinates. Consequently, this method requires a lot of time fo accomplish very little.
[005] An object within the field of view of the WFOV camera can be viewed in more detail using the NFOV camera if the system is properly calibrated.
Therefore, an efficient, automatic method to map the (P, T, Z) values of a
NFOV camera to the (X, Y) coordinates of a WFOV camera is highly desired.
Recently some automatic calibration methods have been devised to address this need for accurate and simple mapping between a NFOV camera and a
WFOV camera.
[006] One attempt at addressing this need can be found in US Patent No. 5,062,056 which discloses a calibration method that uses a correlation tracker and an object tracker simultaneously and cooperatively to “correlate a reference region image” and then “provide a precisely defined track point for an object within the image.” Similarly, the present invention also relies on feature matching between images taken from different cameras to provide detailed images of an object over an expansive range. However, this patent does not contain any reference to any method of calibration between the correlation fracker and the object tracker. It only details the use of both cameras in cooperation for use in target tracking and reduction of background clutter during target tracking. The present invention deals with an automatic method of calibration between two cameras.
[007] US Patent No. 5,434,617 discloses an automatic surveillance system which utilizes two cameras: a fixed spotting camera which provides a general field of view and a tracking camera with pan/tilt/zoom/focus features that is used to specifically track an object in detail. This patent discusses the mapping of points from the general image provided by the spotting camera to the panftilt/zoom/focus of the tracking camera. One objective of the present invention is to map panftilzoom values of a NFOV camera (tracking camera) to the (X, Y) coordinates of a WFOV camera (spotting camera). This patent requires the use of reference points to complete the mapping process. The mapping between the spotting image and the tracking camera is based on knowing the settings which aim the tracking camera to predetermined locations within the viewing angle of the spotting camera. These predetermined locations are typically the four corners of the spotting image. These settings are tabulated during alignment procedures that take place during the system setup process. Interpolation is then used to estimate the panitilt/zoom/focus of the tracking camera based on the location of the person in the image of the spotting camera. With the present invention, the mapping between the panftilt/zoom of the NFOV camera and the (X, Y) coordinates of the WFOV image is done by way of a transformation matrix which is determined via feature matching between the NFOV and WFOV images. Hence, there is no need for interpolation, knowledge of camera settings, or reference points.
[008] US Patent No. 6,215,519 discloses a surveillance and monitoring system that includes an imaging system with a wide field of view and one or more additional systems which have adjustable view settings (P, T, Z) and high resolution capabilities. This patent discloses a system which maps the (X, Y) coordinate system of the wide field of view image to the coordinates of the PTZ imaging system. The present invention sets out to accomplish the same goal of mapping between a WFOV image and the PTZ values of a NFOV camera: however, the present invention accomplishes this objective with the only required inputs being the images from the respective cameras. The cited patent requires additional parameters such as the physical location, distance and angles of the object relative to the wide field of view and PTZ imaging systems.
Overall, the patent is directed towards mapping between the wide field of view and the PTZ camera while the present invention is directed to a method of automatic calibration between said cameras.
[009] US Patent No. 6,734,911 discloses an alternative embodiment which is similar to the present invention. The alternative embodiment described in this patent concerns a narrow-angle camera and a wide-angle camera mounted on independent bases. The patent states that aiming may be synchronized by way of a variety of techniques including “gears, servos, or a pair of independent aiming motors operating under the control of a single controller.” it does not disclose any method of automatic calibration between the two cameras. While the present invention also concerns a system set-up similar to the one described in this patent, the invention is directed to a method of automatic calibration between the NFOV and WFOV images using a transformation matrix.
[010] US Patent No. 6,778,207 discloses a method of using several mounted cameras whose fields of view overlap to create a virtual PTZ camera. This patent uses the images captured from the mounted cameras as the inputs for its method. These images are merged by transforming to a common planar surface and blended together accounting for overlapping regions. In addition, as disclosed in this patent, zooming is done by way of interpolating the pixels so that a low resolution image can be mapped to a high resolution signal as the virtual PTZ camera zooms in. The present invention uses a WFOV camera to produce a wide field of view image and a NFOV camera is used to produce a close up view. In addition, the patent warps high resolution while the present invention warps a low resolution image captured from a NFOV camera fo a high resolution image captured from a WFQOV camera, reducing the complexities involved in the process. Overall, the method stated in the cited patent, requires more computations and is more complex and involved than the method of the present invention since steps such as high resolution image warping, image mosaicing and pixel interpolating procedures are required.
[011] US Patent No. 7,050,085 discloses a method of calibrating multiple cameras using a multi-lens camera and a structure, the edges of which have at least one row of indicia. For this patent, the camera is placed in the center of the structure and records an image that includes the indicia along each edge 5 combined with an edge of another image to form a panorama, each image being captured by two lenses on the camera. Human intervention is then required to select the feature points so the two lenses can be calibrated. With the present invention, on the other hand, the calibration process is completely automated and no human intervention is needed. In addition, the only necessary inputs required for the invention are the images from the different cameras, unlike the case in this patent where a reference object is needed.
Furthermore, in the present invention, there is no panoramic image created, rather the WFOV camera is used to provide an expansive view of the subject while the NFOV camera is used to produce a close-up view.
[012] US Patent No. 7,212,228 discloses a method for automatic calibration for a system comprising a plurality of cameras. In this patent, prior camera information must be collected before the calibration process including the posture in the world coordinate system of at least one camera and the positions in the world coordinate system of at least two cameras. Reference objects are also required; specifically, humans are used as the reference objects. Features such as the position of the representative point of the person being tracked, position of the head of the person being tracked and the color of the clothes of the person being tracked are needed fo correlate between cameras. In the present invention, no such intrinsic or extrinsic parameters of the cameras or reference objects are required. The only necessary inputs are images from the
NFOV and WFOV cameras.
[013] US Patent No. 7,548,253 discloses a method of self-calibration of a wide-field of view camera using a sequence of omni-directional images of a scene obtained from a camera. This patent uses tracked features collected from the images fo determine unknown calibration parameters. The present invention, on the other hand, uses matched feature points in images taken by
NFOV and WFOV cameras to calibrate the camera system. In addition, the system disclosed in this patent was developed for purpose of “correcting” the images collected rather than for the purpose of comparing and matching between images.
[014] US Patent No. 7,629,995 discloses a method of correlating views of two or more cameras. However, this patent does not teach comparing the images obtained from the camera systems. The patent simply states that “each data point will have a set of data point coordinates corresponding to its location relative to the immersive camera system.” In addition, the patent recommends the use of interpolation or extrapolation with regard to the data point coordinate sets for correlation between the views of the camera. In contrast, the present invention does not require interpolation or extrapolation of data points.
[015] While this known prior art provides various methods of automatically calibrating between narrow and wide FOV cameras, a more simple method which does not require interpolation, the use of reference points, intrinsic or extrinsic parameters of the cameras, but that still provides an expansive view and retains its accuracy, is highly desirable.
[016] Any and all discussion of documents, devices, acts or knowledge which is provided in this specification is included only to explain the context of the invention. This discussion should not be taken as an admission that any of the material forms a part of the state of the art, or is common general knowledge in the relevant art, on or before the priority date of the disclosure and claims herein.
[017] It is a primary objective of the present invention to provide a method for the automatic calibration of a NFOV camera to a WFOV camera, where such cameras are aimed to produce overlapping camera images.
[018] This object, as well other objects which will become apparent from the discussion that follows, are achieved, in accordance with the present invention, by providing a method of mapping from an NFOV camera image to a WFOV camera image which comprises the steps of: performing feature matching between the NFOV and WFOV images to obtain a group of matched feature points; removing any “outliers” (incorrectly matched feature points) that present themselves during the feature matching between the NFOV and WFQV images; using each matched feature point to form a transformation matrix [A]; selecting and retaining a plurality, preferably five, of the most accurately matched feature points with each feature point containing a NFOV point and a
WFOQV point; determining the centroid position of the NFOV image (Nox, Ney); and multiplying the NFOV centroid position by the transformation matrix [A] to obtain the (X, Y) coordinates of the WFOV image for the (P, T, Z) values used.
[019] It is also an objective of the present invention to provide apparatus for implementing the aforementioned method. According to one aspect of the invention, this apparatus comprises: (a) A single NFOV camera and a single WFOV camera, which are calibrated using the method described above, are fixed at a certain location and placed at an elevated level. The NFOV camera is allowed to pan, tilt and zoom and the WFOV camera is stationery; (b) A personal computer is coupled to the cameras and used for storing the mapping between the PTZ values of the NFOV camera and the (X, Y) coordinates of the WFOV image.
[020] According to another aspect of the present invention, the apparatus for implementing the aforementioned method comprises: (a) A single NFOV camera and multiple WFOV cameras, which are calibrated using the method described above, are fixed at a certain location and placed at an elevated level. The NFOV camera is allowed to pan, tilt and zoom and the WFOV cameras are stationery; (b) A personal computer is coupled to the cameras and used for storing the mapping between the PTZ values of the NFOV camera and the (X, Y) coordinates of the WFOV images.
[021] According to still another aspect of the present invention, the apparatus for implementing the aforementioned method comprises: (a) Multiple NFOV cameras and a single WFOV camera, which are calibrated using the method above, are fixed at a certain location and placed at an elevated level. The NFOV cameras are allowed to pan, tilt and zoom and the WFOV camera is stationery; (b) A personal computer is coupled to the cameras and used for storing the mapping between the PTZ values of the NFOV cameras and the {X, Y) coordinates of the WFOV image.
[022] In still another aspect of the present invention, the apparatus comprises: (a) Multiple NFOV cameras and a multiple WFOV cameras, which are calibrated using the method above, are fixed at a certain location and placed at an elevated level. The NFOV cameras are allowed to pan, tilt and zoom and the WFOV cameras are stationery; {(b) A personal computer is coupled to the cameras and used for storing the mapping between the PTZ values of the NFOV cameras and the (X, Y) coordinates of the WFOV images.
[023] In all cases there must be overlap between the NFOV and WFOV images for application of the present invention.
[024] One advantage of the present invention is that the camera system does not require intrinsic parameters such as focal length, image format or principal point; or extrinsic parameters such as the position of the camera center or the camera heading fo perform the calibration. The only necessary inputs are the images from the various cameras implemented in the camera system. The present invention is also not restricted to camera systems having a single
NFOV camera and a single WFOV camera. Rather, the method may be applied to systems that have multiple NFOV and WFOV cameras. Finally, the present invention makes possible an improvement in the speed of re-calibration when a camera in the system is changed or when a camera in the system is re- positioned.
[025] The advantages of the present invention can be seen in more detail in the following description taken in connection with the accompanying drawings, wherein, by way of illustration and exampie, preferred embodiments of the present invention are disclosed.
[026] Figure 1 is a representational diagram showing a camera system wherein one movable NFOV and one stationary WFOV camera are set at a certain location. Both cameras are connected to a personal computer (PC) for storage of mapping data: namely, the P, T and Z values of NFOV camera and the (X, Y) coordinates of WFOV image.
[027] Figure 2 is a the flowchart of the automatic calibration method according fo the present invention.
[028] Figure 3 shows a camera system featuring a single NFOV camera with multiple WFOV cameras. This system is connected to a PC in the same fashion as in Figure 1.
[029] Figure 4 shows a camera system featuring multiple NFOV cameras with a single WFOV camera. This system is connected to a PC in the same fashion as in Figure 1.
[030] Figure 5 shows the camera system from Figure 4 providing multiple views of the same object. The system is connecied to a PC in the same fashion as in Figure 1.
[031] Figure 6 illustrates a camera system containing multiple NFOV cameras and multiple WFOV cameras providing multiple views of a single object. The system is connected to a PC in the same fashion as in Figure 1.
[032] Figure 7 is a diagram illustrating the mapping process between PTZ values of NFOV cameras and WFOV cameras for a system containing multiple
NFOV and WFOV cameras.
[033] The present invention will now be described in detail in connection with the various embodiments with reference to Figures 1-7 in the accompanying drawings. Identical elements in the various figures are designated with the same reference numerals.
[034] The present invention provides a method of automatic calibration wherein the Cartesian (X, Y) coordinates of a Wide Field-Of-View (WFQV) image are determined for each set of pan, tilt and zoom (P, T, Z) values of a
Narrow Field-Of-View (NFOV) camera used to make overlapping images. P, T,
Z thus represent the panfiilt/zoom factors of the NFOV camera and X and Y represent the coordinates of the WFOV image where the NFOV camera is focusing. The only inputs needed for automatic calibration using this method are the images from each respective NFOV/WFOV cameras used in the system.
[035] Conventional approaches of calibrating between wide and narrow FOV cameras are mostly done manually, although a few automatic approaches have been proposed in recent years. However, these automatic calibration approaches require the parameters of each camera to compute the transformation matrix between the camera images. With the present invention, none of these parameters are needed and the only necessary inputs are the images from the different cameras. This significantly speeds up the recalibration process when any camera in the system is changed or when the position of any camera is changed. Moreover, a state of the art feature descriptor called Speeded Up Robust Features (SURF) may be used to improve speed and accuracy of the present invention.
[036] The objective of the present invention is to tabulate each set of P,T,Z values of the NFOV camera to the (X,Y) coordinates of WFOV image, where
P,T and Z are the pan, tilt, zoom values of NFOV camera and X and Y are the coordinates values of the WFOV image that the NFOV camera is pointing at.
The method generally consists of the following steps:
[037] For each set of P,T,Z values of the NFOV camera, 1. Features Matching is performed between NFOV image and WFQV image. Each matching feature point corresponds to a point on
NFOV image (Nx, Ny) and a point on WFOV image (Wx, Wy). 2. Any outliers during the feature matching are removed, using algorithm such as RANdom SAmple Consensus (RANSAC).
Outliers are incorrectly matched points that are mapped between the NFOV and WFQOV images. Outliers contribute to the errors of the pairings between PTZ values of NFOV camera and XY coordinates of WFOV image. Hence removing them helps to improve the quality of the calibration.
3. After outliers are removed, a transformation matrix [A] is formed from the matched feature point. 4. For each matching feature point, there is a corresponding accuracy value. The high accuracy value means there is a high confidence level that the point on NFOV image corresponds to the point on
WFOV image. Based on this accuracy value, the list of matched feature points is sorted, and the top five most accurate matched feature points are retained. If the accuracy value is less than a threshold, for example of 0.6, another set of P, T and Z values is used. 5. The centroid position (Nex, Ney) of the NFOV image is determined.
This NFOV centroid position (Ng, Ng), is then multiplied by the transformation matrix [A]. 6. The transformed NFOV centroid position is used as the (X)Y) coordinates of WFOV image for the set of (P,T,Z) values of NFOV camera.
[038] Figure 1 is a schematic representation of the basic system required for the present invention. A single WFOV camera (1) and a single NFOV camera (2), both placed at an elevated, fixed position, are connected to a personal computer (6). The WFOV camera (1) is stationary while the NFOV camera (2) is allowed to pan, tilt and zoom. The WFOV camera (1) provides a more expansive view (4) of an object (3) being examined while the NFOV camera (2) provides a narrow, more detailed view (5) of the particular object (3) in both camera images. The attached personal computer (6) is used for storage of mapping data between the (P, T, Z) values of the NFOV camera (2) and the (X,
Y) coordinates of the WFOV camera (1).
[039] The method according to the present invention for automatic calibration is illustrated on the flowchart of Figure 2. In greater detail and with regards to
Figure 1, the preferred embodiment provides, for each set of (P, T, Z) values obtained from the NFOV camera (2), a feature matching between the overlapping NFOV image (5) and WFOV image (4). Each matched feature point corresponds to a point on the NFOV image (5) and a point on the WFOV image (4). Therefore, there is a corresponding accuracy value associated with each matched feature point which can be determined using RANSAC algorithm, for example.
[040] If any outliers are found during the feature matching process, using
RANSAC algorithm for example, they should be unselected and removed.
Outliers represent incorrectly matched feature points and can lead to errors in the calibration process. Removing the outliers can help improve the accuracy of the tabulations. Once the outliers are removed, the matched feature points are used to formulate a transformation matrix [A].
[041] The matched feature points are then sorted according to their corresponding accuracy value and the most accurate feature points, preferably five such points, are selected. Only those with an accuracy value above a prescribed threshold, for example a threshold value of 0.6, are retained. This threshold value is preferably selected in dependence upon the particular surveillance application of the NFOV and WFOV cameras.
[042] The centroid position (Ng, Ng) of the the NFOV image (5) is then determined. This centroid position is multiplied by the transformation matrix [A] to obtain the transformed NFOV centroid position (N's, Ny), thus: 1 xX dy Gp diz dy Xx !
YI |G Gp Ay dy J ' » z a3; dip Oy3 dy z
[043] The mean centroid position of the WFOV image (4) represents the (X, Y) coordinates of the WFOV image for the set of PTZ values that were used. The steps may be repeated for another set of P, T, Z values of the NFOV camera (2) to map to the (X, Y) coordinates of the WFOV image (4).
[044] In a particular preferred embodiment of the present invention, the corresponding accuracy value of the matched feature points must be greater than 0.6 to qualify. In this embodiment, if the accuracy values are below this threshold value, then the process should be started over again with a new set of P, T, Z values for a different feature point.
[045] The automatic calibration method in the present invention is not limited to a single NFOV camera and a single WFOV camera system. The invention may be applied to multiple camera systems as well. Figure 3 illustrates a camera system having a single NFOV camera (2) paired with two WFOV cameras (1, 7). All three cameras are placed at elevated, fixed positions. The NFOV camera (2) is allowed to pan, tilt and zoom while both WFOV cameras (1, 7) are stationary. Since the NFOV image 1 (5) and the WFOV image 1 (4) overlap, calibration between the P, T, Z values of the NFOV camera, with regards to NFOV image 1 (5), and the (X, Y) coordinates of WFOV image 1 (4) can be conducted using the method described above. Similarly, calibration between the P, T, Z values of the NFOV camera, with regards to NFOV image 2 (9), and the (X, Y) coordinates of the WFOV image 2 (8) can be conducted as well. Once again, in this system, the NFOV camera (2) and both WFOV cameras (1, 7) are connected to a personal computer (6) for storage of mapping tabulations between the P, T, Z values of the NFOV camera and the (X,Y) coordinates of the WFOV images. This camera system is advantageous because it allows for surveillance and tracking of the object between the areas covered by the WFOV cameras while also providing detailed views of an object over a wide area of coverage.
[046] The camera system may also include multiple NFOV cameras (2, 10) paired with a single WFOV camera (1) as shown in Figures 4 and 5. All three cameras are placed at elevated, fixed positions. Both NFOV cameras (2, 10) are allowed to pan, tilt and zoom while the WFOV camera (1) is held stationary.
In this system, NFOV images 1 (5) and 2 (9) overlap with WFOQV image 1 (4).
This once again allows the aforementioned calibration method to be used to calibrate between the P, T, Z values of the NFOV camera 1 (2) or NFOV camera 2 (10) and the (X, Y) coordinates of the WFOV image (4). Since the
NFOV cameras (2, 10) are allowed to pan, tilt and zoom, it is possible to focus both cameras on the same object (3) and obtain multiple views of the object (3), as shown in Figure 5. The NFOV cameras (2, 10) could also focus in on different areas of the WFOV image (4) as shown in Figure 4 to allow tracking of multiple objects over a wide area. Once again, the WFOV camera (1) and both
NFQOV cameras (2, 10) are connected to a personal computer (8) for storage of mapping tabulations between the P, T, Z values of the NFOV cameras and the (X, Y) coordinates of the WFOV image.
[047] The camera system may also include multiple NFOV cameras (2, 10, 12) paired with multiple WFOV cameras (1, 7, 11) as shown in Figure 8. In this system, the (X, Y) coordinates of the WFOV camera 1 image (4) can be directly mapped to the (X, Y) coordinates of the WFOV camera 2 image (8) by using the transformation matrix [A] calculated by way of feature matching in accordance with the automatic calibration method described above. In a similar fashion, the transformation matrix [A] may be used to map the (X, Y) coordinates of the WFOV camera 2 image (8) to the (X, Y) coordinates of the
WFQOV camera 3 image (13). A diagram of the described mapping relationship between the different cameras of this camera system is illustrated in Figure 7.
As shown in this figure, the P, T and Z values of the NFOV camera 1 (2) can still be mapped to the (X, Y) coordinates of the WFOV camera 1 image (4).
Similarly, the P, T and Z values of the NFOV camera 2 (10) can still be mapped to the (X, Y) coordinates of the WFOV camera 2 image (8) and the P, T, Z values of the NFOV camera 3 (12) can be mapped to the (X, Y) coordinates of the WFOV camera 3 image (13). The mapping between the respective NFOV and WFOV images is possible because of the overlap between the NFOV and
WFQOV images.
[048] The angle between WFOV cameras is dependent upon the level of confidence achieved during feature matching. The more acute the angle between the respective WFOV cameras, the greater the level of accuracy will be during mapping between (X, Y) coordinates of WFOV images.
[049] For applications of this method for all the various camera systems described above, it is assumed that the environment is neither featureless nor contains repetitive patterns.
[050] There has thus been shown and described a novel automatic calibration method and apparatus for NFOV and WFOV cameras which fulfills all the objects and advantages sought therefor. Many changes, modifications, variations and other uses and applications of the subject invention will, however, become apparent fo those skilled in the art after considering this specification and the accompanying drawings which disclose the preferred embodiments thereof. Alli such changes, modifications, variations and other uses and applications which do not depart from the spirit and scope of the invention are deemed to be covered by the invention, which is to be limited only by the claims which follow.
Claims (13)
1. A method for automatically calibrating at least one movable near field-of- view (NFOV) camera with respect to at least one stationary wide field-of-view (WFOV) camera, wherein said cameras are aimed to produce overlapping camera images which are supplied to a computer together with pan (P), tilt (T) and zoom (Z) values, respectively, of said at least one NFOV camera, said calibration method comprising the computer-implemented steps of: (a) receiving a WFQV image produced by a WFOV camera; (b) receiving an overlapping NFOV image produced by a NFOV camera having a selected set of P, Z and T values; (c) identifying a feature point in said WFOV image; (d) identifying a feature point in said NFOV image which may match the feature point in said WFOV image; (e) forming a transformation matrix [A] for the matched feature point in the WFOV and NFOV images; (f) determining an accuracy value of match between said feature point in said WFOV and NFOV images; (9) repeating steps (c) — (e) for a plurality of other matched feature points in said WFOV and NFOV images; (h) selecting a plurality of matched feature points having a highest accuracy of match; (i) determining a centroid position (Nex, Ney) of the NFOV image;
(j) multiplying the transformation matrix [A] by the centroid position (Nex, Ney) of the NFOV image to obtain a transformed centroid position (N'c, N'gy) of the NFOV image; whereby the transformed centroid position (N'cx, N'cy) of the NFOV image is used as the (X, Y) coordinates of the WFOV image for the selected set of P, T and Z values of the NFOV camera.
2. The method defined in claim 1, wherein steps (b) — (J) are repeated for another selected set of P, T and Z values of the NFOV camera.
3. The method defined in claim 1, further comprising the steps of (1) determining the combined matching accuracy value of the selected plurality of matched feature points, and (2) selecting another set of P, T and Z values if the combined accuracy value is less than a prescribed threshold.
4. The method defined in claim 3, wherein the prescribed threshold is selected in dependence upon the particular surveillance application of the NFOV and WFQOV cameras.
5. The method defined in claim 1, wherein the selected number of matched feature points in step (g) is five.
6. The method defined in claim 1, further comprising the step of removing outliers during feature matching between the NFOV and WFOV images.
7. The method defined in claim 1, wherein said WFOV and NFOV cameras are placed at an elevated level above ground to view a scene.
8. The method defined in claim 1, wherein said computer is operative to store the mapping between the P, T and Z values of said at least one NFOV camera and the (X, Y) coordinates of the WFOV image of said at least one WFQV cameras.
9. The method defined in claim 1, wherein said method is operative to calibrate a single movable NFOV camera with respect to a single stationary WFOV camera.
10. The method defined in claim 1, wherein said method is operative to calibrate a single movable NFOV camera with respect to a plurality of stationary WFOV cameras.
11. The method defined in claim 1, wherein said method is operative to calibrate a plurality of movable NFOV cameras with respect to a single stationary WFOV camera.
12. The method defined in claim 1, wherein said method is operative to calibrate a plurality of movable NFOV cameras with respect to a plurality of stationary WFOV cameras.
13. The method defined in claim 12, wherein each one of said NFOV cameras is calibrated with respect to a corresponding one of said WFQOV cameras.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG2011097888A SG191452A1 (en) | 2011-12-30 | 2011-12-30 | Automatic calibration method and apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG2011097888A SG191452A1 (en) | 2011-12-30 | 2011-12-30 | Automatic calibration method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
SG191452A1 true SG191452A1 (en) | 2013-07-31 |
Family
ID=49301167
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
SG2011097888A SG191452A1 (en) | 2011-12-30 | 2011-12-30 | Automatic calibration method and apparatus |
Country Status (1)
Country | Link |
---|---|
SG (1) | SG191452A1 (en) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109427076A (en) * | 2017-08-29 | 2019-03-05 | 安讯士有限公司 | Method and system relative to fixed camera calibration Pan/Tilt/Zoom camera direction |
US10473593B1 (en) | 2018-05-04 | 2019-11-12 | United Technologies Corporation | System and method for damage detection by cast shadows |
US10488371B1 (en) | 2018-05-04 | 2019-11-26 | United Technologies Corporation | Nondestructive inspection using thermoacoustic imagery and method therefor |
US10685433B2 (en) | 2018-05-04 | 2020-06-16 | Raytheon Technologies Corporation | Nondestructive coating imperfection detection system and method therefor |
US10902664B2 (en) | 2018-05-04 | 2021-01-26 | Raytheon Technologies Corporation | System and method for detecting damage using two-dimensional imagery and three-dimensional model |
US10914191B2 (en) | 2018-05-04 | 2021-02-09 | Raytheon Technologies Corporation | System and method for in situ airfoil inspection |
US10928362B2 (en) | 2018-05-04 | 2021-02-23 | Raytheon Technologies Corporation | Nondestructive inspection using dual pulse-echo ultrasonics and method therefor |
US10943320B2 (en) | 2018-05-04 | 2021-03-09 | Raytheon Technologies Corporation | System and method for robotic inspection |
US10958843B2 (en) | 2018-05-04 | 2021-03-23 | Raytheon Technologies Corporation | Multi-camera system for simultaneous registration and zoomed imagery |
US11079285B2 (en) | 2018-05-04 | 2021-08-03 | Raytheon Technologies Corporation | Automated analysis of thermally-sensitive coating and method therefor |
US11268881B2 (en) | 2018-05-04 | 2022-03-08 | Raytheon Technologies Corporation | System and method for fan blade rotor disk and gear inspection |
WO2022072901A1 (en) * | 2020-10-02 | 2022-04-07 | Facebook Technologies, Llc | Multi-sensor camera systems, devices and methods for providing image pan, tilt and zoom functionality |
US20220230348A1 (en) * | 2021-01-19 | 2022-07-21 | The Boeing Company | Method and apparatus for determining a three-dimensional position and pose of a fiducial marker |
US20230292015A1 (en) * | 2022-03-08 | 2023-09-14 | Nec Corporation Of America | Image based localization |
US11941840B2 (en) | 2021-09-21 | 2024-03-26 | The Boeing Company | Method and apparatus for hand-off and tracking for pose estimation of a fiducial marker |
US12069380B2 (en) | 2022-03-08 | 2024-08-20 | Nec Corporation Of America | Solar blind imaging |
US12067805B2 (en) | 2022-03-08 | 2024-08-20 | Nec Corporation Of America | Facial gesture recognition in SWIR images |
US12106517B2 (en) | 2021-09-21 | 2024-10-01 | The Boeing Company | Method and apparatus for modeling dynamic intrinsic parameters of a camera |
-
2011
- 2011-12-30 SG SG2011097888A patent/SG191452A1/en unknown
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI689870B (en) * | 2017-08-29 | 2020-04-01 | 瑞典商安訊士有限公司 | A method of calibrating a direction of a pan, tilt, zoom, camera with respect to a fixed camera, and a system in which such a calibration is carried out |
EP3451650A1 (en) | 2017-08-29 | 2019-03-06 | Axis AB | A method of calibrating a direction of a pan, tilt, zoom, camera with respect to a fixed camera, and a system in which such a calibration is carried out |
US10430972B2 (en) | 2017-08-29 | 2019-10-01 | Axis Ab | Method of calibrating a direction of a pan, tilt, zoom, camera with respect to a fixed camera, and a system in which such a calibration is carried out |
CN109427076A (en) * | 2017-08-29 | 2019-03-05 | 安讯士有限公司 | Method and system relative to fixed camera calibration Pan/Tilt/Zoom camera direction |
US11079285B2 (en) | 2018-05-04 | 2021-08-03 | Raytheon Technologies Corporation | Automated analysis of thermally-sensitive coating and method therefor |
US11268881B2 (en) | 2018-05-04 | 2022-03-08 | Raytheon Technologies Corporation | System and method for fan blade rotor disk and gear inspection |
US10685433B2 (en) | 2018-05-04 | 2020-06-16 | Raytheon Technologies Corporation | Nondestructive coating imperfection detection system and method therefor |
US10902664B2 (en) | 2018-05-04 | 2021-01-26 | Raytheon Technologies Corporation | System and method for detecting damage using two-dimensional imagery and three-dimensional model |
US10914191B2 (en) | 2018-05-04 | 2021-02-09 | Raytheon Technologies Corporation | System and method for in situ airfoil inspection |
US10928362B2 (en) | 2018-05-04 | 2021-02-23 | Raytheon Technologies Corporation | Nondestructive inspection using dual pulse-echo ultrasonics and method therefor |
US10943320B2 (en) | 2018-05-04 | 2021-03-09 | Raytheon Technologies Corporation | System and method for robotic inspection |
US10958843B2 (en) | 2018-05-04 | 2021-03-23 | Raytheon Technologies Corporation | Multi-camera system for simultaneous registration and zoomed imagery |
US10473593B1 (en) | 2018-05-04 | 2019-11-12 | United Technologies Corporation | System and method for damage detection by cast shadows |
US10488371B1 (en) | 2018-05-04 | 2019-11-26 | United Technologies Corporation | Nondestructive inspection using thermoacoustic imagery and method therefor |
US11880904B2 (en) | 2018-05-04 | 2024-01-23 | Rtx Corporation | System and method for robotic inspection |
WO2022072901A1 (en) * | 2020-10-02 | 2022-04-07 | Facebook Technologies, Llc | Multi-sensor camera systems, devices and methods for providing image pan, tilt and zoom functionality |
US20220230348A1 (en) * | 2021-01-19 | 2022-07-21 | The Boeing Company | Method and apparatus for determining a three-dimensional position and pose of a fiducial marker |
US12073582B2 (en) * | 2021-01-19 | 2024-08-27 | The Boeing Company | Method and apparatus for determining a three-dimensional position and pose of a fiducial marker |
US11941840B2 (en) | 2021-09-21 | 2024-03-26 | The Boeing Company | Method and apparatus for hand-off and tracking for pose estimation of a fiducial marker |
US12106517B2 (en) | 2021-09-21 | 2024-10-01 | The Boeing Company | Method and apparatus for modeling dynamic intrinsic parameters of a camera |
US20230292015A1 (en) * | 2022-03-08 | 2023-09-14 | Nec Corporation Of America | Image based localization |
US12069380B2 (en) | 2022-03-08 | 2024-08-20 | Nec Corporation Of America | Solar blind imaging |
US12067805B2 (en) | 2022-03-08 | 2024-08-20 | Nec Corporation Of America | Facial gesture recognition in SWIR images |
US12114084B2 (en) * | 2022-03-08 | 2024-10-08 | Nec Corporation Of America | Image based localization |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
SG191452A1 (en) | Automatic calibration method and apparatus | |
US11503275B2 (en) | Camera calibration system, target, and process | |
US10033924B2 (en) | Panoramic view imaging system | |
CN111750820B (en) | Image positioning method and system | |
US7479982B2 (en) | Device and method of measuring data for calibration, program for measuring data for calibration, program recording medium readable with computer, and image data processing device | |
US5801970A (en) | Model-based feature tracking system | |
US8755624B2 (en) | Image registration device and method thereof | |
CN103971375B (en) | A kind of panorama based on image mosaic stares camera space scaling method | |
JP2008538474A (en) | Automated monitoring system | |
US20210120194A1 (en) | Temperature measurement processing method and apparatus, and thermal imaging device | |
TW201921288A (en) | A method of calibrating a direction of a pan, tilt, zoom, camera with respect to a fixed camera, and a system in which such a calibration is carried out | |
CN108010086A (en) | Camera marking method, device and medium based on tennis court markings intersection point | |
US9679382B2 (en) | Georeferencing method and system | |
JP4270949B2 (en) | Calibration chart image display device, calibration device, and calibration method | |
Cvišić et al. | Recalibrating the KITTI dataset camera setup for improved odometry accuracy | |
CN101447073A (en) | Zoom lens calibration method | |
CN114022560A (en) | Calibration method and related device and equipment | |
CN112017238A (en) | Method and device for determining spatial position information of linear object | |
KR102138333B1 (en) | Apparatus and method for generating panorama image | |
CN111598956A (en) | Calibration method, device and system | |
CN114067000B (en) | Multi-target monitoring method and system based on panoramic camera and galvanometer camera | |
CN112711982B (en) | Visual detection method, device, system and storage device | |
Tehrani et al. | A practical method for fully automatic intrinsic camera calibration using directionally encoded light | |
CN114331834A (en) | Panoramic image splicing method in optical simulation training system | |
Krishnasamy et al. | High precision target tracking with a compound-eye image sensor |