CN108496354B - Camera calibration device and camera calibration method - Google Patents
Camera calibration device and camera calibration method Download PDFInfo
- Publication number
- CN108496354B CN108496354B CN201780007829.6A CN201780007829A CN108496354B CN 108496354 B CN108496354 B CN 108496354B CN 201780007829 A CN201780007829 A CN 201780007829A CN 108496354 B CN108496354 B CN 108496354B
- Authority
- CN
- China
- Prior art keywords
- feature point
- camera
- extracted feature
- area
- distance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 21
- 238000004364 calculation method Methods 0.000 claims abstract description 13
- 238000000605 extraction Methods 0.000 claims abstract description 13
- 239000003550 marker Substances 0.000 claims abstract description 13
- 239000000284 extract Substances 0.000 claims abstract description 9
- 238000012937 correction Methods 0.000 claims description 42
- 238000011156 evaluation Methods 0.000 claims description 18
- 238000010586 diagram Methods 0.000 description 13
- 238000012545 processing Methods 0.000 description 5
- 238000009434 installation Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000002945 steepest descent method Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/02—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Mechanical Engineering (AREA)
- Studio Devices (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Image Processing (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The camera calibration device includes a feature point extraction unit, an area calculation unit, and an adjustment unit. A feature point extraction unit extracts a plurality of feature points from an image of a calibration marker captured by a camera. The area calculation unit calculates an area of a figure defined by the extracted feature points. The adjusting unit adjusts a camera parameter for correcting the camera based on the coordinates of the extracted feature points and the calculated area of the pattern.
Description
Technical Field
The present disclosure relates to a camera calibration device and a camera calibration method for performing calibration of a camera.
Background
One of the following techniques is known: the periphery of the vehicle is photographed by a camera mounted on the vehicle, and driving assistance is performed using the photographed image. In such a technique, a deviation from a design value occurs in an image obtained by imaging due to an error in a mounting position of a camera, a manufacturing error of the camera itself, or the like. Therefore, it is necessary to correct an error in the mounting position of the camera in advance in a factory or the like. In the calibration, a calibration marker is photographed by a camera, a feature point is extracted from an image of the photographed calibration marker, and a camera parameter for calibrating the camera is adjusted based on the coordinates of the extracted feature point (for example, see patent document 1). The larger the proportion of the image of the correction mark in the captured image is, and the more uniformly and densely distributed the feature points are, the higher the correction accuracy can be.
Patent document 1: japanese patent laid-open publication No. 2011-155687
Disclosure of Invention
A camera calibration device according to an aspect of the present disclosure includes a feature point extraction unit, an area calculation unit, and an adjustment unit. A feature point extraction unit extracts a plurality of feature points from an image of a calibration marker captured by a camera. The area calculation unit calculates the area of the graph defined by the feature points extracted by the feature point extraction unit. The adjusting unit adjusts a camera parameter for correcting the camera based on the coordinates of the feature points extracted by the feature point extracting unit and the area of the figure calculated by the area calculating unit.
Other modes of the present disclosure are camera calibration methods. The method comprises the following steps: extracting a plurality of feature points from an image of a correction mark captured by a camera; calculating an area of a figure specified by the extracted feature points; and adjusting a camera parameter for correcting the camera based on the coordinates of the extracted feature points and the calculated area of the figure.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present disclosure, the correction accuracy of the camera can be improved.
Drawings
Fig. 1A is a diagram showing a positional relationship between a vehicle and a correction mark at the time of correction according to an embodiment.
Fig. 1B is a plan view showing an example of the correction mark of fig. 1A.
Fig. 2 is a block diagram showing a schematic configuration of the camera calibration device of fig. 1A.
Fig. 3A is a diagram showing an image of a correction mark in an ideal case where there is no error in the attachment position of the camera.
Fig. 3B is a diagram showing an image of a correction mark in a case where an error of the yaw angle is present in the mounting position of the camera.
Fig. 4A is a diagram showing the feature points of fig. 3A and 3B superimposed.
Fig. 4B is a diagram showing the feature points of fig. 3A and 3B in a state of being superimposed on a graphic.
Fig. 5 is a flowchart showing the processing of the camera calibration device of fig. 1A.
Detailed Description
Before describing the embodiments of the present disclosure, problems in the conventional camera calibration device will be briefly described. In the calibration of a camera, it is desirable to use a small calibration mark because of the restriction of the installation space. The smaller the correction mark is, the smaller the number of feature points that can be extracted and the more dense the feature points are in a part of the captured image. In this case, the movement of the feature point due to the error in the mounting position of the camera is less affected by the deformation of the lens, and the correction accuracy is lowered because the movement approaches the parallel movement.
The present disclosure has been made in view of such circumstances, and provides a technique capable of improving the correction accuracy of a camera.
Fig. 1A is a diagram showing a positional relationship between the vehicle C1 and the calibration markers M1 and M2 at the time of calibration according to the embodiment, and fig. 1B is a plan view showing an example of the calibration markers M1 and M2 in fig. 1A. In fig. 1A, the periphery of the vehicle C1 when the correction of the camera 10 is performed at a production plant of the vehicle C1 or the like is viewed from above. The vehicle C1 includes the camera 10 and the camera calibration device 20. The camera 10 is attached to a rear door or the like at the rear of the vehicle C1, and captures an image of the rear of the vehicle C1. The camera 10 may be mounted near the center axis of the vehicle or may be mounted off the center axis of the vehicle. The camera calibration device 20 adjusts camera parameters for calibrating the camera 10.
The correction marks M1 and M2 are disposed at predetermined positions in the rear of the vehicle C1 within the imaging range of the camera 10 so as to be substantially perpendicular to the ground. The correction marks M1 and M2 are arranged substantially symmetrically on both sides of the center axis of the vehicle.
As shown in fig. 1B, each of the calibration marks M1 and M2 has a checkerboard pattern in which 16 squares are arranged in a matrix. The camera 10 photographs the checkerboard pattern.
Fig. 2 is a block diagram showing a schematic configuration of the camera calibration device 20 of fig. 1A. The camera calibration device 20 includes an image storage unit 22, a feature point extraction unit 24, an area calculation unit 26, and an adjustment unit 28.
The image storage unit 22 stores images of the calibration markers M1 and M2 captured by the camera 10.
The feature point extracting unit 24 extracts a plurality of feature points from the images of the markers M1 and M2 for correction stored in the image storage unit 22. The method of extracting the feature points is not particularly limited, and the feature point extracting unit 24 may extract the feature points by using a pattern matching (pattern matching) technique. When the pattern matching technique is used, the feature point extracting unit 24 scans a template for each feature point on an image, and extracts feature points from a pattern on the image having a high degree of matching with the template.
The area calculation unit 26 calculates the area of the graph defined by the feature points extracted by the feature point extraction unit 24. Each graph is a polygon having three or more feature points as vertices.
The adjusting unit 28 adjusts the camera parameters based on the coordinates of the feature points extracted by the feature point extracting unit 24 and the area of the figure calculated by the area calculating unit 26.
The detailed functions of area calculating unit 26 and adjusting unit 28 will be described later.
The configuration of the camera calibration device 20 can be realized by a CPU, a memory, or another LSI of an arbitrary computer in terms of hardware, and can be realized by a program loaded in a memory in terms of software. Accordingly, those skilled in the art will appreciate that these functional blocks can be implemented in various forms using only hardware, or by a combination of hardware and software.
The adjustment of the camera parameters will be described with reference to fig. 3A and 3B and fig. 4A and 4B.
Fig. 3A is a diagram showing an image I1 of the correction marks M1 and M2 in an ideal case where there is no error in the mounting position of the camera 10, and fig. 3B is a diagram showing an image I2 of the correction marks M1 and M2 in a case where there is an error in the yaw angle in the mounting position of the camera 10. The images I1 and I2 are captured by the camera 10. Next, an example of adjusting the camera parameters in the case of fig. 3B will be described.
Fig. 4A is a diagram showing feature points P1C to P10C and P1 to P10 of fig. 3A and 3B superimposed on each other, and fig. 4B is a diagram showing feature points P1C to P10C and P1 to P10 of fig. 3A and 3B superimposed on diagrams F1C to F8C and F1 to F8. For clarity of description, the markers M1 and M2 for correction are removed in fig. 4A and 4B.
As shown in fig. 3A and 3B, 10 feature points P1C to P10C exist in the image I1. The image I2 includes 10 feature points P1 to P10. The feature points P1 and P1C are located at the center of the correction mark M1. The feature points P2 and P2C are located at the centers of the four upper left squares of the correction marker M1, and the feature points P3 and P3C are located at the centers of the four upper right squares of the correction marker M1. The feature points P4 and P4C are located at the centers of the four squares on the lower left of the calibration marker M1, and the feature points P5 and P5C are located at the centers of the four squares on the lower right of the calibration marker M1. The relationships between the feature points P6 to P10, P6C to P10C and the correction mark M2 are also the same, and therefore, the description thereof is omitted.
The graph F1 is a triangle having the feature points P1 to P3 as vertices. The graph F2 is a triangle having the feature points P1, P4, and P5 as vertices. The graph F3 is a triangle having the feature points P1, P2, and P4 as vertices. The graph F4 is a triangle having the feature points P1, P3, and P5 as vertices.
The graph F5 is a triangle having the feature points P6 to P8 as vertices. The graph F6 is a triangle having the feature points P6, P9, and P10 as vertices. The graph F7 is a triangle having the feature points P6, P7, and P9 as vertices. The graph F8 is a triangle having the feature points P6, P8, and P10 as vertices.
The relationships between the graphs F1C to F8C and the feature points P1C to P10C are also the same, and therefore, the description thereof is omitted.
In this way, the area calculation unit 26 in fig. 2 calculates the areas of the group of patterns F1 and F2 arranged substantially in the first direction (vertical direction) y and the area of the group of patterns F3 and F4 arranged substantially in the second direction (horizontal direction) x in the image of the correction mark M1. The area calculation unit 26 calculates the areas of the group of patterns F5 and F6 arranged substantially in the first direction y and the area of the group of patterns F7 and F8 arranged substantially in the second direction x in the image of the correction marker M2. The second direction x intersects the first direction y.
Here, since the calibration markers M1 and M2 are provided at predetermined positions as described above, the coordinates of the respective feature points on the calibration markers M1 and M2 in a reference coordinate system (world coordinate system) based on the vehicle C1 are known. The coordinates of the feature points on the image can be calculated by calculating the coordinates of the known feature points in the reference coordinate system and the camera parameters by a known method. Here, the coordinates of the feature points on the image calculated using the camera parameters of the initial values are assumed to be equal to the coordinates of the feature points P1C to P10C in the ideal case of fig. 3A.
If the camera parameters are the optimum values, the respective coordinates of the calculated feature points P1C to P10C are equal to the coordinates of the corresponding feature points P1 to P10 extracted from the captured image. The difference in the coordinates between the feature points P1C to P10C and the corresponding feature points P1 to P10 becomes larger as the camera parameters are different from the optimal values. Therefore, the adjusting unit 28 in fig. 2 repeatedly adjusts the camera parameters so that the evaluation value becomes smaller, and the camera parameters are brought close to the optimum values. The evaluation value is the sum of the first evaluation value and the second evaluation value.
The first evaluation value is the sum of the distances L1 to L10 shown in fig. 4A. The distance L [ i ] (i is an integer of 1 to 10) is a distance between the coordinates of the calculated feature point P [ i ] C and the coordinates of the feature point P [ i ] extracted from the captured image. Note that in fig. 4A, the distances L1 and L6 are not illustrated.
The second evaluation value is the sum of the area differences S1 to S8. The area difference Sj (j is an integer of 1 to 8) is the difference between the area of the pattern Fj C calculated based on the calculated feature points P1C to P10C and the area of the pattern Fj calculated based on the feature points P1 to P10 extracted from the image.
As a method of adjusting the camera parameters so as to reduce the evaluation value, a known method such as a steepest descent method can be used. For example, when the evaluation value is substantially constant or when the evaluation value is smaller than a predetermined threshold value, adjustment unit 28 in fig. 2 determines that the evaluation value converges, and ends the adjustment of the camera parameters.
For example, the area difference S3 between the area of the graph F3C and the area of the graph F3 in fig. 4B is numerically larger than the distance L2 between the calculated feature point P2C and the extracted feature point P2 in fig. 4A. Therefore, compared to the case of using only the feature points, the change in the iterative computation can be captured more largely, and the accuracy of the correction can be improved.
In addition, the area difference S1 between the area of the pattern F1C and the area of the pattern F1 is smaller than the area difference S3 between the area of the pattern F3C and the area of the pattern F3. The area difference S2 between the area of the pattern F2C and the area of the pattern F2 is smaller than the area difference S4 between the area of the pattern F4C and the area of the pattern F4. That is, the areas of the graphs F1 to F8 change in different trends due to the error in the yaw angle of the mounting position of the camera 10. Therefore, by adjusting the camera parameters so that the total of the area differences S1 to S8 becomes smaller, the error in the yaw angle of the mounting position of the camera 10 can be corrected with higher accuracy.
Similarly, when there is an error in the pitch angle and the roll angle in the mounting position of the camera 10, the areas of the graphs F1 to F8 change with different tendencies, and therefore, the correction can be performed with higher accuracy.
Fig. 5 is a flowchart showing the processing of the camera calibration device 20 of fig. 1A. First, the feature point extraction unit 24 extracts a plurality of feature points P1 to P10 from the images of the correction markers M1 and M2 captured by the camera 10 (S1). Next, the area calculation unit 26 calculates the areas of the graphs F1 to F8 defined by the extracted feature points P1 to P10 (S2). Subsequently, adjustment unit 28 adjusts the camera parameters based on the coordinates of extracted feature points P1 to P10 and the calculated areas of patterns F1 to F8 (S3). Subsequently, adjustment unit 28 determines whether or not the evaluation value converges (S4), and if not (no at S4), returns to the process at S3. When the evaluation value converges (yes in S4), adjustment unit 28 ends the processing.
The adjusted camera parameters are stored in a storage unit, not shown, in the vehicle C1. Then, an image processing device, not shown, in the vehicle C1 corrects the image captured by the camera 10 using the stored camera parameters, and generates an image in which distortion due to an error in the attachment position of the camera 10 or the like is corrected. The corrected image is used for the driver to confirm the rear of the vehicle C1, and the like.
As described above, according to the present embodiment, the feature amount used for adjustment of the camera parameter increases as compared with the case where only the coordinates of the feature points are used. Further, the area varies from the feature point according to the error in the mounting position of the camera 10. In addition, the amount of change in the area due to the error in the mounting position of the camera 10 is larger than the amount of change in the coordinates of the feature point due to the error in the mounting position of the camera 10. Therefore, the accuracy of correction can be improved as compared with the case where only the feature points are used.
Therefore, a smaller calibration mark can be used without degrading the calibration accuracy as compared with the case of using only the feature points.
In addition, due to a deviation of at least one of the pitch angle, yaw angle, and roll angle of the camera 10, the areas of the group of patterns F1, F2 aligned in the first direction y and the areas of the group of patterns F3, F4 aligned in the second direction x respectively change in different tendencies. Similarly, the areas of the group of patterns F5 and F6 and the areas of the group of patterns F7 and F8 respectively change in different trends. Therefore, the influence of errors in the pitch angle, yaw angle, and roll angle of the mounting position of the camera 10 can be corrected with higher accuracy.
The present disclosure has been described above based on the embodiments. The present embodiment is an example, and those skilled in the art will understand that various modifications can be made to the combination of the components or the processing steps of the embodiments, and that such modifications are also within the scope of the present disclosure.
For example, a calibration mark may be used. In this case, although the correction accuracy is lower than that in the above-described embodiment, the installation space of the correction mark can be reduced.
In addition, three or more calibration marks may be used. In this case, although the correction accuracy is improved as compared with the above-described embodiment, the installation space of the correction mark increases.
In addition, the specific pattern of the correction mark is not particularly limited as long as three or more feature points for defining at least one figure can be extracted from one correction mark. When adjusting the camera parameters based on the area of one pattern, although the correction accuracy is lower than that in the above-described embodiment, the adjustment time of the camera parameters may be shortened.
The figure may be a polygon other than a triangle, or may be a different shape.
The first evaluation value may be based on the distances L1 to L10 shown in fig. 4A, or may be an average value of the distances L1 to L10. The second evaluation value may be a value based on the area differences S1 to S8, or may be an average value of the area differences S1 to S8.
The position where the camera 10 is mounted may be the front or side of the vehicle C1. The camera 10 attached to a device other than the vehicle C1 may be corrected by the camera correction device 20.
One mode of the present disclosure is as follows.
[ item 1]
A camera calibration device includes: a feature point extraction unit that extracts a plurality of feature points from an image of a calibration marker captured by a camera; an area calculation unit that calculates an area of a figure defined by the feature points extracted by the feature point extraction unit; and an adjusting unit configured to adjust a camera parameter for correcting the camera based on the coordinates of the feature points extracted by the feature point extracting unit and the area of the pattern calculated by the area calculating unit.
According to this aspect, the area varies differently from the feature point in accordance with the error in the attachment position of the camera, and the amount of variation in the area due to the error in the attachment position of the camera is larger than the amount of variation in the coordinates of the feature point due to the error in the attachment position of the camera, so the accuracy of correction can be improved.
[ item 2]
The camera calibration device according to item 1, wherein the area calculation unit calculates an area of a group of the patterns arranged in a first direction and an area of a group of the patterns arranged in a second direction intersecting the first direction.
In this case, since the area of the pattern group arranged in the first direction and the area of the pattern group arranged in the second direction change with different tendencies due to the deviation of at least one of the pitch angle, yaw angle, and roll angle of the camera, the influence of the error in the attachment position of the camera can be corrected with higher accuracy.
[ item 3]
A camera calibration method, comprising: extracting a plurality of feature points from an image of a correction mark captured by a camera; calculating an area of a figure defined by the extracted feature points; adjusting a camera parameter for correcting the camera based on the extracted coordinates of the feature points and the calculated area of the figure.
According to this aspect, the area varies differently from the feature point in accordance with the error in the attachment position of the camera, and the amount of variation in the area due to the error in the attachment position of the camera is larger than the amount of variation in the coordinates of the feature point due to the error in the attachment position of the camera, so the accuracy of correction can be improved.
Industrial applicability
According to the camera calibration device and the camera calibration method of the present disclosure, since the calibration accuracy of the camera can be improved, the camera calibration device and the camera calibration method can be applied to calibration of a camera mounted on a mobile body such as an automobile.
Description of the reference numerals
10: a camera; 20: a camera calibration device; 24: a feature point extraction unit; 26: an area calculating unit; 28: an adjusting part.
Claims (6)
1. A camera calibration device includes:
a feature point extraction unit that extracts at least a first extracted feature point, a second extracted feature point, and a third extracted feature point from an image of a first calibration marker captured by a camera;
an area calculation unit that calculates a first area of a first graph defined by the first extracted feature point, the second extracted feature point, and the third extracted feature point, and calculates a second area of a second graph defined by a first ideal feature point calculated using an initial value camera parameter and corresponding to the first extracted feature point, a second ideal feature point calculated using an initial value camera parameter and corresponding to the second extracted feature point, and a third ideal feature point calculated using an initial value camera parameter and corresponding to the third extracted feature point; and
an adjustment unit that adjusts a camera parameter for correcting the camera so that a sum of two evaluation values, one of which is a value based on a sum of a first distance between the first ideal feature point and the first extracted feature point, a second distance between the second ideal feature point and the second extracted feature point, and a third distance between the third ideal feature point and the third extracted feature point becomes smaller, and the other of which is a value based on a first area difference that is a difference between the first area and the second area.
2. The camera calibration device according to claim 1,
the adjusting section adjusts a camera parameter for correcting the camera so that a sum of the first distance, the second distance, the third distance, and the first area difference becomes smaller.
3. The camera calibration device according to claim 1 or 2,
the feature point extraction unit further extracts a fourth extracted feature point, a fifth extracted feature point, and a sixth extracted feature point from the image of the second correction mark captured by the camera,
the area calculation unit further calculates a third area of a third graph defined by the fourth extracted feature point, the fifth extracted feature point, and the sixth extracted feature point, and calculates a fourth area of a fourth graph defined by a fourth ideal feature point calculated using the camera parameter of the initial value and corresponding to the fourth extracted feature point, a fifth ideal feature point calculated using the camera parameter of the initial value and corresponding to the fifth extracted feature point, and a sixth ideal feature point calculated using the camera parameter of the initial value and corresponding to the sixth extracted feature point,
the adjustment unit adjusts the camera parameter so that a sum of two evaluation values, one of which is a value based on a sum of the first distance, the second distance, the third distance, a fourth distance between the fourth ideal feature point and the fourth extracted feature point, a fifth distance between the fifth ideal feature point and the fifth extracted feature point, and a sixth distance between the sixth ideal feature point and the sixth extracted feature point, and the other of which is a value based on a sum of the first area difference and a second area difference that is a difference between the third area and the fourth area, becomes smaller.
4. A camera calibration method, comprising:
extracting at least a first extracted feature point, a second extracted feature point, and a third extracted feature point from an image of a first calibration marker photographed by a camera;
calculating a first area of a first graph defined by the first extracted feature point, the second extracted feature point, and the third extracted feature point, and calculating a second area of a second graph defined by a first ideal feature point calculated using a camera parameter of an initial value and corresponding to the first extracted feature point, a second ideal feature point calculated using a camera parameter of an initial value and corresponding to the second extracted feature point, and a third ideal feature point calculated using a camera parameter of an initial value and corresponding to the third extracted feature point; and
a camera parameter for correcting the camera is adjusted so that a sum of two evaluation values, one of which is a value based on a sum of a first distance between the first ideal feature point and the first extracted feature point, a second distance between the second ideal feature point and the second extracted feature point, and a third distance between the third ideal feature point and the third extracted feature point becomes small, and the other of which is a value based on a first area difference that is a difference between the first area and the second area.
5. The camera calibration method according to claim 4,
in adjusting the camera parameter, a camera parameter for correcting the camera is adjusted so that a sum of the first distance, the second distance, the third distance, and the first area difference becomes small.
6. The camera calibration method according to claim 4 or 5, further comprising,
extracting a fourth extracted feature point, a fifth extracted feature point, and a sixth extracted feature point from an image of a second correction marker captured by the camera,
calculating a third area of a third graph defined by the fourth extracted feature point, the fifth extracted feature point, and the sixth extracted feature point, and calculating a fourth area of a fourth graph defined by a fourth ideal feature point that is a feature point corresponding to the fourth extracted feature point calculated using the camera parameter of the initial value, a fifth ideal feature point that is a feature point corresponding to the fifth extracted feature point calculated using the camera parameter of the initial value, and a sixth ideal feature point that is a feature point corresponding to the sixth extracted feature point calculated using the camera parameter of the initial value,
the camera parameter is adjusted so that the sum of two evaluation values, one of which is a value based on the sum of the first distance, the second distance, the third distance, a fourth distance between the fourth ideal feature point and the fourth extracted feature point, a fifth distance between the fifth ideal feature point and the fifth extracted feature point, and a sixth distance between the sixth ideal feature point and the sixth extracted feature point, becomes smaller, and the other of which is a value based on the sum of the first area difference and a second area difference that is a difference between the third area and the fourth area.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016016382A JP6688989B2 (en) | 2016-01-29 | 2016-01-29 | Camera calibration device and camera calibration method |
JP2016-016382 | 2016-01-29 | ||
PCT/JP2017/001637 WO2017130823A1 (en) | 2016-01-29 | 2017-01-19 | Camera calibration device and camera calibration method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108496354A CN108496354A (en) | 2018-09-04 |
CN108496354B true CN108496354B (en) | 2020-12-08 |
Family
ID=59397764
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780007829.6A Active CN108496354B (en) | 2016-01-29 | 2017-01-19 | Camera calibration device and camera calibration method |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP6688989B2 (en) |
CN (1) | CN108496354B (en) |
WO (1) | WO2017130823A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10504244B2 (en) * | 2017-09-28 | 2019-12-10 | Baidu Usa Llc | Systems and methods to improve camera intrinsic parameter calibration |
ES2968764T3 (en) * | 2018-09-28 | 2024-05-13 | Nexion Spa | Calibrating a vehicle camera |
CN109272474B (en) * | 2018-11-21 | 2022-04-12 | 大陆汽车车身电子系统(芜湖)有限公司 | Method for determining correction parameters of imaging system and pre-correction method of imaging system |
CN110827357B (en) * | 2019-09-30 | 2024-03-29 | 深圳市安思疆科技有限公司 | Combined pattern calibration plate and structured light camera parameter calibration method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08110206A (en) * | 1994-10-12 | 1996-04-30 | Ricoh Co Ltd | Method and apparatus for detecting position and posture |
JP2003078811A (en) * | 2001-09-04 | 2003-03-14 | Nippon Hoso Kyokai <Nhk> | Method for associating marker coordinate, method and system for acquiring camera parameter and calibration pattern |
CN101286235A (en) * | 2008-06-10 | 2008-10-15 | 北京航空航天大学 | Video camera calibration method based on flexible stereo target |
CN102342088A (en) * | 2009-03-31 | 2012-02-01 | 爱信精机株式会社 | Calibration Index For Use In Calibration Of Onboard Camera, Method Of Onboard Camera Calibration Using The Calibration Index And Program For Calibration Apparatus For Onboard Camera Using The Calibration Index |
CN103854271A (en) * | 2012-11-28 | 2014-06-11 | 华中科技大学 | Plane type camera calibration method |
CN104008548A (en) * | 2014-06-04 | 2014-08-27 | 无锡观智视觉科技有限公司 | Feature point extraction method for vehicle-mounted around view system camera parameter calibration |
CN104217429A (en) * | 2014-08-25 | 2014-12-17 | 太仓中科信息技术研究院 | Design and detection method of camera calibration board |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4751939B2 (en) * | 2009-03-31 | 2011-08-17 | アイシン精機株式会社 | Car camera calibration system |
-
2016
- 2016-01-29 JP JP2016016382A patent/JP6688989B2/en active Active
-
2017
- 2017-01-19 CN CN201780007829.6A patent/CN108496354B/en active Active
- 2017-01-19 WO PCT/JP2017/001637 patent/WO2017130823A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08110206A (en) * | 1994-10-12 | 1996-04-30 | Ricoh Co Ltd | Method and apparatus for detecting position and posture |
JP2003078811A (en) * | 2001-09-04 | 2003-03-14 | Nippon Hoso Kyokai <Nhk> | Method for associating marker coordinate, method and system for acquiring camera parameter and calibration pattern |
CN101286235A (en) * | 2008-06-10 | 2008-10-15 | 北京航空航天大学 | Video camera calibration method based on flexible stereo target |
CN102342088A (en) * | 2009-03-31 | 2012-02-01 | 爱信精机株式会社 | Calibration Index For Use In Calibration Of Onboard Camera, Method Of Onboard Camera Calibration Using The Calibration Index And Program For Calibration Apparatus For Onboard Camera Using The Calibration Index |
CN103854271A (en) * | 2012-11-28 | 2014-06-11 | 华中科技大学 | Plane type camera calibration method |
CN104008548A (en) * | 2014-06-04 | 2014-08-27 | 无锡观智视觉科技有限公司 | Feature point extraction method for vehicle-mounted around view system camera parameter calibration |
CN104217429A (en) * | 2014-08-25 | 2014-12-17 | 太仓中科信息技术研究院 | Design and detection method of camera calibration board |
Also Published As
Publication number | Publication date |
---|---|
WO2017130823A1 (en) | 2017-08-03 |
JP2017135680A (en) | 2017-08-03 |
JP6688989B2 (en) | 2020-04-28 |
CN108496354A (en) | 2018-09-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108496354B (en) | Camera calibration device and camera calibration method | |
JP6348093B2 (en) | Image processing apparatus and method for detecting image of detection object from input data | |
CN111242031B (en) | Lane line detection method based on high-precision map | |
US10192309B2 (en) | Camera calibration device | |
EP3041228A1 (en) | Camera calibration device, camera calibration system, and camera calibration method | |
US10659762B2 (en) | Stereo camera | |
KR102647929B1 (en) | Apparatus and method for calibrating camera and lidar sensor of vehicle | |
US9361668B2 (en) | Method and apparatus for generating disparity map | |
CN109523585B (en) | Multisource remote sensing image feature matching method based on direction phase consistency | |
US20210181747A1 (en) | Robert climbing control method and robot | |
CN110084743B (en) | Image splicing and positioning method based on multi-flight-zone initial flight path constraint | |
CN112862895B (en) | Fisheye camera calibration method, device and system | |
EP3534333A1 (en) | Method for calibrating the position and orientation of a camera relative to a calibration pattern | |
KR102297683B1 (en) | Method and apparatus for calibrating a plurality of cameras | |
JP2022039895A (en) | Image correction method and system based on deep learning | |
WO2018074302A1 (en) | Vehicle-mounted camera calibration device and vehicle-mounted camera calibration method | |
CN114332237A (en) | Method for calculating conversion relation between camera coordinate system and laser coordinate system | |
CN113978512A (en) | Rail train positioning method and device | |
JP2020107938A (en) | Camera calibration device, camera calibration method, and program | |
KR102277828B1 (en) | Method and apparatus for calibratiing a plurality of cameras | |
CN113077478B (en) | Alignment method, compensation method and system of display panel and readable storage medium | |
CN109829950B (en) | Method and device for detecting calibration parameters of binocular camera and automatic driving system | |
JP6890293B2 (en) | Camera calibration device, camera calibration system, camera calibration method and program | |
US9703189B2 (en) | Method of calculating a shift vale of a cell contact | |
CN117422746B (en) | Partition nonlinear geographic registration method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20240401 Address after: Kanagawa Prefecture, Japan Patentee after: Panasonic Automotive Electronic Systems Co.,Ltd. Country or region after: Japan Address before: Osaka, Japan Patentee before: PANASONIC INTELLECTUAL PROPERTY MANAGEMENT Co.,Ltd. Country or region before: Japan |