CN112288825A - Camera calibration method and device, electronic equipment, storage medium and road side equipment - Google Patents
Camera calibration method and device, electronic equipment, storage medium and road side equipment Download PDFInfo
- Publication number
- CN112288825A CN112288825A CN202011177580.3A CN202011177580A CN112288825A CN 112288825 A CN112288825 A CN 112288825A CN 202011177580 A CN202011177580 A CN 202011177580A CN 112288825 A CN112288825 A CN 112288825A
- Authority
- CN
- China
- Prior art keywords
- image
- bolt
- fisheye camera
- equivalent
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000000007 visual effect Effects 0.000 claims abstract description 82
- 230000009466 transformation Effects 0.000 claims abstract description 71
- 230000015654 memory Effects 0.000 claims description 20
- 239000003550 marker Substances 0.000 claims description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000006870 function Effects 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000001502 supplementing effect Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 238000005452 bending Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
- G06T3/047—Fisheye or wide-angle transformations
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
The application discloses a camera calibration method and device, electronic equipment, a storage medium and roadside equipment, and relates to the field of computer vision and the field of intelligent transportation. The specific implementation scheme is as follows: carrying out projection transformation on the image of the mark point acquired by the fisheye camera to obtain a gunlock visual angle image; calibrating external parameters of the equivalent bolt corresponding to the bolt visual angle image according to the real positioning information of the mark points and the bolt visual angle image; and obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gunlock. According to the technical scheme of the application, the number of manual dotting can be reduced, and the calibration precision can be improved.
Description
Technical Field
The application relates to the technical field of computers, in particular to the field of computer vision and the field of intelligent transportation.
Background
The application of car networking technology relies on the comprehensive perception of people, cars and things on the road. A camera for capturing road images is an important sensing device, and provides image information capable of providing basic support for various traffic applications and information services. In order to obtain three-dimensional information in the real world from a two-dimensional image, the camera needs to be calibrated. For a camera disposed in a road, the calibration process generally includes setting a landmark point on the road and obtaining camera parameters based on image information of the landmark point.
Disclosure of Invention
The application provides a camera calibration method and device, electronic equipment, storage medium and road side equipment.
According to an aspect of the present application, there is provided a camera calibration method, including:
carrying out projection transformation on the image of the mark point acquired by the fisheye camera to obtain a gunlock visual angle image;
calibrating external parameters of the equivalent bolt corresponding to the bolt visual angle image according to the real positioning information of the mark points and the bolt visual angle image;
and obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gunlock.
According to another aspect of the present application, there is provided a camera calibration apparatus, including:
the image projection module is used for carrying out projection transformation on the image of the mark point acquired by the fisheye camera to obtain a gunlock visual angle image;
the equivalent bolt face calibration module is used for calibrating external parameters of the equivalent bolt face corresponding to the bolt face visual angle image according to the real positioning information of the mark points and the bolt face visual angle image;
and the fisheye camera calibration module is used for obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gunlock.
According to another aspect of the present application, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method of any of the embodiments of the present application.
According to another aspect of the present application, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of any of the embodiments of the present application.
According to another aspect of the present application, there is provided a roadside apparatus including:
one or more processors; and
the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors are enabled to realize the method provided by the embodiment of the application.
According to the technical scheme of this application, can improve fisheye camera and mark the precision.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a schematic diagram of a camera calibration method according to one embodiment of the present application;
fig. 2 is an example of an image of a landmark point acquired by a fisheye camera in an embodiment of the present application;
fig. 3 is an example of a bolt view image in an embodiment of the present application;
FIG. 4 is a schematic diagram of a camera calibration method according to another embodiment of the present application;
FIG. 5 is a schematic diagram of a spherical projective transformation model in an embodiment of the present application;
FIG. 6 is a schematic diagram of a camera calibration apparatus according to an embodiment of the present application;
FIG. 7 is a schematic diagram of a camera calibration apparatus according to another embodiment of the present application;
fig. 8 is a block diagram of an electronic device for implementing a camera calibration method according to an embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In computer vision application, in order to determine the relationship between the three-dimensional geometric position of a certain point on the surface of an object in space and the corresponding point in an image, geometric models for imaging by a camera need to be established, and the parameters of the geometric models are the parameters of the camera. The process of determining camera parameters may be referred to as camera calibration or video camera calibration. The camera parameters include an internal reference and an external reference. The internal parameters are related to the characteristics of the camera itself, for example, the internal parameters may include the focal length, distortion coefficient, etc. of the camera; the external reference represents a relative positional relationship between the camera coordinate system and the world coordinate system, and is related to the position, the rotation direction, and the like of the camera, for example, the external reference may include a rotation parameter, a translation parameter, and the like between the camera coordinate system and the world coordinate system.
In the field of intelligent transportation, new infrastructure equipment, such as street lamps, road monitoring cameras, traffic lights and the like supporting V2X (Vehicle to electric Vehicle networking), can provide sensed information beyond a visual field range for a Vehicle, such as positioning information of obstacles outside the visual field range and the like. The road monitoring camera is one of the most main sensors in the intelligent traffic sensing system, and the accuracy of the external parameters plays a crucial role in the precision and the robustness of the sensing system. The road monitoring camera may include one or more of a fisheye camera, a gun camera (alternatively referred to as a gun bolt), a ball machine, and the like.
The camera calibration method provided by the application can be used for external reference calibration of the fisheye camera in the intelligent traffic perception system. Illustratively, the execution subject of the method may be various road side devices, such as a road side sensing device with a computing function, a road side computing device connected with the road side sensing device, a server device connected with the road side computing device, a server device directly connected with the road side sensing device, and the like.
In a system architecture of intelligent transportation vehicle-road cooperation, a road side device comprises a road side sensing device and a road side calculating device. The roadside sensing device (e.g., a roadside camera for acquiring traffic light images) is connected to a roadside Computing device (e.g., a roadside Computing Unit (RSCU)), which is connected to a server device, and the server device may communicate with the autonomous driving or assisted driving vehicle in various ways. In another system architecture of intelligent transportation vehicle-road cooperation, the roadside sensing equipment comprises a calculation function, and the roadside sensing equipment is directly connected to the server equipment. The above connections may be wired or wireless; the server device in the application is, for example, a cloud control platform, a vehicle-road cooperative management platform, a central subsystem, an edge computing platform, a cloud computing platform, and the like.
Fig. 1 shows a schematic diagram of a camera calibration method according to an embodiment of the present application. As shown in fig. 1, the method includes:
step S11, carrying out projection transformation on the image of the mark point collected by the fisheye camera to obtain a gunlock visual angle image;
step S12, according to the real positioning information of the mark points and the sight angle image of the bolt, calibrating the external parameters of the equivalent bolt corresponding to the sight angle image of the bolt;
and step S13, obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gunlock.
For example, a fisheye camera may be installed near the intersection, for example, above the roads for each driving direction of the roads meeting in the intersection.
Fig. 2 is an example of an image of a marking point acquired by a fisheye camera, as shown in fig. 2, the range of the viewing angle of the fisheye camera is large, and a situation that a plurality of lanes under the fisheye camera are far back and forth can be shot, but barrel distortion exists, for example, a lane line 22 where the marking point 21 is located has obvious bending deformation in the image. The image collected by the fisheye camera is subjected to projection transformation, barrel distortion in the image can be reduced or eliminated, and a gunlock visual angle image is obtained, wherein the gunlock visual angle image is an image which can be equivalently collected by a certain gunlock along a fixed orientation angle, and the gunlock can be recorded as an equivalent gunlock corresponding to the gunlock visual angle image. Fig. 3 shows an example of a view image of a gunlock, which may have a large and small size, but no barrel distortion is evident in fig. 3, and the lane line 22 where the symbol point 21 is located is relatively straight.
In practical application, a plurality of mark points can be arranged on a road surface, a fisheye camera is used for collecting an image of a road surface area containing the mark points, RTK (Real-time Kinematic) equipment is manually carried to the road surface to obtain positioning information of the mark points in the Real world, namely Real positioning information, and the Real positioning information can contain three-dimensional coordinates.
As an alternative, the coordinates of the marker points in the image of the viewing angle of the bolt face can be determined by identifying the marker points in the image by means of image recognition or manual recognition. The coordinates of the plurality of mark points in the gun camera view image and the three-dimensional coordinates of the plurality of mark points are used for carrying out PnP (multipoint-n-Point) calculation, and the external parameters of the equivalent gun camera can be obtained. Because the image of the gunlock visual angle is obtained by the image projection transformation collected by the fisheye camera, the external parameter of the fisheye camera can be calculated according to the external parameter of the equivalent gunlock corresponding to the image of the gunlock visual angle.
Because the imaging model of the fisheye camera is complex, the fisheye camera needs to be equivalent to a pinhole camera model when being directly calibrated, and the calculation complexity is high and the error is large. By adopting the camera calibration method, the image projection of the marker points acquired by the fisheye camera is converted into the visual angle image of the gunlock, so that the external reference calibration of the fisheye camera is converted into the external reference calibration of the equivalent gunlock, the resolution complexity of the external reference of the fisheye camera is reduced, and the accuracy is higher. The method is applied to the intelligent traffic sensing system, and can improve sensing precision and robustness. In addition, the fisheye camera arranged in the road is often used for supplementing the visual field blind area of the gunlock in the road, the external reference calibration of the fisheye camera is converted into the external reference calibration of the equivalent gunlock, the same mark points can be conveniently used for calibrating the gunlock and the fisheye camera in the road, and therefore the number of the mark points is reduced, and the manual dotting cost and the road safety risk are reduced.
For example, in step S11, the performing projection transformation on the image of the landmark collected by the fisheye camera to obtain the image of the viewing angle of the bolt face may include:
and performing projection transformation on the image of the mark point acquired by the fisheye camera based on a spherical projection transformation model to obtain a gunlock visual angle image which has the same visual range as the image acquired by the target gunlock.
The spherical projection transformation model is used for representing the projection transformation relation between the spherical image and the gun camera visual angle image, wherein the spherical image can be obtained by converting the image collected by the fisheye camera. The process of obtaining the image of the viewing angle of the gun bolt based on the spherical projection transformation model may be as shown in fig. 4, where a lens design model such as an equidistant projection model, an equal solid angle projection model, etc. based on the fisheye camera is first used to project the image collected by the fisheye camera onto a spherical projection surface to obtain a spherical image, and then the spherical image is converted into the image of the viewing angle of the gun bolt based on the spherical projection transformation model.
Fig. 5 shows a schematic diagram of a spherical projective transformation model. Where O is the projection center, the spherical surface 51 is the spherical projection surface of the image collected by the fisheye camera, and the points on the spherical surface 51 can be represented based on the XYZ spatial coordinate system. The plane 52 is a projection plane corresponding to the image of the viewing angle of the bolt, and is equivalent to an imaging plane of an equivalent bolt, and points on the plane 52 can be represented based on a uv plane coordinate system. The spherical surface 51 and the plane 52 are tangent to the point D,the projection on the XOY plane isθ1Is composed ofAngle with Z-axis direction, theta2Indicating a counterclockwise rotation from the X-axis toThe angle of (c). Theta1May be referred to as pitch angle, θ2Which may be referred to as yaw angle. The spherical projection surface 51 may be based on a fisheye cameraDetermining an internal reference or lens design model based on the pitch angle theta1And yaw angle theta2The D point, and thus the plane 52, may be determined. On the basis of determining the plane 52, according to the geometric relationship, a corresponding point on the plane 52 can be determined based on any point on the spherical projection surface 51. The coordinates of the point P projected on the plane 52 can be determined based on the coordinates of the point R on the spherical projection surface 51 in fig. 4, for example, so that the spherical image can be converted into an equivalent gun image. In practical application, the pitch angle θ may be preset1Yaw angle theta2And obtaining a corresponding spherical projection transformation model.
For example, a gunlock for monitoring traffic conditions at a certain fixed angle, such as a forward looking gunlock facing the road driving direction and/or a backward looking gunlock facing the opposite direction of the road driving direction, may be provided beside the fisheye camera above the road. The fisheye camera may face downward toward the road surface for supplementing the blind zone of vision between the forward and rearward looking bolt sets. In an embodiment of the application, the target bolt comprises a forward looking bolt and/or a rearward looking bolt which monitors the same road as the fisheye camera.
For example, by using the coordinates of the fisheye camera as the origin and the visible range of the image collected by the fisheye camera as the front and back 50 meters of the origin, the pitch angle θ can be adjusted1And yaw angle theta2So that the visual range of the image of the gunlock visual angle is adjusted within 50 meters in front of and behind the origin. Assuming that the visual range of the image captured by the forward looking bolt is 10 to 70 meters before the origin, according to the above embodiment, the visual range of the visual angle image of the bolt may be adjusted to any region within 10 to 50 meters before the origin, so that the visual angle image of the bolt and the image captured by the forward looking bolt contain the same visual range. For example, if the visible range of the image of the bolt view angle is 5 to 30 meters before the origin, 20 to 50 meters before the origin, etc., and a set of mark points is set within the same visible range, the forward-looking bolt and the fisheye bolt can be calibrated by using the set of mark points. Furthermore, the size of the same visual range in the image of the gunlock visual angle can be adjusted, so that the periphery of the mark point in the gunlock visual angle is enlarged, and the calibration precision is improved.
According to the embodiment, the parameters of the spherical projection transformation model are adjustable, so that the visual range of the gun bolt visual angle image is obtained based on the spherical projection transformation model, the visual range of the gun bolt visual angle image can be conveniently adjusted, the visual range of the gun bolt visual angle image and the visual range of the image collected by the target gun bolt can be the same, the size of the same visual range in the image can be adjusted, the fisheye camera and the target gun bolt can be calibrated by using the same mark points, the number of the mark points is reduced, and the calibration precision can be improved.
As an exemplary embodiment, the camera calibration method may further include:
and calibrating external parameters of the target bolt according to the real positioning information of the mark points and the image of the mark points acquired by the target bolt.
According to the embodiment, the same mark points are utilized, the equivalent bolt is calibrated to realize the calibration of the fisheye camera, and the target bolt is calibrated, so that the number of the mark points is reduced, and the cost and the safety risk of manual dotting are reduced.
As an exemplary embodiment, in step S11, performing a projective transformation on the image of the landmark point collected by the fisheye camera based on a spherical projective transformation model to obtain an image of the viewing angle of the bolt face having the same visual range as the image collected by the target bolt face includes:
traversing the rotation angle within a preset range;
determining a corresponding spherical projection transformation model based on the traversed rotation angle;
based on the spherical projection transformation model, carrying out projection transformation on the image of the marker point acquired by the fisheye camera to obtain a projection image corresponding to the spherical projection transformation model;
and determining the projection image corresponding to the spherical projection transformation model as the visual angle image of the gun bolt under the condition that the projection image corresponding to the spherical projection transformation model and the image collected by the target gun bolt have the same visual range and the visual range in the projection image meets the preset condition.
Illustratively, the above-mentioned rotation angle may include a pitch angle and/or a yaw angle of an imaging plane of the gun camera view image with respect to a spherical projection plane of the fisheye camera.
For example, the pitch angle may be preset to a fixed value, e.g., 45 °, and then traversed every 5 ° for yaw angle, i.e., 0 °, 5 °, 10 °, etc., within 0 ° to 90 °. And traversing an angle, determining a spherical projection transformation model once, and obtaining a corresponding projection image. And determining the projection image meeting the conditions as a gun camera view angle image.
As another example, the pitch angle may be traversed every 5 ° within 0 to 90 °. When traversing to a certain angle, for example 5 degrees, traversing the pitch angle every 5 degrees within 0 to 90 degrees, determining a spherical projection transformation model once traversing to a combination of the pitch angle and the yaw angle, and obtaining a corresponding projection image. And determining the projection image meeting the conditions as a gun camera view angle image.
Illustratively, the predetermined condition may include: the area of the display area of the same viewable area in the projected image exceeds a predetermined threshold and/or the display area is centered in the projected image, etc.
According to the embodiment, the corresponding projection images are obtained by traversing the rotation angles, and the projection images meeting the conditions are selected as the images of the bolt view angle. The camera calibration method can ensure that the image of the visual angle of the gunlock meeting the conditions can be obtained by controlling the traversal precision, and ensure the calibration accuracy of the camera.
As an exemplary embodiment, the step S13 of obtaining the external parameter of the fisheye camera according to the external parameter of the equivalent bolt face may include:
determining rotation parameters between an equivalent gunlock and the fisheye camera according to projection transformation parameters between the images of the mark points acquired by the fisheye camera and the visual angle images of the gunlock;
and obtaining the external parameters of the fisheye camera according to the rotation parameters and the external parameters of the equivalent gunlock.
Illustratively, the projective transformation parameters may include a pitch angle and a yaw angle of the imaging plane of the image of the bolt face view angle with respect to the spherical projection plane of the fisheye camera, a position of a point of intersection of the imaging plane of the image of the bolt face view angle and the spherical projection plane of the fisheye camera, and the like. The rotation parameter may include a rotation matrix.
For example, the rotation matrix R between the fisheye camera and the equivalent bolt face can be determined according to the pitch angle and yaw angle of the plane in which the bolt face view image lies relative to the spherical projection plane of the fisheye camera. Suppose that the external parameters of the equivalent bolt include a rotation matrix R1And shifting adjacent T1And then:
the rotation matrix of the fisheye camera is: r2=R*R1。
The translation vector of the fisheye camera is: t is2=T1。
According to the embodiment, the accurate rotation parameter between the equivalent bolt and the fisheye camera can be obtained based on the projection transformation parameter, so that the accurate fisheye camera external parameter can be obtained based on the external parameter of the equivalent bolt, and the accuracy of fisheye camera external parameter calibration is improved.
As an exemplary embodiment, in the step S12, calibrating the external parameters of the equivalent bolt corresponding to the bolt view image according to the actual positioning information of the landmark points and the bolt view image, includes:
determining internal parameters of an equivalent bolt corresponding to the bolt visual angle image according to projection transformation parameters between the image of the mark point acquired by the fisheye camera and the bolt visual angle image;
and calibrating external parameters of the equivalent bolt according to the real positioning information of the mark points, the bolt view angle image and the internal parameters of the equivalent bolt.
Because the internal parameters are related to the characteristics of the camera, including the focal length and distortion coefficient of the camera, when projection transformation is carried out to reduce or eliminate the distortion of the image acquired by the fisheye camera, the internal parameters of the fisheye camera can be determined according to the projection transformation parameters. The transformation relationship between the image pixel coordinate system and the camera coordinate system can be obtained based on the internal reference. Based on the real positioning information of the mark points and the coordinates of the mark points in the visual angle image of the gun, the conversion relation between an image pixel coordinate system and a world coordinate system can be obtained, and based on the conversion relation between the image pixel coordinate system and a camera coordinate system and the conversion relation between the image pixel coordinate system and the world coordinate system, the conversion relation between the camera coordinate system and the world coordinate system can be obtained, and the external parameters of the equivalent gun are calibrated.
According to the embodiment, the accurate external parameters of the fisheye camera can be obtained by determining the internal parameters of the equivalent gunlock, and the accuracy of calibrating the external parameters of the fisheye camera is improved.
As the realization of the methods, the application also provides a camera calibration device. Fig. 6 shows a schematic view of a camera calibration arrangement. As shown in fig. 6, the apparatus includes:
the image projection module 610 is used for performing projection transformation on the image of the mark point acquired by the fisheye camera to obtain a gunlock visual angle image;
the equivalent bolt face calibration module 620 is used for calibrating the external parameters of the equivalent bolt face corresponding to the bolt face visual angle image according to the real positioning information of the mark points and the bolt face visual angle image;
and the fisheye camera calibration module 630 is used for obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent bolt.
Illustratively, the image projection module is used for performing projection transformation on the image of the marker point acquired by the fisheye camera based on a spherical projection transformation model to obtain a gunlock visual angle image with the same visual range as the image acquired by the target gunlock.
Exemplarily, as shown in fig. 7, the camera calibration apparatus further includes:
and the target bolt calibration module 640 is used for calibrating external parameters of the target bolt according to the real positioning information of the mark points and the image of the mark points acquired by the target bolt.
Illustratively, as shown in fig. 7, the image projection module 610 includes:
a traversing unit 611 for traversing the rotation angle within a predetermined range;
a first determining unit 612, configured to determine a corresponding spherical projective transformation model based on the traversed rotation angle;
a projection unit 613, configured to perform projection transformation on the image of the marker point acquired by the fisheye camera based on the spherical projection transformation model to obtain a projection image corresponding to the spherical projection transformation model;
and a second determining unit 614, configured to determine the projection image corresponding to the spherical projection transformation model as the gun camera view angle image when the projection image corresponding to the spherical projection transformation model and the image acquired by the target gun camera have the same visual range, and a display area of the visual range in the projection image meets a predetermined condition.
Illustratively, as shown in fig. 7, the fisheye camera calibration module 630 includes:
the third determining unit 631 is configured to determine a rotation parameter between the equivalent bolt and the fisheye camera according to a projection transformation parameter between the image of the marker point acquired by the fisheye camera and the bolt view image;
and the fisheye calibration unit 632 is used for obtaining the external parameters of the fisheye camera according to the rotation parameters and the external parameters of the equivalent bolt.
Illustratively, as shown in fig. 7, the equivalent bolt face calibration module 620 includes:
the internal reference calibration unit 621 is configured to determine an internal reference of an equivalent bolt corresponding to the bolt view image according to a projection transformation parameter between the image of the marker point acquired by the fisheye camera and the bolt view image;
and the external parameter calibration unit 622 is used for calibrating the external parameters of the equivalent bolt according to the real positioning information of the mark points, the visual angle image of the bolt and the internal parameters of the equivalent bolt.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
Fig. 8 is a block diagram of an electronic device according to the camera calibration method in the embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 8, the electronic apparatus includes: one or more processors 801, memory 802, and interfaces for connecting the various components, including a high speed interface and a low speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). Fig. 8 illustrates an example of a processor 801.
The memory 802 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the camera calibration method provided by the present application. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the camera calibration method provided herein.
The memory 802, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the camera calibration method in the embodiments of the present application (e.g., the image projection module 610, the equivalent bolt calibration module 620, and the fish-eye camera calibration module 630 shown in fig. 6). The processor 801 executes various functional applications of the server and data processing by running non-transitory software programs, instructions and modules stored in the memory 802, that is, implements the camera calibration method in the above method embodiment.
The memory 802 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the electronic device of the camera calibration method, and the like. Further, the memory 802 may include high speed random access memory and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 802 optionally includes memory located remotely from the processor 801, which may be connected to the camera calibration method electronics over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the camera calibration method may further include: an input device 803 and an output device 804. The processor 801, the memory 802, the input device 803, and the output device 804 may be connected by a bus or other means, and are exemplified by a bus in fig. 8.
The input device 803 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device of the camera calibration method, such as an input device of a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointer, one or more mouse buttons, a track ball, a joystick, or the like. The output devices 804 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
According to the embodiment of the application, the application also provides the roadside device. The apparatus may include:
one or more processors; and
a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the method for detecting an obstacle in the above-described method embodiments.
The functions and implementations of the processor and the storage device of the roadside apparatus may refer to the description of the processor and the memory in the above embodiment of the electronic apparatus.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and Virtual Private Server (VPS) service. The server may also be a server of a distributed system, or a server incorporating a blockchain.
According to the technical scheme of the embodiment of the application, the image projection of the mark point acquired by the fisheye camera is converted into the visual angle image of the gunlock, so that the external reference calibration of the fisheye camera is converted into the external reference calibration of the equivalent gunlock, the resolving complexity of the external reference of the fisheye camera is reduced, and the accuracy is higher. In addition, the fisheye camera arranged in the road is often used for supplementing the visual field blind area of the gunlock in the road, the external reference calibration of the fisheye camera is converted into the external reference calibration of the equivalent gunlock, and the gunlock and the fisheye camera in the road can be conveniently calibrated by using the mark points at the same position, so that the number of the mark points is reduced, and the manual dotting cost and the road safety risk are reduced. The embodiment of the application is applied to an intelligent traffic sensing system, and can improve sensing precision and robustness.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present application can be achieved, and the present invention is not limited herein.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (15)
1. A camera calibration method comprises the following steps:
carrying out projection transformation on the image of the mark point acquired by the fisheye camera to obtain a gunlock visual angle image;
calibrating external parameters of the equivalent bolt corresponding to the bolt visual angle image according to the real positioning information of the mark points and the bolt visual angle image;
and obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gunlock.
2. The method of claim 1, wherein the projective transformation of the image of the landmark points collected by the fisheye camera to obtain the image of the view angle of the gunlock comprises:
and performing projection transformation on the image of the mark point acquired by the fisheye camera based on a spherical projection transformation model to obtain a gunlock visual angle image which has the same visual range as the image acquired by the target gunlock.
3. The method of claim 2, further comprising:
and calibrating external parameters of the target rifle bolt according to the real positioning information of the mark points and the image of the mark points acquired by the target rifle bolt.
4. The method as claimed in claim 2, wherein the performing a projective transformation on the image of the landmark points collected by the fisheye camera based on the spherical projective transformation model to obtain a bolt view angle image having the same visual range as the image collected by the target bolt comprises:
traversing the rotation angle within a preset range;
determining a corresponding spherical projection transformation model based on the traversed rotation angle;
based on the spherical projection transformation model, carrying out projection transformation on the image of the marker point acquired by the fisheye camera to obtain a projection image corresponding to the spherical projection transformation model;
and determining the projection image corresponding to the spherical projection transformation model as the gun camera view angle image under the condition that the projection image corresponding to the spherical projection transformation model and the image collected by the target gun camera have the same visual range and the visual range in the projection image meets a preset condition.
5. The method according to any one of claims 1 to 4, wherein obtaining the external parameters of the fisheye camera from the external parameters of the equivalent bolt comprises:
determining rotation parameters between the equivalent gunlock and the fisheye camera according to projection transformation parameters between the images of the mark points acquired by the fisheye camera and the images of the gunlock visual angles;
and obtaining the external parameters of the fisheye camera according to the rotation parameters and the external parameters of the equivalent rifle bolt.
6. The method according to any one of claims 1 to 4, wherein the calibrating the external parameters of the equivalent bolt corresponding to the bolt view image according to the real positioning information of the mark points and the bolt view image comprises:
determining internal parameters of an equivalent bolt face corresponding to the bolt face visual angle image according to projection transformation parameters between the image of the mark point acquired by the fisheye camera and the bolt face visual angle image;
and calibrating the external parameters of the equivalent bolt according to the real positioning information of the mark points, the visual angle image of the bolt and the internal parameters of the equivalent bolt.
7. A camera calibration device, comprising:
the image projection module is used for carrying out projection transformation on the image of the mark point acquired by the fisheye camera to obtain a gunlock visual angle image;
the equivalent bolt face calibration module is used for calibrating external parameters of the equivalent bolt face corresponding to the bolt face visual angle image according to the real positioning information of the mark points and the bolt face visual angle image;
and the fisheye camera calibration module is used for obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gunlock.
8. The device of claim 7, wherein the image projection module is configured to perform projection transformation on the image of the landmark points collected by the fisheye camera based on a spherical projection transformation model to obtain an image of the bolt view angle having the same visual range as the image collected by the target bolt.
9. The apparatus of claim 8, further comprising:
and the target rifle bolt calibration module is used for calibrating external parameters of the target rifle bolt according to the real positioning information of the mark point and the image of the mark point acquired by the target rifle bolt.
10. The apparatus of claim 9, wherein the image projection module comprises:
the traversing unit is used for traversing the rotating angle within a preset range;
the first determining unit is used for determining a corresponding spherical projection transformation model based on the traversed rotation angle;
the projection unit is used for carrying out projection transformation on the image of the marker point acquired by the fisheye camera based on the spherical projection transformation model to obtain a projection image corresponding to the spherical projection transformation model;
and the second determining unit is used for determining the projection image corresponding to the spherical projection transformation model as the gun camera view angle image under the condition that the projection image corresponding to the spherical projection transformation model and the image collected by the target gun camera have the same visual range and the visual range in the projection image meets a preset condition.
11. The apparatus of any one of claims 7 to 10, wherein the fisheye camera calibration module comprises:
the third determining unit is used for determining rotation parameters between the equivalent gunlock and the fisheye camera according to projection transformation parameters between the image of the mark point acquired by the fisheye camera and the visual angle image of the gunlock;
and the fisheye calibration unit is used for obtaining the external parameters of the fisheye camera according to the rotation parameters and the external parameters of the equivalent bolt.
12. The apparatus of any one of claims 7 to 10, the equivalent bolt face calibration module comprising:
the internal reference calibration unit is used for determining the internal reference of the equivalent bolt corresponding to the bolt visual angle image according to the projection transformation parameter between the image of the mark point acquired by the fisheye camera and the bolt visual angle image;
and the external parameter calibration unit is used for calibrating the external parameters of the equivalent bolt according to the real positioning information of the mark points, the visual angle image of the bolt and the internal parameters of the equivalent bolt.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-6.
14. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-6.
15. A roadside apparatus, the apparatus comprising:
one or more processors; and
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method of any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011177580.3A CN112288825B (en) | 2020-10-29 | 2020-10-29 | Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011177580.3A CN112288825B (en) | 2020-10-29 | 2020-10-29 | Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112288825A true CN112288825A (en) | 2021-01-29 |
CN112288825B CN112288825B (en) | 2024-04-12 |
Family
ID=74373753
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011177580.3A Active CN112288825B (en) | 2020-10-29 | 2020-10-29 | Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112288825B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112967344A (en) * | 2021-03-09 | 2021-06-15 | 北京百度网讯科技有限公司 | Method, apparatus, storage medium, and program product for camera external reference calibration |
CN112991459A (en) * | 2021-03-09 | 2021-06-18 | 北京百度网讯科技有限公司 | Camera calibration method, device, equipment and storage medium |
CN113592951A (en) * | 2021-07-14 | 2021-11-02 | 阿波罗智联(北京)科技有限公司 | Method and device for calibrating external parameters of vehicle-road cooperative middle-road side camera and electronic equipment |
CN114742897A (en) * | 2022-03-31 | 2022-07-12 | 阿波罗智联(北京)科技有限公司 | Method, device and equipment for processing camera installation information of roadside sensing system |
CN115018967A (en) * | 2022-06-30 | 2022-09-06 | 联通智网科技股份有限公司 | Image generation method, device, equipment and storage medium |
CN115731525A (en) * | 2022-11-21 | 2023-03-03 | 禾多科技(北京)有限公司 | Lane line recognition method and device, electronic equipment and computer readable medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101025824A (en) * | 2007-03-30 | 2007-08-29 | 东南大学 | Calibrating method based on fixed parameters and variable parameters for three-dimensional scanning system |
CN102096923A (en) * | 2011-01-20 | 2011-06-15 | 上海杰图软件技术有限公司 | Fisheye calibration method and device |
CN103559707A (en) * | 2013-10-30 | 2014-02-05 | 同济大学 | Industrial fixed-focus camera parameter calibration method based on moving square target calibration object |
CN107886547A (en) * | 2017-11-10 | 2018-04-06 | 长沙全度影像科技有限公司 | A kind of fisheye camera scaling method and system |
CN108257183A (en) * | 2017-12-20 | 2018-07-06 | 歌尔科技有限公司 | A kind of camera lens axis calibrating method and device |
CN108495085A (en) * | 2018-03-14 | 2018-09-04 | 成都新舟锐视科技有限公司 | A kind of ball machine automatic tracking control method and system based on moving target detection |
CN109003311A (en) * | 2018-08-22 | 2018-12-14 | 上海庄生晓梦信息科技有限公司 | A kind of fish-eye scaling method |
WO2019227079A1 (en) * | 2018-05-25 | 2019-11-28 | Aquifi, Inc. | Systems and methods for multi-camera placement |
-
2020
- 2020-10-29 CN CN202011177580.3A patent/CN112288825B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101025824A (en) * | 2007-03-30 | 2007-08-29 | 东南大学 | Calibrating method based on fixed parameters and variable parameters for three-dimensional scanning system |
CN102096923A (en) * | 2011-01-20 | 2011-06-15 | 上海杰图软件技术有限公司 | Fisheye calibration method and device |
CN103559707A (en) * | 2013-10-30 | 2014-02-05 | 同济大学 | Industrial fixed-focus camera parameter calibration method based on moving square target calibration object |
CN107886547A (en) * | 2017-11-10 | 2018-04-06 | 长沙全度影像科技有限公司 | A kind of fisheye camera scaling method and system |
CN108257183A (en) * | 2017-12-20 | 2018-07-06 | 歌尔科技有限公司 | A kind of camera lens axis calibrating method and device |
CN108495085A (en) * | 2018-03-14 | 2018-09-04 | 成都新舟锐视科技有限公司 | A kind of ball machine automatic tracking control method and system based on moving target detection |
WO2019227079A1 (en) * | 2018-05-25 | 2019-11-28 | Aquifi, Inc. | Systems and methods for multi-camera placement |
CN109003311A (en) * | 2018-08-22 | 2018-12-14 | 上海庄生晓梦信息科技有限公司 | A kind of fish-eye scaling method |
Non-Patent Citations (1)
Title |
---|
傅丹 等: "基于直线的几何不变性标定摄像机参数", 《中国图象图形学报》, vol. 14, no. 06, pages 1058 - 1063 * |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112967344A (en) * | 2021-03-09 | 2021-06-15 | 北京百度网讯科技有限公司 | Method, apparatus, storage medium, and program product for camera external reference calibration |
CN112991459A (en) * | 2021-03-09 | 2021-06-18 | 北京百度网讯科技有限公司 | Camera calibration method, device, equipment and storage medium |
CN112967344B (en) * | 2021-03-09 | 2023-12-08 | 阿波罗智联(北京)科技有限公司 | Method, device, storage medium and program product for calibrating camera external parameters |
CN112991459B (en) * | 2021-03-09 | 2023-12-12 | 阿波罗智联(北京)科技有限公司 | Camera calibration method, device, equipment and storage medium |
CN113592951A (en) * | 2021-07-14 | 2021-11-02 | 阿波罗智联(北京)科技有限公司 | Method and device for calibrating external parameters of vehicle-road cooperative middle-road side camera and electronic equipment |
CN114742897A (en) * | 2022-03-31 | 2022-07-12 | 阿波罗智联(北京)科技有限公司 | Method, device and equipment for processing camera installation information of roadside sensing system |
CN115018967A (en) * | 2022-06-30 | 2022-09-06 | 联通智网科技股份有限公司 | Image generation method, device, equipment and storage medium |
CN115018967B (en) * | 2022-06-30 | 2024-05-03 | 联通智网科技股份有限公司 | Image generation method, device, equipment and storage medium |
CN115731525A (en) * | 2022-11-21 | 2023-03-03 | 禾多科技(北京)有限公司 | Lane line recognition method and device, electronic equipment and computer readable medium |
CN115731525B (en) * | 2022-11-21 | 2023-07-25 | 禾多科技(北京)有限公司 | Lane line identification method, lane line identification device, electronic equipment and computer readable medium |
Also Published As
Publication number | Publication date |
---|---|
CN112288825B (en) | 2024-04-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112288825B (en) | Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment | |
JP7240367B2 (en) | Methods, apparatus, electronic devices and storage media used for vehicle localization | |
CN109461211B (en) | Semantic vector map construction method and device based on visual point cloud and electronic equipment | |
JP6552729B2 (en) | System and method for fusing the outputs of sensors having different resolutions | |
CN112652016B (en) | Point cloud prediction model generation method, pose estimation method and pose estimation device | |
CN111220154A (en) | Vehicle positioning method, device, equipment and medium | |
CN112509057B (en) | Camera external parameter calibration method, device, electronic equipment and computer readable medium | |
EP3968266B1 (en) | Obstacle three-dimensional position acquisition method and apparatus for roadside computing device | |
CN111815719A (en) | External parameter calibration method, device and equipment of image acquisition equipment and storage medium | |
CN112967344B (en) | Method, device, storage medium and program product for calibrating camera external parameters | |
JP2021119507A (en) | Traffic lane determination method, traffic lane positioning accuracy evaluation method, traffic lane determination apparatus, traffic lane positioning accuracy evaluation apparatus, electronic device, computer readable storage medium, and program | |
CN112344855B (en) | Obstacle detection method and device, storage medium and drive test equipment | |
CN112101209B (en) | Method and apparatus for determining world coordinate point cloud for roadside computing device | |
CN114283201A (en) | Camera calibration method and device and road side equipment | |
CN112184914B (en) | Method and device for determining three-dimensional position of target object and road side equipment | |
CN110929669A (en) | Data labeling method and device | |
CN111523471A (en) | Method, device and equipment for determining lane where vehicle is located and storage medium | |
CN111998959B (en) | Temperature calibration method and device based on real-time temperature measurement system and storage medium | |
CN111340890A (en) | Camera external reference calibration method, device, equipment and readable storage medium | |
CN111666876A (en) | Method and device for detecting obstacle, electronic equipment and road side equipment | |
CN111369632A (en) | Method and device for acquiring internal parameters in camera calibration | |
CN111932611B (en) | Object position acquisition method and device | |
CN111612851B (en) | Method, apparatus, device and storage medium for calibrating camera | |
CN111833443A (en) | Landmark position reconstruction in autonomous machine applications | |
CN112565730A (en) | Roadside sensing method and device, electronic equipment, storage medium and roadside equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20211013 Address after: 100176 Room 101, 1st floor, building 1, yard 7, Ruihe West 2nd Road, economic and Technological Development Zone, Daxing District, Beijing Applicant after: Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Address before: 2 / F, baidu building, 10 Shangdi 10th Street, Haidian District, Beijing 100085 Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd. |
|
TA01 | Transfer of patent application right | ||
GR01 | Patent grant | ||
GR01 | Patent grant |