CN112288825B - Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment - Google Patents

Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment Download PDF

Info

Publication number
CN112288825B
CN112288825B CN202011177580.3A CN202011177580A CN112288825B CN 112288825 B CN112288825 B CN 112288825B CN 202011177580 A CN202011177580 A CN 202011177580A CN 112288825 B CN112288825 B CN 112288825B
Authority
CN
China
Prior art keywords
camera
image
equivalent
projection
gun
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011177580.3A
Other languages
Chinese (zh)
Other versions
CN112288825A (en
Inventor
苑立彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Intelligent Connectivity Beijing Technology Co Ltd
Original Assignee
Apollo Intelligent Connectivity Beijing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apollo Intelligent Connectivity Beijing Technology Co Ltd filed Critical Apollo Intelligent Connectivity Beijing Technology Co Ltd
Priority to CN202011177580.3A priority Critical patent/CN112288825B/en
Publication of CN112288825A publication Critical patent/CN112288825A/en
Application granted granted Critical
Publication of CN112288825B publication Critical patent/CN112288825B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • G06T3/047Fisheye or wide-angle transformations

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a camera calibration method, a camera calibration device, electronic equipment, a storage medium and road side equipment, and relates to the field of computer vision and the field of intelligent transportation. The specific implementation scheme is as follows: performing projection transformation on the image of the mark point acquired by the fisheye camera to obtain a gun camera visual angle image; calibrating external parameters of the equivalent gun camera corresponding to the gun camera visual angle image according to the real positioning information of the mark points and the gun camera visual angle image; and obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gun camera. According to the technical scheme, the manual dotting quantity can be reduced, and the calibration precision is improved.

Description

Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment
Technical Field
The application relates to the field of computer technology, in particular to the field of computer vision and intelligent transportation.
Background
The application of the internet of vehicles technology depends on the comprehensive perception of people, vehicles and objects on the road. Cameras for capturing road images are an important sensing device that provides image information that can provide the basic support for various traffic applications and information services. In order to obtain three-dimensional information in the real world from a two-dimensional image, calibration of the camera is required. For cameras placed in a road, the calibration process typically includes placing marker points on the road, and deriving camera parameters based on image information of the marker points.
Disclosure of Invention
The application provides a camera calibration method, a camera calibration device, electronic equipment, a storage medium and road side equipment.
According to an aspect of the present application, there is provided a camera calibration method, including:
performing projection transformation on the image of the mark point acquired by the fisheye camera to obtain a gun camera visual angle image;
calibrating external parameters of the equivalent gun camera corresponding to the gun camera visual angle image according to the real positioning information of the mark points and the gun camera visual angle image;
and obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gun camera.
According to another aspect of the present application, there is provided a camera calibration apparatus including:
the image projection module is used for carrying out projection transformation on the image of the mark point acquired by the fisheye camera to obtain a camera view angle image;
the equivalent bolt face calibration module is used for calibrating external parameters of the equivalent bolt face corresponding to the bolt face visual angle image according to the real positioning information of the mark points and the bolt face visual angle image;
and the fisheye camera calibration module is used for obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gunlock.
According to another aspect of the present application, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the methods of any one of the embodiments of the present application.
According to another aspect of the present application, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the method of any of the embodiments of the present application.
According to another aspect of the present application, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method as described above.
According to another aspect of the present application, there is provided a roadside apparatus, the apparatus comprising:
one or more processors; and
and the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors are enabled to realize the method provided by the embodiment of the application.
According to the technical scheme, the calibration precision of the fisheye camera can be improved.
It should be understood that the description of this section is not intended to identify key or critical features of the embodiments of the application or to delineate the scope of the application. Other features of the present application will become apparent from the description that follows.
Drawings
The drawings are for better understanding of the present solution and do not constitute a limitation of the present application. Wherein:
FIG. 1 is a schematic diagram of a camera calibration method according to one embodiment of the present application;
FIG. 2 is an example of an image of a landmark point acquired by a fisheye camera in an embodiment of the application;
FIG. 3 is an example of a bolt face view image in an embodiment of the present application;
FIG. 4 is a schematic diagram of a camera calibration method according to another embodiment of the present application;
FIG. 5 is a schematic illustration of a spherical projective transformation model in an embodiment of the present application;
FIG. 6 is a schematic diagram of a camera calibration apparatus according to one embodiment of the present application;
FIG. 7 is a schematic diagram of a camera calibration apparatus according to another embodiment of the present application;
fig. 8 is a block diagram of an electronic device for implementing a camera calibration method of an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present application to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In computer vision applications, to determine the correlation between the three-dimensional geometric position of a point on the surface of a spatial object and its corresponding point in an image, it is necessary to build geometric models imaged by a camera, and parameters of these geometric models are camera parameters. The process of determining camera parameters may be referred to as camera calibration or camera calibration. The camera parameters include internal parameters and external parameters. Wherein the internal reference is related to the characteristics of the camera itself, for example, the internal reference may include a focal length, a distortion coefficient, etc. of the camera; the external parameters embody the relative positional relationship between the camera coordinate system and the world coordinate system, are related to the position, rotation direction, etc. of the camera, and may include, for example, rotation parameters, translation parameters, etc. between the camera coordinate system and the world coordinate system.
In the field of intelligent transportation, novel infrastructure equipment such as street lamps supporting V2X (Vehicle to Everything, internet of vehicles), road monitoring cameras, traffic lights and the like can provide vehicles with perception information beyond a visual field range, such as positioning information of obstacles beyond the visual field range and the like. The road monitoring camera is one of the most important sensors in the intelligent traffic sensing system, and the accuracy of the sensor plays a vital role in the accuracy and the robustness of the sensing system. The road monitoring camera may include one or more of a fisheye camera, a gun camera (or referred to as a bolt), a ball camera, and the like.
The camera calibration method can be used for external parameter calibration of the fish-eye camera in the intelligent traffic perception system. The execution subject of the method may be various road side devices, such as a road side sensing device with a computing function, a road side computing device connected with the road side sensing device, a server device connected with the road side computing device, or a server device directly connected with the road side sensing device.
In one system architecture for intelligent traffic roadway collaboration, a roadside device includes a roadside awareness device and a roadside computing device. Wherein a road side perception device, such as a road side camera for acquiring traffic light images, is connected to a road side computing device, such as a road side computing unit (Road Side Computing Unit, RSCU), which is connected to a server device, which can communicate with an autonomous or assisted driving vehicle in various ways. In another system architecture for intelligent traffic road collaboration, the road side awareness device itself includes a computing function, and then the road side awareness device is directly connected to the server device. The above connections may be wired or wireless; the server device in the application is, for example, a cloud control platform, a vehicle-road collaborative management platform, a central subsystem, an edge computing platform, a cloud computing platform and the like.
FIG. 1 shows a schematic diagram of a camera calibration method according to one embodiment of the present application. As shown in fig. 1, the method includes:
step S11, performing projection transformation on the image of the mark point acquired by the fisheye camera to obtain a camera view angle image;
step S12, calibrating external parameters of the equivalent gun camera corresponding to the gun camera view angle image according to the real positioning information of the mark points and the gun camera view angle image;
and S13, obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gun camera.
For example, the fisheye camera may be installed near an intersection, for example, above a road for each direction of travel of roads intersecting in the intersection.
Fig. 2 is an example of an image of a marker point acquired by a fisheye camera, and as shown in fig. 2, the range of view of the fisheye camera is large, and a plurality of lanes below the fisheye camera can be shot in a far range, but barrel distortion exists, for example, lane lines 22 where the marker point 21 is located have obvious bending deformation in the image. By performing projection transformation on the image acquired by the fish-eye camera, barrel distortion in the image can be reduced or eliminated, and a camera view image is obtained, wherein the camera view image refers to an image which can be equivalently acquired by a camera along a fixed orientation angle, and the camera can be recorded as an equivalent camera corresponding to the camera view image. Fig. 3 is an example of a camera view image, which may be characterized by a near-far-near size, but without significant barrel distortion, as shown in fig. 3, where the lane line 22 where the marker point 21 is located is relatively straight.
In practical application, a plurality of mark points can be set on a road surface, an image of a road surface area containing the plurality of mark points is acquired by using a fisheye camera, and RTK (Real-time Kinematic) equipment is carried manually on the road surface to obtain positioning information of the mark points in the Real world, namely Real positioning information, wherein the Real positioning information can contain three-dimensional coordinates.
As an alternative embodiment, the coordinates of the marker point in the gun camera view image may be determined by identifying the marker point in the image by means of image recognition or manual recognition. And (3) performing PnP (permanent-n-Point) solution by utilizing the coordinates of the plurality of mark points in the camera view angle image and the three-dimensional coordinates of the plurality of mark points, so as to obtain the external parameters of the equivalent camera. Because the camera view angle image is obtained by projection transformation of the image acquired by the fish-eye camera, the external parameters of the fish-eye camera can be calculated according to the external parameters of the equivalent camera corresponding to the camera view angle image.
Because the imaging model of the fish-eye camera is complex, the fish-eye camera is directly calibrated and needs to be equivalent to a pinhole camera model, and the fish-eye camera has high calculation complexity and large error. By adopting the camera calibration method, the image projection of the mark point acquired by the fisheye camera is converted into the camera view angle image, so that the external parameter calibration of the fisheye camera is converted into the external parameter calibration of the equivalent camera, the resolution complexity of the external parameter of the fisheye camera is reduced, and the accuracy is higher. The method is applied to an intelligent traffic perception system, and can improve perception precision and robustness. In addition, the fisheye camera arranged in the road is often used for supplementing a visual field blind area of the gun camera in the road, the external parameter calibration of the fisheye camera is converted into the external parameter calibration of the equivalent gun camera, and the gun camera and the fisheye camera in the road can be conveniently calibrated by using the same mark point, so that the number of the mark points is reduced, and the manual dotting cost and the road safety risk are reduced.
Illustratively, in the step S11, projection transformation is performed on the image of the marker point acquired by the fisheye camera to obtain a camera view image, which may include:
and performing projection transformation on the image of the mark point acquired by the fish-eye camera based on the spherical projection transformation model to obtain a camera view angle image with the same visual range as the image acquired by the target camera.
The spherical projective transformation model is used for representing the projective transformation relation between the spherical image and the camera view angle image, wherein the spherical image can be obtained by converting the image acquired by the fisheye camera. The process of obtaining the camera view angle image based on the spherical projection transformation model can be as shown in fig. 4, firstly, the image collected by the fisheye camera is projected onto the spherical projection surface based on the lens design model of the fisheye camera, such as an equidistant projection model, an equal solid angle projection model and the like, so as to obtain the spherical image, and then the spherical image is converted into the camera view angle image based on the spherical projection transformation model.
Fig. 5 shows a schematic diagram of a spherical projective transformation model. Where O is the projection center, the spherical surface 51 is the spherical projection surface of the image acquired by the fisheye camera, and the point on the spherical surface 51 can be represented based on the XYZ spatial coordinate system. Plane 52 is the projection plane corresponding to the camera view image, and corresponds to the imaging plane of the equivalent camera, and the points on plane 52 can be represented based on the uv plane coordinate system. The sphere 51 and the plane 52 are tangential to the point D,projection on the XOY plane is +.>θ 1 Is->Included angle theta with Z axis direction 2 Indicating a counter-clockwise rotation from the X-axis direction to +.>Is a function of the angle of (a). θ 1 May be referred to as pitch angle, θ 2 May be referred to as a yaw angle. The spherical projection surface 51 can be determined based on an internal reference of the fisheye camera or a lens design model, and based on the pitch angle theta based on the determination of the spherical projection surface 51 1 And yaw angle theta 2 The point D may be determined to thereby determine the plane 52. On the basis of determining the plane 52, according to the geometric relationship, a corresponding point on the plane 52 may be determined based on any point on the spherical projection surface 51. For example, the coordinates of the point R on the spherical projection surface 51 in fig. 4 can be based onThe coordinates of the point P of its projection on the plane 52 are determined so that the spherical image can be converted into an equivalent bolt face image. In practical application, the pitch angle theta can be preset 1 Yaw angle theta 2 And obtaining a corresponding spherical projection transformation model.
By way of example, a bolt face for monitoring traffic conditions at a certain fixed angle can also be provided beside the fish-eye camera above the road, for example a forward-looking bolt face facing the road travel direction and/or a rear-looking bolt face facing the opposite direction of the road travel direction. The fisheye camera may be directed toward the underlying surface for supplementing the field of view blind between the forward looking bolt face and the rear looking bolt face. In an embodiment of the present application, the target bolt face includes a forward looking bolt face and/or a rear looking bolt face that is monitoring the same road as the fish eye camera.
For example, with the coordinates of the fisheye camera as the origin, the visual range of the image captured by the fisheye camera is 50 meters before and after the origin, and the pitch angle θ can be adjusted 1 And yaw angle theta 2 The visual range of the camera view angle image is adjusted within 50 meters before and after the origin. Assuming that the visual range of the image collected by the forward-looking bolt face is 10 to 70 m in front of the origin, according to the above embodiment, the visual range of the image of the view angle of the bolt face may be adjusted to any area within 10 to 50 m in front of the origin, so that the image of the view angle of the bolt face and the image collected by the forward-looking bolt face contain the same visual range. For example, the visual range of the camera view angle image is 5 m to 30 m before the origin, 20 m to 50 m before the origin, and a group of mark points are arranged in the same visual range, so that the set of mark points can be used for calibrating the front view camera and the fish-eye camera. Furthermore, the same visual range can be adjusted in the camera view angle image, so that the periphery of the mark point in the camera view angle is enlarged, and the calibration precision is improved.
According to the embodiment, because the parameters of the spherical projection transformation model are adjustable, the camera view angle image is obtained based on the spherical projection transformation model, the visual range of the camera view angle image can be conveniently adjusted, the camera view angle image and the image acquired by the target camera can have the same visual range, the same visual range is adjustable in the image, the same mark points can be utilized to calibrate the fisheye camera and the target camera, the number of the mark points is reduced, and the calibration precision can be improved.
As an exemplary embodiment, the camera calibration method may further include:
and calibrating the external parameters of the target gun camera according to the real positioning information of the mark points and the images of the mark points acquired by the target gun camera.
According to the embodiment, the same mark points are utilized, so that the equivalent gun camera is calibrated, the fisheye camera is calibrated, and the target gun camera is calibrated, so that the number of the mark points is reduced, and the cost and the safety risk of manual dotting are reduced.
As an exemplary embodiment, in the step S11, projection transformation is performed on the image of the marker point collected by the fisheye camera based on the spherical projection transformation model, to obtain a camera view image having the same visual range as the image collected by the target camera, including:
traversing the rotation angle within a predetermined range;
determining a corresponding spherical projection transformation model based on the traversed rotation angle;
based on the spherical projection transformation model, performing projection transformation on the image of the mark point acquired by the fisheye camera to obtain a projection image corresponding to the spherical projection transformation model;
and under the condition that the projection image corresponding to the spherical projection transformation model and the image acquired by the target rifle bolt have the same visible range and the visible range in the projection image accords with a preset condition, determining the projection image corresponding to the spherical projection transformation model as a rifle bolt visual angle image.
For example, the rotation angle may include a pitch angle and/or a yaw angle of an imaging plane of a bolt face view image relative to a spherical projection plane of the fish-eye camera.
For example, the pitch angle may be preset to a certain fixed value, for example 45 °, and then the yaw angle is traversed every 5 ° within 0 ° to 90 °, i.e. 0 °, 5 °, 10 °, etc. Traversing an angle, determining a spherical projection transformation model, and obtaining a corresponding projection image. And determining the projection image meeting the condition as a gun camera view angle image.
As another example, the pitch angle may be traversed every 5 ° within 0 to 90 °. When traversing to a certain angle, such as 5 degrees, traversing every 5 degrees to a pitch angle within 0 to 90 degrees, determining a spherical projective transformation model once every 5 degrees, and obtaining a corresponding projection image. And determining the projection image meeting the condition as a gun camera view angle image.
Illustratively, the predetermined conditions may include: the area of the display area of the same visual range in the projection image exceeds a predetermined threshold and/or the display area is in a centered position in the projection image, etc.
According to the above embodiment, the corresponding projection image is obtained by traversing the rotation angle, and the projection image conforming to the condition is selected as the gun camera view angle image. The traversing precision can be controlled to ensure that a camera view angle image meeting the conditions can be obtained, and the accuracy of camera calibration is ensured.
As an exemplary embodiment, the step S13, according to the external parameters of the equivalent bolt, may include:
determining a rotation parameter between the equivalent gun camera and the fisheye camera according to projection transformation parameters between the image of the mark point acquired by the fisheye camera and the gun camera view angle image;
and obtaining the external parameters of the fisheye camera according to the rotation parameters and the external parameters of the equivalent gunlock.
The projective transformation parameters may include a pitch angle and a yaw angle of an imaging plane of the camera view image relative to a spherical projection plane of the fish-eye camera, a position of a tangent point between the imaging plane of the camera view image and the spherical projection plane of the fish-eye camera, and the like. The rotation parameters may include a rotation matrix.
For example, the rotation matrix R between the fish-eye camera and the equivalent bolt can be determined from the pitch angle and yaw angle of the plane of the bolt view image relative to the spherical projection plane of the fish-eye camera. Assume that the external parameters of the equivalent bolt include a rotation matrix R 1 And translate adjacent T 1 Then:
the rotation matrix of the fisheye camera is: r is R 2 =R*R 1
The translation vector of the fisheye camera is: t (T) 2 =T 1
According to the embodiment, the rotation parameters between the equivalent gun camera and the fisheye camera can be obtained based on the projection transformation parameters, so that the accurate fisheye camera external parameters can be obtained based on the external parameters of the equivalent gun camera, and the accuracy of calibrating the fisheye camera external parameters is improved.
As an exemplary embodiment, in the step S12, calibrating the external parameters of the equivalent bolt corresponding to the view image of the bolt according to the real positioning information of the mark point and the view image of the bolt, including:
according to projection transformation parameters between the images of the mark points and the view angle images of the gun camera, which are acquired by the fish-eye camera, determining internal references of the equivalent gun camera corresponding to the view angle images of the gun camera;
and calibrating the external parameters of the equivalent bolt according to the real positioning information of the mark points, the visual angle image of the bolt and the internal parameters of the equivalent bolt.
Since the intrinsic camera itself is related in characteristics, including focal length and distortion coefficient of the camera, the intrinsic parameters of the fisheye camera can be determined according to the projective transformation parameters when projective transformation is performed to reduce or eliminate distortion of the image acquired by the fisheye camera. The conversion relationship between the image pixel coordinate system and the camera coordinate system can be obtained based on the internal reference. Based on the real positioning information of the mark points and the coordinates of the mark points in the camera view angle image, the conversion relation between the image pixel coordinate system and the world coordinate system can be obtained, based on the conversion relation between the image pixel coordinate system and the camera coordinate system and the conversion relation between the image pixel coordinate system and the world coordinate system can be utilized to obtain the conversion relation between the camera coordinate system and the world coordinate system, and the external parameters of the equivalent camera are calibrated.
According to the embodiment, the external parameters of the fish-eye camera can be accurately obtained by determining the internal parameters of the equivalent gun camera, and the accuracy of calibrating the external parameters of the fish-eye camera is improved.
As the implementation of the method, the application also provides a camera calibration device. Fig. 6 shows a schematic diagram of a camera calibration device. As shown in fig. 6, the apparatus includes:
the image projection module 610 is configured to perform projection transformation on the image of the mark point acquired by the fisheye camera to obtain a camera view image;
the equivalent bolt face calibration module 620 is configured to calibrate an external parameter of the equivalent bolt face corresponding to the bolt face view image according to the real positioning information of the mark point and the bolt face view image;
the fisheye camera calibration module 630 is configured to obtain an external parameter of the fisheye camera according to the external parameter of the equivalent gunlock.
The image projection module is used for projecting and transforming the image of the mark point acquired by the fish-eye camera based on the spherical projection transformation model, so as to obtain a gun camera view angle image with the same visual range as the image acquired by the target gun camera.
Illustratively, as shown in fig. 7, the camera calibration apparatus further includes:
the target bolt face calibration module 640 is used for calibrating external parameters of the target bolt face according to the real positioning information of the mark points and the images of the mark points acquired by the target bolt face.
Illustratively, as shown in FIG. 7, the image projection module 610 includes:
a traversing unit 611 for traversing the rotation angle within a predetermined range;
a first determining unit 612, configured to determine a corresponding spherical projective transformation model based on the traversed rotation angle;
a projection unit 613, configured to perform projective transformation on the image of the marker point acquired by the fisheye camera based on the spherical projective transformation model, to obtain a projected image corresponding to the spherical projective transformation model;
and the second determining unit 614 is configured to determine, as the camera view angle image, the projection image corresponding to the spherical projection transformation model when the projection image corresponding to the spherical projection transformation model and the image acquired by the target camera have the same visible range and the display area of the visible range in the projection image meets a predetermined condition.
Illustratively, as shown in fig. 7, the fisheye camera calibration module 630 includes:
a third determining unit 631 for determining a rotation parameter between the equivalent camera and the fisheye camera according to a projective transformation parameter between the image of the mark point and the camera view angle image acquired by the fisheye camera;
the fisheye calibration unit 632 is configured to obtain an external parameter of the fisheye camera according to the rotation parameter and the external parameter of the equivalent bolt.
Illustratively, as shown in FIG. 7, the equivalent bolt face calibration module 620 includes:
an internal reference calibration unit 621, configured to determine an internal reference of the equivalent bolt corresponding to the bolt view angle image according to the projection transformation parameter between the image of the mark point acquired by the fisheye camera and the bolt view angle image;
the external parameter calibration unit 622 calibrates the external parameters of the equivalent bolt according to the real positioning information of the mark points, the visual angle image of the bolt and the internal parameters of the equivalent bolt.
According to embodiments of the present application, there is also provided an electronic device, a readable storage medium and a computer program product.
As shown in fig. 8, a block diagram of an electronic device according to a camera calibration method according to an embodiment of the present application is shown. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 8, the electronic device includes: one or more processors 801, memory 802, and interfaces for connecting the components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 801 is illustrated in fig. 8.
Memory 802 is a non-transitory computer-readable storage medium provided herein. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the camera calibration methods provided herein. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the camera calibration method provided by the present application.
The memory 802, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the camera calibration method in the embodiments of the present application (e.g., the image projection module 610, the equivalent bolt face calibration module 620, and the fisheye camera calibration module 630 shown in fig. 6). The processor 801 executes various functional applications of the server and data processing, i.e., implements the camera calibration method in the above-described method embodiments, by running non-transitory software programs, instructions, and modules stored in the memory 802.
Memory 802 may include a storage program area that may store an operating system, at least one application program required for functionality, and a storage data area; the storage data area may store data created according to the use of the electronic device of the camera calibration method, etc. In addition, memory 802 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, memory 802 may optionally include memory located remotely from processor 801, which may be connected to the electronics of the camera calibration method via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the camera calibration method may further include: an input device 803 and an output device 804. The processor 801, memory 802, input devices 803, and output devices 804 may be connected by a bus or other means, for example in fig. 8.
The input device 803 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device of the camera calibration method, such as a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointer stick, one or more mouse buttons, a track ball, a joystick, etc. The output device 804 may include a display apparatus, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibration motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
According to an embodiment of the application, the application further provides road side equipment. The apparatus may include:
one or more processors; and
and the storage device is used for storing one or more programs, and when the one or more programs are executed by the one or more processors, the one or more processors are enabled to realize the obstacle detection method in the method embodiment.
The functions and implementation of the processor and the storage device of the roadside apparatus may refer to the description about the processor and the memory in the above-mentioned electronic device embodiment.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and Virtual Private Server (VPS) service. The server may also be a server of a distributed system or a server that incorporates a blockchain.
According to the technical scheme of the embodiment of the application, the image projection of the mark point acquired by the fisheye camera is converted into the camera view angle image, so that the external parameter calibration of the fisheye camera is converted into the external parameter calibration of the equivalent camera, the resolving complexity of the external parameter of the fisheye camera is reduced, and the accuracy is higher. In addition, the fisheye camera arranged in the road is often used for supplementing a visual field blind area of the rifle bolt in the road, the external parameter calibration of the fisheye camera is converted into the external parameter calibration of the equivalent rifle bolt, so that the rifle bolt and the fisheye camera in the road can be conveniently calibrated by using the mark points at the same positions, the number of the mark points is reduced, and the manual dotting cost and the road safety risk are reduced. The embodiment of the application is applied to an intelligent traffic perception system, and can improve perception precision and robustness.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
The above embodiments do not limit the scope of the application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (11)

1. A camera calibration method, comprising:
based on a spherical projection transformation model, performing projection transformation on the image of the mark point acquired by the fish-eye camera to obtain a gun camera view angle image with the same visual range as the image acquired by the target gun camera;
calibrating external parameters of an equivalent bolt corresponding to the bolt view angle image according to the real positioning information of the mark points and the bolt view angle image;
determining a rotation parameter between the equivalent gun camera and the fisheye camera according to projection transformation parameters between the image of the mark point and the gun camera visual angle image acquired by the fisheye camera;
and obtaining the external parameters of the fisheye camera according to the rotation parameters and the external parameters of the equivalent gun camera.
2. The method of claim 1, further comprising:
and calibrating external parameters of the target gun camera according to the real positioning information of the mark points and the images of the mark points acquired by the target gun camera.
3. The method of claim 1, wherein the projective transformation of the image of the marker point acquired by the fish-eye camera based on the spherical projective transformation model to obtain a bolt face view image having the same visual range as the image acquired by the target bolt face, comprises:
traversing the rotation angle within a predetermined range;
determining a corresponding spherical projection transformation model based on the traversed rotation angle;
based on the spherical projection transformation model, performing projection transformation on the image of the mark point acquired by the fisheye camera to obtain a projection image corresponding to the spherical projection transformation model;
and determining the projection image corresponding to the spherical projection transformation model as the gun camera view angle image under the condition that the projection image corresponding to the spherical projection transformation model and the image acquired by the target gun camera have the same visual range and the visual range in the projection image meets a preset condition.
4. A method according to any one of claims 1 to 3, wherein calibrating the external parameters of the equivalent bolt corresponding to the bolt view image according to the true positioning information of the marker point and the bolt view image comprises:
according to projection transformation parameters between the image of the mark point and the view angle image of the gun camera, which are acquired by the fisheye camera, determining an internal reference of an equivalent gun camera corresponding to the view angle image of the gun camera;
and calibrating the external parameters of the equivalent bolt according to the real positioning information of the mark points, the visual angle image of the bolt and the internal parameters of the equivalent bolt.
5. A camera calibration apparatus comprising:
the image projection module is used for carrying out projection transformation on the image of the mark point acquired by the fisheye camera to obtain a camera view angle image; the image projection module is used for carrying out projection transformation on the image of the mark point acquired by the fish-eye camera based on a spherical projection transformation model to obtain a camera view angle image with the same visual range as the image acquired by the target camera;
the equivalent bolt face calibration module is used for calibrating external parameters of an equivalent bolt face corresponding to the bolt face visual angle image according to the real positioning information of the mark point and the bolt face visual angle image;
the fisheye camera calibration module is used for obtaining the external parameters of the fisheye camera according to the external parameters of the equivalent gun camera; wherein, fisheye camera calibration module includes:
the third determining unit is used for determining a rotation parameter between the equivalent gun camera and the fisheye camera according to a projection transformation parameter between the image of the mark point acquired by the fisheye camera and the gun camera view angle image;
and the fish-eye calibration unit is used for obtaining the external parameters of the fish-eye camera according to the rotation parameters and the external parameters of the equivalent gun camera.
6. The apparatus of claim 5, further comprising:
and the target gun camera calibration module is used for calibrating the external parameters of the target gun camera according to the real positioning information of the mark points and the images of the mark points acquired by the target gun camera.
7. The apparatus of claim 6, wherein the image projection module comprises:
a traversing unit for traversing the rotation angle within a predetermined range;
a first determining unit, configured to determine a corresponding spherical projective transformation model based on the traversed rotation angle;
the projection unit is used for carrying out projection transformation on the image of the mark point acquired by the fisheye camera based on the spherical projection transformation model to obtain a projection image corresponding to the spherical projection transformation model;
and the second determining unit is used for determining the projection image corresponding to the spherical projection transformation model as the camera view angle image under the condition that the projection image corresponding to the spherical projection transformation model and the image acquired by the target camera have the same visible range and the visible range in the projection image meets the preset condition.
8. The apparatus of any one of claims 5 to 7, the equivalent bolt face calibration module comprising:
the internal reference calibration unit is used for determining the internal reference of the equivalent gun camera corresponding to the gun camera visual angle image according to projection transformation parameters between the mark point image acquired by the fisheye camera and the gun camera visual angle image;
and the external parameter calibration unit is used for calibrating the external parameters of the equivalent bolt according to the real positioning information of the mark points, the visual angle image of the bolt and the internal parameters of the equivalent bolt.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-4.
10. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of any one of claims 1-4.
11. A roadside apparatus, the apparatus comprising:
one or more processors; and
storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the method of any of claims 1-4.
CN202011177580.3A 2020-10-29 2020-10-29 Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment Active CN112288825B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011177580.3A CN112288825B (en) 2020-10-29 2020-10-29 Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011177580.3A CN112288825B (en) 2020-10-29 2020-10-29 Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment

Publications (2)

Publication Number Publication Date
CN112288825A CN112288825A (en) 2021-01-29
CN112288825B true CN112288825B (en) 2024-04-12

Family

ID=74373753

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011177580.3A Active CN112288825B (en) 2020-10-29 2020-10-29 Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment

Country Status (1)

Country Link
CN (1) CN112288825B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112967344B (en) * 2021-03-09 2023-12-08 阿波罗智联(北京)科技有限公司 Method, device, storage medium and program product for calibrating camera external parameters
CN112991459B (en) * 2021-03-09 2023-12-12 阿波罗智联(北京)科技有限公司 Camera calibration method, device, equipment and storage medium
CN113592951A (en) * 2021-07-14 2021-11-02 阿波罗智联(北京)科技有限公司 Method and device for calibrating external parameters of vehicle-road cooperative middle-road side camera and electronic equipment
CN114742897B (en) * 2022-03-31 2023-02-28 阿波罗智联(北京)科技有限公司 Method, device and equipment for processing camera installation information of roadside sensing system
CN115018967B (en) * 2022-06-30 2024-05-03 联通智网科技股份有限公司 Image generation method, device, equipment and storage medium
CN115731525B (en) * 2022-11-21 2023-07-25 禾多科技(北京)有限公司 Lane line identification method, lane line identification device, electronic equipment and computer readable medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101025824A (en) * 2007-03-30 2007-08-29 东南大学 Calibrating method based on fixed parameters and variable parameters for three-dimensional scanning system
CN102096923A (en) * 2011-01-20 2011-06-15 上海杰图软件技术有限公司 Fisheye calibration method and device
CN103559707A (en) * 2013-10-30 2014-02-05 同济大学 Industrial fixed-focus camera parameter calibration method based on moving square target calibration object
CN107886547A (en) * 2017-11-10 2018-04-06 长沙全度影像科技有限公司 A kind of fisheye camera scaling method and system
CN108257183A (en) * 2017-12-20 2018-07-06 歌尔科技有限公司 A kind of camera lens axis calibrating method and device
CN108495085A (en) * 2018-03-14 2018-09-04 成都新舟锐视科技有限公司 A kind of ball machine automatic tracking control method and system based on moving target detection
CN109003311A (en) * 2018-08-22 2018-12-14 上海庄生晓梦信息科技有限公司 A kind of fish-eye scaling method
WO2019227079A1 (en) * 2018-05-25 2019-11-28 Aquifi, Inc. Systems and methods for multi-camera placement

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101025824A (en) * 2007-03-30 2007-08-29 东南大学 Calibrating method based on fixed parameters and variable parameters for three-dimensional scanning system
CN102096923A (en) * 2011-01-20 2011-06-15 上海杰图软件技术有限公司 Fisheye calibration method and device
CN103559707A (en) * 2013-10-30 2014-02-05 同济大学 Industrial fixed-focus camera parameter calibration method based on moving square target calibration object
CN107886547A (en) * 2017-11-10 2018-04-06 长沙全度影像科技有限公司 A kind of fisheye camera scaling method and system
CN108257183A (en) * 2017-12-20 2018-07-06 歌尔科技有限公司 A kind of camera lens axis calibrating method and device
CN108495085A (en) * 2018-03-14 2018-09-04 成都新舟锐视科技有限公司 A kind of ball machine automatic tracking control method and system based on moving target detection
WO2019227079A1 (en) * 2018-05-25 2019-11-28 Aquifi, Inc. Systems and methods for multi-camera placement
CN109003311A (en) * 2018-08-22 2018-12-14 上海庄生晓梦信息科技有限公司 A kind of fish-eye scaling method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于直线的几何不变性标定摄像机参数;傅丹 等;《中国图象图形学报》;第14卷(第06期);第1058-1063页 *

Also Published As

Publication number Publication date
CN112288825A (en) 2021-01-29

Similar Documents

Publication Publication Date Title
CN112288825B (en) Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment
CN112861653B (en) Method, system, equipment and storage medium for detecting fused image and point cloud information
JP7240367B2 (en) Methods, apparatus, electronic devices and storage media used for vehicle localization
CN111815719B (en) External parameter calibration method, device and equipment of image acquisition equipment and storage medium
CN112509057B (en) Camera external parameter calibration method, device, electronic equipment and computer readable medium
CN109270534B (en) Intelligent vehicle laser sensor and camera online calibration method
WO2020098316A1 (en) Visual point cloud-based semantic vector map building method, device, and electronic apparatus
CN111324115B (en) Obstacle position detection fusion method, obstacle position detection fusion device, electronic equipment and storage medium
CN110793544B (en) Method, device and equipment for calibrating parameters of roadside sensing sensor and storage medium
CN111220154A (en) Vehicle positioning method, device, equipment and medium
CN112598750B (en) Road side camera calibration method and device, electronic equipment and storage medium
CN112967344B (en) Method, device, storage medium and program product for calibrating camera external parameters
EP3968266B1 (en) Obstacle three-dimensional position acquisition method and apparatus for roadside computing device
WO2022078513A1 (en) Positioning method and apparatus, self-moving device, and storage medium
WO2022078488A1 (en) Positioning method and apparatus, self-moving device, and storage medium
CN111626206A (en) High-precision map construction method and device, electronic equipment and computer storage medium
CN114283201A (en) Camera calibration method and device and road side equipment
CN111324945B (en) Sensor scheme determining method, device, equipment and storage medium
CN110794844B (en) Automatic driving method, device, electronic equipment and readable storage medium
KR102694715B1 (en) Method for detecting obstacle, electronic device, roadside device and cloud control platform
CN112184914B (en) Method and device for determining three-dimensional position of target object and road side equipment
CN112344855B (en) Obstacle detection method and device, storage medium and drive test equipment
CN111721281B (en) Position identification method and device and electronic equipment
CN111523471A (en) Method, device and equipment for determining lane where vehicle is located and storage medium
WO2022217988A1 (en) Sensor configuration scheme determination method and apparatus, computer device, storage medium, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20211013

Address after: 100176 Room 101, 1st floor, building 1, yard 7, Ruihe West 2nd Road, economic and Technological Development Zone, Daxing District, Beijing

Applicant after: Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd.

Address before: 2 / F, baidu building, 10 Shangdi 10th Street, Haidian District, Beijing 100085

Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant