CN112967345A - External parameter calibration method, device and system of fisheye camera - Google Patents

External parameter calibration method, device and system of fisheye camera Download PDF

Info

Publication number
CN112967345A
CN112967345A CN202110254664.0A CN202110254664A CN112967345A CN 112967345 A CN112967345 A CN 112967345A CN 202110254664 A CN202110254664 A CN 202110254664A CN 112967345 A CN112967345 A CN 112967345A
Authority
CN
China
Prior art keywords
fisheye camera
image
focal length
perception information
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110254664.0A
Other languages
Chinese (zh)
Other versions
CN112967345B (en
Inventor
苑立彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110254664.0A priority Critical patent/CN112967345B/en
Publication of CN112967345A publication Critical patent/CN112967345A/en
Application granted granted Critical
Publication of CN112967345B publication Critical patent/CN112967345B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T3/047
    • G06T5/80

Abstract

The application discloses an external parameter calibration method, device and system of a fisheye camera, and relates to artificial intelligence, automatic driving, intelligent traffic, road side perception and computer vision in cooperation with a vehicle and a road in a computer technology. The method comprises the following steps: according to the method, the original image collected by the fisheye camera is subjected to distortion removal processing to obtain a distortion-removed image, the target focal length is a focal length corresponding to a preset image collection overlapping region between the fisheye camera and the gunlock, external parameters of the fisheye camera are determined according to the distortion-removed image, the influence of various factors can be avoided, images shot by the fisheye camera often cannot collect all inner corner points of a checkerboard, and therefore the problem that a large number of images shot by the fisheye camera cannot be used is caused.

Description

External parameter calibration method, device and system of fisheye camera
Technical Field
The application relates to artificial intelligence, automatic driving, intelligent transportation, vehicle and road cooperative roadside perception and computer vision in computer technology, in particular to an external reference calibration method, device and system of a fisheye camera.
Background
With the development of Vehicle network technology, Vehicle to X (V2X) road side sensing systems provide beyond-the-horizon sensing information for vehicles cooperating with a Vehicle road, and a road side camera is one of the most main sensors of the road side sensing systems, and generally comprises a gunlock and a fisheye camera.
In the prior art, perception information is usually determined by an image acquired by a gunlock, a fisheye camera is usually used for monitoring, so that external parameters of the fisheye camera generally do not need to be calibrated, when the fisheye camera is applied to other scenes, a commonly adopted external parameter calibration method is a checkerboard calibration method, for example, checkerboards are manufactured (the size of each checkerboard can be measured), internal angle points of the checkerboards are extracted, the image is acquired based on the fisheye camera, pixels in the image are subjected to corrosion, expansion and other processing, the processed pixels and the internal angle points in the checkerboards are calculated based on a preset calibration function, and therefore the external parameters of the fisheye camera are acquired.
However, the above external reference of the fisheye camera is calibrated, which may cause the problems of low reliability and high cost caused by discarding the images that cannot detect the inner corner points of all the checkerboards.
Disclosure of Invention
The application provides an external reference calibration method, device and system of a fisheye camera for improving calibration reliability.
According to a first aspect of the present application, there is provided an external reference calibration method for a fisheye camera, including:
according to the target focal length, carrying out distortion removal processing on an original image acquired by the fisheye camera to obtain a distortion-removed image; the target focal length is a focal length corresponding to a preset image acquisition overlapping area between the fisheye camera and the gunlock;
and determining external parameters of the fisheye camera according to the undistorted image.
According to a second aspect of the present application, there is provided an external reference calibration apparatus for a fisheye camera, comprising:
the first processing unit is used for carrying out distortion removal processing on an original image acquired by the fisheye camera according to the target focal length to obtain a distortion-removed image; the target focal length is a focal length corresponding to a preset image acquisition overlapping area between the fisheye camera and the gunlock;
a first determining unit, configured to determine an external parameter of the fisheye camera according to the undistorted image.
According to a third aspect of the present application, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the first aspect as described above.
According to a fourth aspect of the present application, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method as described above in the first embodiment.
According to a fifth aspect of the present application, there is provided a computer program product comprising: a computer program, stored in a readable storage medium, from which at least one processor of an electronic device can read the computer program, execution of the computer program by the at least one processor causing the electronic device to perform the method of the first aspect.
According to a sixth aspect of the present application, there is provided a roadside apparatus including the electronic apparatus as described in the third aspect above.
According to a seventh aspect of the present application, there is provided a cloud control platform including the electronic device according to the third aspect.
According to an eighth aspect of the present application, there is provided an external reference calibration system for a fisheye camera, comprising: a fisheye camera, a bolt, and a device as described above in the second aspect; wherein the content of the first and second substances,
the fisheye camera and the bolt face are used for determining a target focal length.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a schematic diagram according to a first embodiment of the present application;
FIG. 2 is a schematic diagram according to a second embodiment of the present application;
FIG. 3 is a schematic diagram of an original image captured by a fisheye camera;
FIG. 4 is a schematic diagram of an original image being subjected to a de-distortion process, resulting in a de-distorted image;
FIG. 5 is a schematic illustration according to a third embodiment of the present application;
FIG. 6 is a schematic illustration according to a fourth embodiment of the present application;
FIG. 7 is a block diagram of an electronic device used to implement embodiments of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In order to improve the safety and reliability of vehicle driving, the roadside sensing system may provide the vehicle with sensing information, which may include information related to obstacles (such as vehicles, pedestrians, etc.), such as position information of the obstacles, etc.
Among them, the roadside camera is one of the most important sensors of the roadside sensing system, the roadside camera may generally include a bolt and a fisheye camera, and the bolt may include a forward-looking bolt and a rearward-looking bolt. Generally, a monitoring rod (i.e., a vertical rod) can be arranged at an intersection (e.g., an intersection or a t-shaped intersection), and a fisheye camera, a rear-view bolt and a front-view bolt can be arranged on the monitoring rod at the same time.
In the related art, a fisheye camera is usually used for monitoring, so that external parameters of the fisheye camera generally do not need to be calibrated, and when the fisheye camera is applied to other scenes, an external parameter calibration method adopted is a checkerboard calibration method, for example, inner corner points in a checkerboard are detected, and the external parameters of the fisheye camera are obtained after calculation.
However, due to the influence of various factors, the images shot by the fisheye camera cannot collect all inner corner points of the checkerboard, which may cause that a large number of images shot by the fisheye camera cannot be used and only a small number of images are available, thereby causing technical problems of high image elimination rate, low calibration efficiency, high cost, low reliability and the like.
In order to avoid at least one of the above technical problems, the inventors of the embodiments of the present application have made creative efforts to obtain the inventive concept of the embodiments of the present application: and taking a focal length corresponding to a preset image acquisition overlapping region between the fisheye camera and the gunlock as a target focal length, performing distortion removal processing on an original image acquired by the fisheye camera based on the target focal length to obtain a distortion-removed image, and generating external parameters of the fisheye camera based on the distortion-removed image.
Based on the inventive concept, the application provides an external reference calibration method and device of a fisheye camera and roadside equipment, which are applied to artificial intelligence, automatic driving, intelligent transportation, vehicle and road cooperative roadside perception and computer vision in computer technology, so as to achieve the technical effects of saving calibration cost and improving calibration accuracy.
Fig. 1 is a schematic diagram according to a first embodiment of the present application, as shown in fig. 1, the method comprising:
s101: and according to the target focal length, carrying out distortion removal processing on the original image acquired by the fisheye camera to obtain a distortion-removed image.
The target focal length is a focal length corresponding to a preset image acquisition overlapping area between the fisheye camera and the gunlock.
For example, the execution main body of this embodiment may be an external reference calibration device of a fisheye camera (hereinafter referred to as an external reference calibration device for short), the external reference calibration device may be a server (including a local server and a cloud server, where the server may be a cloud control platform, a vehicle-road cooperative management platform, a central subsystem, an edge computing platform, a cloud computing platform, and the like), may also be a road side device, may also be a terminal device, may also be a processor, may also be a chip, and the like, and this embodiment is not limited.
In a system architecture of intelligent transportation vehicle-road cooperation, the road side equipment comprises road side sensing equipment with a computing function and road side computing equipment connected with the road side sensing equipment, the road side sensing equipment (such as a road side camera) is connected to the road side computing equipment (such as a Road Side Computing Unit (RSCU)), the road side computing equipment is connected to a server, and the server can communicate with an automatic driving vehicle or an auxiliary driving vehicle in various modes; or the roadside sensing device comprises a calculation function, and the roadside sensing device is directly connected to the server. The above connections may be wired or wireless.
It should be noted that, in this embodiment, a feature is introduced that a focal length corresponding to a preset image capture overlap region between the fisheye camera and the bolt machine is taken as a target focal length, so as to perform a distortion removal processing on an original image captured by the fisheye camera based on the target focal length, by introducing this feature, on the basis of the target focal length, a overlap region is formed between a distortion-removed image (i.e., a distortion-removed image) and an image captured by the bolt machine, and particularly when the overlap region is relatively large, a range of the distortion-removed image determined by this embodiment can be relatively large, so that when an external parameter of the fisheye camera is subsequently determined based on the distortion-removed image, a full coverage of the distortion-removed image can be relatively ensured, thereby improving accuracy and reliability of the determined external parameter of the fisheye camera, compared with the method for determining the external parameters of the fisheye camera by means of the checkerboard calibration method in the related art, the method can avoid the problem that images shot by the fisheye camera cannot be used due to the fact that all inner corner points of the checkerboard cannot be collected, and therefore a large number of images shot by the fisheye camera cannot be used.
S102: and determining external parameters of the fisheye camera according to the undistorted image.
In one example, the external parameter calibration device may determine the external parameter of the fisheye camera according to the undistorted image by means of manual labeling.
For example, the external parameter calibration device may manually perform an operation of acquiring world coordinates in a world coordinate system based on pixel points in the undistorted image, and perform coordinate conversion calculation of different coordinate systems (i.e., the world coordinate system and the image coordinate system) based on the undistorted image and the acquired world coordinates, thereby obtaining the external parameters of the fisheye camera.
In another example, the appearance parameter calibration device can determine the appearance parameter of the fisheye camera based on the historical image-assisted image distortion removal mode.
For example, the external reference calibration device may acquire a history image acquired by another camera, determine a world coordinate of at least one pixel in the history image according to a parameter (such as an external reference) of the other camera and the history image, determine a pixel coordinate of the at least one pixel in the distorted image, and perform coordinate conversion calculation of different coordinate systems (i.e., a world coordinate system and an image coordinate system) based on the pixel coordinate and the determined world coordinate, thereby obtaining the external reference of the fisheye camera.
The setting positions of the other cameras and the setting positions of the fisheye cameras can be the same, so that the historical images acquired by the other cameras and the undistorted images are directed to the same object.
It should be noted that the above embodiments are only exemplary, and the method for determining the external parameter of the fisheye camera according to the undistorted image that can be adopted in the present embodiment is not to be construed as a limitation on the method for determining the external parameter of the fisheye camera according to the undistorted image.
Based on the above analysis, an embodiment of the present application provides an external reference calibration method for a fisheye camera, including: according to a target focal length, performing distortion removal processing on an original image acquired by a fisheye camera to obtain a distortion-removed image, wherein the target focal length is a focal length corresponding to a preset image acquisition overlapping region between the fisheye camera and a gunlock, determining external parameters of the fisheye camera according to the distortion-removed image, and according to the focal length corresponding to the preset image acquisition overlapping region between the fisheye camera and the gunlock, performing distortion removal processing on the original image acquired by the fisheye camera based on the target focal length to determine the external parameters of the fisheye camera based on the obtained distortion-removed image, so that the problem that the images acquired by the fisheye camera cannot be acquired to the inner corner points of all checkerboards, so that a large number of images acquired by the fisheye camera cannot be used is solved, and only a small number of images are available, the method has the advantages that the image elimination rate is high, the calibration efficiency is low, relatively speaking, the range of the undistorted image determined through the embodiment is relatively large, and the dead zone between the undistorted image and the image collected by the gunlock is reduced or even has no dead zone, so that when the external parameter of the fisheye camera is determined based on the undistorted image subsequently, the comprehensive coverage of the undistorted image can be relatively ensured, and the technical effects of improving the accuracy and reliability of the determined external parameter of the fisheye camera are achieved.
Fig. 2 is a schematic diagram according to a first embodiment of the present application, as shown in fig. 2, the method comprising:
s201: and determining an image acquisition area of the fisheye camera at each focal length according to the field angle of the fisheye camera at each focal length.
Illustratively, the relationship between the focal length and the field angle may be determined based on equation 1, equation 1:
θ=2*atan((L/2)/f)
where θ is the field angle, L is the diagonal length of the film of the fisheye camera, and f is the focal length.
The embodiment can be understood as follows: the field angles corresponding to different focal lengths are different, the coverage areas (i.e., the acquisition areas) of the images acquired by the fisheye camera based on the different field angles are different, and for a certain focal length, the external reference calibration device can determine the field angle corresponding to the focal length based on the above formula 1, so as to determine the image acquisition area of the fisheye camera when the fisheye camera is at the field angle.
In some embodiments, S201 may include: and determining an image acquisition area of the fisheye camera at each focal length according to the field angle of the fisheye camera at each focal length and the acquired height information of the monitoring rod where the fisheye camera is located.
For example, for a certain focal length, the external reference calibration device may determine an angle of view of the fisheye camera at the focal length, and may obtain a corresponding image based on the angle of view, and the external reference calibration device may obtain height information of the monitoring rod, and the fisheye camera is mounted on the monitoring rod, so that the external reference calibration device may determine a distance between the left side of the image obtained based on the angle of view and the monitoring rod, and determine a distance between the right side of the image obtained based on the angle of view and the monitoring rod, based on the image obtained based on the angle of view and the height information of the monitoring rod, thereby obtaining an image acquisition area at the focal length.
For the calculation of the distance between the left side of the image obtained based on the field angle and the monitoring rod and the distance between the right side of the image obtained based on the field angle and the monitoring rod, reference may be made to the imaging principle of the camera, and details are not described here.
It should be noted that, in this embodiment, by determining the image capturing area of the fisheye camera at each focal length by combining the angle of view and the height information of the monitoring rod, the accuracy, reliability, and convenience of determining the image capturing area can be improved.
S202: and determining the image acquisition area of the bolt at each focal length according to the field angle of the bolt at each focal length.
Similarly, in this step, the image capture area of the bolt at each focal length may be determined based on the above equation 1, and the implementation principle may be described with reference to S201, which is not described herein again.
In some embodiments, S202 may include: and determining an image acquisition area of the bolt machine at each focal length according to the field angle of the bolt machine at each focal length and the acquired height information of the monitoring rod.
Similarly, for determining the image capturing area of the bolt face at each focal length based on the angle of view at each focal length and the height information of the monitoring rod, reference may be made to the principle of determining the image capturing area of the fisheye camera at each focal length based on the angle of view at each focal length and the height information of the monitoring rod in the above embodiments, and details thereof are not repeated here. And the image acquisition area of the bolt machine under each focal length is determined by combining the angle of view and the height information of the monitoring rod, so that the accuracy, reliability and convenience for determining the image acquisition area can be improved.
S203: and determining a preset image acquisition overlapping area according to the image acquisition area of the fisheye camera under each focal length and the image acquisition area of the gunlock under each focal length.
For example, for a certain focal length, the external reference calibration device can determine an image acquisition region corresponding to the fisheye camera and an image acquisition region corresponding to the bolt, and determine a common region of the two image acquisition regions, where the determined common region is a coincidence region of an image of the fisheye camera and an image of the bolt at the focal length.
In one example, the external reference calibration device may preset a focal length set, where the focal length set includes a plurality of focal lengths, the external reference calibration device may sequentially select one focal length from the focal length set, determine an image corresponding to the fisheye camera and an image corresponding to the bolt based on the selected focal length, and determine a coincidence region of the two images, determine, if an area of the coincidence region of the two images is greater than a preset area, the coincidence region of the two images as a preset image acquisition coincidence region, and accordingly determine the selected focal length as a target focal length.
In another example, the external reference calibration device may preset a plurality of focal lengths, and the external reference calibration device may respectively calculate, on the basis of each focal length, an image corresponding to the fisheye camera and an image corresponding to the bolt, and determine the overlapping area of the two images, and when the overlapping area of the two images corresponding to all the focal lengths is determined, determine the overlapping area with the largest overlapping area as the preset image acquisition overlapping area from the overlapping areas of all the two images, and accordingly determine the focal length corresponding to the overlapping area with the largest overlapping area as the target focal length.
It should be noted that the above examples are only used for exemplary illustration, and the method for determining the preset image capturing overlapping area may be implemented, but is not to be construed as a limitation on the method for determining the preset image capturing overlapping area.
It should be noted that, in this embodiment, the image capturing areas corresponding to the fisheye camera and the bolt face are determined according to the field angle at each focal length, and the preset image capturing overlapping area is determined based on the corresponding image capturing areas, so that the technical effects of accuracy and reliability of determining the preset image capturing overlapping area can be improved.
S204: and according to the target focal length, carrying out distortion removal processing on the original image acquired by the fisheye camera to obtain a distortion-removed image.
The target focal length is a focal length corresponding to a preset image acquisition overlapping area between the fisheye camera and the gunlock.
For example, the description about S204 may refer to S101, which is not described herein.
In which the original image can be seen in figure 3.
In some embodiments, S204 may include: and acquiring a calibration plate image acquired by the fisheye camera based on the target focal length, projecting the calibration plate image onto the spherical surface to obtain a spherical surface image, and performing distortion removal processing on the spherical surface image to obtain a distortion-removed image.
For example, the number of the calibration plate images may be one or multiple, and when the number of the calibration plate images is multiple, the multiple calibration plate images may be subjected to mean processing to obtain the calibration plate image for projecting to the spherical surface to obtain the spherical surface image. And the spherical image is obtained based on the plurality of calibration plate images, so that the error of the obtained spherical image can be reduced, and the distortion removal processing has the technical effects of higher accuracy and reliability.
In some embodiments, performing a distortion removal process on the spherical image, and obtaining a distortion-removed image may include: and determining a conversion relation between the spherical image and the square image, and converting each pixel point of the spherical image based on the conversion relation to obtain position information of each pixel point converted into the square image, thereby obtaining the distortion-removed image. The spherical surface corresponding to the spherical image is a square circumscribed circle or an inscribed circle of the square image.
The undistorted image may be referred to in fig. 4.
That is, by the scheme of S204, a undistorted image as shown in fig. 4 can be obtained based on the distorted image shown in fig. 3.
S205: and determining external parameters of the fisheye camera according to the undistorted image.
For example, the description about S205 may refer to S102, which is not described herein.
In some embodiments, the undistorted image is a two-dimensional image, and S205 may include the steps of:
s2051: and acquiring at least one pixel point in the undistorted image, a two-dimensional coordinate in the undistorted image and a three-dimensional coordinate in a world coordinate system.
It should be understood that the undistorted image includes a plurality of pixel points, each pixel point has a two-dimensional coordinate in the undistorted image, and the two-dimensional coordinate may also be understood as an image coordinate of the pixel point in an image coordinate system.
The World coordinate System may also be understood as the ground coordinate System (World geographic System), or as the physical coordinate System, i.e. the coordinate System in which real objects such as the ground and others are located.
In this embodiment, for any pixel in the undistorted image, the external reference calibration apparatus may determine a two-dimensional coordinate of the pixel in the image coordinate System (i.e., a coordinate of the pixel in the undistorted image based on the image coordinate System), or may determine a three-dimensional coordinate of the pixel in the world coordinate System (i.e., a coordinate of the pixel in the physical coordinate System based on the ground coordinate System, such as a Global Positioning System (GPS coordinate)).
In some embodiments, the three-dimensional coordinates of a certain pixel point in the world coordinate system may be three-dimensional coordinates of a ground point corresponding to the pixel point on the ground.
For example, the GPS coordinate of any pixel point may be determined in advance by a high-precision map or a real-time kinematic (RTK) device dotting manner, which is not limited in this embodiment.
In this embodiment, the number of pixel points is not limited, and may be one or more. When the number of the pixel points is multiple, the multiple pixel points can be randomly selected, and the multiple pixel points which are uniformly distributed in the distortion-removed image can be selected.
It should be noted that if the number of the selected pixel points is multiple, then, in comparison, the accuracy and reliability of determining the external parameter of the fisheye camera based on the two-dimensional coordinate and the three-dimensional coordinate in the following process can be improved, and especially when the plurality of pixel points are uniformly distributed in the undistorted image, the influence of the pixel points in each region in the undistorted image on the determined external parameter of the fisheye camera can be fully considered, so that the technical effects of the accuracy and reliability of the determined external parameter of the fisheye camera can be further improved.
S2052: and determining external parameters of the fisheye camera according to the two-dimensional coordinates and the three-dimensional coordinates.
Wherein, the external reference of fisheye camera includes: and the rotation matrix and the translation vector of the fisheye camera under the world coordinate system.
For example, the external reference calibration device may combine a coordinate conversion principle between an image coordinate system and a world coordinate system in the related art to convert a two-dimensional coordinate and a three-dimensional coordinate, so as to obtain a rotation matrix and a translation vector of the fisheye camera in the world coordinate system, and obtain the external reference of the fisheye camera.
In some embodiments, the external parameters of the fisheye camera can be obtained by performing PNP (PNP) calculation on the two-dimensional coordinates and the three-dimensional coordinates.
It should be noted that, in this embodiment, by respectively determining at least one pixel point in the undistorted image, the two-dimensional coordinate in the undistorted image, and the three-dimensional coordinate in the world coordinate system, and determining the external parameter of the fisheye camera based on the two-dimensional coordinate and the three-dimensional coordinate, problems that in the related art, the reliability is low and the cost is high due to the fact that images in which all the inner corner points of the checkerboards cannot be detected are discarded can be avoided, the technical effect of determining the accuracy and reliability of the external parameter of the fisheye camera is improved, and the cost of acquiring images is reduced.
S206: and processing the two-dimensional perception information of the target object in the image acquired by the fisheye camera according to the external parameters of the fisheye camera to obtain first perception information of the target object.
The first perception information is three-dimensional perception information. The three-dimensional perception information may include location information and the like, such as world coordinates of the target object.
The execution subject of S206 to S208 may be the same as that in the above-described embodiment, or may be different from that in the above-described embodiment. For example:
the execution main body in the above embodiment may be a road side unit, and may also be a server connected to the road side unit, and accordingly, the execution main body in this embodiment may be the same road side unit, or a server connected to the road side unit. Alternatively, the first and second electrodes may be,
the execution main body in the above embodiment may be a road side unit, and the execution main body in this embodiment may be a server connected to the road side unit. Alternatively, the first and second electrodes may be,
the execution main body in the above embodiment may be a server connected to the road side unit, and the execution main body in this embodiment may be the road side unit. Alternatively, the first and second electrodes may be,
the executing main body in the above embodiments may be a server connected to the road side unit or the road side unit, and the executing main body in this embodiment may be a vehicle.
For example, if the main execution bodies of S201 to 205 are the road side unit or the server connected to the road side unit, after the road side unit or the server connected to the road side unit determines the external parameters of the fisheye camera, the image captured by the fisheye camera and the external parameters of the fisheye camera may be transmitted to the vehicle connected to the road side unit or the server connected to the road side unit, and the vehicle may execute S206 to S208 based on the external parameters of the fisheye camera.
Taking the execution subject of this step as a vehicle and the execution subjects of S201 to 205 as road side units as examples, this embodiment can be understood as follows: the vehicle receives an image collected by the fisheye camera sent by the road side unit and external parameters of the fisheye camera, the image collected by the fisheye camera comprises a target object, two-dimensional perception information of the target object is determined, the two-dimensional perception information of the target is processed according to the external parameters of the fisheye camera, and first perception information of the target object is obtained and is three-dimensional perception information.
The target object may be a target object determined by the vehicle from an image captured by the fisheye camera based on a driving demand and the like, and the target object may include other vehicles, pedestrians and the like, which is not limited in this embodiment.
S207: and processing the two-dimensional perception information of the target object in the image acquired by the bolt according to the preset external parameters of the bolt to obtain second perception information of the target object.
And the second perception information is three-dimensional perception information.
Similarly, in this embodiment, the vehicle may receive the external reference of the bolt sent by the roadside unit, and process the two-dimensional perception information of the target object based on the external reference of the bolt to obtain the second perception information of the target object, where the second perception information is the three-dimensional perception information.
S208: and performing fusion processing on the first perception information and the second perception information to obtain fused three-dimensional perception information of the target object.
It should be noted that, in this embodiment, the first perception information and the second perception information are respectively determined and fused to obtain the fused three-dimensional perception information of the target object, and the first perception information is determined based on the external parameters of the fisheye camera, that is, in this embodiment, the characteristic of the three-dimensional perception information of the target object is determined jointly by combining the fisheye camera and the bolt face, so that the problem of low accuracy caused by a blind area and the like when the three-dimensional perception information of the target object is determined based on the bolt face in the related art can be avoided, and the technical effect of improving the accuracy and reliability of determining the three-dimensional perception information of the target object is achieved.
In some embodiments, the first perception information and the second perception information are world coordinate information of the target object, and S208 includes: and performing weighting processing on the first perception information and the second perception information to obtain fused three-dimensional perception information of the target object.
For example, different weight coefficients may be assigned to the bolt face and the fisheye camera, and preferably, the weight coefficient of the bolt face may be greater than the weight coefficient of the fisheye camera, so as to reduce problems such as information loss due to distortion removal processing as much as possible, thereby improving the technical effects of accuracy and reliability of the finally determined three-dimensional perceptual information.
Based on the above analysis, in some embodiments, the execution subject of S206 to S208 may be the same as the execution subject of S201 to S205, for example, all are roadside units, and after S208, in an example, the method may further include: and the road side unit sends the fused three-dimensional perception information of the target object to the vehicle, wherein the fused three-dimensional perception information of the target object is used for generating a driving strategy for adjusting or maintaining the driving information of the vehicle.
For example, the road side unit sends the fused three-dimensional perception information of the target object to the vehicle, the vehicle determines the distance between the target object and the vehicle according to the fused three-dimensional perception information of the target object and the current running information of the vehicle, such as positioning information and speed, and can determine whether to decelerate or detour based on the speed, that is, to obtain a running strategy for adjusting or maintaining the running information of the vehicle.
In another example, the road side unit acquires the driving information of the vehicle, and generates and outputs (e.g., transmits) a driving policy for instructing to adjust or maintain the driving information of the vehicle, based on the driving information and the fused three-dimensional perception information of the target object.
In some embodiments, the execution subject of S206 to S208 may be different from the execution subject of S201 to 205, for example, the execution subject of S201 to 205 is a road side unit, and the execution subject of S206 to S208 is a vehicle, and after S208, the method may further include: the vehicle generates a driving policy for adjusting or maintaining the driving information of the vehicle based on the driving information of the vehicle and the fused three-dimensional perception information of the target object.
It should be noted that, in the present embodiment, the driving policy for adjusting or maintaining the driving information of the vehicle is generated based on the driving information and the fused three-dimensional perception information of the target object, so that the safety and reliability of the vehicle driving can be improved.
Fig. 5 is a schematic diagram of an external reference calibration apparatus 500 of a fisheye camera, as shown in fig. 5, according to a third embodiment of the present application, including:
the first processing unit 501 is configured to perform distortion removal processing on an original image acquired by the fisheye camera according to a target focal length to obtain a distortion-removed image; and the target focal length is a focal length corresponding to a preset image acquisition overlapping region between the fisheye camera and the gunlock.
A first determining unit 502 for determining an external parameter of the fisheye camera from the undistorted image.
Fig. 6 is a schematic diagram of an external reference calibration apparatus 600 of a fisheye camera, as shown in fig. 6, according to a fourth embodiment of the present application, including:
the second determining unit 601 determines an image acquisition area of the fisheye camera at each focal length according to the field angle of the fisheye camera at each focal length, and determines an image acquisition area of the bolt at each focal length according to the field angle of the bolt at each focal length.
The third determining unit 602 is configured to determine a preset image capturing overlapping area according to an image capturing area of the fisheye camera at each focal length and an image capturing area of the bolt at each focal length.
As can be seen in fig. 6, in some embodiments, the second determining unit 601 includes:
the first determining subunit 6011 is configured to determine an image acquisition area of the fisheye camera at each focal length according to the field angle of the fisheye camera at each focal length and the acquired height information of the monitoring rod where the fisheye camera is located.
The second determining subunit 6012 is configured to determine an image acquisition area of the bolt machine at each focal length according to the field angle of the bolt machine at each focal length and the height information of the monitoring rod.
The first processing unit 603 is configured to perform distortion removal processing on an original image acquired by the fisheye camera according to the target focal length to obtain a distortion-removed image; and the target focal length is a focal length corresponding to a preset image acquisition overlapping region between the fisheye camera and the gunlock.
As can be seen in fig. 6, in some embodiments, the first processing unit 603 includes:
and a second acquisition sub-unit 6031 configured to acquire a calibration plate image acquired by the fisheye camera based on the target focal length.
And a projection subunit 6032, configured to project the calibration board image onto a spherical surface, so as to obtain a spherical image.
And a processing subunit 6033, configured to perform distortion removal processing on the spherical image to obtain a distortion-removed image.
A first determining unit 604 for determining an external parameter of the fisheye camera from the undistorted image.
As can be appreciated in conjunction with fig. 6, in some embodiments, the undistorted image is a two-dimensional image; the first determination unit 604 includes:
a first obtaining subunit 6041, configured to obtain at least one pixel point in the undistorted image, a two-dimensional coordinate in the undistorted image, and a three-dimensional coordinate in the world coordinate system.
And a third determining subunit 6042 configured to determine the external parameters of the fisheye camera according to the two-dimensional coordinates and the three-dimensional coordinates.
In some embodiments, the external parameters of the fisheye camera include: and the rotation matrix and the translation vector of the fisheye camera under the world coordinate system.
In some embodiments, the number of the pixels is multiple, and the multiple pixels are multiple pixels evenly distributed in the undistorted image.
The second processing unit 605 is configured to process the two-dimensional perception information of the target object in the image acquired by the fisheye camera according to the external parameter of the fisheye camera, to obtain first perception information of the target object, where the first perception information is three-dimensional perception information.
The third processing unit 606 is configured to process the two-dimensional perception information of the target object in the image acquired by the bolt according to a preset external parameter of the bolt, so as to obtain second perception information of the target object, where the second perception information is three-dimensional perception information.
And the fusion unit 607 is configured to perform fusion processing on the first perception information and the second perception information to obtain fused three-dimensional perception information of the target object.
In some embodiments, the first perception information and the second perception information are world coordinate information of the target object; the fusion unit 607 is configured to perform weighting processing on the first perception information and the second perception information to obtain fused three-dimensional perception information of the target object.
An obtaining unit 608 is configured to obtain the running information of the vehicle.
A generating unit 609 is configured to generate a driving strategy for adjusting or maintaining the driving information of the vehicle according to the driving information and the fused three-dimensional perception information of the target object.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
There is also provided, in accordance with an embodiment of the present application, a computer program product, including: a computer program, stored in a readable storage medium, from which at least one processor of the electronic device can read the computer program, the at least one processor executing the computer program causing the electronic device to perform the solution provided by any of the embodiments described above.
FIG. 7 illustrates a schematic block diagram of an example electronic device 700 that can be used to implement embodiments of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 7, the electronic device 700 includes a computing unit 701, which may perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)702 or a computer program loaded from a storage unit 708 into a Random Access Memory (RAM) 703. In the RAM 703, various programs and data required for the operation of the device 700 can also be stored. The computing unit 701, the ROM 702, and the RAM 703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
Various components in the device 700 are connected to the I/O interface 705, including: an input unit 706 such as a keyboard, a mouse, or the like; an output unit 707 such as various types of displays, speakers, and the like; a storage unit 708 such as a magnetic disk, optical disk, or the like; and a communication unit 709 such as a network card, modem, wireless communication transceiver, etc. The communication unit 709 allows the device 700 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
Computing unit 701 may be a variety of general purpose and/or special purpose processing components with processing and computing capabilities. Some examples of the computing unit 701 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 701 performs the respective methods and processes described above, such as the external reference calibration method of the fisheye camera. For example, in some embodiments, the external reference calibration method of a fisheye camera may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 708. In some embodiments, part or all of a computer program may be loaded onto and/or installed onto device 700 via ROM 702 and/or communications unit 709. When the computer program is loaded into the RAM 7703 and executed by the computing unit 701, one or more steps of the external reference calibration method for a fish-eye camera described above may be performed. Alternatively, in other embodiments, the computing unit 701 may be configured by any other suitable means (e.g. by means of firmware) to perform the external reference calibration method of the fisheye camera.
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS"). The server may also be a server of a distributed system, or a server incorporating a blockchain.
According to another aspect of the embodiments of the present application, there is also provided a roadside device including the electronic device according to the above embodiments.
According to another aspect of the embodiment of the present application, an embodiment of the present application further provides a cloud control platform, including the electronic device according to the above embodiment.
According to another aspect of the embodiments of the present application, there is also provided an external reference calibration system for a fisheye camera, including: the external reference calibration device of the fisheye camera, the gunlock and the fisheye camera in the embodiment; wherein the content of the first and second substances,
a fisheye camera and bolt are used to determine the target focal length.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (26)

1. An external parameter calibration method of a fisheye camera comprises the following steps:
according to the target focal length, carrying out distortion removal processing on an original image acquired by the fisheye camera to obtain a distortion-removed image; the target focal length is a focal length corresponding to a preset image acquisition overlapping area between the fisheye camera and the gunlock;
and determining external parameters of the fisheye camera according to the undistorted image.
2. The method of claim 1, further comprising:
determining an image acquisition area of the fisheye camera under each focal length according to the field angle of the fisheye camera under each focal length, and determining an image acquisition area of the bolt machine under each focal length according to the field angle of the bolt machine under each focal length;
and determining the preset image acquisition overlapping area according to the image acquisition area of the fisheye camera under each focal length and the image acquisition area of the bolt machine under each focal length.
3. The method of claim 2, determining an image acquisition area of the fisheye camera at each focal length from the fisheye camera's field angle at each focal length, and determining an image acquisition area of the bolt at each focal length from the bolt's field angle at each focal length, comprising:
determining an image acquisition area of the fisheye camera under each focal length according to the field angle of the fisheye camera under each focal length and the acquired height information of the monitoring rod on which the fisheye camera is positioned;
and determining an image acquisition area of the bolt machine at each focal length according to the field angle of the bolt machine at each focal length and the acquired height information of the monitoring rod.
4. The method of any of claims 1 to 3, wherein the undistorted image is a two-dimensional image; determining external parameters of the fisheye camera from the undistorted image, comprising:
acquiring at least one pixel point in the undistorted image, a two-dimensional coordinate in the undistorted image and a three-dimensional coordinate in a world coordinate system;
and determining external parameters of the fisheye camera according to the two-dimensional coordinates and the three-dimensional coordinates.
5. The method of claim 4, wherein the external reference of the fisheye camera comprises: and the fish-eye camera rotates a matrix and translates vectors under a world coordinate system.
6. The method of claim 4, wherein the number of the pixels is plural, and the plural pixels are plural pixels uniformly distributed in the undistorted image.
7. The method of any of claims 1 to 3, further comprising:
processing two-dimensional perception information of a target object in an image acquired by the fisheye camera according to external parameters of the fisheye camera to obtain first perception information of the target object, wherein the first perception information is three-dimensional perception information;
processing two-dimensional perception information of the target object in the image acquired by the bolt according to preset external parameters of the bolt to obtain second perception information of the target object, wherein the second perception information is three-dimensional perception information;
and performing fusion processing on the first perception information and the second perception information to obtain fused three-dimensional perception information of the target object.
8. The method of claim 7, wherein the first perception information and the second perception information are both world coordinate information of the target object; performing fusion processing on the first perception information and the second perception information to obtain fused three-dimensional perception information of the target object, including:
and performing weighting processing on the first perception information and the second perception information to obtain fused three-dimensional perception information of the target object.
9. The method of claim 7, further comprising:
and acquiring the driving information of the vehicle, and generating a driving strategy for adjusting or maintaining the driving information of the vehicle according to the driving information and the fused three-dimensional perception information of the target object.
10. The method according to any one of claims 1 to 3, wherein the undistorting the original image acquired by the fisheye camera according to the target focal length to obtain a undistorted image comprises:
acquiring a calibration plate image acquired by a fisheye camera based on a target focal length, and projecting the calibration plate image onto a spherical surface to obtain a spherical surface image;
and carrying out distortion removal treatment on the spherical image to obtain a distortion-removed image.
11. An external reference calibration device of a fisheye camera comprises:
the first processing unit is used for carrying out distortion removal processing on an original image acquired by the fisheye camera according to the target focal length to obtain a distortion-removed image; the target focal length is a focal length corresponding to a preset image acquisition overlapping area between the fisheye camera and the gunlock;
a first determining unit, configured to determine an external parameter of the fisheye camera according to the undistorted image.
12. The apparatus of claim 11, further comprising:
the second determining unit is used for determining an image acquisition area of the fisheye camera under each focal length according to the field angle of the fisheye camera under each focal length, and determining the image acquisition area of the bolt machine under each focal length according to the field angle of the bolt machine under each focal length;
and the third determining unit is used for determining the preset image acquisition overlapping area according to the image acquisition area of the fisheye camera under each focal length and the image acquisition area of the gunlock under each focal length.
13. The apparatus of claim 12, the second determination unit comprising:
the first determining subunit is configured to determine an image acquisition area of the fisheye camera at each focal length according to the field angle of the fisheye camera at each focal length and the acquired height information of the monitoring rod on which the fisheye camera is located;
and the second determining subunit is used for determining an image acquisition area of the bolt machine at each focal length according to the field angle of the bolt machine at each focal length and the acquired height information of the monitoring rod.
14. The apparatus of any of claims 11 to 13, wherein the undistorted image is a two-dimensional image; the first determination unit includes:
the first acquisition subunit is used for acquiring at least one pixel point in the undistorted image, a two-dimensional coordinate in the undistorted image and a three-dimensional coordinate in a world coordinate system;
and the third determining subunit is used for determining the external parameters of the fisheye camera according to the two-dimensional coordinates and the three-dimensional coordinates.
15. The apparatus of claim 14, wherein the external reference of the fisheye camera comprises: and the fish-eye camera rotates a matrix and translates vectors under a world coordinate system.
16. The apparatus of claim 14, wherein the number of the pixels is plural, and the plural pixels are plural pixels uniformly distributed in the undistorted image.
17. The apparatus of any of claims 11 to 13, further comprising:
the second processing unit is used for processing the two-dimensional perception information of the target object in the image acquired by the fisheye camera according to the external parameters of the fisheye camera to obtain first perception information of the target object, wherein the first perception information is three-dimensional perception information;
the third processing unit is used for processing the two-dimensional perception information of the target object in the image acquired by the bolt according to preset external parameters of the bolt to obtain second perception information of the target object, wherein the second perception information is three-dimensional perception information;
and the fusion unit is used for carrying out fusion processing on the first perception information and the second perception information to obtain fused three-dimensional perception information of the target object.
18. The apparatus of claim 17, wherein the first perception information and the second perception information are both world coordinate information of the target object; the fusion unit is used for weighting the first perception information and the second perception information to obtain fused three-dimensional perception information of the target object.
19. The apparatus of claim 17, further comprising:
an acquisition unit configured to acquire travel information of a vehicle;
and the generating unit is used for generating a driving strategy for adjusting or maintaining the driving information of the vehicle according to the driving information and the fused three-dimensional perception information of the target object.
20. The apparatus of any of claims 11 to 13, wherein the first processing unit comprises:
the second acquisition subunit is used for acquiring a calibration plate image acquired by the fisheye camera based on the target focal length;
the projection subunit is used for projecting the calibration plate image onto a spherical surface to obtain a spherical surface image;
and the processing subunit is used for carrying out distortion removal processing on the spherical image to obtain a distortion-removed image.
21. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-10.
22. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-10.
23. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-10.
24. A roadside apparatus comprising the electronic apparatus of claim 21.
25. A cloud controlled platform comprising the electronic device of claim 21.
26. An external reference calibration system of a fisheye camera, comprising: a fisheye camera, a bolt, and the apparatus of any of claims 11-20; wherein the content of the first and second substances,
the fisheye camera and the bolt face are used for determining a target focal length.
CN202110254664.0A 2021-03-09 2021-03-09 External parameter calibration method, device and system of fish-eye camera Active CN112967345B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110254664.0A CN112967345B (en) 2021-03-09 2021-03-09 External parameter calibration method, device and system of fish-eye camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110254664.0A CN112967345B (en) 2021-03-09 2021-03-09 External parameter calibration method, device and system of fish-eye camera

Publications (2)

Publication Number Publication Date
CN112967345A true CN112967345A (en) 2021-06-15
CN112967345B CN112967345B (en) 2024-04-26

Family

ID=76277381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110254664.0A Active CN112967345B (en) 2021-03-09 2021-03-09 External parameter calibration method, device and system of fish-eye camera

Country Status (1)

Country Link
CN (1) CN112967345B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113256540A (en) * 2021-07-14 2021-08-13 智道网联科技(北京)有限公司 Image distortion removal method and apparatus, electronic device, and computer-readable storage medium
CN113344906A (en) * 2021-06-29 2021-09-03 阿波罗智联(北京)科技有限公司 Vehicle-road cooperative camera evaluation method and device, road side equipment and cloud control platform
CN115376313A (en) * 2022-07-26 2022-11-22 四川智慧高速科技有限公司 Method for realizing image fusion and distortion correction based on monitoring camera group
CN115423804A (en) * 2022-11-02 2022-12-02 依未科技(北京)有限公司 Image calibration method and device and image processing method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106683139A (en) * 2017-02-20 2017-05-17 南京航空航天大学 Fisheye-camera calibration system based on genetic algorithm and image distortion correction method thereof
CN106846415A (en) * 2017-01-24 2017-06-13 长沙全度影像科技有限公司 A kind of multichannel fisheye camera binocular calibration device and method
CN107248178A (en) * 2017-06-08 2017-10-13 上海赫千电子科技有限公司 A kind of fisheye camera scaling method based on distortion parameter
US20190043219A1 (en) * 2018-07-02 2019-02-07 Intel Corporation Dual Model for Fisheye Lens Distortion and an Algorithm for Calibrating Model Parameters
US20200267310A1 (en) * 2020-05-07 2020-08-20 Intel Corporation Single image ultra-wide fisheye camera calibration via deep learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106846415A (en) * 2017-01-24 2017-06-13 长沙全度影像科技有限公司 A kind of multichannel fisheye camera binocular calibration device and method
CN106683139A (en) * 2017-02-20 2017-05-17 南京航空航天大学 Fisheye-camera calibration system based on genetic algorithm and image distortion correction method thereof
CN107248178A (en) * 2017-06-08 2017-10-13 上海赫千电子科技有限公司 A kind of fisheye camera scaling method based on distortion parameter
US20190043219A1 (en) * 2018-07-02 2019-02-07 Intel Corporation Dual Model for Fisheye Lens Distortion and an Algorithm for Calibrating Model Parameters
US20200267310A1 (en) * 2020-05-07 2020-08-20 Intel Corporation Single image ultra-wide fisheye camera calibration via deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨宇;赵成星;张晓玲;: "鱼眼相机的视觉标定及畸变校正", 激光杂志, no. 09, pages 26 - 29 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113344906A (en) * 2021-06-29 2021-09-03 阿波罗智联(北京)科技有限公司 Vehicle-road cooperative camera evaluation method and device, road side equipment and cloud control platform
CN113344906B (en) * 2021-06-29 2024-04-23 阿波罗智联(北京)科技有限公司 Camera evaluation method and device in vehicle-road cooperation, road side equipment and cloud control platform
CN113256540A (en) * 2021-07-14 2021-08-13 智道网联科技(北京)有限公司 Image distortion removal method and apparatus, electronic device, and computer-readable storage medium
CN115376313A (en) * 2022-07-26 2022-11-22 四川智慧高速科技有限公司 Method for realizing image fusion and distortion correction based on monitoring camera group
CN115423804A (en) * 2022-11-02 2022-12-02 依未科技(北京)有限公司 Image calibration method and device and image processing method
CN115423804B (en) * 2022-11-02 2023-03-21 依未科技(北京)有限公司 Image calibration method and device and image processing method

Also Published As

Publication number Publication date
CN112967345B (en) 2024-04-26

Similar Documents

Publication Publication Date Title
CN112967345B (en) External parameter calibration method, device and system of fish-eye camera
CN108319655B (en) Method and device for generating grid map
CN113657224B (en) Method, device and equipment for determining object state in vehicle-road coordination
CN112967344B (en) Method, device, storage medium and program product for calibrating camera external parameters
CN113378760A (en) Training target detection model and method and device for detecting target
CN113989450A (en) Image processing method, image processing apparatus, electronic device, and medium
CN115578433B (en) Image processing method, device, electronic equipment and storage medium
CN112106111A (en) Calibration method, calibration equipment, movable platform and storage medium
CN115410167A (en) Target detection and semantic segmentation method, device, equipment and storage medium
EP3940636A2 (en) Method for acquiring three-dimensional perception information based on external parameters of roadside camera, and roadside device
CN112560769B (en) Method for detecting obstacle, electronic device, road side device and cloud control platform
CN114299242A (en) Method, device and equipment for processing images in high-precision map and storage medium
CN113344906A (en) Vehicle-road cooperative camera evaluation method and device, road side equipment and cloud control platform
CN111553342B (en) Visual positioning method, visual positioning device, computer equipment and storage medium
CN112509126A (en) Method, device, equipment and storage medium for detecting three-dimensional object
CN113112551B (en) Camera parameter determining method and device, road side equipment and cloud control platform
CN116129422A (en) Monocular 3D target detection method, monocular 3D target detection device, electronic equipment and storage medium
JP7258101B2 (en) Image stabilization method, device, electronic device, storage medium, computer program product, roadside unit and cloud control platform
CN114742726A (en) Blind area detection method and device, electronic equipment and storage medium
CN113497897B (en) Vehicle-road cooperative roadside camera installation parameter adjusting method and device and electronic equipment
CN113470103A (en) Method and device for determining camera action distance in vehicle-road cooperation and road side equipment
CN113483771A (en) Method, device and system for generating live-action map
CN113312979B (en) Image processing method and device, electronic equipment, road side equipment and cloud control platform
CN112598750B (en) Road side camera calibration method and device, electronic equipment and storage medium
CN115294234B (en) Image generation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20211022

Address after: 100176 101, floor 1, building 1, yard 7, Ruihe West 2nd Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Applicant after: Apollo Zhilian (Beijing) Technology Co.,Ltd.

Address before: 2 / F, baidu building, 10 Shangdi 10th Street, Haidian District, Beijing 100085

Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant