CN109978954A - The method and apparatus of radar and camera combined calibrating based on cabinet - Google Patents
The method and apparatus of radar and camera combined calibrating based on cabinet Download PDFInfo
- Publication number
- CN109978954A CN109978954A CN201910155349.5A CN201910155349A CN109978954A CN 109978954 A CN109978954 A CN 109978954A CN 201910155349 A CN201910155349 A CN 201910155349A CN 109978954 A CN109978954 A CN 109978954A
- Authority
- CN
- China
- Prior art keywords
- cabinet
- point
- angle point
- radar
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Abstract
The invention discloses the method and apparatus of radar and camera combined calibrating based on cabinet.This method comprises: obtaining the image data comprising cabinet from camera;The point cloud data comprising cabinet is obtained from laser radar;According to point cloud data, point cloud height map is established, cabinet region is extracted;Angle point fitting is carried out to the housing area, to calculate the angle point of cabinet;Angle point fitting is carried out to image data, with the angle point of abstract image;Based on calculated cabinet angle point and the image angle point that is extracted, obtain corresponding point pair;Stablized and accurately transition matrix based on corresponding point to calculate.
Description
Technical field
In summary, the present invention relates to automatic Pilot fields, specifically, the present invention relates to radars and phase based on cabinet
The method and apparatus of machine combined calibrating.
Background technique
With the development of technology, the robot driver vehicles (for example, " UAV " or " unmanned plane ") automatic Pilot technology is opened
Beginning is increasingly becoming hot spot.In unmanned field, environment perception technology is one of the key technology of the robot driver vehicles,
Each robot driver vehicles may be equipped with many sensors, and these sensors are distributed in robot driver friendship
The different location of logical tool, their coordinate system are all based on a coordinate system of installation site, but when final process
Wish that the result by each sensor is fused together, so need to obtain the coordinate transformation relation between different sensors, and
This process is referred to as combined calibrating (Calibration).
After laser radar and camera image calibration are got up, so that it may using the corresponding information of two kinds of sensors to barrier
It carries out comprehensive analysis and makes accurate decision, to have to the environment perception technology of the robot driver vehicles important
Meaning.
In unmanned field, usually using focal length camera, focal length camera detection range is remote, and blind area is larger, but is finding
The region of corresponding points pair, radar points cloud can be than sparse, and traditional scaling method application condition is big.Therefore, the present invention devises
A kind of point cloud detection and image Corner Detection Algorithm based on cabinet, to obtain accurate camera radar transition matrix.
Summary of the invention
In view of this, the disclosure of invention provide radar and camera combined calibrating based on cabinet method, apparatus,
Equipment and computer storage medium.The scaling method and device that the present invention provides are adaptable to a variety of different road conditions and scene,
Realize laser and camera image high-precision calibrating, thus can be widely used in robot driver vehicles vision guided navigation,
The fields such as unmanned vision auxiliary driving.It is inclined relative to traditional point cloud detection based on the combined calibrating of plate at 40 meters
Poor 20-30cm, the disclosure of invention realizes the control errors by cloud detection at 40 meters within 2cm, to improve
The safety of robot driver vehicles automatic Pilot.
In one aspect, the embodiment provides the sides of a kind of radar based on cabinet and camera combined calibrating
Method, this method comprises: obtaining the image data comprising the cabinet from camera;The point comprising the cabinet is obtained from laser radar
Cloud data;According to the point cloud data, point cloud height map is established, the cabinet region is extracted;Angle is carried out to the housing area
Point fitting, to calculate the angle point of the cabinet;Angle point fitting is carried out to described image data, to extract the angle point of described image;
Based on calculated cabinet angle point and the image angle point that is extracted, obtain corresponding point pair;Based on the corresponding point to next
Calculate stabilization and accurately transition matrix.
In one embodiment of the disclosure of invention, angle point fitting is carried out to calculate the case to the housing area
The angle point of body, comprising: two sides of the cabinet swept in the point cloud data are split;To two be partitioned into
A side carries out plane fitting, and carries out plane fitting to the road surface around the cabinet, to obtain the equation of three planes;Root
The angle point of the cabinet is calculated according to the equation of three planes.
In one embodiment of the disclosure of invention, the planar fit method is that (random sampling is consistent by RANSAC
Property (Random Sample Consensus)) method.
In one embodiment of the disclosure of invention, angle point fitting is carried out to extract the figure to described image data
The angle point of picture, comprising: detect the edge of described image;Gaussian smoothing is carried out to eliminate breakpoint to described image edge;According to most
Small OBBs algorithm finds out the hierarchical structure at described image edge to extract the angle point of described image.In one aspect, the figure
As the hierarchical structure at edge can be 6.
In one embodiment of the disclosure of invention, based on the corresponding point to come calculate the stabilization and precisely
Transition matrix, comprising: using the solvepnp algorithm that is provided in opencv solve it is described for the corresponding point to turning
Change matrix.
In one embodiment of the disclosure of invention, described image edge detection is carried out using canny operator.
In one embodiment of the disclosure of invention, according to the point cloud data, point cloud height map and extraction are established
The cabinet region, comprising: establish m*n characteristic pattern, wherein each point represents a block;The point cloud data is traversed,
Obtain the difference in height figure of each block;Based on the height of the cabinet, the tool of the cabinet is obtained by the difference in height figure
Body region.
In one embodiment of the disclosure of invention, the lateral layout of the cabinet has specific pattern, to improve
The accuracy of radar and camera combined calibrating.
In one embodiment of the disclosure of invention, the point cloud data and described image data to acquisition are carried out
Pretreatment, wherein the pretreatment includes time synchronization, data screening, point cloud data correction etc..
On the other hand, the embodiment provides a kind of radar for based on cabinet and camera combined calibratings
Device.It the apparatus may include module is obtained, is used for: obtaining the image data comprising the cabinet from camera;From laser
Radar obtains the point cloud data comprising the cabinet.The device can also include box drawing module, be used for according to the point
Cloud data establish point cloud height map, extract the cabinet region.The device can also include cabinet angle point computing module,
It is used to carry out angle point fitting to the housing area, to calculate the angle point of the cabinet.The device can also include image angle
Point abstraction module is used to carry out angle point fitting to described image data, to extract the angle point of described image.The device can be with
Including matrix computing module, be used for: based on calculated cabinet angle point and the extraction image angle point, obtain it is corresponding
Point pair;Stablized and accurately transition matrix based on the corresponding point to calculate.
Each embodiment can also include the robot driver vehicles with radar camera combined calibrating device, the thunder
Up to camera combined calibrating device include transceiver, memory and configured with processor-executable instruction it is general with institute above executing
The processor of the operation for the method stated.Each embodiment includes setting for handling used in the robot driver vehicles
It is standby, it is configured to execute the operation of method outlined above.Each embodiment includes being stored with processor-executable instruction
Non-transitory processor readable medium, the processor-executable instruction are configured as making the processing of the robot driver vehicles
Device executes the operation of method outlined above.
Detailed description of the invention
It is incorporated herein and is constituted the attached drawing of this specification a part, depicts exemplary embodiment, and together with above
Provide be broadly described and detailed description given below comes together to explain the feature of each embodiment.
Fig. 1 shows the environment or system for being suitable for being used to realize the embodiment of the present invention;
Fig. 2 is according to embodiments of the present invention, to show and join for the radar camera used in the robot driver vehicles
Close the block diagram of the component of calibration facility;
Fig. 3 is according to embodiments of the present invention, shows for the laser coordinate system of combined calibrating and the definition of camera coordinates system
Schematic diagram;
Fig. 4 is according to embodiments of the present invention, shows the specific pattern for designing on the side of cabinet;
Fig. 5 is according to embodiments of the present invention, shows the design scheme based on this paper, using the parameter solved by three-dimensional point
Cloud projects to the exemplary results on image;
Fig. 6 is according to embodiments of the present invention, shows the method for a kind of radar for based on cabinet and camera combined calibrating
Schematic flow chart;
Fig. 7 is according to embodiments of the present invention, shows the device of a kind of radar for based on cabinet and camera combined calibrating
Schematic block diagram.
In the accompanying drawings, the same or similar label is used to represent the same or similar element.
Specific embodiment
Disclosed preferred embodiment that the present invention will be described in more detail below with reference to accompanying drawings.Although showing this in attached drawing
The preferred embodiment of disclosure of the invention content, however, it is to be appreciated that the present invention can also be realized with other various forms without answering
It is limited in specific embodiment described below.There is provided these specific embodiments herein is to disclose the present invention more
Add thorough and complete, and range disclosed by the invention can be completely communicated to those skilled in the art.
" illustrative " word used herein means " being used as example, illustration or explanation ".Here depicted as " showing
Any aspect of example property " is not necessarily to be construed as or more advantage more more preferable than other aspects.
As it is used herein, term " the robot driver vehicles " and " unmanned plane ", " unmanned vehicle " refer to: including
It is configured to supply one of various types of vehicles of cart-mounted computing device of some autonomous or semi-autonomous ability.Robot
The example to drive a conveyance includes but is not limited to: such as aircraft of unmanned vehicle (UAV) etc;Surface car
(for example, autonomous driving vehicle or semi-automatic driving automobile etc.);Water base vehicle is (that is, being configured as on the water surface or making under water
The vehicle of industry);Its basic vehicle (for example, spacecraft or space probe);And/or its certain combination.In some embodiments, machine
Device people drive a conveyance can be it is manned.In other embodiments, the robot driver vehicles can be nobody
It drives.In some implementations, the robot driver vehicles can be aircraft (unmanned or manned), the flight
Device can be rotor craft or lifting vehicle.
Each embodiment can realize that being shown in FIG. 1 may adapt in the various robot driver vehicles
The example used in conjunction with various embodiments.
Referring to Fig. 1, system or environment 1 may include one or more robot driver vehicles 10 and target cabinet
20.The robot driver vehicles 10 in Fig. 1 can be communicated with any communication network, can not also be with these communication networks
Network is communicated.Target cabinet 20 can be the three-dimensional cabinet with particular design pattern for radar camera combined calibrating.
In various embodiments, the robot driver vehicles 10 may include one or more cameras 140, one
Or multiple cameras 140 are configured as obtaining image, and image data is supplied to the processing equipment of the robot driver vehicles 10
110。
In various embodiments, the robot driver vehicles 10 may include one or more laser radars 150, described
One or more laser radars 150 are configured as obtaining radar point cloud data, and the radar point cloud data of acquisition is supplied to machine
People drive a conveyance 10 processing equipment 110.Target cabinet 20 can be in 140 He of camera of the robot driver vehicles 10
In the range of exposures of laser radar 150.
Such as Global Navigation Satellite System (GNSS), global positioning system can be used in the robot driver vehicles 10
(GPS) etc. positioning is navigated or determined to navigation system, GNSS/IMU can be used obtains robot driver traffic work
The posture information of tool.In some embodiments, the robot driver vehicles 10 can be used substitution positioning signal source (that is,
Different from GNSS, GPS etc.).
The robot driver vehicles 10 may include processing equipment 110, processing equipment 110 can be configured as monitoring and
Control various functions, subsystem and/or the other components of the robot driver vehicles 10.For example, processing equipment 110 can be by
Be configured to be monitored and controlled the robot driver vehicles 10 various functions, such as with propulsion, power management, sensor tube
Reason, navigation, communication, actuating, turn to, braking and/or mode of vehicle operation manage relevant module, software, instruction, circuit, hard
Part.
Processing equipment 110 can accommodate the various circuits of the operation for controlling the robot driver vehicles 10 and set
It is standby.For example, processing equipment 110 may include the processor 120 for indicating the control of the robot driver vehicles 10.Processor
120 may include one or more processors, be configured as executing processor-executable instruction (for example, application program, example
Journey, script, instruction set etc.) with the operation that controls the robot driver vehicles 10, (it includes the behaviour of various embodiments herein
Make).In some embodiments, processing equipment 110 may include the memory 122 for being coupled to processor 120, be configured as depositing
Store up data (for example, the image data obtained, radar point cloud data, received message, application program etc.).120 He of processor
Memory 122 and other elements may be configured to or including systems on chip (SOC) 115.Processing equipment 110 may include
More than one SOC 115, to increase the quantity of processor 120 and processor core.Processing equipment 110 can also include not with
The associated processor 120 of SOC 115.Each processor 120 can be multi-core processor.
As used herein term " system on chip " or " SOC " refer to the electronic circuit of one group of interconnection, usually (but not
It exclusively) include one or more processors (for example, 120), memory (for example, 122) and communication interface.SOC 115 can be with
Including various types of processor 120 and processor core, such as general processor, central processing unit (CPU), number letter
The subsystem of number processor (DSP), graphics processing unit (GPU), acceleration processing unit (APU), the specific components of processing equipment
Processor is (for example, the display for the image processor of radar camera combined calibrating device (e.g., 130) or for display
Processor, secondary processor, single core processor and multi-core processor).SOC 115 can also include other hardware and hardware group
It closes, such as field programmable gate array (FPGA), specific integrated circuit (ASIC), other programmable logic device, discrete gate are patrolled
Volume, transistor logic, performance monitoring hardware, house dog hardware and time reference.Integrated circuit can be configured, so that collection
It is resided on monolithic semiconductor material (for example, silicon) at the component of circuit.
Processing equipment 110 can also include or be connected to one or more sensors 136, and institute can be used in processor 120
State one or more sensors 136 determine information associated with vehicle operating and/or with the robot driver vehicles 10
The corresponding associated information of external environment, to control the various processing on the robot driver vehicles 10.This sensing
The example of device 136 includes accelerometer, gyroscope and electronic compass, they are configured as providing to processor 120 about machine
Device people drives a conveyance the data in 10 direction and the variation of movement.For example, in some embodiments, processor 120 can be with
Use the data from sensor 136 as input, for determining or predicting the movement number of the robot driver vehicles 10
According to.Can be by various circuits (for example, bus or other similar circuits), it will be each in processing equipment 110 and/or SOC 115
Kind component is coupled.
Processing equipment 110 can also include radar camera combined calibrating device 130, and the latter can obtain from camera 140 and wrap
Image data containing cabinet 20 obtains the point cloud data comprising cabinet 20 from laser radar 150.Radar camera combined calibrating device
130 can establish point cloud height map, extract 20 region of cabinet, carry out angle point to the housing area according to the point cloud data
It is calculated to be based on institute to image data progress angle point fitting with the angle point of abstract image with the angle point for calculating cabinet 20 for fitting
Cabinet angle point and the image angle point extracted obtain corresponding point pair, and are stablized and smart based on these corresponding points to calculate
Quasi- transition matrix.The transition matrix can be used for for the three-dimensional point coordinate in radar being mapped in image coordinate system.
It, can will be some or all of although the various components of processing equipment 110 are illustrated as individual component in figure
Component (for example, processor 120, memory 122 and other units) be integrated in individual equipment or module together (for example, on piece
System module) in.
Each embodiment can realize in the radar camera combined calibrating equipment 200 of the robot driver vehicles,
One example is shown in Fig. 2.Referring to Fig. 1-2, the radar camera combined calibrating equipment 200 suitable for various embodiments can be with
Including camera 140, processor 208, memory 210, laser radar element 212 and combined calibrating unit 214.
Camera 140 may include at least one imaging sensor 204 and at least one optical system 206 (for example, one or
Multiple lens).Camera 140 can obtain one or more digital pictures (sometimes herein called picture frame).Camera 140 can wrap
Include single monocular camera, stereoscopic camera and/or omnidirectional camera.In some embodiments, camera 140 can combine with radar camera
Calibration facility 200 is physically separated, for example, positioned at the robot driver vehicles outside and (do not show via data cable
It is connected to processor 208 out).In some embodiments, camera 140 may include another processor (not shown), can
Configured with one or more operations in operation of the processor-executable instruction to execute various embodiment methods.
In some embodiments, memory 210 can be realized in camera 140 or such as frame buffer (does not show
Another memory out) etc.For example, camera 140 may include being configured as to the image from imaging sensor 204
Before data are handled (for example, being handled by processor 208), (that is, provisionally storing) is cached to the data.?
In some embodiments, radar camera combined calibrating equipment 200 may include image data buffer, be configured as to from phase
The image data of machine 140 is cached (that is, provisionally storing).Such caching image data can be supplied to processor 208,
Either by processor 208 or it can be configured as executing other processors visit of some or all of various embodiments operation
It asks.
Laser radar element 212 can be configured as the one or more laser radar point cloud datas of capture.It can will capture
One or more laser radar point cloud datas storage in memory 210.
Radar camera combined calibrating equipment 200 optionally includes: being configured as robot measurement and drives a conveyance 10
The Inertial Measurement Unit (IMU) of various parameters.IMU may include one or more of gyroscope, accelerometer and magnetometer.
IMU can be configured as the variation of detection pitching associated with the robot driver vehicles 10, rolling and yaw axis.IMU
Outputting measurement value is determined for height, angular speed, linear velocity and/or the position of the robot driver vehicles.
In some embodiments, combined calibrating unit 214 can be configured as use and mention from 140 captured image of camera
It the information that takes, the one or more laser radar point cloud datas captured from laser radar element 212 and is optionally obtained from IMU
The posture information obtained, carries out radar and camera combined calibrating based on cabinet, and determination is various for what is navigated in environment
Parameter, to navigate in the environment of the robot driver vehicles 10.
In various embodiments, can one or more camera captured images to camera 140, from laser radar element
One or more of one or more laser radar point cloud datas of 212 captures add timestamp.Combined calibrating unit 214 can be with
It one or more images for being captured using these timestamp informations from camera 140 and/or captures from laser radar element 212
Information is extracted in one or more laser radar point cloud datas and/or is led in the environment of the robot driver vehicles 10
Boat.
Processor 208 may be coupled to camera 140, one or more imaging sensors 204, one or more optical systems
206, memory 210, laser radar element 212 and combined calibrating unit 214 (for example, being communicated with them).Processor 208
It can be general purpose single-chip or multi-chip microprocessor (for example, arm processor), specific use microprocessor is (for example, number letter
Number processor (DSP)), microcontroller, programmable gate array etc..Processor 208 is properly termed as central processing unit (CPU).
Although single processor 208 is shown in FIG. 2, radar camera combined calibrating equipment 200 may include multiple processor (examples
Such as, multi-core processor) or different types of processor (for example, ARM and DSP) combination.
Processor 208 can be configured as the method for realizing each embodiment, to carry out the radar and camera based on cabinet
Combined calibrating and/or in the environment navigating robot drive a conveyance 10.
Memory 210 can store data (for example, image data, radar point cloud data, timestamp and combined calibrating list
First 214 associated data etc.) and the instruction that can be executed by processor 208.In various embodiments, it can store
The example of instruction and/or data in memory 210 may include image data, gyroscope measurement data, radar points cloud number
According to, automatic calibration command of camera etc..Memory 210 can be any electronic building brick that can store electronic information comprising
Such as the flash memory in random access memory (RAM), read-only memory (ROM), magnetic disk storage medium, optical storage media, RAM is set
Standby, processor is subsidiary onboard storage device, Erasable Programmable Read Only Memory EPROM (EPROM), electrically erasable is read-only deposits
Reservoir (EEPROM), register etc. (including a combination thereof).
Certainly, it should be understood by one skilled in the art that the radar camera combined calibrating equipment 200 for example can be clothes
Business device or computer, are also possible to intelligent terminal, such as electronic lock, smart phone, Intelligent flat etc., the present invention is not by this
Limitation.
It will be detailed below the mechanism and principle of the embodiment of the present invention.Unless specifically stated otherwise, below and claim
Used in term "based" indicate " being based at least partially on ".Term " includes " indicates that opening includes, i.e., " including but it is unlimited
In ".Term " multiple " expression " two or more ".Term " one embodiment " expression " at least one embodiment ".Term is " another
Embodiment " expression " at least one other embodiment ".The definition of other terms provides in will be described below.
Fig. 3 is according to embodiments of the present invention, shows for the laser coordinate system of combined calibrating and the definition of camera coordinates system
Schematic diagram.As shown in the figure, laser radar coordinate system and camera coordinates system are respectively defined, wherein (X, Y, Z) is laser radar
Coordinate under coordinate system, (U, V) are the coordinates of the image under camera coordinates system.
Radar camera combined calibrating process includes the calibration of camera internal reference and the calibration of outer ginseng.The present invention uses widely used
Positive friend's standardization comes calibration for cameras inner parameter (that is, camera internal reference), thus the effective focal length f of our available cameras, figure
Origin (u, v), the scale factor fx and fy of picture.
Radar and the process of Camera extrinsic calibration can be indicated by following formula:
Wherein, (x, y, z) is the coordinate under radar fix system, and (u, v) is the pixel of picture, and first 3*3 matrix is camera
Internal reference matrix, scale factor fx and fy respectively represent the focal length in each direction of camera, and cx and cy represent the central point of camera, generally
It is the half of camera resolution.The matrix of second 3*4 is intended to outer ginseng matrix to be requested, and the part r is spin matrix, the part t
It is translation matrix.
In the combined calibrating of radar and camera, the matrix of the 4*3 mainly in calculating above formula thus can be by radar
In three-dimensional point coordinate be mapped in image coordinate system, and during calculating this, most important step is exactly to obtain camera
Corresponding point in picture and radar points cloud.It can be by realizing the cabinet angle point fitting algorithm and image data of point cloud data
Cabinet angle point fitting, to obtain corresponding point in camera picture and radar points cloud.
The cabinet angle point of point cloud data is fitted
Cabinet is placed in the laser radar visual field, the point cloud and surrounding road of laser is inswept cabinet will form two sides
The point cloud in face.By being partitioned into the point cloud of each plane, RANSAC (random sampling consistency (Random Sample is utilized
Consensus)) method fits the equation of each plane, can calculate the angle point of chest, and specific steps include: that (1) is built
The characteristic pattern of a vertical m*n, wherein each point represents a block, the size that each block is arranged herein is 20cm*20cm,
By traversal point a cloud, the difference in height figure of each block is obtained, because known to the height of cabinet, it is possible to by difference in height figure come
Obtain the specific region of cabinet;(2) divide two sides of chest according to the inflection point of every line of laser, secondly on cabinet boundary
Search point of proximity is as ground point around point, it is possible thereby to obtain the equation of following three planes by RANSAC algorithm:
A1x+B1y+C1z+D1=0
A2x+B2y+C2z+D2=0
A3x+B3y+C3z+D3=0
Wherein, equation group is solved, so that it may obtain coordinate of three plane point of intersection under radar fix system.
The cabinet angle point of image data is fitted
Since camera focus is long, when cabinet Corner Detection, needs to be placed on remote position, two dimensional code general at this time
Detection algorithm can be unstable, so devising a kind of pattern to overcome such case, which is attached to the side of cabinet, is passed through
The angle point of detection box obtains the position of cabinet in the picture.Fig. 4 is according to embodiments of the present invention, shows in cabinet
The specific pattern designed on side.The cabinet angle point fitting specific steps of image data may include: (1) first with canny
Operator carries out Image Edge-Detection;(2) influence different due to illumination, continuous line will appear breakpoint in edge detection,
Therefore Gaussian smoothing is done to eliminate breakpoint to the binaryzation picture after edge detection;(3) according to OBBs algorithm, image side is found out
The hierarchical structure of edge, the hierarchical structure of layout is 6 (choosing the level more than or equal to 5) here.As the bounding box to be looked for,
Angle point is one in four vertex of bounding box.
In the above manner, the point pair under camera coordinates system and laser radar coordinate system can be obtained, by these points to work
To require the input solved equation, to iteratively solve the 4*3 transition matrix in above formula.
For example, based on two kinds of algorithms described above, corresponding point pair can be accurately acquired, by using
The transition matrix that the solvepnp algorithm provided in opencv both solves, to be stablized and accurately transition matrix.
The transition matrix can be used for for the three-dimensional point coordinate in radar being mapped in image coordinate system.Fig. 5 is according to embodiments of the present invention, shows
The design scheme based on this paper is gone out, three-dimensional point cloud is projected to the exemplary results on image using the parameter solved.Knot
Fruit shows that error of the point cloud detection of the disclosure of invention at 40 meters is within 2cm, and the pixel error of Corner Detection is two
Within a pixel.Red in Fig. 5 is the data that three-dimensional point projects two-dimensional surface, it can be seen that the electric pole and point of distant place
Cloud is ideally bonded.
Fig. 6 exemplary embodiment according to the present invention is shown for radar and camera joint mark based on cabinet
The schematic flow chart of fixed method 600.Method 600 can be by described radar camera combined calibrating device referring to Fig.1
130, referring to described in Fig. 2 radar camera combined calibrating equipment 200 executes.Below with reference in Fig. 6 detailed description method 600
Including each step.
Method 600 starts from step 602, obtains image data from camera.It will appreciated by the skilled person that here
Obtaining image data for example can be the image data for obtaining acquisition, be also possible to obtain the picture number of acquisition after treatment
According to, or other means.The present invention is not limited thereto.
In step 604, the point cloud data comprising cabinet is obtained from laser radar.
Point cloud height map is established, cabinet region is extracted according to point cloud data in step 606.In one aspect, root
According to point cloud data, establishes point cloud height map and extract cabinet region, comprising: establish m*n characteristic pattern, wherein each point generation
One block of table;Point cloud data is traversed, the difference in height figure of each block is obtained;Height based on cabinet, by difference in height figure come
Obtain the specific region of cabinet.
In step 608, angle point fitting is carried out to housing area, to calculate the angle point of cabinet.In one aspect, to cabinet area
Domain carries out angle point fitting to calculate the angle point of cabinet, comprising: divides two sides of the cabinet swept in point cloud data
It cuts;Plane fitting is carried out to two sides being partitioned into, and plane fitting is carried out to the road surface around cabinet, to obtain three
The equation of plane;The angle point of cabinet is calculated according to the equation of three planes.In one aspect, which is
RANSAC method.
In step 610, angle point fitting is carried out to image data, with the angle point of abstract image.In one aspect, to picture number
It is fitted according to angle point is carried out with the angle point of abstract image, comprising: the edge of detection image;Gaussian smoothing is carried out to disappear to image border
Except breakpoint;According to minimum bounding box algorithm, the hierarchical structure of image border is found out to extract the angle point of described image.A side
The hierarchical structure in face, image border can be 6.In one aspect, it can use canny operator to carry out Image Edge-Detection.
In step 612, based on calculated cabinet angle point and the image angle point that is extracted, obtain corresponding point pair.
In step 614, stablized and accurately transition matrix based on corresponding point to calculate.In one aspect, based on pair
The point answered is stablized and accurately transition matrix to calculate, comprising: uses the solvepnp algorithm provided in opencv to solve and uses
In the transition matrix of corresponding point pair.The transition matrix can be used for the three-dimensional point coordinate in radar being mapped to image coordinate system
In.
In one aspect, there is specific pattern in the lateral layout of cabinet, to improve the essence of radar and camera combined calibrating
Exactness.
In one aspect, the point cloud data of acquisition and image data are pre-processed, wherein the pretreatment includes the time
Synchronization, data screening, point cloud data correction etc..
Fig. 7 is according to embodiments of the present invention, provides the device of a kind of radar for based on cabinet and camera combined calibrating
700 schematic block diagram.
The device 700 include: obtain module 702, be configured as from camera obtain include cabinet image data, from swash
Optical radar obtains the point cloud data comprising cabinet.The device 700 can also include box drawing module 704, be configured as root
According to point cloud data, point cloud height map is established, the cabinet region is extracted.The device 700 can also include cabinet angle point meter
Module 706 is calculated, is configured as carrying out angle point fitting to housing area, to calculate the angle point of cabinet.The device 700 can also wrap
Image angle point abstraction module 708 is included, is configured as carrying out angle point fitting to image data, with the angle point of abstract image.The dress
Setting to include matrix computing module 710, be configured as: based on the calculated cabinet angle point of institute and the image angle that is extracted
Point obtains corresponding point pair;Stablized and accurately transition matrix based on corresponding point to calculate.Then, device 700, which uses, is somebody's turn to do
Three-dimensional point coordinate in radar is mapped in image coordinate system by transition matrix.
The specific implementation of device 700 provided in this embodiment is referred to corresponding embodiment of the method, and details are not described herein.
For clarity, all selectable units or subelement included by device 700 are not shown in Fig. 7, and uses
Dotted line shows optional module.Described by above method embodiment and the embodiment by reference to that can be obtained with combination
All features and operation be respectively suitable for device 700, therefore details are not described herein.
It will be understood by those skilled in the art that the division of unit or sub-unit is not limiting but shows in device 700
Example property, but in order to more convenient it will be appreciated by those skilled in the art that logically describing its major function or operation.In device
In 700, the function of a unit can be realized by multiple units;Conversely, multiple units can also be realized by a unit.This
Invention limits not to this.
Likewise, carrying out the list that realization device 700 is included in various manners it will be understood by those skilled in the art that can adopt
Member comprising but be not limited to software, hardware, firmware or any combination thereof, the present invention limits not to this.
The present invention can be system, method, computer-readable storage medium and/or computer program product.Computer
Readable storage medium storing program for executing for example can be the tangible device for being able to maintain and storing the instruction used by instruction execution equipment.
Computer-readable/executable program instruction can be downloaded to from computer readable storage medium each calculating/from
Equipment is managed, outer computer or External memory equipment can also be downloaded to by various communication modes.The present invention does not limit specifically
Make the specific programming language for realizing computer-readable/executable program instruction or instruction.
Referring herein to according to the method for the embodiment of the present invention, the flowchart and or block diagram of device (system) describe this hair
Bright various aspects.It should be appreciated that each box in each box and flowchart and or block diagram of flowchart and or block diagram
Combination can be realized by computer-readable/executable program instruction.
Above-mentioned method description and process flow diagram are intended merely as illustrative example, rather than are intended to requirement or hidden
Contain the operation that each embodiment must be executed with given sequence.As those of ordinary skill in the art should understand that
, the operation order in the above embodiments can be executed in any order.
Various illustrative logical boxs, module, circuit and the algorithm operating described in conjunction with presently disclosed embodiment is equal
The combination of electronic hardware, computer software or both may be implemented into.It is this between hardware and software in order to clearly show that
Interchangeability has carried out general description around its function to various illustrative components, frame, module, circuit and operation above.
Hardware is implemented as this function and is also implemented as software, is set depending on specifically application and to what whole system was applied
Count constraint condition.Those skilled in the art can be directed to each specific application, realize described function in a manner of flexible, but
It is that this embodiment decision should not be interpreted as causing a departure from the scope of this disclosure.
For executing general processor, the digital signal processor (DSP), specific integrated circuit of function described herein
(ASIC), field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, point
Vertical hardware component or any combination thereof, can be used to realize or executes combine aspect disclosed herein to describe for realizing
The hardware of various illustrative logics, logical box, module and circuit.General processor can be multiprocessor, alternatively, the processing
Device is also possible to any conventional processor, controller, microcontroller or state machine.Processor also can be implemented as receiver
The combination of smart object, for example, the combination of DSP and microprocessor, several microprocessors, one or more microprocessors and DSP
The combination or any other such structure of kernel.Alternatively, some operations or method can be by specific to given functions
Circuit executes.
To can be realized any those of ordinary skill in this field or using the present invention, disclosed implementation is surrounded above
Example is described.It to those skilled in the art, is it will be apparent that simultaneously to the various modifications of these embodiments
And general principles defined herein can also be applied to other realities on the basis of not departing from spirit of the invention or protection scope
Apply example.Therefore, the disclosure of invention is not limited to embodiments shown herein, but with the appended claims and this paper
The widest scope of principle disclosed and novel features is consistent.
Claims (12)
1. a kind of method of radar and camera combined calibrating based on cabinet, which comprises
The image data comprising the cabinet is obtained from camera;
The point cloud data comprising the cabinet is obtained from laser radar;
According to the point cloud data, point cloud height map is established, the cabinet region is extracted;
Angle point fitting is carried out to the region of the cabinet, to calculate the angle point of the cabinet;
Angle point fitting is carried out to described image data, to extract the angle point of described image;
Image angle point based on the calculated cabinet angle point and the extraction, obtains corresponding point pair;
Stablized and accurately transition matrix based on the corresponding point to calculate.
2. according to the method described in claim 1, wherein, carrying out angle point fitting to the housing area to calculate the cabinet
Angle point, comprising:
Two sides of the cabinet swept in the point cloud data are split;
Plane fitting is carried out to two sides being partitioned into, and plane fitting is carried out to the road surface around the cabinet, to obtain
Obtain the equation of three planes;
The angle point of the cabinet is calculated according to the equation of three planes.
3. according to the method described in claim 2, wherein, the planar fit method is RANSAC random sampling consistency side
Method.
4. according to the method described in claim 1, wherein, carrying out angle point fitting to described image data to extract described image
Angle point, comprising:
Detect the edge of described image;
Gaussian smoothing is carried out to eliminate breakpoint to described image edge;
According to minimum bounding box algorithm, the hierarchical structure at described image edge is found out to extract the angle point of described image.
5. according to the method described in claim 1, wherein, based on the corresponding point to calculating the stabilization and accurately turn
Change matrix, comprising: solve the conversion square for being used for the corresponding point pair using the solvepnp algorithm provided in opencv
Battle array.
6. according to the method described in claim 4, wherein, the edge detection of described image is carried out using canny operator.
7. according to the method described in claim 4, wherein, the hierarchical structure at described image edge is 6.
8. according to the method described in claim 1, wherein, according to the point cloud data, establishing described in point cloud height map and extraction
Cabinet region, comprising:
Establish m*n characteristic pattern, wherein each point represents a block;
The point cloud data is traversed, the difference in height figure of each block is obtained;
Based on the height of the cabinet, the specific region of the cabinet is obtained by the difference in height figure.
9. according to the method described in claim 1, wherein, the lateral layout of the cabinet has specific pattern, to improve radar
With the accuracy of camera combined calibrating.
10. according to the method described in claim 1, further include:
The point cloud data and described image data of acquisition are pre-processed, wherein it is described pretreatment include time synchronization,
Data screening, point cloud data correction.
11. a kind of device of radar and camera combined calibrating for based on cabinet requires any in 1 to 10 for perform claim
The method of item.
12. a kind of computer readable storage medium of radar and camera combined calibrating for based on cabinet, the computer can
It reads to be stored at least one executable computer program instructions on storage medium, the computer program instructions include for holding
The computer program instructions of each step of the method for row any one of claims 1 to 10.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2019100901725 | 2019-01-30 | ||
CN201910090172 | 2019-01-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109978954A true CN109978954A (en) | 2019-07-05 |
Family
ID=67077687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910155349.5A Pending CN109978954A (en) | 2019-01-30 | 2019-03-01 | The method and apparatus of radar and camera combined calibrating based on cabinet |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109978954A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110320506A (en) * | 2019-08-06 | 2019-10-11 | 阿尔法巴人工智能(深圳)有限公司 | A kind of automobile-used laser radar automatic calibration device and method |
CN110349221A (en) * | 2019-07-16 | 2019-10-18 | 北京航空航天大学 | A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor |
CN110376573A (en) * | 2019-08-28 | 2019-10-25 | 上海禾赛光电科技有限公司 | Laser radar assembling & adjusting system and its Method of Adjustment |
CN110673115A (en) * | 2019-09-25 | 2020-01-10 | 杭州飞步科技有限公司 | Combined calibration method, device, equipment and medium for radar and integrated navigation system |
CN112485774A (en) * | 2020-11-26 | 2021-03-12 | 中国第一汽车股份有限公司 | Vehicle-mounted laser radar calibration method, device, equipment and storage medium |
CN116777903A (en) * | 2023-08-11 | 2023-09-19 | 北京斯年智驾科技有限公司 | Box door detection method and system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106097348A (en) * | 2016-06-13 | 2016-11-09 | 大连理工大学 | A kind of three-dimensional laser point cloud and the fusion method of two dimensional image |
CN107464265A (en) * | 2017-06-14 | 2017-12-12 | 深圳市圆周率软件科技有限责任公司 | A kind of parameter calibration system and method for binocular panorama camera |
CN109100741A (en) * | 2018-06-11 | 2018-12-28 | 长安大学 | A kind of object detection method based on 3D laser radar and image data |
-
2019
- 2019-03-01 CN CN201910155349.5A patent/CN109978954A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106097348A (en) * | 2016-06-13 | 2016-11-09 | 大连理工大学 | A kind of three-dimensional laser point cloud and the fusion method of two dimensional image |
CN107464265A (en) * | 2017-06-14 | 2017-12-12 | 深圳市圆周率软件科技有限责任公司 | A kind of parameter calibration system and method for binocular panorama camera |
CN109100741A (en) * | 2018-06-11 | 2018-12-28 | 长安大学 | A kind of object detection method based on 3D laser radar and image data |
Non-Patent Citations (3)
Title |
---|
YOONSU PARK等: "Calibration between Color Camera and 3D LIDAR Instruments with a Polygonal Planar Board", 《SENSORS》 * |
ZOLTAN PUSZTAI等: "Accurate Calibration of LiDAR-Camera Systems using Ordinary Boxes", 《2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW)》 * |
英红: "《基于视觉的水泥路面病害检测方法》", 30 November 2014 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110349221A (en) * | 2019-07-16 | 2019-10-18 | 北京航空航天大学 | A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor |
CN110320506A (en) * | 2019-08-06 | 2019-10-11 | 阿尔法巴人工智能(深圳)有限公司 | A kind of automobile-used laser radar automatic calibration device and method |
CN110320506B (en) * | 2019-08-06 | 2023-05-02 | 深圳市海梁科技有限公司 | Automatic calibration device and method for automotive laser radar |
CN110376573A (en) * | 2019-08-28 | 2019-10-25 | 上海禾赛光电科技有限公司 | Laser radar assembling & adjusting system and its Method of Adjustment |
CN110376573B (en) * | 2019-08-28 | 2021-08-20 | 上海禾赛科技有限公司 | Laser radar installation and adjustment system and installation and adjustment method thereof |
CN110673115A (en) * | 2019-09-25 | 2020-01-10 | 杭州飞步科技有限公司 | Combined calibration method, device, equipment and medium for radar and integrated navigation system |
CN110673115B (en) * | 2019-09-25 | 2021-11-23 | 杭州飞步科技有限公司 | Combined calibration method, device, equipment and medium for radar and integrated navigation system |
CN112485774A (en) * | 2020-11-26 | 2021-03-12 | 中国第一汽车股份有限公司 | Vehicle-mounted laser radar calibration method, device, equipment and storage medium |
CN112485774B (en) * | 2020-11-26 | 2024-03-15 | 中国第一汽车股份有限公司 | Vehicle-mounted laser radar calibration method, device, equipment and storage medium |
CN116777903A (en) * | 2023-08-11 | 2023-09-19 | 北京斯年智驾科技有限公司 | Box door detection method and system |
CN116777903B (en) * | 2023-08-11 | 2024-01-26 | 北京斯年智驾科技有限公司 | Box door detection method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109887057B (en) | Method and device for generating high-precision map | |
CN109978954A (en) | The method and apparatus of radar and camera combined calibrating based on cabinet | |
US10509983B2 (en) | Operating device, operating system, operating method, and program therefor | |
CN111448476B (en) | Technique for sharing mapping data between unmanned aerial vehicle and ground vehicle | |
CN113657224B (en) | Method, device and equipment for determining object state in vehicle-road coordination | |
US10681269B2 (en) | Computer-readable recording medium, information processing method, and information processing apparatus | |
US20210019535A1 (en) | Systems and methods for pose determination | |
JP6988197B2 (en) | Controls, flying objects, and control programs | |
CN108845335A (en) | Unmanned aerial vehicle ground target positioning method based on image and navigation information | |
EP3750133A2 (en) | Method of and apparatus for analyzing images | |
CN112184812B (en) | Method for improving identification and positioning precision of unmanned aerial vehicle camera to april tag and positioning method and system | |
CN112116651A (en) | Ground target positioning method and system based on monocular vision of unmanned aerial vehicle | |
CN112379352A (en) | Laser radar calibration method, device, equipment and storage medium | |
CN108225273A (en) | A kind of real-time runway detection method based on sensor priori | |
KR102288609B1 (en) | Method and system for position estimation of unmanned aerial vehicle using graph structure based on multi module | |
WO2021081958A1 (en) | Terrain detection method, movable platform, control device, system, and storage medium | |
KR102490521B1 (en) | Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system | |
CN114662587A (en) | Three-dimensional target sensing method, device and system based on laser radar | |
US20210272289A1 (en) | Sky determination in environment detection for mobile platforms, and associated systems and methods | |
US20210156710A1 (en) | Map processing method, device, and computer-readable storage medium | |
US10275940B2 (en) | Image processing device, image processing method, and image processing program | |
WO2021262704A1 (en) | Post-processing of mapping data for improved accuracy and noise-reduction | |
Mansur et al. | Real time monocular visual odometry using optical flow: study on navigation of quadrotors UAV | |
CN113034347A (en) | Oblique photographic image processing method, device, processing equipment and storage medium | |
CN116952229A (en) | Unmanned aerial vehicle positioning method, device, system and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190705 |
|
RJ01 | Rejection of invention patent application after publication |