CN115131423B - Distance measurement method and device integrating millimeter wave radar and vision - Google Patents

Distance measurement method and device integrating millimeter wave radar and vision Download PDF

Info

Publication number
CN115131423B
CN115131423B CN202110283671.3A CN202110283671A CN115131423B CN 115131423 B CN115131423 B CN 115131423B CN 202110283671 A CN202110283671 A CN 202110283671A CN 115131423 B CN115131423 B CN 115131423B
Authority
CN
China
Prior art keywords
target
target object
millimeter wave
radar
wave radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110283671.3A
Other languages
Chinese (zh)
Other versions
CN115131423A (en
Inventor
郭怀勇
何速
罗明柱
欧阳大亮
周东旭
黄智捷
覃炳庆
湛建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Science and Industry Shenzhen Group Co Ltd
Original Assignee
Aerospace Science and Industry Shenzhen Group Co Ltd
Filing date
Publication date
Application filed by Aerospace Science and Industry Shenzhen Group Co Ltd filed Critical Aerospace Science and Industry Shenzhen Group Co Ltd
Priority to CN202110283671.3A priority Critical patent/CN115131423B/en
Publication of CN115131423A publication Critical patent/CN115131423A/en
Application granted granted Critical
Publication of CN115131423B publication Critical patent/CN115131423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

According to the distance measurement method and device for fusing the millimeter wave radar and the vision, the millimeter wave radar and the monocular camera are used for collecting the data in front, the target detection algorithm is used for obtaining a plurality of target detection frames and category information of targets in each target detection frame in an image at the current moment collected by the monocular camera, each target detection frame corresponds to one target object, the millimeter wave radar is used for scanning to obtain coordinates and distance information of the target objects, the coordinates of the target objects collected by the millimeter wave radar are converted into an image coordinate system, the target objects of the millimeter wave radar are matched with the target detection frames, if the target objects are matched with the target detection frames, the distance information of the radar target objects matched with the target detection frames is used as a distance measurement result, and the advantages of the millimeter wave radar and the monocular camera are fully utilized, and then the target objects are matched through coordinate conversion, so that the category and the distance information of the target objects are successfully obtained.

Description

Distance measurement method and device integrating millimeter wave radar and vision
Technical Field
The invention belongs to the field of unmanned multi-sensor fusion detection, and particularly relates to a distance measurement method and device for fusing millimeter wave radar and vision.
Background
Unmanned control systems are typically composed of technical modules for navigation, environmental awareness, dynamic planning and decision making, automatic control, and the like. Environmental awareness technology, one of the key technologies, is a research hotspot in this field.
The unmanned aerial vehicle or the state of the unmanned aerial vehicle and surrounding road, traffic sign, signal lamp, vehicle, pedestrian, obstacle and other environmental information are detected by the sensing technologies such as cameras, radars and ultrasonic waves, so that the unmanned aerial vehicle system can accurately position surrounding targets. The comprehensive and accurate environment perception information can provide sufficient data guarantee for unmanned system decision-making, thereby ensuring the safety and stability of the unmanned control system.
In the prior art, the distance measurement is performed on the target by combining the laser radar with vision, but the laser radar is greatly affected by weather and cannot perform the distance measurement on the target stably, so that a new technology is urgently needed to find out to perform the stable distance measurement on the target, and sufficient data guarantee is provided for unmanned system decision.
Disclosure of Invention
The invention aims to solve the technical problem of providing the unmanned system with the category and distance information of the front target, so that the unmanned system can make a correct decision judgment to realize accurate obstacle avoidance, and provides a distance measurement method and device integrating millimeter wave radar and vision.
In order to solve the technical problems, the invention adopts the technical proposal that,
A distance measurement method integrating millimeter wave radar and vision comprises the following steps:
Step 1: installing a monocular camera right above the millimeter wave radar, installing the millimeter wave radar on an unmanned control system for distance detection, and starting the monocular camera and the millimeter wave radar;
Step 2: performing joint calibration on the monocular camera and the millimeter wave radar, and calculating a coordinate transformation matrix from a radar coordinate system to an image coordinate system;
Step 3: collecting image data right in front of a monocular camera at the current moment, recording as f (x, y) i, wherein i represents the current moment, and detecting the collected image data by using an image target detection method to obtain a plurality of target detection frames, wherein each target detection frame comprises coordinate information of the target detection frame in an image coordinate system, category information of a target object positioned in the target detection frame and category score information;
Step 4: scanning a space right in front of the millimeter wave radar at the current moment by the millimeter wave radar while acquiring the latest image data to obtain coordinates and distance information of each target object at the current moment, preprocessing the coordinate information of each target object at the current moment, and converting the preprocessed coordinate information of each target object into an image coordinate system through a coordinate conversion matrix to obtain corresponding coordinates of each target object in the image coordinate system;
Step 5: for each target detection frame, judging whether the corresponding coordinates of each target object at the current moment acquired by the millimeter wave radar in an image coordinate system are inside the target detection frame in sequence, dividing each target object at the current moment acquired by the millimeter wave radar into two types after detection is finished, wherein N is more than or equal to 0 and less than or equal to N and T i_out=[p1,p2,…,pm is an out-of-frame target point T i_in=[p1,p2,…,pn, m is more than or equal to 0 and less than or equal to N, i is more than or equal to the current moment, and N is the number of the target objects acquired by the millimeter wave radar;
Step 6: when the in-frame target point T i _in contains a target object point acquired by a millimeter wave radar, a minimum distance matching method is used for finding out a point with the minimum longitudinal coordinate value of each target object point in the T i _in in a radar coordinate system as a fusion matching result of a current target detection frame, and the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system is used as the distance of a target detected in the current target detection frame;
When T i _in does not contain any target object point acquired by the millimeter wave radar, using a nearest neighbor matching method to find M radar target object data points closest to a target detection frame from T i _out, wherein M is less than or equal to 8, then using a minimum distance matching method to find a point with the minimum longitudinal coordinate value of each target object point in a radar coordinate system as a fusion matching result of the current target detection frame, and using the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system as the distance of a target detected in the current target detection frame;
and 7, repeating the steps 5 and 6 until all the target detection frames formed by the acquired image data are detected, and obtaining the distance information of the detected targets in all the target detection frames.
Further, in step4, the method for preprocessing the coordinate information of each target object at the current moment is as follows:
Step 4.1: filtering millimeter wave radar target object points at the current moment when the angle and the distance exceed the limit range by using a preset threshold value to obtain filtered radar object data T_new i at the current moment i;
Step 4.2: and filtering the radar object data T_new i by using an adaptive radius determination method to obtain filtered radar object data.
Further, the adaptive radius determination method refers to: based on radar target object data T i-1 _data of the last moment, calculating the size of an adaptive radius by using the speed and direction angle information of each target object point, then drawing a circle by using the radius, detecting each target object point in target object data T i _data of the current moment, if the target object point is in a circle corresponding to a certain target object point in T i-1 _data, reserving, and if the target object point is not in a circle corresponding to any point, filtering.
Further, in step 6, the method further includes performing an isolation judgment on the obtained fusion matching result of the current target detection frame, wherein the isolation judgment refers to judging whether the Euclidean distance between the current target detection frame and the radar target object point matched with the current target detection frame and the previous m time points is greater than a threshold value, m is greater than or equal to 3, if the Euclidean distance is greater than the threshold value, the fusion matching result is isolated, and the fusion matching result is discarded, so that no result is output.
Further, in step 6, noise detection is further performed on the obtained fusion matching result of the current target detection frame.
Further, the monocular camera is a monocular color camera.
The invention also provides a distance measuring device integrating millimeter wave radar and vision, which comprises the following modules:
hardware configuration module: the method comprises the steps of installing a monocular camera right above a millimeter wave radar, installing the millimeter wave radar on an unmanned control system for distance detection, and starting the monocular camera and the millimeter wave radar;
And a coordinate conversion module: the method comprises the steps of performing joint calibration on a monocular camera and a millimeter wave radar, and calculating a coordinate transformation matrix from a radar coordinate system to an image coordinate system;
image acquisition and target detection module: the method comprises the steps of acquiring image data right in front of a monocular camera at the current moment, recording as f (x, y) i, wherein i represents the current moment, and detecting the acquired image data by using an image target detection method to obtain a plurality of target detection frames, wherein each target detection frame comprises coordinate information of the target detection frame in an image coordinate system, category information of a target object positioned in the target detection frame and category score information;
Radar target acquisition and coordinate conversion module: scanning a space right in front of the millimeter wave radar at the current moment by the millimeter wave radar while collecting the right in front of the image data to obtain coordinates and distance information of each target object at the current moment, preprocessing the coordinate information of each target object at the current moment, and converting the preprocessed coordinate information of each target object into an image coordinate system through a coordinate conversion matrix to obtain corresponding coordinates of each target object in the image coordinate system;
The radar target object and target detection frame matching module: for each target detection frame, judging whether the corresponding coordinates of each target object at the current moment acquired by the millimeter wave radar in an image coordinate system are inside the target detection frame in sequence, dividing each target object at the current moment acquired by the millimeter wave radar into two types after detection is finished, wherein N is more than or equal to 0 and less than or equal to N and T i_out=[p1,p2,…,pm is an out-of-frame target point T i_in=[p1,p2,…,pn, m is more than or equal to 0 and less than or equal to N, i is more than or equal to the current moment, and N is the number of the target objects acquired by the millimeter wave radar;
when the in-frame target point T i _in contains a target object point acquired by a millimeter wave radar, a minimum distance matching method is used for finding out a point with the minimum longitudinal coordinate value of each target object point in the T i _in in a radar coordinate system as a fusion matching result of a current target detection frame, and the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system is used as the distance of a target detected in the current target detection frame;
When T i _in does not contain any target object point acquired by the millimeter wave radar, using a nearest neighbor matching method to find M radar target object data points closest to a target detection frame from T i _out, wherein M is less than or equal to 8, then using a minimum distance matching method to find a point with the minimum longitudinal coordinate value of each target object point in a radar coordinate system as a fusion matching result of the current target detection frame, and using the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system as the distance of a target detected in the current target detection frame;
And the detection result output module is used for repeatedly detecting each target detection frame until each target detection frame formed by the acquired image data is detected, obtaining and outputting the category and distance information of the detected targets in all the target detection frames.
By adopting the technical scheme, the invention has the following beneficial effects:
According to the distance measurement method and device for fusing millimeter wave radar and vision, the millimeter wave radar and the monocular camera are used for collecting data in front of the device, a target detection algorithm is used for obtaining a plurality of target detection frames in an image at the current moment collected by the monocular camera and category information of targets in each target detection frame, each target detection frame corresponds to one target object, the millimeter wave radar can obtain coordinates and distance information of the scanned target objects, the coordinates of target object points collected by the millimeter wave radar are converted into an image coordinate system collected by the monocular camera, whether the coordinates of target object points of the millimeter wave radar are located in one target detection frame of an image collected by the monocular camera is judged, if the coordinates of each target object point of the millimeter wave radar are located in the target detection frame, the point with the smallest longitudinal coordinate value in all target object points located in the target detection frame is used as a fusion matching candidate result of the current target detection frame, and the distance of the point with the smallest longitudinal coordinate value is used as a distance measurement result. The invention fully utilizes the advantages of the millimeter wave radar and the monocular camera, and then matches through coordinate transformation, thereby successfully obtaining the category and distance information of each target object point.
Drawings
FIG. 1 is a flow chart of a system of the present invention;
Detailed Description
The following description of the embodiments of the present invention will be made apparent and fully in view of the accompanying drawings, in which some, but not all embodiments of the invention are shown. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 shows a specific embodiment of a distance measurement method combining millimeter wave radar and vision according to the present invention, comprising the steps of:
Step 1: installing a monocular camera right above the millimeter wave radar, installing the millimeter wave radar on an unmanned control system for distance detection, and starting the monocular camera and the millimeter wave radar; in this embodiment, the monocular camera and the millimeter wave radar are adopted to collect the target object data of the front side respectively, and in order to make the collected space in front of the monocular camera and the millimeter wave radar as consistent as possible, the monocular camera is installed immediately above the millimeter wave radar, the monocular camera and the millimeter wave radar are called as a fusion ranging module together, and the vertical distance is 3cm away. The monocular camera in this embodiment is a monocular color camera.
Step 2: performing joint calibration on the monocular camera and the millimeter wave radar, and calculating a coordinate transformation matrix from a radar coordinate system to an image coordinate system; the joint calibration means that a series of corresponding points in an image coordinate system and a radar coordinate system are obtained through pre-measurement, and then a coordinate system conversion matrix is calculated according to the points. The conversion matrix calculation formula is:
Wherein, Representing a transformation matrix, x γ=rsinα,yγ = rcos a. (u, v) represents coordinates of the object in the image pixel coordinate system; (x r,yr) represents the coordinates of the target in the radar coordinate system; r represents the linear distance between the target and the fusion ranging module; alpha represents the offset angle of the target in the radar coordinate system, and the formula can directly complete the conversion from the radar coordinate system to the image coordinate system. The related parameters of the matrix in the conversion relation are obtained by the following calculation mode:
And (3) making:
Ti=[ti1 ti2 ti3]',U=[u1 u2 … un]',V=[v1 v2 … vn]',In×1=[1 1 … 1]' And:
Where n is the number of index points, Is the position of the calibration point in the radar coordinate system, and the matrix is convertedThe method can be calculated by the following linear least square method:
T1=(PPT)-1PTU
T2=(PPT)-1PTV
T3=(PPT)-1PTIn×1
Step 3: collecting image data right in front of a monocular camera at the current moment, recording as f (x, y) i, wherein i represents the current moment, and detecting the collected image data by using an image target detection method to obtain a plurality of target detection frames, wherein each target detection frame comprises coordinate information of the target detection frame in an image coordinate system and category information of a target object positioned in the target detection frame; each target detection frame result contains the upper left corner coordinates, the lower right corner coordinates and the category information of the frame. In this embodiment, the image target detection method used is the YOLO-v3 method of fusion MobileNet.
Step 4: scanning a space right in front of the millimeter wave radar at the current moment by the millimeter wave radar while acquiring the latest image data to obtain coordinates and distance information of each target object at the current moment, preprocessing the coordinate information of each target object at the current moment, and converting the preprocessed coordinate information of each target object into an image coordinate system through a coordinate conversion matrix to obtain corresponding coordinates of each target object in the image coordinate system;
In this embodiment, the millimeter wave radar and the monocular camera acquire target object data at the same time under the effect of software synchronization, and data of each period scanned by the millimeter wave radar corresponds to image data at the same time. For uniformity, the data of the latest period scanned by the millimeter wave radar are collectively called as the data of the current moment, so that the data can conveniently correspond to the image acquired by the monocular camera at the current moment.
In this embodiment, the method for preprocessing the coordinate information of each target object at the current moment in step 4 is as follows:
Step 4.1: filtering millimeter wave radar target object points at the current moment when the angle and the distance exceed the limit range by using a preset threshold value to obtain filtered radar object data T_new i at the current moment i; in order to avoid the inconformity of the spatial range scanned by the millimeter wave radar and the spatial range of the image shot by the monocular camera, the target object points with the angles and the distances exceeding the limit range are filtered by using a preset threshold, so that redundant target object points are reduced, and the matching speed is improved.
Step 4.2: and filtering the radar object data T_new i by using an adaptive radius determination method to obtain filtered radar object data.
In this embodiment, the adaptive radius determining method refers to: based on radar target object data T i-1 _data of the last moment, calculating the size of an adaptive radius by utilizing the speed and direction angle information of each target object point acquired by the millimeter wave radar, then drawing a circle by using the radius, detecting each target object point in target object data T i _data of the current moment, if the target object point is in a circle corresponding to a certain target object point in T i-1 _data, reserving, and if the target object point is not in a circle corresponding to any point, filtering. According to the speed and angle of the target object point at the previous moment, it can be judged that the moving range of the target object point at the next moment should not exceed a circle with the target object point as the center of a circle and with the self-adaptive radius as the size, and by the method, some error target object points can be filtered out.
Step 5: for each target detection frame, judging whether the corresponding coordinates of each target object at the current moment acquired by the millimeter wave radar in an image coordinate system are inside the target detection frame in sequence, dividing each target object at the current moment acquired by the millimeter wave radar into two types after detection is finished, wherein N is more than or equal to 0 and less than or equal to N and T i_out=[p1,p2,…,pm is an out-of-frame target point T i_in=[p1,p2,…,pn, m is more than or equal to 0 and less than or equal to N, i is more than or equal to the current moment, and N is the number of the target objects acquired by the millimeter wave radar;
In this embodiment, the data in front of the image is collected by the millimeter wave radar and the monocular camera, the target detection algorithm is used to obtain a plurality of target detection frames in the image at the current moment collected by the monocular camera and the category information of the targets in each target detection frame, each target detection frame corresponds to one target object, the millimeter wave radar can obtain the coordinates and distance information of the scanned target object, the coordinates of the target object points collected by the millimeter wave radar are converted into the image coordinate system collected by the monocular camera, whether the coordinates of the target object points of the millimeter wave radar are located in one target detection frame of the image collected by the monocular camera is judged, and the target object points scanned by the millimeter wave radar are primarily corresponding to the target detection frames in the image collected by the monocular camera. More than one radar target point is contained in each target detection frame.
Step 6: when the in-frame target point T i _in contains a target object point acquired by a millimeter wave radar, a minimum distance matching method is used for finding out a point with the minimum longitudinal coordinate value of each target object point in the T i _in in a radar coordinate system as a fusion matching result of a current target detection frame, and the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system is used as the distance of a target detected in the current target detection frame;
When T i _in does not contain any target object point acquired by the millimeter wave radar, using a nearest neighbor matching method to find M radar target object data points closest to a target detection frame from T i _out, wherein M is less than or equal to 8, then using a minimum distance matching method to find a point with the minimum longitudinal coordinate value of each target object point in a radar coordinate system as a fusion matching result of the current target detection frame, and using the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system as the distance of a target detected in the current target detection frame;
In this embodiment, the M radar data points closest to the target detection frame refer to that the out-of-frame portion of the target detection frame is divided into 8 portions according to the azimuth, and each portion selects 1 point. If the final number is less than 8 points, the points with the shorter distance are selected from other areas to supplement the feet, and if the final number is still insufficient, the current situation is maintained. Nearest neighbor matching method, namely, finding out the nearest point to a target detection frame from all target object data points
In this embodiment, the method further includes performing an isolation judgment on the obtained matching result of the current target detection frame, where the isolation judgment refers to judging whether the Euclidean distance between the current target detection frame and the radar target object point matched with the current target detection frame and the previous m time points is greater than a threshold value, m is greater than or equal to 3, if so, the result is isolated, and the result is discarded, which indicates that no result is output in the fusion.
In this embodiment, noise detection is further required to be performed on the obtained fusion matching result of the current target detection frame, so as to determine whether the obtained target is a noise; if the result is noisy, the result is discarded, and no result is output in this fusion.
And 7, repeating the steps 5 and 6 until all the target detection frames formed by the acquired image data are detected, and obtaining the category and distance information of the detected targets in all the target detection frames. And (3) respectively carrying out fusion matching on all detected target detection frames in the image acquired by the monocular camera in a mode of steps 5 and 6, and obtaining the distance information of target object points in a radar coordinate system matched with each target detection frame according to the distance information of the target object obtained by the millimeter wave radar.
In this embodiment, the fusion matching candidate results of all the target detection frames are also used together as the target object distance detection result at the current i moment, and then the target object distance detection result at the current i moment is saved in the detection result data list at the previous moments.
The ranging algorithm based on millimeter wave radar and vision fusion has strong environmental adaptability, can provide accurate category and distance information of a front target for the unmanned vehicle, and enables the unmanned vehicle to make correct judgment according to a decision algorithm, thereby realizing the functions of obstacle avoidance, ranging, striking and the like.
The invention also provides a distance measuring device integrating millimeter wave radar and vision, which comprises the following modules:
hardware configuration module: the method comprises the steps of installing a monocular camera right above a millimeter wave radar, installing the millimeter wave radar on an unmanned control system for distance detection, and starting the monocular camera and the millimeter wave radar;
And a coordinate conversion module: the method comprises the steps of performing joint calibration on a monocular camera and a millimeter wave radar, and calculating a coordinate transformation matrix from a radar coordinate system to an image coordinate system;
image acquisition and target detection module: the method comprises the steps of acquiring image data right in front of a monocular camera at the current moment, recording as f (x, y) i, wherein i represents the current moment, and detecting the acquired image data by using an image target detection method to obtain a plurality of target detection frames, wherein each target detection frame comprises coordinate information of the target detection frame in an image coordinate system, category information of a target object positioned in the target detection frame and category score information;
Radar target acquisition and coordinate conversion module: scanning a space right in front of the millimeter wave radar at the current moment by the millimeter wave radar while collecting the right in front of the image data to obtain coordinates and distance information of each target object at the current moment, preprocessing the coordinate information of each target object at the current moment, and converting the preprocessed coordinate information of each target object into an image coordinate system through a coordinate conversion matrix to obtain corresponding coordinates of each target object in the image coordinate system;
The radar target object and target detection frame matching module: for each target detection frame, judging whether the corresponding coordinates of each target object at the current moment acquired by the millimeter wave radar in an image coordinate system are inside the target detection frame in sequence, dividing each target object at the current moment acquired by the millimeter wave radar into two types after detection is finished, wherein N is more than or equal to 0 and less than or equal to N and T i_out=[p1,p2,…,pm is an out-of-frame target point T i_in=[p1,p2,…,pn, m is more than or equal to 0 and less than or equal to N, i is more than or equal to the current moment, and N is the number of the target objects acquired by the millimeter wave radar;
when the in-frame target point T i _in contains a target object point acquired by a millimeter wave radar, a minimum distance matching method is used for finding out a point with the minimum longitudinal coordinate value of each target object point in the T i _in in a radar coordinate system as a fusion matching result of a current target detection frame, and the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system is used as the distance of a target detected in the current target detection frame;
When T i _in does not contain any target object point acquired by the millimeter wave radar, using a nearest neighbor matching method to find M radar target object data points closest to a target detection frame from T i _out, wherein M is less than or equal to 8, then using a minimum distance matching method to find a point with the minimum longitudinal coordinate value of each target object point in a radar coordinate system as a fusion matching result of the current target detection frame, and using the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system as the distance of a target detected in the current target detection frame;
And the detection result output module is used for repeatedly detecting each target detection frame until each target detection frame formed by the acquired image data is detected, obtaining and outputting the category and distance information of the detected targets in all the target detection frames.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (5)

1. The distance measurement method integrating millimeter wave radar and vision is characterized by comprising the following steps of:
Step 1: installing a monocular camera right above the millimeter wave radar, installing the millimeter wave radar on an unmanned control system for distance detection, and starting the monocular camera and the millimeter wave radar;
Step 2: performing joint calibration on the monocular camera and the millimeter wave radar, and calculating a coordinate transformation matrix from a radar coordinate system to an image coordinate system;
Step 3: collecting image data right in front of the monocular camera at the current moment, and recording as The method comprises the steps of representing the current moment, detecting acquired image data by using an image target detection method to obtain a plurality of target detection frames, wherein each target detection frame comprises coordinate information of the target detection frame in an image coordinate system, category information of a target object positioned in the target detection frame and category score information;
Step 4: scanning a space right in front of the millimeter wave radar at the current moment by the millimeter wave radar while acquiring the latest image data to obtain coordinates and distance information of each target object at the current moment, preprocessing the coordinate information of each target object at the current moment, and converting the preprocessed coordinate information of each target object into an image coordinate system through a coordinate conversion matrix to obtain corresponding coordinates of each target object in the image coordinate system; the method for preprocessing the coordinate information of each target object at the current moment comprises the following steps:
step 4.1: filtering out millimeter wave radar target object points at the current moment of which the angle and the distance exceed the limit range by using a preset threshold value to obtain the filtered current moment Radar object data of (2)
Step 4.2: for radar object dataFiltering by using an adaptive radius determination method to obtain filtered radar object data;
The adaptive radius determination method refers to: radar target object data based on last time point Calculating the size of the self-adaptive radius by using the speed and direction angle information of each target object point, then drawing a circle with the radius, and detecting the target object data of the current moment pointEach target object point in (if the target object point is atReserving the corresponding circle of a certain target object point, and filtering if the corresponding circle is not in the circle corresponding to any point;
Step 5: for each target detection frame, sequentially judging whether the corresponding coordinates of each target object at the current moment acquired by the millimeter wave radar in an image coordinate system are inside the target detection frame, and classifying each target object at the current moment acquired by the millimeter wave radar into two types after detection is finished, wherein the targets are target points in the frame And out-of-frame target pointIndicating the current time of day and,Representing the number of target objects acquired by the millimeter wave radar;
Step 6: when the target point in the frame When the millimeter wave radar comprises a target object point acquired by the millimeter wave radar, a minimum distance matching method is used for finding outThe point with the minimum longitudinal coordinate value of each target object point in the radar coordinate system is used as a fusion matching result of the current target detection frame, and the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system is used as the distance of the target detected in the current target detection frame;
When (when) When the target object point acquired by any millimeter wave radar is not included, the nearest neighbor matching method is used for the target object pointFinding out the nearest to the target detection frameA number of radar target object data points,Then, a minimum distance matching method is used for finding out the point with the minimum longitudinal coordinate value of each target object point in the radar coordinate system as a fusion matching result of the current target detection frame, and the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system is used as the distance of the target detected in the current target detection frame;
And 7, repeating the steps 5 and 6 until all the target detection frames formed by the acquired image data are detected, and obtaining the distance and category information of the detected targets in all the target detection frames.
2. The method according to claim 1, further comprising determining whether the current target detection frame and the radar target object point matched with the current target detection frame are identical to each other or not by performing an isolation determination on the obtained fusion matching result of the current target detection frame in step 6The resulting euclidean distance for each time point is greater than the threshold value,If greater than the threshold, then this is indicated as isolated, and the result is discarded, indicating that no result is output for this fusion.
3. The method according to claim 1, further comprising performing noise detection on the obtained fusion matching result of the current target detection frame in step 6.
4. A distance measuring method according to any one of claims 1 to 3, wherein the monocular camera is a monocular color camera.
5. A distance measuring apparatus for fusing millimeter wave radar and vision, for performing a distance measuring method for fusing millimeter wave radar and vision as set forth in any one of claims 1 to 4, comprising:
hardware configuration module: the method comprises the steps of installing a monocular camera right above a millimeter wave radar, and starting the monocular camera and the millimeter wave radar;
And a coordinate conversion module: the method comprises the steps of performing joint calibration on a monocular camera and a millimeter wave radar, and calculating a coordinate transformation matrix from a radar coordinate system to an image coordinate system;
image acquisition and target detection module: for acquiring image data immediately in front of the monocular camera at the current moment, noted as The method comprises the steps of representing the current moment, detecting acquired image data by using an image target detection method to obtain a plurality of target detection frames, wherein each target detection frame comprises coordinate information of the target detection frame in an image coordinate system, category information of a target object positioned in the target detection frame and category score information;
Radar target acquisition and coordinate conversion module: scanning a space right in front of the millimeter wave radar at the current moment by the millimeter wave radar while collecting the right in front of the image data to obtain coordinates and distance information of each target object at the current moment, preprocessing the coordinate information of each target object at the current moment, and converting the preprocessed coordinate information of each target object into an image coordinate system through a coordinate conversion matrix to obtain corresponding coordinates of each target object in the image coordinate system; the method for preprocessing the coordinate information of each target object at the current moment comprises the following steps:
filtering out millimeter wave radar target object points at the current moment of which the angle and the distance exceed the limit range by using a preset threshold value to obtain the filtered current moment Radar object data of (2); For radar object dataFiltering by using an adaptive radius determination method to obtain filtered radar object data; the adaptive radius determination method refers to: radar target object data based on last time pointCalculating the size of the self-adaptive radius by using the speed and direction angle information of each target object point, then drawing a circle with the radius, and detecting the target object data of the current moment pointEach target object point in (if the target object point is atReserving the corresponding circle of a certain target object point, and filtering if the corresponding circle is not in the circle corresponding to any point;
The radar target object and target detection frame matching module: for each target detection frame, sequentially judging whether the corresponding coordinates of each target object at the current moment acquired by the millimeter wave radar in an image coordinate system are inside the target detection frame, and classifying each target object at the current moment acquired by the millimeter wave radar into two types after detection is finished, wherein the targets are target points in the frame And out-of-frame target pointIndicating the current time of day and,Representing the number of target objects acquired by the millimeter wave radar;
when the target point in the frame When the millimeter wave radar comprises a target object point acquired by the millimeter wave radar, a minimum distance matching method is used for finding outThe point with the minimum longitudinal coordinate value of each target object point in the radar coordinate system is used as a fusion matching result of the current target detection frame, and the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system is used as the distance of the target detected in the current target detection frame;
When (when) When the target object point acquired by any millimeter wave radar is not included, the nearest neighbor matching method is used for the target object pointFinding out the nearest to the target detection frameA number of radar target object data points,Then, a minimum distance matching method is used for finding out the point with the minimum longitudinal coordinate value of each target object point in the radar coordinate system as a fusion matching result of the current target detection frame, and the distance of the point with the minimum longitudinal coordinate value in the radar coordinate system is used as the distance of the target detected in the current target detection frame;
the detection result output module is used for: and repeatedly detecting each target detection frame until each target detection frame formed by the acquired image data is detected, obtaining and outputting the category and distance information of the detected targets in all the target detection frames.
CN202110283671.3A 2021-03-17 Distance measurement method and device integrating millimeter wave radar and vision Active CN115131423B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110283671.3A CN115131423B (en) 2021-03-17 Distance measurement method and device integrating millimeter wave radar and vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110283671.3A CN115131423B (en) 2021-03-17 Distance measurement method and device integrating millimeter wave radar and vision

Publications (2)

Publication Number Publication Date
CN115131423A CN115131423A (en) 2022-09-30
CN115131423B true CN115131423B (en) 2024-07-16

Family

ID=

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision
CN111462237A (en) * 2020-04-03 2020-07-28 清华大学 Target distance detection method for constructing four-channel virtual image by using multi-source information

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision
CN111462237A (en) * 2020-04-03 2020-07-28 清华大学 Target distance detection method for constructing four-channel virtual image by using multi-source information

Similar Documents

Publication Publication Date Title
CN109444911B (en) Unmanned ship water surface target detection, identification and positioning method based on monocular camera and laser radar information fusion
CN107161141B (en) Unmanned automobile system and automobile
CN108960183B (en) Curve target identification system and method based on multi-sensor fusion
CN112215306B (en) Target detection method based on fusion of monocular vision and millimeter wave radar
CN112396650A (en) Target ranging system and method based on fusion of image and laser radar
CN109085570A (en) Automobile detecting following algorithm based on data fusion
CN111045000A (en) Monitoring system and method
CN105302151A (en) Aircraft docking guidance and type recognition system and method
CN105654732A (en) Road monitoring system and method based on depth image
KR20200001471A (en) Apparatus and method for detecting lane information and computer recordable medium storing computer program thereof
CN112997093B (en) Method and processing unit for determining information about objects in a vehicle environment
CN111123262B (en) Automatic driving 3D modeling method, device and system
KR102264152B1 (en) Method and system for ground truth auto labeling advanced sensor data and image by camera
CN113936198A (en) Low-beam laser radar and camera fusion method, storage medium and device
CN114252884A (en) Method and device for positioning and monitoring roadside radar, computer equipment and storage medium
CN115546741A (en) Binocular vision and laser radar unmanned ship marine environment obstacle identification method
CN116699602A (en) Target detection system and method based on millimeter wave radar and camera fusion
CN117111085A (en) Automatic driving automobile road cloud fusion sensing method
CN115690746A (en) Non-blind area sensing method and system based on vehicle-road cooperation
CN113988197A (en) Multi-camera and multi-laser radar based combined calibration and target fusion detection method
CN114252883B (en) Target detection method, apparatus, computer device and medium
CN117173666A (en) Automatic driving target identification method and system for unstructured road
CN115131423B (en) Distance measurement method and device integrating millimeter wave radar and vision
CN114252859A (en) Target area determination method and device, computer equipment and storage medium
CN115546595A (en) Track tracking method and system based on fusion sensing of laser radar and camera

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant