CN113093178A - Obstacle target detection method and device, domain controller and vehicle - Google Patents

Obstacle target detection method and device, domain controller and vehicle Download PDF

Info

Publication number
CN113093178A
CN113093178A CN202110431055.8A CN202110431055A CN113093178A CN 113093178 A CN113093178 A CN 113093178A CN 202110431055 A CN202110431055 A CN 202110431055A CN 113093178 A CN113093178 A CN 113093178A
Authority
CN
China
Prior art keywords
sensing result
obstacle target
real
camera
millimeter wave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110431055.8A
Other languages
Chinese (zh)
Inventor
刘柯旺
吕颖
刘斌
崔茂源
孙连明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202110431055.8A priority Critical patent/CN113093178A/en
Publication of CN113093178A publication Critical patent/CN113093178A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the invention discloses a method and a device for detecting an obstacle target, a domain controller and a vehicle. The method comprises the following steps: acquiring a millimeter wave radar sensing result and a camera sensing result after time synchronization; determining a first obstacle target identified together, a second obstacle target identified by the millimeter wave radar alone and a third obstacle target identified by the camera alone according to the sensing result; fusing obstacle target information corresponding to a first obstacle target in the perception result to obtain a first real perception result; determining a second real sensing result and a third real sensing result respectively corresponding to the second obstacle target and the third obstacle target, and summarizing the real sensing results to obtain a real sensing result of the detection; and filtering the real sensing result of the detection to obtain an accurate sensing result of the detection, solving the problem of false recognition or missing recognition in the scheme of the related technology, and realizing accurate output of the sensing result of the obstacle target.

Description

Obstacle target detection method and device, domain controller and vehicle
Technical Field
The embodiment of the invention relates to an intelligent driving technology, in particular to a method and a device for detecting an obstacle target, a domain controller and a vehicle.
Background
With the popularization of intelligent automobiles, the advanced intelligent driving function of the automobile also puts more and more stringent requirements on the perception performance of the automobile.
At present, main perception sensors of automobiles comprise laser radars, millimeter wave radars, cameras and the like. According to the existing automobile intelligent driving scheme, a single camera or a single millimeter wave radar is adopted to collect physical environment information around the automobile, and then obstacle target detection is carried out according to the collected physical environment information. However, the single-camera scheme is easy to miss recognition, the single-millimeter-wave radar scheme is easy to miss recognition, and accurate output of a sensing result is difficult to guarantee, so that the single sensor is difficult to meet the sensing requirements of a complex road environment and a complex weather environment, and the performance requirements of intelligent driving are increasingly not met.
The millimeter wave radar has the advantages of accurate distance measurement and speed measurement, small influence by weather and long sensing distance, but the target resolution capability is weak, the angle resolution capability is weak, and the number of false identifications of static targets is large; the camera has the advantages of strong target type identification capability and accurate obstacle transverse distance identification, but is greatly influenced by weather and limited in the types of identifiable obstacles. According to the priority and the disadvantages of the millimeter wave radar and the camera, part of the intelligent driving schemes of the automobile adopt a scheme of combining the camera and the millimeter wave radar to detect the obstacle target. However, according to the sensing result of the camera, the obstacle target detected by the millimeter waves but not detected by the camera is not considered, so that the condition of missed identification is easy to occur, the comprehensiveness and accuracy of obstacle detection are affected, and even traffic accidents may be caused in severe cases.
Disclosure of Invention
The embodiment of the invention provides a method and a device for detecting an obstacle target, a domain controller and a vehicle, which can improve the accuracy and comprehensiveness of the obstacle target detection and improve the safety of intelligent driving.
In a first aspect, an embodiment of the present invention provides a method for detecting an obstacle target, including:
acquiring a millimeter wave radar sensing result and a camera sensing result after time synchronization;
determining a first obstacle target identified by the millimeter wave radar and the camera together, a second obstacle target identified by the millimeter wave radar alone and a third obstacle target identified by the camera alone according to the sensing result of the millimeter wave radar and the sensing result of the camera;
fusing obstacle target information corresponding to the first obstacle target in the millimeter wave radar sensing result and the camera sensing result to obtain a first real sensing result;
determining a second real sensing result and a third real sensing result respectively corresponding to the second obstacle target and the third obstacle target, and obtaining a real sensing result of the detection based on the first real sensing result, the second real sensing result and the third real sensing result;
and filtering the real sensing result of the current detection according to the accurate sensing result of the previous detection to obtain the accurate sensing result of the current detection.
In a second aspect, an embodiment of the present invention further provides an obstacle target detection apparatus, where the apparatus includes:
the sensing result acquisition module is used for acquiring the millimeter wave radar sensing result and the camera sensing result after time synchronization;
the sensing result matching module is used for determining a first obstacle target identified by the millimeter wave radar and the camera together, a second obstacle target identified by the millimeter wave radar alone and a third obstacle target identified by the camera alone according to the sensing result of the millimeter wave radar and the sensing result of the camera;
the sensing result fusion module is used for fusing obstacle target information corresponding to the first obstacle target in the millimeter wave radar sensing result and the camera sensing result to obtain a first real sensing result;
a real result determining module, configured to determine a second real sensing result and a third real sensing result that correspond to the second obstacle target and the third obstacle target, respectively, and obtain a real sensing result of the current detection based on the first real sensing result, the second real sensing result, and the third real sensing result;
and the accurate result determining module is used for filtering the real sensing result of the detection according to the accurate sensing result of the previous detection to obtain the accurate sensing result of the detection.
In a third aspect, an embodiment of the present invention further provides a domain controller, where the domain controller includes:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement a method of obstacle target detection as described in any embodiment of the invention.
In a fourth aspect, an embodiment of the present invention further provides a vehicle, including:
the millimeter wave radar is used for collecting physical environment information around the self-vehicle and outputting a millimeter wave sensing result;
the camera is used for collecting physical environment information around the self-vehicle and outputting a camera sensing result;
and a domain controller according to any of the embodiments of the present invention.
In a fifth aspect, embodiments of the present invention further provide a storage medium containing computer-executable instructions, which when executed by a computer processor, are configured to perform the obstacle target detection method according to any of the embodiments of the present invention.
The embodiment of the invention provides a method, a device, a domain controller and a vehicle for detecting an obstacle target, wherein a millimeter wave radar sensing result and a camera sensing result are subjected to time synchronization, a first obstacle target identified by a millimeter wave radar and a camera together, a second obstacle target identified by the millimeter wave radar and a third obstacle target identified by the camera independently are determined according to the millimeter wave radar sensing result and the camera sensing result after the time synchronization, fusion processing is carried out on the first obstacle target identified by the millimeter wave radar and the camera together to obtain a first real sensing result, the real sensing result of the current detection is formed on the basis of the first real sensing result, a second real sensing result corresponding to the second obstacle target and a third real sensing result corresponding to the third obstacle target, the real sensing result of the current detection is filtered according to the accurate sensing result of the previous detection, and obtaining an accurate sensing result of the detection. According to the embodiment of the invention, a more accurate obstacle sensing result is obtained through fusion and filtering processing, and the obstacle target independently identified by the millimeter wave radar and the obstacle independently identified by the camera are considered, so that missing identification is avoided, the problem of mistaken identification or missing identification in the scheme of the related technology is solved, the accurate output of the sensing result of the obstacle target is realized, more accurate sensing information is provided for the advanced cruise function of the automobile, more accurate vehicle control is realized, and better driving experience is provided for a driver.
Drawings
Fig. 1 is a flowchart of a method for detecting an obstacle target according to an embodiment of the present invention;
FIG. 2 is a schematic diagram showing the arrangement positions of a millimeter wave radar and a camera on a vehicle;
fig. 3 is a flowchart of another method for detecting an obstacle target according to an embodiment of the present invention;
fig. 4 is a block diagram of an obstacle target detection apparatus according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a domain controller according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Fig. 1 is a flowchart of a method for detecting an obstacle target according to an embodiment of the present invention, where the present embodiment is applicable to an intelligent driving scenario, and the method may be executed by an obstacle target detection apparatus, which may be implemented by software and/or hardware, and is generally configured in a domain controller. As shown in fig. 1, the method includes:
and step 110, acquiring the millimeter wave radar sensing result and the camera sensing result after time synchronization.
The sensing result of the millimeter wave radar is a set of obstacle target information obtained by acquiring physical environment information around the vehicle by the millimeter wave radar configured on the vehicle and processing the acquired physical environment information by the millimeter wave radar controller. Optionally, the millimeter wave radar controller outputs the sensing result of the millimeter wave radar to the domain controller.
The camera perception result is a set of obstacle target information obtained by acquiring physical environment information around the vehicle through a camera configured on the vehicle and processing the acquired physical environment information through a camera controller. Optionally, the camera controller outputs the camera sensing result to the domain control.
It should be noted that, because the arrangement positions of the millimeter wave radar and the camera on the vehicle are different, after receiving the sensing result of the millimeter wave radar and the sensing result of the camera, the domain controller needs to unify the two sensing results in the same target coordinate system. The target coordinate system may be a rectangular coordinate system with the center of the front bumper of the vehicle as the origin of coordinates, so as to better detect the obstacle target in front of the vehicle. Alternatively, the target coordinate system may be a rectangular coordinate system with the center of the front axle of the automobile as the origin of coordinates. The embodiment of the invention does not specifically limit the specific meaning of the target coordinate system, and the target coordinate system can be adjusted according to the actual detection scene.
Fig. 2 is a schematic diagram of the arrangement positions of the millimeter wave radar and the camera on the vehicle. As shown in fig. 2, the millimeter wave radar 210 is respectively disposed at the head and tail of the vehicle, and the camera 220 is disposed near the windshield in the middle of the vehicle. The millimeter wave radar sensing obstacle coordinate system is a rectangular coordinate system taking the millimeter wave radar as a coordinate origin. The camera sensing obstacle coordinate system is a rectangular coordinate system with the camera as the origin of coordinates. Because the coordinate systems corresponding to the sensing result of the millimeter wave radar and the sensing result of the camera are different, subsequent fusion operation cannot be performed, and the coordinate system conversion needs to be performed first.
Exemplarily, physical environment information around a vehicle is respectively collected through a millimeter wave radar and a camera which are configured on the vehicle, and a reference millimeter wave radar sensing result and a reference camera sensing result which correspond to the physical environment information are obtained; and converting the reference millimeter wave radar sensing result and the reference camera sensing result into a millimeter wave radar sensing result and a camera sensing result in a target coordinate system through coordinate system conversion. Specifically, after the sensing result of the reference millimeter wave radar and the sensing result of the reference camera are obtained, the sensing result of the reference millimeter wave radar in the millimeter wave radar coordinate system is converted into the middle sensing result of the millimeter wave radar in the whole vehicle coordinate system according to the position of the millimeter wave radar in the whole vehicle. And converting the middle sensing result of the millimeter wave radar into a sensing result of the millimeter wave radar under a rectangular coordinate system with the center of the front bumper of the automobile as the origin of coordinates according to the position of the center of the front bumper of the automobile in the whole automobile. Similarly, according to the position of the camera in the whole vehicle, the reference millimeter wave radar sensing result under the camera coordinate system is converted into the middle sensing result of the camera under the whole vehicle coordinate system. And converting the middle sensing result of the camera into a sensing result of the camera under a rectangular coordinate system with the center of the front bumper of the automobile as the origin of coordinates according to the position of the center of the front bumper of the automobile in the whole automobile.
It should be noted that the difference between the obstacle targets detected by different timestamps is large, and therefore, the timestamp calibration needs to be performed after the millimeter wave radar sensing result and the camera sensing result in the same coordinate system are obtained. The timestamp can be considered as the time when the domain controller acquires the sensing result of the millimeter wave radar, and the time when the domain controller acquires the sensing result of the camera in the current reporting period of the millimeter wave radar. Or, the timestamp may also be regarded as the time when the domain controller acquires the sensing result of the camera, and the time when the domain controller acquires the sensing result of the millimeter wave radar in the current reporting period of the camera.
For example, the millimeter wave radar reports the sensing result of the millimeter wave radar to the domain controller in the current reporting period. And the domain controller also acquires the sensing result of the camera in the current reporting period, and then takes the acquisition time of the sensing result of the millimeter wave radar as the timestamp of the sensing result of the millimeter wave radar and the acquisition time of the sensing result of the camera as the timestamp of the sensing result of the camera.
Or the camera reports the sensing result of the camera to the domain controller in the current reporting period. And the domain controller also acquires the sensing result of the millimeter wave radar in the current reporting period, and then takes the acquisition time of the sensing result of the camera as the timestamp of the sensing result of the camera and the acquisition time of the sensing result of the millimeter wave radar as the timestamp of the sensing result of the millimeter wave radar.
Exemplarily, calculating a timestamp difference value of a current millimeter wave radar sensing result and a current camera sensing result; if the timestamp difference is smaller than a set threshold, determining that the current millimeter wave radar sensing result and the current camera sensing result are time-synchronized millimeter wave radar sensing results and camera sensing results; otherwise, determining that the current sensing result of the millimeter wave radar and the current sensing result of the camera are not the sensing result of the millimeter wave radar and the sensing result of the camera which are time-synchronized. The set threshold may be a default value or a parameter manually set according to different application scenarios.
Optionally, the time stamps of the millimeter wave radar and the camera may be calibrated in advance, so that the time stamps of the millimeter wave radar and the camera are synchronized, and the effect of time synchronization of the sensing result output to the domain controller is ensured.
And step 120, determining a first obstacle target identified by the millimeter wave radar and the camera together, a second obstacle target identified by the millimeter wave radar alone and a third obstacle target identified by the camera alone according to the sensing result of the millimeter wave radar and the sensing result of the camera.
Since the sensing result of the millimeter wave radar is a set of obstacle target information, each of which represents an obstacle target sensed by the millimeter wave radar. The camera perception result is also a set of obstacle target information, wherein each obstacle target information represents an obstacle target perceived by one camera. And taking at least one obstacle target sensed by both the millimeter wave radar and the camera as a first obstacle target. And taking the obstacle target sensed by the millimeter-wave radar alone as a second obstacle target. And taking the obstacle target sensed by the camera alone as a third obstacle target.
Exemplarily, each piece of obstacle target information in the camera sensing result is respectively matched with each piece of obstacle target information in the millimeter wave radar sensing result to obtain a type matching result, a transverse position error, a longitudinal position error and an orientation angle matching result;
if the type matching result, the transverse position error, the longitudinal position error and the orientation angle matching result of the current obstacle target all meet preset conditions, determining that the current obstacle target is a first obstacle target identified by the millimeter wave radar and the camera together;
otherwise, determining that the current obstacle target is a second obstacle target individually identified by the millimeter wave radar or a third obstacle target individually identified by the camera according to the source of the obstacle target information corresponding to the current obstacle target.
The preset condition may be understood as a set of conditions, that is, a first condition for determining a type matching result, a second condition for determining a lateral position error, a third condition for determining a longitudinal position error, a fourth condition for determining an orientation angle matching result, and the like are included. The preset condition can be a default value, and can also be manually adjusted or automatically adjusted according to actual driving scenes such as actual road conditions and the like.
And step 130, integrating obstacle target information corresponding to the first obstacle target in the millimeter wave radar sensing result and the camera sensing result to obtain a first real sensing result.
The first real sensing result is a fusion result obtained by fusing all components in the obstacle target information corresponding to the first obstacle target. Specifically, the components participating in the fusion process may include the lateral distance, the longitudinal distance, the lateral speed, and the longitudinal speed of the first obstacle target relative to the host vehicle in the millimeter wave radar sensing result, and the lateral distance, the longitudinal distance, the lateral speed, the longitudinal speed, and the obstacle type of the first obstacle target relative to the host vehicle in the camera sensing result. The transverse distance, the longitudinal data, the transverse speed and the longitudinal speed of the first obstacle target relative to the host vehicle in the sensing result of the millimeter wave radar can be understood as the transverse distance, the longitudinal distance, the transverse speed and the longitudinal speed of the first obstacle target relative to the host vehicle recognized by the millimeter wave radar. Similarly, the transverse distance, the longitudinal distance, the transverse speed, the longitudinal speed and the obstacle type of the first obstacle target relative to the host vehicle in the camera sensing result can be understood as that the camera recognizes the transverse distance, the longitudinal distance, the transverse speed, the longitudinal speed and the obstacle type of the first obstacle target relative to the host vehicle.
Exemplarily, the lateral distance of the first obstacle target relative to the host vehicle in the millimeter wave radar sensing result and the camera sensing result is fused to obtain the real lateral distance of the first obstacle target relative to the host vehicle;
fusing the longitudinal distance of the first obstacle target relative to the vehicle in the millimeter wave radar sensing result and the camera sensing result to obtain the real longitudinal distance of the first obstacle target relative to the vehicle;
fusing the transverse speed of the first obstacle target relative to the own vehicle in the millimeter wave radar sensing result and the camera sensing result to obtain the real transverse speed of the first obstacle target relative to the own vehicle;
fusing the longitudinal speed of the first obstacle target relative to the self-vehicle in the sensing result of the millimeter wave radar and the sensing result of the camera to obtain the real longitudinal speed of the first obstacle target relative to the self-vehicle;
taking the type of the first obstacle target in the camera sensing result as the real type of the first obstacle target after fusion;
and forming the first real sensing result according to the real transverse distance, the real longitudinal distance, the real transverse speed, the real longitudinal speed and the real type.
Specifically, a weighted sum or an average value of the lateral distances of the first obstacle target relative to the host vehicle, and the like, in the millimeter wave radar sensing result and the camera sensing result may be calculated, and the calculation result may be used as the true lateral distance of the first obstacle target relative to the host vehicle. And calculating the weighted sum or average value of the longitudinal distance of the first obstacle target relative to the vehicle in the sensing result of the millimeter wave radar and the sensing result of the camera, and taking the calculation result as the real longitudinal distance of the first obstacle target relative to the vehicle. And respectively determining the real transverse speed and the real longitudinal speed of the first obstacle target relative to the self-vehicle in a similar mode. And taking the barrier type of the first barrier target sensed by the camera as the real type of the fused first barrier target. Thus, a first true perception result including the above-mentioned true lateral distance, true longitudinal distance, true lateral velocity, true longitudinal velocity, and true type can be obtained.
Step 140, determining a second real sensing result and a third real sensing result respectively corresponding to the second obstacle target and the third obstacle target, and obtaining a real sensing result of the detection based on the first real sensing result, the second real sensing result and the third real sensing result.
It should be noted that, since the millimeter wave radar detects the obstacle target alone, there may be a case of misidentification, and filtering processing needs to be performed on the sensing result of the millimeter wave radar to obtain a second true sensing result formed by true obstacle target information. For a scene in which a camera detects an obstacle target alone, missed recognition may exist, and filtering processing needs to be performed on a camera sensing result to obtain a third real sensing result formed by real obstacle target information.
The real sensing result of the detection is a set of obstacle target information obtained by performing fusion processing on the first obstacle target and performing filtering processing on the second obstacle target and the third obstacle target respectively. Specifically, the real sensing result of the current detection includes a first real sensing result, a second real sensing result, and a third real sensing result.
Illustratively, recording the continuous occurrence time of the second obstacle target in the millimeter wave radar sensing result;
obtaining an obstacle target confidence corresponding to the second obstacle target in the sensing result of the millimeter wave radar;
if the continuous occurrence time and the confidence coefficient of the obstacle target corresponding to the second obstacle target both meet preset conditions, taking obstacle target information corresponding to the second obstacle target as data in the second real sensing result;
otherwise, abandoning the obstacle target information corresponding to the second obstacle target as the data in the second real sensing result;
obtaining an obstacle target confidence corresponding to the third obstacle target in the camera sensing result;
if the confidence coefficient of the obstacle target corresponding to the third obstacle target meets a preset condition, taking the obstacle target information corresponding to the third obstacle target as data in the third real sensing result;
otherwise, abandoning the obstacle target information corresponding to the third obstacle target as the data in the third real sensing result;
and sequencing the data in the first real sensing result, the data in the second real sensing result and the data in the third sensing result according to the position of the obstacle target relative to the vehicle to obtain the real sensing result of the detection.
It should be noted that the millimeter wave radar reports the sensing result periodically, for example, the sensing result of the millimeter wave radar may be reported every 50 ms. And if the obstacle A is continuously reported by the millimeter wave radar for 3 times, determining that the continuous occurrence time of the obstacle A is 150 ms.
The preset condition may be a condition set including a fifth condition for determining whether the duration meets the requirement, a sixth condition for determining whether the confidence level meets the requirement, a seventh condition for determining whether the confidence level meets the requirement, and the like. For example, as for the duration, the preset condition is a fifth condition for judging whether the duration meets the requirement. And for the confidence coefficient of the obstacle target in the sensing result of the millimeter wave radar, the preset condition is a sixth condition for judging whether the confidence coefficient meets the requirement. And for the confidence coefficient of the obstacle target in the sensing result of the camera, the preset condition is a seventh condition for judging whether the confidence coefficient meets the requirement.
The position of the obstacle target relative to the host vehicle may be understood as a lateral distance or a longitudinal distance or the like of the obstacle target relative to the host vehicle. For example, the data in the first real sensing result, the data in the second real sensing result, and the data in the third sensing result may be sorted according to a lateral distance of the obstacle target with respect to the own vehicle.
Specifically, the continuous occurrence time of the second obstacle target is determined according to whether obstacle target information of the second obstacle target is continuously reported within a continuous reporting period of the millimeter wave radar. And obtaining the confidence coefficient of the obstacle target corresponding to the second obstacle target in the sensing result of the millimeter wave radar, and taking the confidence coefficient as the confidence coefficient of the obstacle target corresponding to the second obstacle target. And if the confidence coefficient of the obstacle target corresponding to the second obstacle target is greater than a preset threshold value and the continuous occurrence time of the second obstacle target exceeds a preset time, determining that the current second obstacle target is a real obstacle and is a real obstacle independently identified by the millimeter wave radar. Otherwise, the current second obstacle target is considered to be a non-real obstacle, and obstacle target information corresponding to the non-real obstacle is removed from the sensing result of the millimeter wave radar.
And obtaining the confidence coefficient of the obstacle target corresponding to the third obstacle target in the sensing result of the camera as the confidence coefficient of the obstacle target corresponding to the third obstacle target. And if the confidence coefficient of the obstacle target corresponding to the third obstacle target is greater than the set threshold, the current third obstacle target is considered to be a real obstacle and is the real obstacle independently identified by the camera. Otherwise, the current third obstacle target is considered to be a non-real obstacle, and obstacle target information corresponding to the non-real obstacle is removed from the camera sensing result.
And summarizing the first real sensing result, the second real sensing result and the third real sensing result to obtain a set of real sensing results, sequencing the obstacle targets represented by the set of real sensing results according to the positions of the obstacle targets relative to the vehicle, and sequencing the obstacle target information corresponding to each obstacle target represented by the set of sensing results according to the sequencing results to obtain the real sensing result of the detection.
And 150, filtering the real sensing result of the current detection according to the accurate sensing result of the previous detection to obtain the accurate sensing result of the current detection.
And the accurate sensing result of the previous detection is a set of obstacle target information obtained after the previous fusion, filtering and other operations adjacent to the current detection operation. The accurate perception result of each detection is the data basis of scenes such as cruising route planning.
Exemplarily, each piece of obstacle target information in the real sensing result of the current detection is respectively matched with each piece of obstacle target information in the accurate sensing result of the previous detection, and a continuously tracked obstacle target and a newly appeared obstacle target are determined according to the matching result;
for the continuously tracked obstacle target, continuing to use the identification information of the corresponding obstacle target in the accurate sensing result of the previous detection, and performing filtering processing on the real sensing result of the current detection based on the identification information and the accurate sensing result of the previous detection to obtain a first accurate sensing result;
and distributing new identification information to the newly appeared obstacle target to obtain a second accurate sensing result, and forming the accurate sensing result of the detection according to the first accurate sensing result and the second accurate sensing result.
Specifically, suppose that the true sensing result of the detection is represented as [ M [ ]0,M1,……,Mm]The accurate sensing result of the previous detection is represented as [ N ]0,N1,……,Nn]Wherein M is0,M1,……,MmObstacle target information representing m obstacle targets in the real sensing result of the current detection, N0,N1,……,NnAnd obstacle target information representing n obstacle targets in the accurate sensing result of the previous detection, wherein m and n are natural numbers. For M0Traversing the accurate sensing result of the previous detection, and respectively comparing M0And N0,N1,……,NnAnd matching to obtain a matching result. With M0And N0The matching process is described by taking matching as an example, and M is calculated0And N0Transverse distance difference, longitudinal distance difference, transverse velocity difference, longitudinal velocity difference, and heading angle difference. M is considered to be satisfied if the following conditions are satisfied at the same time0And N0Can match up: the transverse distance difference is smaller than a first preset value, the longitudinal distance difference is smaller than a second preset value, the transverse speed difference is smaller than a third preset value, the longitudinal speed difference is smaller than a fourth preset value, and the orientation angle difference is smaller than a fifth preset value. Matching M respectively by similar means0And N1,……,NnObtaining the accurate sensing result of the previous detection and M0All the sensing results on the match. Similarly, M and M in accurate sensing result of previous detection are respectively determined1,……,MmAll the sensing results on the match.
Adopting an optimal matching algorithm with weight to match M in the accurate sensing result of the previous detection0,M1,……,MmIn all the perception results on the matching, M and M are respectively determined0,M1,……,MmBest match ofAnd (6) optimizing the matching result.
For convenience of description, assuming that n is 4 and m is 5, the weight factor table shown in table 1 may be used to determine the optimal matching result between the real sensing result of the current detection and the accurate sensing result of the previous detection.
Table 1 is a weight factor table of the actual sensing result of the current detection and the accurate sensing result of the previous detection.
N0 N1 N2 N3 N4
M0 K1 K2 K3 K4 K5
M1 K6 K7 K8 K9 K10
M2 K11 K12 K13 K14 K15
M3 K16 K17 K18 K19 K20
M4 K21 K22 K23 K24 K25
M5 K26 K27 K28 K29 K30
Wherein, KiRepresents NxAnd MyA weight factor on the match, the weight factor based on NxAnd MyThe lateral distance difference, the longitudinal distance difference, the lateral velocity difference, the longitudinal velocity difference, and the heading angle difference. 1,2,3, … …,30, 0,1,2,3, 4; y is 0,1,2,3,4, 5. If N is presentxAnd MyIf there is no match, set the corresponding KiIs a default value. For example, K may be set if there is no matchiIs 0. And determining a weighting factor according to the addition operation result of the addition operation of the transverse distance difference, the longitudinal distance difference, the transverse speed difference, the longitudinal speed difference and the orientation angle difference (the orientation angle difference may need to be multiplied by a preset scale factor). It should be noted that, as the result of the addition is larger, the matching is considered to be worse, and accordingly, the weighting factor is set to be smaller, and conversely, the weighting factor is set to be larger.
Based on the weight factors in table 1, a KM (Kuhn-Munkras, optimal matching algorithm) algorithm is adopted to determine that M and M in the accurate sensing result of the previous detection are respectively matched0,M1,……,M5And (5) matching the optimal matching result.
And for the real sensing result of the detection, marking the obstacle target with the optimal matching result as a continuously tracked obstacle target, and marking the rest obstacle targets as newly appeared obstacle targets. And for the continuously tracked obstacle target, the identification information of the corresponding obstacle target in the accurate sensing result of the previous detection is continuously used. And taking the obstacle target information in the accurate sensing result of the previous detection as an observed value, and performing Kalman filtering processing on the real sensing result of the current detection according to the accurate sensing result of the previous detection with the same identification information to obtain a first accurate sensing result. And for the newly appeared obstacle target, distributing new identification information to the newly appeared obstacle target to obtain a second accurate sensing result. And combining the first accurate sensing result and the second accurate sensing result to form an accurate sensing result of the detection.
According to the technical scheme of the embodiment, time synchronization is carried out on the sensing result of the millimeter wave radar and the sensing result of the camera, a first barrier target identified by the millimeter wave radar and the camera together, a second barrier target identified by the millimeter wave radar alone and a third barrier target identified by the camera alone are determined according to the sensing result of the millimeter wave radar and the sensing result of the camera after the time synchronization, fusion processing is carried out on the first barrier target identified by the millimeter wave radar and the camera together to obtain a first real sensing result, the real sensing result of the current detection is formed based on the first real sensing result, a second real sensing result corresponding to the second barrier target and a third real sensing result corresponding to the third barrier target, the real sensing result of the current detection is filtered according to the accurate sensing result of the previous detection, and obtaining an accurate sensing result of the detection. According to the embodiment of the invention, a more accurate obstacle sensing result is obtained through fusion and filtering processing, and the obstacle target independently identified by the millimeter wave radar and the obstacle independently identified by the camera are considered, so that missing identification is avoided, the problem of mistaken identification or missing identification in the scheme of the related technology is solved, the accurate output of the sensing result of the obstacle target is realized, more accurate sensing information is provided for the advanced cruise function of the automobile, more accurate vehicle control is realized, and better driving experience is provided for a driver.
Fig. 3 is a flowchart of another method for detecting an obstacle target according to an embodiment of the present invention, and the present embodiment describes the method for detecting an obstacle target in detail based on the above embodiment. As shown in fig. 3, the method includes:
and 301, respectively acquiring physical environment information around the vehicle by the domain controller through the millimeter wave radar and the camera.
And step 302, the millimeter wave radar processes the acquired physical environment information through the self controller to obtain a reference millimeter wave radar sensing result, and outputs the reference obstacle sensing result to the domain controller.
And 303, processing the acquired physical environment information by the camera through the self controller to obtain a sensing result of the reference camera, and outputting the sensing result of the reference obstacle to the domain controller.
And step 304, the domain controller unifies the reference obstacle sensing result and the reference obstacle sensing result into a coordinate system taking the center of the front bumper of the automobile as a coordinate origin through coordinate system conversion to obtain a millimeter wave radar sensing result and a camera sensing result.
And 305, the domain controller carries out time stamp calibration processing on the sensing result of the millimeter wave radar and the sensing result of the camera to obtain the sensing result of the millimeter wave radar and the camera at the same time.
And step 306, preprocessing the sensing results of the millimeter wave radar and the camera at the same time by the domain controller through a matching algorithm to determine the obstacle target identified by the camera and the millimeter wave radar together, the obstacle target identified by the millimeter wave radar alone and the obstacle target identified by the camera alone.
The matching algorithm may specifically be: and traversing each obstacle target information Object _ Radar of the sensing result of the millimeter wave Radar aiming at each obstacle target information Object _ Camera of the sensing result of the Camera obstacle. If the type of one Object _ Radar in the sensing result of the millimeter wave Radar is consistent with that of the Object _ Camera, the transverse position error of the Object _ Radar is smaller than the transverse position threshold, the longitudinal position error is smaller than the longitudinal position threshold, and the orientation angles of the Object _ Radar and the Object _ Camera relative to the self-vehicle are both smaller than the orientation angle threshold (wherein the orientation angle threshold is related to the distance between the obstacle target and the self-vehicle), the obstacle target is considered to be the same obstacle, and is an obstacle recognized by the Camera and the millimeter wave Radar together. If any one of the conditions is not met, the matching is not considered to be successful, and the obstacle target is respectively distinguished into an obstacle identified by the millimeter wave radar and an obstacle identified by the camera. The orientation angle changes with a change in the distance between the obstacle target and the vehicle, and a plurality of orientation angle thresholds may be set according to the distance between the obstacle target and the vehicle. For example, when the lateral distance or the longitudinal distance of the obstacle target from the host vehicle is less than 15m, the orientation angle threshold is set to 30 °. The heading angle threshold is set to 50 ° when the lateral or longitudinal distance of the obstacle target from the host vehicle is greater than 15m and less than 30 m. When the lateral or longitudinal distance of the obstacle target from the host vehicle is greater than 30m, the heading angle threshold is set to 80 °.
And 307, filtering the sensing result independently identified by the millimeter wave radar by the domain controller to filter obstacle target information with low obstacle confidence in the sensing result to obtain a second real sensing result.
Illustratively, the confidence degree and the continuous occurrence time of any obstacle target are determined by traversing the sensing result of the millimeter wave radar for individual recognition. If the confidence coefficient of the obstacle target is greater than 80% and the continuous occurrence time is greater than 1s (because the false alarm and missing report rate of the millimeter wave radar is relatively high, whether the obstacle target is the obstacle target which is stably reported or not needs to be determined), the obstacle target is considered to be a real obstacle and is an obstacle which is independently identified by the millimeter wave radar, otherwise, the confidence coefficient of the obstacle is considered to be too low, and the obstacle is removed.
And 308, filtering the sensing result independently identified by the camera by the domain controller to filter the obstacle target information with the obstacle confidence coefficient filtered in the sensing result to obtain a third real sensing result.
Illustratively, the perception result recognized by the camera is traversed, and for any obstacle region target, the corresponding obstacle target confidence is determined. If the confidence degree of the obstacle target is more than 70%, the obstacle target is considered to be a real obstacle and is an obstacle which is separately identified by the camera, otherwise, the obstacle target is considered to be an unreliable obstacle and is removed.
And 309, for the obstacle identified by the millimeter wave radar and the camera together, the domain controller integrates the respective advantages of the two sensors to fuse the obstacle target information to obtain the real obstacle target information of the obstacle.
Illustratively, the fusion algorithm is as follows:
the relative self-vehicle transverse distance after fusion is as follows: x _ Fusion 80% × X _ Camera + 20% × X _ rad;
the relative self-vehicle longitudinal distance after fusion is as follows: y _ Fusion ═ 20% Y _ Camera + 80% Y _ Radar;
relative self-vehicle transverse speed after fusion: xseed _ Fusion is 80% xseed _ Camera + 20% xseed _ Radar;
relative self-vehicle longitudinal speed after fusion: yspeed _ Fusion is 20% Yspeed _ Camera + 80% Yspeed _ Rada;
post-fusion obstacle type: type _ Fusion is Type _ Camera.
Wherein, X _ Camera, Y _ Camera, xseed _ Camera, yseed _ Camera, and Type _ Camera are the transverse distance, the longitudinal distance, the transverse speed, the longitudinal speed, and the Type of the obstacle (for example, the Type of the obstacle may be stationary or moving, etc.) of the obstacle identified by the Camera with respect to the vehicle. X _ Radar, Y _ Radar, Xspeed _ Radar and Yspeed _ Radar are respectively the transverse distance, the longitudinal distance, the transverse speed and the longitudinal speed of the obstacle identified by the millimeter wave Radar relative to the vehicle. Note that, the above 80% and 20% are fusion weight factors, which are default values set in advance according to the detection advantage of the sensor.
And obtaining the relative real position, speed and category information of the obstacle at the moment after fusion, namely the first real sensing result.
And 310, the domain controller collects the first real sensing result, the second real sensing result and the third real sensing result to obtain a real sensing result of the detection.
And 311, the domain controller processes the accurate sensing result of the previous detection and the real sensing result of the current detection through a matching algorithm.
Illustratively, each piece of obstacle target information in the real sensing result of the current detection is respectively matched with each piece of obstacle target information in the accurate sensing result of the previous detection, and a continuously tracked obstacle target and a newly appeared obstacle target are determined according to the matching result. For the continuously tracked obstacle target, the identification information of the corresponding obstacle target in the accurate sensing result of the previous detection is used to execute the filtering algorithm of step 312. And for the newly appeared obstacle target, distributing new identification information to the newly appeared obstacle target to obtain a second accurate sensing result.
And step 312, the domain controller performs kalman filtering processing on the real sensing result of the current detection according to the accurate sensing result of the previous detection to obtain a first accurate sensing result.
For example, the detection processing performed on the obstacle target is a continuous process, because the sensing result corresponding to each detection has an error, and even the sensing result of the same obstacle target may have a large jump. In order to obtain a stable sensing result and a stable tracking result, the accurate sensing result of the previous detection can be combined, and the true sensing result of the current detection is filtered to obtain a first accurate sensing result. It should be noted that the filtering algorithm is not limited to the kalman filtering algorithm, and may be other filtering algorithms.
And 313, obtaining an accurate sensing result of the detection after filtering by the domain controller.
Illustratively, the accurate sensing result of the current detection is formed according to the first accurate sensing result and the second accurate sensing result.
And step 314, outputting the accurate sensing result of the detection to the planning control module by the domain controller.
The planning control module is a functional module for realizing the cruise path planning. For example, the planning control module implements cruise path planning by executing planning control program code based on accurate sensing results of the respective detections. Alternatively, the planning control module may be configured at the domain controller or exist independently of the domain controller.
According to the embodiment of the invention, the unification of the millimeter wave radar sensing barrier coordinate system and the camera sensing barrier coordinate system is completed through the change of the coordinate system; obtaining a fused real sensing result through matched filtering fusion operation, and comprehensively considering all millimeter wave radar independent sensing results and camera independent sensing results of the barrier targets with stable time domains and high confidence coefficients to obtain a real sensing result of the detection; the Kalman filtering processing is carried out on the real sensing result detected this time by combining the accurate detection result detected last time, so that the accurate output of the position, the speed and the type information of the obstacle target is realized, and more accurate sensing information is provided for the advanced cruise function of the automobile, so that more accurate vehicle control is realized, and better driving experience is provided for a driver.
Fig. 4 is a block diagram of a structure of an obstacle target detection apparatus according to an embodiment of the present invention. The device can be suitable for intelligent driving scenes, and accurate output of sensing results of the obstacle targets is achieved by executing the obstacle target detection method. The apparatus may be implemented by software and/or hardware and is generally configured in a domain controller. As shown in fig. 4, the apparatus includes:
a sensing result obtaining module 410, configured to obtain a millimeter wave radar sensing result and a camera sensing result after time synchronization;
the sensing result matching module 420 is configured to determine, according to the sensing result of the millimeter wave radar and the sensing result of the camera, a first obstacle target identified by the millimeter wave radar and the camera together, a second obstacle target identified by the millimeter wave radar alone, and a third obstacle target identified by the camera alone;
the sensing result fusion module 430 is configured to fuse obstacle target information corresponding to the first obstacle target in the millimeter wave radar sensing result and the camera sensing result to obtain a first real sensing result;
a real result determining module 440, configured to determine a second real sensing result and a third real sensing result that correspond to the second obstacle target and the third obstacle target, respectively, and obtain a real sensing result of the current detection based on the first real sensing result, the second real sensing result, and the third real sensing result;
and the accurate result determining module 450 is configured to perform filtering processing on the real sensing result of the current detection according to the accurate sensing result of the previous detection, so as to obtain the accurate sensing result of the current detection.
The embodiment of the invention provides an obstacle target detection device, which obtains a more accurate obstacle sensing result through fusion and filtering processing, considers an obstacle target independently identified by a millimeter wave radar and an obstacle independently identified by a camera, avoids missing identification, solves the problem of false identification or missing identification in the scheme of the related technology, realizes accurate output of the sensing result of the obstacle target, provides more accurate sensing information for the advanced cruise function of an automobile, further realizes more accurate vehicle control and provides better driving experience for a driver.
Optionally, the apparatus further comprises:
the sensing result acquisition module is used for acquiring physical environment information around the vehicle through a millimeter wave radar and a camera which are configured on the vehicle before acquiring the millimeter wave radar sensing result and the camera sensing result after time synchronization to obtain a reference millimeter wave radar sensing result and a reference camera sensing result which correspond to the physical environment information;
and the coordinate system conversion module is used for converting the reference millimeter wave radar sensing result and the reference camera sensing result into a millimeter wave radar sensing result and a camera sensing result in a target coordinate system through coordinate system conversion.
Optionally, the sensing result obtaining module 410 is specifically configured to:
calculating a timestamp difference value of a current sensing result of the millimeter wave radar and a current sensing result of the camera;
and if the timestamp difference is smaller than the set threshold, determining that the current sensing result of the millimeter wave radar and the current sensing result of the camera are the sensing result of the millimeter wave radar and the sensing result of the camera which are synchronized in time.
Optionally, the sensing result matching module 420 is specifically configured to:
aiming at each obstacle target information in the camera sensing result, respectively matching each obstacle target information in the millimeter wave radar sensing result to obtain a type matching result, a transverse position error, a longitudinal position error and an orientation angle matching result;
if the type matching result, the transverse position error, the longitudinal position error and the orientation angle matching result of the current obstacle target all meet preset conditions, determining that the current obstacle target is a first obstacle target identified by the millimeter wave radar and the camera together;
otherwise, determining that the current obstacle target is a second obstacle target individually identified by the millimeter wave radar or a third obstacle target individually identified by the camera according to the source of the obstacle target information corresponding to the current obstacle target.
Optionally, the sensing result fusion module 430 is specifically configured to:
fusing the transverse distance of the first obstacle target relative to the vehicle in the millimeter wave radar sensing result and the camera sensing result to obtain the real transverse distance of the first obstacle target relative to the vehicle;
fusing the longitudinal distance of the first obstacle target relative to the vehicle in the millimeter wave radar sensing result and the camera sensing result to obtain the real longitudinal distance of the first obstacle target relative to the vehicle;
fusing the transverse speed of the first obstacle target relative to the own vehicle in the millimeter wave radar sensing result and the camera sensing result to obtain the real transverse speed of the first obstacle target relative to the own vehicle;
fusing the longitudinal speed of the first obstacle target relative to the self-vehicle in the sensing result of the millimeter wave radar and the sensing result of the camera to obtain the real longitudinal speed of the first obstacle target relative to the self-vehicle;
taking the type of the first obstacle target in the camera sensing result as the real type of the first obstacle target after fusion;
and forming the first real sensing result according to the real transverse distance, the real longitudinal distance, the real transverse speed, the real longitudinal speed and the real type.
Optionally, the real result determining module 440 is specifically configured to:
recording the continuous occurrence time of the second obstacle target in the sensing result of the millimeter wave radar;
obtaining an obstacle target confidence corresponding to the second obstacle target in the sensing result of the millimeter wave radar;
if the continuous occurrence time and the confidence coefficient of the obstacle target corresponding to the second obstacle target both meet preset conditions, taking obstacle target information corresponding to the second obstacle target as data in the second real sensing result;
otherwise, abandoning the obstacle target information corresponding to the second obstacle target as the data in the second real sensing result;
obtaining an obstacle target confidence corresponding to the third obstacle target in the camera sensing result;
if the confidence coefficient of the obstacle target corresponding to the third obstacle target meets a preset condition, taking the obstacle target information corresponding to the third obstacle target as data in the third real sensing result;
otherwise, abandoning the obstacle target information corresponding to the third obstacle target as the data in the third real sensing result;
and sequencing the data in the first real sensing result, the data in the second real sensing result and the data in the third sensing result according to the position of the obstacle target relative to the vehicle to obtain the real sensing result of the detection.
Optionally, the accurate result determining module 450 is specifically configured to:
aiming at each obstacle target information in the real sensing result of the current detection, matching each obstacle target information with each obstacle target information in the accurate sensing result of the previous detection, and determining a continuously tracked obstacle target and a newly appeared obstacle target according to the matching result;
for the continuously tracked obstacle target, continuing to use the identification information of the corresponding obstacle target in the accurate sensing result of the previous detection, and performing filtering processing on the real sensing result of the current detection based on the identification information and the accurate sensing result of the previous detection to obtain a first accurate sensing result;
and distributing new identification information to the newly appeared obstacle target to obtain a second accurate sensing result, and forming the accurate sensing result of the detection according to the first accurate sensing result and the second accurate sensing result.
The obstacle target detection device provided by the embodiment of the invention can execute the obstacle target detection method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
It should be noted that, in the embodiment of the obstacle target detection device, the included units and modules are only divided according to functional logic, but are not limited to the above division, as long as the corresponding functions can be realized; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
Fig. 5 is a schematic structural diagram of a domain controller according to an embodiment of the present invention, as shown in fig. 5, the domain controller includes a processor 50 and a memory 51; the number of the processors 50 in the domain controller may be one or more, and one processor 50 is taken as an example in fig. 5; the processor 50 and the memory 51 in the domain controller may be connected by a bus or other means, and the connection by the bus is exemplified in fig. 5.
The memory 51 is used as a computer-readable storage medium, and can be used for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the obstacle target detection method in the embodiment of the present invention (for example, the sensing result obtaining module 410, the sensing result matching module 420, the sensing result fusion module 430, the real result determining module 440, and the accurate result determining module 450 in the obstacle target detection apparatus). The processor 50 executes various functional applications of the domain controller and data processing by running software programs, instructions, and modules stored in the memory 51, that is, implements the above-described obstacle target detection method.
The memory 51 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 51 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory 51 may further include memory remotely located from the processor 50, which may be connected to the domain controller over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
An embodiment of the present invention further provides a vehicle, including: millimeter wave radar, camera and domain controller.
The millimeter wave radar is used for collecting physical environment information around the self-vehicle and outputting a millimeter wave sensing result;
the camera is used for collecting physical environment information around the self-vehicle and outputting a camera sensing result;
and, a domain controller according to an embodiment of the present invention.
Embodiments of the present invention also provide a storage medium containing computer-executable instructions, which when executed by a computer processor, perform a method of obstacle target detection, the method comprising:
acquiring a millimeter wave radar sensing result and a camera sensing result after time synchronization;
determining a first obstacle target identified by the millimeter wave radar and the camera together, a second obstacle target identified by the millimeter wave radar alone and a third obstacle target identified by the camera alone according to the sensing result of the millimeter wave radar and the sensing result of the camera;
fusing obstacle target information corresponding to the first obstacle target in the millimeter wave radar sensing result and the camera sensing result to obtain a first real sensing result;
determining a second real sensing result and a third real sensing result respectively corresponding to the second obstacle target and the third obstacle target, and obtaining a real sensing result of the detection based on the first real sensing result, the second real sensing result and the third real sensing result;
and filtering the real sensing result of the current detection according to the accurate sensing result of the previous detection to obtain the accurate sensing result of the current detection.
Of course, the storage medium containing the computer-executable instructions provided by the embodiments of the present invention is not limited to the method operations described above, and may also perform related operations in the obstacle target detection method provided by any embodiments of the present invention.
From the above description of the embodiments, it is obvious for those skilled in the art that the present invention can be implemented by software and necessary general hardware, and certainly, can also be implemented by hardware, but the former is a better embodiment in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which can be stored in a computer-readable storage medium, such as a floppy disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a FLASH Memory (FLASH), a hard disk or an optical disk of a computer, and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods according to the embodiments of the present invention.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. An obstacle target detection method, comprising:
acquiring a millimeter wave radar sensing result and a camera sensing result after time synchronization;
determining a first obstacle target identified by the millimeter wave radar and the camera together, a second obstacle target identified by the millimeter wave radar alone and a third obstacle target identified by the camera alone according to the sensing result of the millimeter wave radar and the sensing result of the camera;
fusing obstacle target information corresponding to the first obstacle target in the millimeter wave radar sensing result and the camera sensing result to obtain a first real sensing result;
determining a second real sensing result and a third real sensing result respectively corresponding to the second obstacle target and the third obstacle target, and obtaining a real sensing result of the detection based on the first real sensing result, the second real sensing result and the third real sensing result;
and filtering the real sensing result of the current detection according to the accurate sensing result of the previous detection to obtain the accurate sensing result of the current detection.
2. The method according to claim 1, before obtaining the millimeter wave radar sensing result and the camera sensing result after time synchronization, further comprising:
respectively acquiring physical environment information around a vehicle through a millimeter wave radar and a camera which are configured on the vehicle to obtain a reference millimeter wave radar sensing result and a reference camera sensing result which correspond to the physical environment information;
and converting the reference millimeter wave radar sensing result and the reference camera sensing result into a millimeter wave radar sensing result and a camera sensing result in a target coordinate system through coordinate system conversion.
3. The method according to claim 1 or 2, wherein the obtaining of the millimeter wave radar sensing result and the camera sensing result after time synchronization includes:
calculating a timestamp difference value of a current sensing result of the millimeter wave radar and a current sensing result of the camera;
and if the timestamp difference is smaller than the set threshold, determining that the current sensing result of the millimeter wave radar and the current sensing result of the camera are the sensing result of the millimeter wave radar and the sensing result of the camera which are synchronized in time.
4. The method according to claim 1, wherein the determining, according to the sensing result of the millimeter wave radar and the sensing result of the camera, a first obstacle target recognized by the millimeter wave radar and the camera together, a second obstacle target recognized by the millimeter wave radar alone, and a third obstacle target recognized by the camera alone comprises:
aiming at each obstacle target information in the camera sensing result, respectively matching each obstacle target information in the millimeter wave radar sensing result to obtain a type matching result, a transverse position error, a longitudinal position error and an orientation angle matching result;
if the type matching result, the transverse position error, the longitudinal position error and the orientation angle matching result of the current obstacle target all meet preset conditions, determining that the current obstacle target is a first obstacle target identified by the millimeter wave radar and the camera together;
otherwise, determining that the current obstacle target is a second obstacle target individually identified by the millimeter wave radar or a third obstacle target individually identified by the camera according to the source of the obstacle target information corresponding to the current obstacle target.
5. The method according to claim 1, wherein the fusing obstacle target information corresponding to the first obstacle target in the millimeter wave radar sensing result and the camera sensing result to obtain a first true sensing result includes:
fusing the transverse distance of the first obstacle target relative to the vehicle in the millimeter wave radar sensing result and the camera sensing result to obtain the real transverse distance of the first obstacle target relative to the vehicle;
fusing the longitudinal distance of the first obstacle target relative to the vehicle in the millimeter wave radar sensing result and the camera sensing result to obtain the real longitudinal distance of the first obstacle target relative to the vehicle;
fusing the transverse speed of the first obstacle target relative to the own vehicle in the millimeter wave radar sensing result and the camera sensing result to obtain the real transverse speed of the first obstacle target relative to the own vehicle;
fusing the longitudinal speed of the first obstacle target relative to the self-vehicle in the sensing result of the millimeter wave radar and the sensing result of the camera to obtain the real longitudinal speed of the first obstacle target relative to the self-vehicle;
taking the type of the first obstacle target in the camera sensing result as the real type of the first obstacle target after fusion;
and forming the first real sensing result according to the real transverse distance, the real longitudinal distance, the real transverse speed, the real longitudinal speed and the real type.
6. The method according to claim 1, wherein the determining a second real sensing result and a third real sensing result corresponding to the second obstacle target and the third obstacle target, respectively, and obtaining the real sensing result of the current detection based on the first real sensing result, the second real sensing result, and the third real sensing result comprises:
recording the continuous occurrence time of the second obstacle target in the sensing result of the millimeter wave radar;
obtaining an obstacle target confidence corresponding to the second obstacle target in the sensing result of the millimeter wave radar;
if the continuous occurrence time and the confidence coefficient of the obstacle target corresponding to the second obstacle target both meet preset conditions, taking obstacle target information corresponding to the second obstacle target as data in the second real sensing result;
otherwise, abandoning the obstacle target information corresponding to the second obstacle target as the data in the second real sensing result;
obtaining an obstacle target confidence corresponding to the third obstacle target in the camera sensing result;
if the confidence coefficient of the obstacle target corresponding to the third obstacle target meets a preset condition, taking the obstacle target information corresponding to the third obstacle target as data in the third real sensing result;
otherwise, abandoning the obstacle target information corresponding to the third obstacle target as the data in the third real sensing result;
and sequencing the data in the first real sensing result, the data in the second real sensing result and the data in the third sensing result according to the position of the obstacle target relative to the vehicle to obtain the real sensing result of the detection.
7. The method according to claim 1, wherein the filtering the real sensing result of the current detection according to the accurate sensing result of the previous detection to obtain the accurate sensing result of the current detection comprises:
aiming at each obstacle target information in the real sensing result of the current detection, matching each obstacle target information with each obstacle target information in the accurate sensing result of the previous detection, and determining a continuously tracked obstacle target and a newly appeared obstacle target according to the matching result;
for the continuously tracked obstacle target, continuing to use the identification information of the corresponding obstacle target in the accurate sensing result of the previous detection, and performing filtering processing on the real sensing result of the current detection based on the identification information and the accurate sensing result of the previous detection to obtain a first accurate sensing result;
and distributing new identification information to the newly appeared obstacle target to obtain a second accurate sensing result, and forming the accurate sensing result of the detection according to the first accurate sensing result and the second accurate sensing result.
8. An obstacle object detection apparatus, comprising:
the sensing result acquisition module is used for acquiring the millimeter wave radar sensing result and the camera sensing result after time synchronization;
the sensing result matching module is used for determining a first obstacle target identified by the millimeter wave radar and the camera together, a second obstacle target identified by the millimeter wave radar alone and a third obstacle target identified by the camera alone according to the sensing result of the millimeter wave radar and the sensing result of the camera;
the sensing result fusion module is used for fusing obstacle target information corresponding to the first obstacle target in the millimeter wave radar sensing result and the camera sensing result to obtain a first real sensing result;
a real result determining module, configured to determine a second real sensing result and a third real sensing result that correspond to the second obstacle target and the third obstacle target, respectively, and obtain a real sensing result of the current detection based on the first real sensing result, the second real sensing result, and the third real sensing result;
and the accurate result determining module is used for filtering the real sensing result of the detection according to the accurate sensing result of the previous detection to obtain the accurate sensing result of the detection.
9. A domain controller, characterized in that the domain controller comprises:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the obstacle object detection method of any one of claims 1-7.
10. A vehicle, characterized in that the vehicle comprises:
the millimeter wave radar is used for collecting physical environment information around the self-vehicle and outputting a millimeter wave sensing result;
the camera is used for collecting physical environment information around the self-vehicle and outputting a camera sensing result;
and a domain controller as claimed in claim 9.
CN202110431055.8A 2021-04-21 2021-04-21 Obstacle target detection method and device, domain controller and vehicle Pending CN113093178A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110431055.8A CN113093178A (en) 2021-04-21 2021-04-21 Obstacle target detection method and device, domain controller and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110431055.8A CN113093178A (en) 2021-04-21 2021-04-21 Obstacle target detection method and device, domain controller and vehicle

Publications (1)

Publication Number Publication Date
CN113093178A true CN113093178A (en) 2021-07-09

Family

ID=76679060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110431055.8A Pending CN113093178A (en) 2021-04-21 2021-04-21 Obstacle target detection method and device, domain controller and vehicle

Country Status (1)

Country Link
CN (1) CN113093178A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114488047A (en) * 2022-01-27 2022-05-13 中国第一汽车股份有限公司 Vehicle sensor calibration system
CN114637003A (en) * 2022-03-16 2022-06-17 中国第一汽车股份有限公司 Target identification method and device for vehicle, vehicle and storage medium
CN114779790A (en) * 2022-06-16 2022-07-22 小米汽车科技有限公司 Obstacle recognition method, obstacle recognition device, vehicle, server, storage medium and chip
CN114792469A (en) * 2022-04-06 2022-07-26 大唐高鸿智联科技(重庆)有限公司 Method and device for testing sensing system and testing equipment
CN115079155A (en) * 2022-05-27 2022-09-20 中国第一汽车股份有限公司 Target detection method and device and vehicle
CN115230684A (en) * 2021-08-20 2022-10-25 广州汽车集团股份有限公司 Forward anti-collision method and system
CN115257717A (en) * 2022-08-09 2022-11-01 上海保隆汽车科技股份有限公司 Intelligent obstacle avoidance method and system for vehicle, medium, vehicle machine and vehicle
WO2023142814A1 (en) * 2022-01-30 2023-08-03 中国第一汽车股份有限公司 Target recognition method and apparatus, and device and storage medium
WO2024007570A1 (en) * 2022-07-04 2024-01-11 惠州市德赛西威汽车电子股份有限公司 Obstacle recognition method and apparatus, electronic device, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108764373A (en) * 2018-06-08 2018-11-06 北京领骏科技有限公司 A kind of sensing data filtering and fusion method in automatic Pilot
CN109747643A (en) * 2017-11-07 2019-05-14 郑州宇通客车股份有限公司 A kind of information fusion method of intelligent vehicle sensory perceptual system
CN109885056A (en) * 2019-03-07 2019-06-14 格陆博科技有限公司 A kind of more scene selection methods merged based on monocular cam and millimetre-wave radar
CN110378360A (en) * 2018-08-01 2019-10-25 北京京东尚科信息技术有限公司 Target designation method, apparatus, electronic equipment and readable storage medium storing program for executing
CN110850413A (en) * 2019-11-26 2020-02-28 奇瑞汽车股份有限公司 Method and system for detecting front obstacle of automobile
CN111060911A (en) * 2018-10-16 2020-04-24 天津所托瑞安汽车科技有限公司 Vehicle anti-collision recognition method based on scene analysis
CN111157994A (en) * 2019-12-31 2020-05-15 中汽数据(天津)有限公司 Sensing algorithm of millimeter wave radar

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109747643A (en) * 2017-11-07 2019-05-14 郑州宇通客车股份有限公司 A kind of information fusion method of intelligent vehicle sensory perceptual system
CN108764373A (en) * 2018-06-08 2018-11-06 北京领骏科技有限公司 A kind of sensing data filtering and fusion method in automatic Pilot
CN110378360A (en) * 2018-08-01 2019-10-25 北京京东尚科信息技术有限公司 Target designation method, apparatus, electronic equipment and readable storage medium storing program for executing
CN111060911A (en) * 2018-10-16 2020-04-24 天津所托瑞安汽车科技有限公司 Vehicle anti-collision recognition method based on scene analysis
CN109885056A (en) * 2019-03-07 2019-06-14 格陆博科技有限公司 A kind of more scene selection methods merged based on monocular cam and millimetre-wave radar
CN110850413A (en) * 2019-11-26 2020-02-28 奇瑞汽车股份有限公司 Method and system for detecting front obstacle of automobile
CN111157994A (en) * 2019-12-31 2020-05-15 中汽数据(天津)有限公司 Sensing algorithm of millimeter wave radar

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115230684A (en) * 2021-08-20 2022-10-25 广州汽车集团股份有限公司 Forward anti-collision method and system
CN115230684B (en) * 2021-08-20 2024-03-01 广州汽车集团股份有限公司 Forward anti-collision method and system
CN114488047A (en) * 2022-01-27 2022-05-13 中国第一汽车股份有限公司 Vehicle sensor calibration system
WO2023142814A1 (en) * 2022-01-30 2023-08-03 中国第一汽车股份有限公司 Target recognition method and apparatus, and device and storage medium
CN114637003A (en) * 2022-03-16 2022-06-17 中国第一汽车股份有限公司 Target identification method and device for vehicle, vehicle and storage medium
CN114792469A (en) * 2022-04-06 2022-07-26 大唐高鸿智联科技(重庆)有限公司 Method and device for testing sensing system and testing equipment
CN115079155A (en) * 2022-05-27 2022-09-20 中国第一汽车股份有限公司 Target detection method and device and vehicle
CN114779790A (en) * 2022-06-16 2022-07-22 小米汽车科技有限公司 Obstacle recognition method, obstacle recognition device, vehicle, server, storage medium and chip
WO2024007570A1 (en) * 2022-07-04 2024-01-11 惠州市德赛西威汽车电子股份有限公司 Obstacle recognition method and apparatus, electronic device, and storage medium
CN115257717A (en) * 2022-08-09 2022-11-01 上海保隆汽车科技股份有限公司 Intelligent obstacle avoidance method and system for vehicle, medium, vehicle machine and vehicle

Similar Documents

Publication Publication Date Title
CN113093178A (en) Obstacle target detection method and device, domain controller and vehicle
US11353553B2 (en) Multisensor data fusion method and apparatus to obtain static and dynamic environment features
CN106255899B (en) Device for signaling an object to a navigation module of a vehicle equipped with such a device
JP5713106B2 (en) Vehicle identification system and vehicle identification device
US8989915B2 (en) Vehicular wireless communication apparatus and communication system
CN109099920B (en) Sensor target accurate positioning method based on multi-sensor association
CN111798698B (en) Method and device for determining front target vehicle and vehicle
CN112693466A (en) System and method for evaluating performance of vehicle environment perception sensor
CN111753623B (en) Method, device, equipment and storage medium for detecting moving object
CN112731296B (en) Method and system for condensing points of millimeter wave radar of automobile
CN113253257B (en) Strip mine obstacle detection method based on multi-millimeter-wave radar and vision
CN110866544B (en) Sensor data fusion method and device and storage medium
CN110969178A (en) Data fusion system and method for automatic driving vehicle and automatic driving system
CN115856872A (en) Vehicle motion track continuous tracking method
CN114460598A (en) Target identification method, device, equipment and storage medium
CN114842445A (en) Target detection method, device, equipment and medium based on multi-path fusion
CN114475593B (en) Travel track prediction method, vehicle, and computer-readable storage medium
CN109416885B (en) Vehicle identification method and system
Bouain et al. Multi-sensor fusion for obstacle detection and recognition: A belief-based approach
CN115223131A (en) Adaptive cruise following target vehicle detection method and device and automobile
CN113179303A (en) Method, device and program carrier for reporting traffic events
EP3467545A1 (en) Object classification
CN113611112B (en) Target association method, device, equipment and storage medium
CN114488065A (en) Track data processing method, device, vehicle and medium
CN115966084A (en) Holographic intersection millimeter wave radar data processing method and device and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210709