CN112689842A - Target detection method and device - Google Patents

Target detection method and device Download PDF

Info

Publication number
CN112689842A
CN112689842A CN202080004870.XA CN202080004870A CN112689842A CN 112689842 A CN112689842 A CN 112689842A CN 202080004870 A CN202080004870 A CN 202080004870A CN 112689842 A CN112689842 A CN 112689842A
Authority
CN
China
Prior art keywords
target
threshold
sensor
detection error
error
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202080004870.XA
Other languages
Chinese (zh)
Other versions
CN112689842B (en
Inventor
林永兵
马莎
张慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN112689842A publication Critical patent/CN112689842A/en
Application granted granted Critical
Publication of CN112689842B publication Critical patent/CN112689842B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/809Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
    • G06V10/811Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data the classifiers operating on different input data, e.g. multi-modal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides a target detection method and a target detection device, relates to the field of automatic driving, and is used for improving the accuracy of judging that targets detected by different sensors are the same target, and the method comprises the following steps: determining an overlapping area of a first target area determined by a detection error of the first sensor and a position of the first target detected by the first sensor and a second target area determined by a detection error of the second sensor and a position of the second target detected by the second sensor; calculating the overlapping degree of the overlapping area in the union of the first target area and the second target area; and when the degree of overlap is larger than or equal to the first threshold, determining the confidence of the first target and the second target being the same target as the target confidence according to the relationship between the position of the first target and the position of the second target and the first region and/or the relationship between the distance between the position of the first target and the position of the second target and the second threshold.

Description

Target detection method and device
Technical Field
The embodiment of the application relates to the field of automatic driving, in particular to a target detection method and device.
Background
The implementation of autonomous driving environment awareness relies on onboard deployed sensors, which may include cameras and millimeter wave radar, among others. The multiple sensors work cooperatively, the advantages are complementary, and the perception capability and the reliability of the automatic driving environment can be enhanced.
The camera has the characteristics of passive measurement, non-contact measurement, high resolution, convenience in use, low cost and the like, and is a necessary sensor for automatic driving environment sensing, target detection and an Automatic Driving Assistance System (ADAS). Compared with a camera, the millimeter wave radar belongs to an active measurement method, has the characteristics of high distance measurement precision, all-weather work, moderate price and the like, and is more and more widely applied to automatic driving occasions.
The detection range of the camera and the millimeter wave radar has an overlapping area, so that the same target can be simultaneously detected, and the target detection precision and reliability can be improved by carrying out target fusion on the target detected by the camera and the target detected by the millimeter wave radar. However, because the sensors have detection errors, how to effectively fuse the same target detected by different sensors is a fundamental problem for realizing the functions of target tracking, vehicle obstacle avoidance and the like.
This relates to how to determine whether or not a target detected by the camera and a target detected for the millimeter wave radar are the same target.
Disclosure of Invention
The embodiment of the application provides a target detection method and a target detection device, which are used for improving the accuracy of judging that targets detected by different sensors are the same target.
In order to achieve the above purpose, the embodiments of the present application provide the following technical solutions:
in a first aspect, an embodiment of the present application provides a target detection method, including: the first device determines an overlapping area of a first target area determined by a detection error of the first sensor and a position of the first target detected by the first sensor and a second target area determined by a detection error of the second sensor and a position of the second target detected by the second sensor. The first target area includes a position of the first target and the second target area includes a position of the second target. The first device calculates the overlapping degree of the overlapping area occupying the union of the first target area and the second target area. When the degree of overlap is greater than or equal to the first threshold, the first device determines the confidence that the first target and the second target are the same target as the target confidence according to the relationship between the position of the first target and the position of the second target and the first region, and/or the relationship between the distance between the position of the first target and the position of the second target and the second threshold. Wherein the first region is determined by a first detection error of the first sensor and a third detection error of the second sensor. The detection error of the first sensor includes a first detection error and a second detection error. The accuracy of the first detection error is higher than the accuracy of the second detection error. The detection error of the second sensor includes a third detection error and a fourth detection error, and the accuracy of the third detection error is higher than the accuracy of the fourth detection error.
The embodiment of the application provides an object detection method, in which a first device determines the degree of overlap of an overlap region of a first object region and a second object region, and then when the degree of overlap is greater than or equal to a first threshold, the first device determines, as an object confidence, the confidence that the first object and the second object are the same object according to the relationship between the position of the first object and the position of the second object and the first region, and/or the relationship between the distance between the position of the first object and the position of the second object and a second threshold, so that when the degree of overlap is greater than or equal to the first threshold, the accuracy of the confidence that the first object and the second object are the same object is further determined based on the relationship between the position of the first object and the position of the second object and the first region, and/or the relationship between the distance between the position of the first object and the position of the second object and the second threshold And the misjudgment is reduced.
In one possible implementation, the relationship between the position of the first object and the position of the second object and the first area includes: the position of the first target and the position of the second target are both located within the first area, or the relationship between the position of the first target and the position of the second target and the first area, includes: at least one of the position of the first target and the position of the second target is outside the first area.
In one possible implementation manner, when the size of the overlap degree is greater than or equal to the first threshold, and the position of the first target and the position of the second target are both located in the first region, the target confidence degree is the first confidence degree. When the degree of overlap is greater than or equal to the first threshold, the target confidence is the second confidence when one or more of the position of the first target and the position of the second target are outside the first region. The first confidence level is greater than the second confidence level. In other words, when the degree of overlap is greater than or equal to the first threshold, the confidence that the first object and the second object whose positions are within the first region are the same object is greater than the confidence that the first object and the second object whose positions are outside the first region are the same object.
In one possible implementation, the relationship between the distance between the position of the first target and the position of the second target and the second threshold comprises: the distance between the position of the first object and the position of the second object is less than or equal to a second threshold. Alternatively, the relationship between the distance between the position of the first target and the position of the second target and the second threshold comprises: the distance between the position of the first object and the position of the second object is greater than a second threshold.
In one possible implementation manner, when the size of the degree of overlap is greater than or equal to the first threshold and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the target confidence is the third confidence. When the degree of overlap is greater than or equal to the first threshold, and the distance between the position of the first target and the position of the second target is greater than the second threshold, the target confidence is the fourth confidence, and the third confidence is greater than the fourth confidence, that is, when the degree of overlap is greater than or equal to the first threshold, the confidence that two targets that are closer in distance (for example, the distance between the position of the first target and the position of the second target is less than or equal to the second threshold) are the same target is higher than the confidence that two targets that are farther in distance (for example, the distance between the position of the first target and the position of the second target is greater than the second threshold) are the same target.
In a possible implementation manner, the determining, by the first device, the confidence that the first target and the second target are the same target as the target confidence according to the relationship between the position of the first target and the position of the second target and the first region, and/or the relationship between the distance between the position of the first target and the position of the second target and the second threshold by the first device, includes: the position of the first target and the position of the second target are both located in the first area, and/or the distance between the position of the first target and the position of the second target is smaller than or equal to a second threshold, the first device determines the confidence as a target confidence, and the target confidence is larger than or equal to the first confidence threshold. The accuracy of the confidence of determining that the first target and the second target are the same target is further improved.
In a possible implementation manner, the determining, by the first device, the confidence that the first target and the second target are the same target as the target confidence according to the relationship between the position of the first target and the position of the second target and the first region includes: the position of the first target and the position of the second target are both located in the first area, the first device determines the confidence as a target confidence, and the target confidence is greater than or equal to a first confidence threshold.
In a possible implementation manner, the determining, by the first device, the confidence that the first target and the second target are the same target as the target confidence according to the relationship between the position of the first target and the position of the second target and the first region includes: and if at least one of the position of the first target and the position of the second target is outside the first region, the first device determines that the confidence degrees of the first target and the second target which are the same target are target confidence degrees, and the target confidence degree is smaller than or equal to a second confidence degree threshold value.
In a possible implementation manner, the determining, by the first device, the confidence that the first target and the second target are the same target as the target confidence according to the relationship between the distance between the position of the first target and the position of the second target and the second threshold includes: and the distance between the position of the first target and the position of the second target is smaller than or equal to a second threshold, the first device determines the confidence coefficient as a target confidence coefficient, and the target confidence coefficient is larger than or equal to the first confidence coefficient threshold.
In a possible implementation manner, the determining, by the first device, the confidence that the first target and the second target are the same target as the target confidence according to the relationship between the distance between the position of the first target and the position of the second target and the second threshold includes: the distance between the position of the first target and the position of the second target is greater than or equal to a second threshold, the first device determines the confidence as a target confidence, and the target confidence is less than or equal to the second confidence threshold. It is convenient for the first device to determine the likelihood that two objects at greater distances are not the same object when the degree of overlap is greater than or equal to the first threshold.
In a possible implementation manner, when the size of the degree of overlap is greater than or equal to a first threshold, the determining, by the first device, the confidence that the first target and the second target are the same target according to a relationship between the positions of the first target and the second target and the first region, and a relationship between a distance between the positions of the first target and the second threshold includes: when the degree of overlap is greater than or equal to the first threshold, the position of the first target and the position of the second target are both located in the first region, and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, then the first device determines the confidence that the first target and the second target are the same target as a target confidence, and the target confidence is greater than or equal to the first confidence threshold. Therefore, when the degree of overlap is larger than or equal to the first threshold, the judgment metric that the position of the first target and the position of the second target are both located in the first area and the distance between the position of the first target and the position of the second target is smaller than or equal to the second threshold is introduced, so that the possibility that the first target and the second target are the same target can be further improved.
In a possible implementation manner, when the degree of overlap is greater than or equal to a first threshold, the position of the first target and the position of the second target are both located in the first region, the distance between the position of the first target and the position of the second target is less than or equal to a second threshold, and the target confidence is a fifth confidence; when the degree of overlap is larger than or equal to the first threshold, at least one of the position of the first target and the position of the second target is located outside the first area, and/or the distance between the position of the first target and the position of the second target is larger than the second threshold, and the target confidence is a sixth confidence. The fifth confidence level is greater than the sixth confidence level.
In the embodiment of the present application, when the target confidence is greater than or equal to the first confidence threshold, the first target and the second target may be considered as the same target. In the embodiment of the present application, when the target confidence is less than or equal to the second confidence threshold, the first target and the second target may not be considered to be the same target.
In a possible implementation, the first threshold is determined by the second detection error or the first threshold is determined by the fourth detection error.
In one possible implementation, the first threshold value increases with an increase in the second detection error, or the first threshold value increases with an increase in the fourth detection error.
In a possible implementation manner, when the second detection error is the first error, the first threshold is a first value; when the second detection error is a second error, the first threshold is a second value, wherein the first error is greater than the second error, and the first value is greater than the second value; or when the fourth detection error is the third error, the first threshold is the third value; and when the fourth detection error is a fourth error, the first threshold is a fourth value, wherein the third error is greater than the fourth error, and the third value is greater than the fourth value.
In one possible implementation, the second threshold is determined by the second detection error and the fourth detection error.
In one possible implementation, the second threshold decreases as the second detection error and the fourth detection error increase.
In a possible implementation manner, the second detection error is a fifth error, the fourth detection error is a sixth error, the second threshold is a fifth value, the second detection error is a seventh error, the fourth detection error is an eighth error, and the second threshold is a sixth value, where the fifth value is smaller than the sixth value; the fifth error is greater than the seventh error and the sixth error is greater than the eighth error.
In one possible implementation, the overlap is a ratio of the overlap area of the first target area and the second target area in the union of the first target area and the second target area.
In one possible implementation, the calculation of the overlap size of the overlap area in the union of the first target area and the second target area by the first device may be replaced by the following description: the first device calculates the area of the overlap region, and the size of the overlap may be replaced by the area of the overlap region.
In one possible implementation, the first sensor is a radar and the second sensor is a camera. For example, the first sensor and the second sensor are both vehicle-mounted sensors. For example, the first sensor is a vehicle radar and the second sensor is a vehicle camera.
In one possible implementation, the first detection error is a distance measurement error of the first sensor in a vertical direction, and the second detection error is an angle measurement error of the first sensor or a distance measurement error of the first sensor in a horizontal direction. The third detection error is an angle measurement error of the second sensor. The fourth detection error is a ranging error of the second sensor.
In one possible implementation, the position of the first target is a position of the first target with respect to a reference point on an XY plane, an X-axis direction of the XY plane is a vehicle width direction, and a Y-axis direction of the XY plane is a vehicle length direction. The position of the second target is a position of the second target with respect to the reference point on the XY plane.
In one possible implementation, the first device may be a vehicle or a server or a component in the vehicle capable of implementing processing or communication functions of the vehicle (e.g., circuitry inside the vehicle). Such as a telecommunications BOX (T-BOX). Still alternatively, the first device may be a chip in the T-BOX.
In a second aspect, an embodiment of the present application provides an object detection apparatus, which may implement the method in the first aspect or any possible implementation manner of the first aspect, and therefore may also implement the beneficial effects in the first aspect or any possible implementation manner of the first aspect. The object detection apparatus may be the first device, or may be an apparatus that can support the first device to implement the method in the first aspect or any possible implementation manner of the first aspect, for example, a chip or a circuit component applied in the first device. The device can realize the method through software, hardware or corresponding software executed by hardware.
An example, an object detection apparatus, comprising: a first determination module to determine an overlap area of a first target area and a second target area, the first target area being determined by a detection error of the first sensor and a position of the first target detected by the first sensor, the second target area being determined by a detection error of the second sensor and a position of the second target detected by the second sensor, the first target area including a position of the first target, the second target area including a position of the second target. And the calculation module is used for calculating the overlapping degree of the overlapping area occupying the union of the first target area and the second target area. And when the degree of overlap is greater than or equal to the first threshold, a second determining module is used for determining the confidence of the first target and the second target being the same target as the target confidence according to the relationship between the position of the first target and the position of the second target and the first region, and/or the second determining module is used for determining the confidence of the first target and the second target being the same target according to the relationship between the distance between the position of the first target and the position of the second target and the second threshold. The first area is determined by a first detection error of the first sensor and a third detection error of the second sensor, the detection error of the first sensor comprises the first detection error and the second detection error, the accuracy of the first detection error is higher than that of the second detection error, the detection error of the second sensor comprises the third detection error and the fourth detection error, and the accuracy of the third detection error is higher than that of the fourth detection error.
In one possible implementation, the relationship between the position of the first object and the position of the second object and the first area includes: the position of the first target and the position of the second target are both located within the first area, or the relationship between the position of the first target and the position of the second target and the first area, includes: at least one of the position of the first target and the position of the second target is outside the first area.
In one possible implementation manner, when the size of the overlap degree is greater than or equal to the first threshold, and the position of the first target and the position of the second target are both located in the first region, the target confidence degree is the first confidence degree. When the degree of overlap is greater than or equal to the first threshold, the target confidence is the second confidence when one or more of the position of the first target and the position of the second target are outside the first region. The first confidence level is greater than the second confidence level. In other words, when the degree of overlap is greater than or equal to the first threshold, the confidence that the first object and the second object whose positions are within the first region are the same object is greater than, when the degree of overlap is greater than or equal to the first threshold, the confidence that the first object and the second object whose positions are outside the first region are the same object is greater than.
In one possible implementation, the relationship between the distance between the position of the first target and the position of the second target and the second threshold comprises: the distance between the position of the first object and the position of the second object is less than or equal to a second threshold. Alternatively, the relationship between the distance between the position of the first target and the position of the second target and the second threshold comprises: the distance between the position of the first object and the position of the second object is greater than a second threshold.
In one possible implementation manner, when the size of the degree of overlap is greater than or equal to the first threshold and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the target confidence is the third confidence. When the degree of overlap is greater than or equal to the first threshold, and the distance between the position of the first target and the position of the second target is greater than the second threshold, the target confidence is the fourth confidence, and the third confidence is greater than the fourth confidence, that is, when the degree of overlap is greater than or equal to the first threshold, the confidence that two targets that are closer in distance (for example, the distance between the position of the first target and the position of the second target is less than or equal to the second threshold) are the same target is higher than the confidence that two targets that are farther in distance (for example, the distance between the position of the first target and the position of the second target is greater than the second threshold) are the same target.
In a possible implementation manner, the position of the first target and the position of the second target are both located in the first region, and/or a distance between the position of the first target and the position of the second target is smaller than or equal to a second threshold, and the second determining module is configured to determine the confidence as a target confidence, where the target confidence is greater than or equal to the first confidence threshold. The accuracy of the confidence of determining that the first target and the second target are the same target is further improved.
In a possible implementation manner, the determining, by the second determining module, a confidence that the first target and the second target are the same target as the target confidence according to a relationship between the position of the first target and the position of the second target and the first region includes: the position of the first target and the position of the second target are both located in the first region, and the second determining module is used for determining the confidence coefficient as a target confidence coefficient, wherein the target confidence coefficient is greater than or equal to a first confidence coefficient threshold value.
In a possible implementation manner, the determining, by the second determining module, a confidence that the first target and the second target are the same target as the target confidence according to a relationship between the position of the first target and the position of the second target and the first region includes: and if at least one of the position of the first target and the position of the second target is outside the first region, the second determining module is used for determining that the confidence degrees of the first target and the second target which are the same target are target confidence degrees, and the target confidence degree is smaller than or equal to a second confidence degree threshold value.
In a possible implementation manner, the determining, by the second determining module, a confidence that the first target and the second target are the same target as the target confidence according to a relationship between a distance between the position of the first target and the position of the second target and the second threshold includes: the distance between the position of the first target and the position of the second target is smaller than or equal to a second threshold, and the second determining module is used for determining the confidence coefficient as the target confidence coefficient, and the target confidence coefficient is larger than or equal to the first confidence coefficient threshold.
In a possible implementation manner, the determining, by the second determining module, a confidence that the first target and the second target are the same target as the target confidence according to a relationship between a distance between the position of the first target and the position of the second target and the second threshold includes: the distance between the position of the first target and the position of the second target is greater than or equal to a second threshold, and the second determining module is used for determining the confidence coefficient as the target confidence coefficient, and the target confidence coefficient is less than or equal to the second confidence coefficient threshold. It is convenient for the first device to determine that two objects at greater distances are not the same object when the degree of overlap is greater than or equal to the first threshold.
In a possible implementation manner, when the size of the degree of overlap is greater than or equal to the first threshold, the second determining module is configured to determine the confidence that the first target and the second target are the same target according to a relationship between the position of the first target and the position of the second target and the first region, and according to a relationship between the distance between the position of the first target and the position of the second target and the second threshold, including: when the degree of overlap is greater than or equal to the first threshold, the position of the first target and the position of the second target are both located in the first region, and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the second determining module is configured to determine that the confidence degrees that the first target and the second target are the same target are target confidence degrees, and the target confidence degree is greater than or equal to the first confidence degree threshold.
In a possible implementation manner, when the degree of overlap is greater than or equal to a first threshold, the position of the first target and the position of the second target are both located in the first region, the distance between the position of the first target and the position of the second target is less than or equal to a second threshold, and the target confidence is a fifth confidence; when the degree of overlap is larger than or equal to the first threshold, at least one of the position of the first target and the position of the second target is located outside the first area, and/or the distance between the position of the first target and the position of the second target is larger than the second threshold, and the target confidence is a sixth confidence. The fifth confidence level is greater than the sixth confidence level.
In the embodiment of the present application, when the target confidence is greater than or equal to the first confidence threshold, the first target and the second target may be considered as the same target. In the embodiment of the present application, when the target confidence is less than or equal to the second confidence threshold, the first target and the second target may not be considered to be the same target.
In a possible implementation, the first threshold is determined by the second detection error or the first threshold is determined by the fourth detection error.
In one possible implementation, the first threshold value increases with an increase in the second detection error, or the first threshold value increases with an increase in the fourth detection error.
In a possible implementation manner, when the second detection error is the first error, the first threshold is a first value; when the second detection error is a second error, the first threshold is a second value, wherein the first error is greater than the second error, and the first value is greater than the second value; or when the fourth detection error is the third error, the first threshold is the third value; and when the fourth detection error is a fourth error, the first threshold is a fourth value, wherein the third error is greater than the fourth error, and the third value is greater than the fourth value.
In one possible implementation, the second threshold is determined by the second detection error and the fourth detection error.
In one possible implementation, the second threshold decreases as the second detection error and the fourth detection error increase.
In a possible implementation manner, the second detection error is a fifth error, the fourth detection error is a sixth error, the second threshold is a fifth value, the second detection error is a seventh error, the fourth detection error is an eighth error, and the second threshold is a sixth value, where the fifth value is smaller than the sixth value; the fifth error is greater than the seventh error and the sixth error is greater than the eighth error.
In one possible implementation, the overlap is a ratio of the overlap area of the first target area and the second target area in the union of the first target area and the second target area.
In one possible implementation, the first sensor is a radar and the second sensor is a camera. For example, the first sensor and the second sensor are both vehicle-mounted sensors. For example, the first sensor is a vehicle radar and the second sensor is a vehicle camera.
In one possible implementation, the first detection error is a distance measurement error of the first sensor in a vertical direction, and the second detection error is an angle measurement error of the first sensor or a distance measurement error of the first sensor in a horizontal direction. The third detection error is an angle measurement error of the second sensor; the fourth detection error is a ranging error of the second sensor.
In one possible implementation, the position of the first target is a position of the first target with respect to a reference point on an XY plane, an X-axis direction of the XY plane is a vehicle width direction, and a Y-axis direction of the XY plane is a vehicle length direction; the position of the second target is a position of the second target with respect to the reference point on the XY plane.
In another example, an embodiment of the present application provides an object detection apparatus, where the object detection apparatus may be a first device or a chip in the first device. The object detection apparatus may include: and a processing unit. When the object detection means is a first device, the processing unit may be a processor. The processing unit may replace the first determining module, the second determining module, and the calculating module described above. In addition, the object detection device may further include a communication unit, and the communication unit may be an interface circuit. The object detection device may further include a storage unit. The storage unit may be a memory. The memory unit is to store computer program code, the computer program code comprising instructions. The processing unit executes the instructions stored by the storage unit to cause the object detection apparatus to implement the method described in the first aspect or any one of the possible implementations of the first aspect. When the object detection means is a chip within the first device, the processing unit may be a processor, and the communication unit may be collectively referred to as: a communication interface. For example, the communication interface may be an input/output interface, a pin or a circuit, or the like. The processing unit executes computer program code stored in a storage unit, which may be a storage unit within the chip (e.g. a register, a cache, etc.) or a storage unit external to the chip within the first device (e.g. a read-only memory, a random access memory, etc.), to cause the first device to implement the method described in the first aspect or any one of the possible implementations of the first aspect.
Optionally, the processor, the communication interface and the memory are coupled to each other.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, where a computer program or an instruction is stored, and when the computer program or the instruction is run on a computer, the computer is caused to execute an object detection method as described in any one of the possible implementation manners of the first aspect to the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product including instructions that, when executed on a computer, cause the computer to perform the object detection method of the first aspect or one of the various possible implementations of the first aspect.
In a fifth aspect, an embodiment of the present application provides a communication apparatus, which includes a processor and a storage medium, where the storage medium stores instructions that, when executed by the processor, implement a target detection method as described in the first aspect or various possible implementation manners of the first aspect.
In a sixth aspect, the present application provides a communication apparatus, which includes one or more modules for implementing the method of the first aspect, where the one or more modules may correspond to each step in the method of the first aspect.
In a seventh aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a computer program or instructions to implement the first aspect or one of the object detection methods described in various possible implementation manners of the first aspect. The communication interface is used for communicating with other modules outside the chip.
In particular, the chip provided in the embodiments of the present application further includes a memory for storing a computer program or instructions.
In an eighth aspect, embodiments of the present application provide a communication device, which includes a processor, a memory and a coupling, where the memory stores computer programs or instructions, and the processor is configured to execute the computer programs or instructions stored in the memory, so as to implement one of the object detection methods described in the first aspect or various possible implementation manners of the first aspect.
Any one of the above-provided apparatuses, computer storage media, computer program products, chips, or communication systems is configured to execute the above-provided corresponding methods, and therefore, the beneficial effects that can be achieved by the apparatuses, the computer storage media, the computer program products, the chips, or the communication systems can refer to the beneficial effects of the corresponding schemes in the above-provided corresponding methods, and are not described herein again.
Drawings
Fig. 1 is a first schematic diagram illustrating positions of an image target and a radar target according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of positions of an image target and a radar target according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram three of positions of an image target and a radar target according to an embodiment of the present application;
fig. 4 is a schematic diagram of positions of an image target and a radar target according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram of positions of an image target and a radar target according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a communication device according to an embodiment of the present application;
fig. 7 is a schematic flowchart of a target detection method according to an embodiment of the present application;
FIG. 8 is a schematic view of a first region provided in an embodiment of the present application;
FIG. 9a is a first schematic diagram illustrating positions of a first target and a second target according to an embodiment of the present disclosure;
FIG. 9b is a schematic diagram of a second location of a first target and a second target according to an embodiment of the present application;
FIG. 10a is a schematic illustration showing a third position of a first target and a second target according to an embodiment of the present application;
FIG. 10b is a fourth schematic diagram illustrating the positions of a first target and a second target according to the embodiment of the present application;
FIG. 10c is a schematic illustration of the positions of a first target and a second target according to an embodiment of the present application;
FIG. 11 is a sixth schematic illustration of the positions of a first target and a second target provided by an embodiment of the present application;
fig. 12 is a schematic diagram seven of positions of a first target and a second target according to an embodiment of the present application;
fig. 13 is a schematic position diagram eight of a first target and a second target provided in the embodiment of the present application;
FIG. 14 is a schematic illustration of the positions of a first target and a second target according to an embodiment of the present disclosure;
fig. 15 is a schematic structural diagram of an object detection apparatus according to an embodiment of the present disclosure;
FIG. 16 is a schematic structural diagram of another object detection apparatus according to an embodiment of the present disclosure;
fig. 17 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first sensor and the second sensor are only used for distinguishing different sensors, and the sequence order thereof is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The network architecture and the service scenario described in the embodiment of the present application are for more clearly illustrating the technical solution of the embodiment of the present application, and do not form a limitation on the technical solution provided in the embodiment of the present application, and as a person of ordinary skill in the art knows that along with the evolution of the network architecture and the appearance of a new service scenario, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
In the embodiment of the present application, a target detected from an image captured by a monocular camera is referred to as an image target. In the embodiment of the present application, a target detected from an image captured by a millimeter wave radar is referred to as a radar target.
As shown in fig. 1, the vehicle is located at the coordinate system origin O, the X-axis represents the vehicle width direction (width direction), and the Y-axis represents the vehicle length direction (length direction) as an example. EYr in FIG. 1 represents the position error of the radar target in the vertical direction, EXr represents the position error of the radar target in the horizontal direction, wherein EYr is less than EXr, RrIndicating the error region of the radar target, RrAs determined by EYr and EXr. E θ i represents the angle measurement accuracy of the image target, EYi represents the distance measurement accuracy of the image target, the detection of the image target is based on a monocular camera, E θ i is higher than EYi. Likewise, RiError regions, R, representing image objectsiAs determined by E θ i and EYi. An overlapping region of an error region of a radar target and an error region of an image target (for example, a hatched portion R in fig. 1)x) The size of (a) reflects the likelihood that the image target and the radar target are from the same object, in other words the likelihood that the image target and the radar target are the same target. And when the area of the overlapping area is larger than a certain threshold value, judging that the radar target and the image target are the same target.
As shown in fig. 2, fig. 2 differs from fig. 1 in that: the error region of the radar target in fig. 2 is determined by the vertical direction error EYr and the angle measurement error E θ r. Error region R of the radar target in FIG. 2rIs shaped like a parallelogram (the error area of the radar target in fig. 1 is rectangular in shape). In this case, it is still possible to determine whether the image target and the radar target are the same target using a relationship between the size of the overlapping area of the error region of the radar target and the error region of the image target and a certain threshold value. In the embodiment of the present application, determining whether two targets (e.g., an image target and a radar target) are the same target may also be understood as: it is determined whether the two targets are from the same object.
Although it is possible to determine whether the image target and the radar target are the same target using the overlapping area larger than a certain threshold, there are the following problems: even if the overlapping areas have the same size, the target position relationship may be different, which may result in misjudgment that the image target and the radar target are the same target. As shown in fig. 3, if the overlapping area of the error region of the radar target at the position X and the error region of the image target is a, and the overlapping area of the error region of the radar target at the position X 'and the error region of the image target is B, if a and B are both greater than a certain threshold and a and B are equal, theoretically, the probability that the radar target and the image target at the position X are the same target according to the above scheme should be the same as the probability that the radar target and the image target at the position X' are the same target, that is, the radar target and the image target at the position X 'are the same target, and the radar target and the image target at the position X' are also the same target. However, in practice the following conditions may exist: the radar target located at position X and the radar target located at position X' are not the same radar target. When the radar target at the position X and the radar target at the position X ' are not the same radar target, the possibility that the radar target at the position X ' and the image target are the same target should be different from the possibility that the radar target at the position X ' and the image target are the same target, for example, the radar target at the position X ' and the image target are the same target, and the radar target at the position X ' and the image target are not the same target. However, if only the relationship between the overlap area and the threshold is used, since both a and B are greater than the threshold, it may be determined that the radar target at the position X' and the image target are the same target, and the radar target at the position X and the image target are the same target, but this conclusion is contrary to the fact, and therefore, the erroneous determination may be caused only by the relationship between the overlap area and the threshold. Further, even if the radar target at the position X 'and the radar target at the position X belong to the same radar target, the possibility that the radar target at the position X' and the radar target at the position X are the same target as the same image target is different.
Further, as shown in FIG. 4, the error region R of the image object located at the position YiError region R with radar targetrHas an overlapping area of C, and is located in an error region R of the image object at a position YiError region R with radar targetrIs D, if C and D are both greater than a certain threshold, and C and D are equal. Theoretically, the probability that the image target and the radar target located at the position Y 'are the same target according to the above scheme should be equal to the probability that the image target and the radar target located at the position Y are the same target, that is, the image target and the radar target located at the position Y' are the same target, and the image target and the radar target located at the position Y are also the same target. However, if the image target located at a different position is not the same image target, the probability that the image target located at position Y' and the radar target are the same target should be different from the probability that the image target located at position Y and the radar target are the same target. However, the above scheme only depends on the relationship between the overlap area and the threshold, and since both C and D are greater than the threshold, it may be determined that the image target and the radar target located at Y' are the same target, and the image target and the radar target located at Y are the same target, but this conclusion is contrary to the fact, and therefore, the erroneous determination may be caused only depending on the relationship between the overlap area and the threshold.
As shown in fig. 5, when the detection error in the horizontal direction of the camera and the radar continues to be large, the overlapping area of the error region of the image target and the error region of the radar target reaches a maximum value. At this time, the image target and the radar target may appear in more different positions, and the overlapping area may remain unchanged. That is, it is not possible to accurately judge whether the image target and the radar target are the same target if they are based on the overlapping area. Therefore, the possibility of erroneous judgment in this case is greatly increased.
From the above analysis, it can be seen that it is determined whether the image target and the radar target are the same target, and erroneous determination may be caused only according to the measurement of the overlapping area between the error region of the radar target and the error region of the image target. Based on this, the present application provides a target detection method, which considers the measurement of the size of the overlapping area of the error region of the radar target and the error region of the image target, and also considers other measurements to improve the accuracy of determining whether the image target and the radar target are the same target.
As shown in fig. 6, fig. 6 is a schematic diagram illustrating a hardware structure of an object detection apparatus in an embodiment of the present application. The structure of the first device in the embodiment of the present application may refer to the structure shown in fig. 6. The object detection means comprises a processor 41, and a communication line 43.
Optionally, the object detection means may further comprise a memory 42.
Processor 41 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the teachings of the present disclosure.
The communication link 43 may include a path for transmitting information between the aforementioned components.
The memory 42 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via a communication line 43. The memory may also be integral to the processor.
The memory 42 is used for storing computer-executable instructions for executing the present application, and is controlled by the processor 41 to execute. The processor 41 is configured to execute the computer-executable instructions stored in the memory 42, so as to implement an object detection method provided by the following embodiments of the present application.
It will be appreciated that if the first device is a vehicle, the object detecting means may also comprise a first sensor and a second sensor.
It is to be understood that, in an embodiment of the present application, the steps performed by the first device in an object detection method may also be performed by a chip applied to the first device, and the following embodiments are described by taking an execution subject of an object detection method as an example of the first device.
Embodiments of the present application may be referred to or referred to, for example, the same or similar steps, method embodiments, apparatus embodiments, or system embodiments, without limitation.
As shown in fig. 7, fig. 7 illustrates an embodiment of a target detection method provided in an embodiment of the present application, where the method includes:
step 701, the first device determines an overlapping area of a first target area and a second target area, the first target area being determined by a detection error of the first sensor and a position of the first target detected by the first sensor, and the second target area being determined by a detection error of the second sensor and a position of the second target detected by the second sensor.
Wherein the first target area comprises the position of the first target, in other words the position of the first target is located within the first target area. The second target area comprises the position of the second target, in other words the position of the second target is located within the second target area.
In this embodiment of the application, an overlapping area of the first target area and the second target area is a common area where the first target area and the second target area exist, in other words, the overlapping area is an intersection of the first target area and the second target area, or the overlapping area belongs to both the first target area and the second target area.
In the embodiment of the present application, a target detected by an image captured by the first sensor may be referred to as a first target, and a target detected by an image captured by the second sensor may be referred to as a second target.
The first device in the embodiment of the present application may be a vehicle, or a chip applied in the vehicle, or a component applied in the vehicle (for example, an Electronic Control Unit (ECU) or an automatic driving controller), or the first device may be a server, or a chip applied in the server.
Illustratively, the detection error of the first sensor is a position error of the first sensor. For example, the detection error of the first sensor includes a first detection error and a second detection error. For example, the first detection error is a distance measurement error of the first sensor in the vertical direction, and the second detection error is an angle measurement error of the first sensor or a distance measurement error of the first sensor in the horizontal direction. Wherein the detection error of the first sensor is predetermined based on the characteristic of the first sensor.
Illustratively, the detection error of the second sensor is a position error of the second sensor. For example, the detection error of the second sensor includes a third detection error and a fourth detection error. The third detection error is an angle measurement error of the second sensor, and the fourth detection error is a distance measurement error of the second sensor. The detection error of the second sensor is predetermined based on the characteristic of the second sensor. The ranging error of the second sensor may be a ranging error of the second sensor in a horizontal direction or a ranging error in a vertical direction.
The "error" of the sensor in the embodiment of the present application may also be replaced with the "accuracy" of the sensor, for example, the angle measurement error may be replaced with the angle measurement accuracy. The detection error in the embodiment of the present application may also be referred to as: and measuring errors.
As a possible implementation, the first target area in the embodiment of the present application is defined by a closed area enclosed by a fifth line and a sixth line which are parallel lines and are vertically arranged, and a seventh line and an eighth line which are parallel lines and are horizontally arranged. The fifth line and the sixth line are arranged in parallel, the shortest distance between the fifth line and the position of the first target is equal to the shortest distance between the sixth line and the position of the first target, and the shortest distance between the fifth line and the position of the first target is a distance measurement error of the first sensor in the vertical direction. The seventh line and the eighth line are arranged in parallel, the shortest distance between the seventh line and the position of the first target is equal to the shortest distance between the eighth line and the position of the first target, and the shortest distance between the seventh line and the position of the first target is a distance measurement error of the first sensor in the horizontal direction.
In other words, in the embodiment of the present application, the first target area is defined by a closed area surrounded by the distance measurement error of the first sensor in the vertical direction extending vertically from the position of the first target, and the distance measurement error of the first sensor in the horizontal direction extending horizontally from the position of the first target to the left and right, that is, the position of the first target is the center of the first target area. Or in this embodiment of the application, the first target area is defined by a closed area surrounded by a range error in the vertical direction, which is centered on the position of the first target and extends vertically along the vertical direction, and a range error in the horizontal direction, which is centered on the position of the first target, extends horizontally and extends horizontally along the left and right of the first sensor, that is, the position of the first target is the center of the first target area.
As a specific implementation, the first target area in the embodiment of the present application may be denoted as Y1-EYr≤Yr≤Y1+ EYr and X1-EXr≤Xr≤X1+ EXr wherein Y is1Denotes the coordinate of the first object in the vertical direction, X1Indicating the coordinates of the first object in the horizontal direction. The ± EYr indicates the range error of the first sensor in the vertical direction (Y-axis direction), and the ± EXr indicates the range error of the first sensor in the horizontal direction (X-axis direction). Y isrDenotes the coordinate of the first target area in the vertical direction, XrIndicating the coordinates of the first target area in the horizontal direction.
As another possible implementation, the first target area in the embodiment of the present application is formed by extending the range-finding error of the first sensor in the vertical direction up and down in the vertical direction with the position of the first target as the center, and the position of the first targetA connecting line between the device and the reference point is an axis, and a closed area formed by rotating the angle measurement error of the first sensor left and right is limited. At this time, the specific shape of the first target region may refer to R in fig. 4r
As a specific implementation, the first target area in the embodiment of the present application may be denoted as Y1-EYr≤Yr≤Y1+ EYr and X1-Eθr≤Xr≤X1+ E θ r, wherein Y1Denotes the coordinate of the first object in the vertical direction, X1Indicating the coordinates of the first object in the horizontal direction. The ± EYr indicates the distance measurement error of the first sensor in the vertical direction (Y-axis direction), and the ± E θ r indicates the angle measurement error of the first sensor. Y isrDenotes the coordinate of the first target area in the vertical direction, XrIndicating the coordinates of the first target area in the horizontal direction.
As one possible implementation, the second target area in the embodiment of the present application is defined by a closed area formed by extending the distance measurement error of the second sensor in the vertical direction up and down in the vertical direction with the position of the second target as the center, and rotating the angle measurement error of the second sensor left and right with the line between the position of the second target and the reference point as the axis.
As a specific implementation, the second target area in the embodiment of the present application may be represented as Y2-EYi≤Yi≤Y2+ EYi and X2-Eθi≤Xi≤X2+ E θ i, wherein Y2Denotes the coordinate of the second target in the vertical direction (Y-axis direction), X2Indicating the coordinates of the second target in the horizontal direction (X-axis direction). The ± EYi indicates a distance measurement error of the second sensor in the vertical direction (Y-axis direction), and the ± E θ i indicates an angle measurement error of the second sensor in the horizontal azimuth angle (X-axis direction). Y isiDenotes the coordinates of the second target area in the vertical direction, XiIndicating the coordinates of the second target area in the horizontal direction.
As shown in FIG. 8, RirDenotes an overlapping region (i.e., a hatched portion in fig. 8) of the first target region and the second target region, RrA first target area is represented which is,i.e. the first target area is defined by XrAnd YrThe width of the coordinate range of the first target region in the vertical direction (Y-axis direction) is defined, in other words, 2 EYr. The width of the coordinate range of the first target region in the horizontal direction (X-axis direction) is 2 EXr. RiRepresenting a second target area, i.e. the second target area is represented by XiAnd YiThe width of the coordinate range of the second target region in the vertical direction (Y-axis direction) is defined, in other words, 2 EYi. The azimuth range width of the second target area in the horizontal azimuth (X-axis direction) is 2E θ i.
Wherein the position of the first target is a position of the first target with respect to a reference point on an XY plane, an X-axis direction of the XY plane is a vehicle width direction, and a Y-axis direction of the XY plane is a vehicle length direction. The position of the second target is a position of the second target with respect to the reference point on the XY plane.
It is to be understood that the reference point on the XY plane may be the origin of coordinates O shown in fig. 8, which may be the position of the vehicle on which the first sensor and the second sensor are disposed.
Step 702, the first device calculates the overlapping degree of the overlapping area occupying the union of the first target area and the second target area.
As a specific implementation: the overlap degree is the proportion of the overlap area of the first target area and the second target area in the union of the first target area and the second target area. For example, the overlap size (R)rAnd RiIntersection of (c)/(R)rAnd RiUnion of (d). Wherein R isrRepresenting a first target region, RiRepresenting a second target area.
For another example, step 702 in the embodiment of the present application may be replaced by the following steps: the first device calculates an overlap area of the overlap region. As to how to calculate the area of the overlapping region, reference may be made to the description in the prior art, which is not limited in the embodiments of the present application.
It is understood that, if the overlapping degree is a ratio of the overlapping area of the first target area and the second target area in the union of the first target area and the second target area, the first threshold value may also be a preset ratio value.
If the first device calculates the overlapping area, the first threshold may be a preset area threshold, which is not limited in this embodiment of the application.
And 703, when the degree of overlap is greater than or equal to the first threshold, the first device determines the confidence that the first target and the second target are the same target as the target confidence according to the relationship between the position of the first target and the position of the second target and the first region, and/or the relationship between the distance between the position of the first target and the position of the second target and the second threshold.
Wherein the first area is determined by a first detection error of the first sensor and a third detection error of the second sensor, the detection error of the first sensor includes the first detection error and the second detection error, the accuracy of the first detection error is higher than the accuracy of the second detection error, the detection error of the second sensor includes the third detection error and a fourth detection error, and the accuracy of the third detection error is higher than the accuracy of the fourth detection error.
The confidence that the first target and the second target are the same target in the embodiment of the present application may be understood as: the likelihood that the first and second objects are the same object, or the likelihood that the first and second objects originate from the same object.
The first threshold and the second threshold in the embodiment of the present application may be values predefined by a protocol or values determined by the first device.
The first region in the embodiment of the present application is determined by the first detection error of the first sensor and the third detection error of the second sensor, which is based on the fact that:
taking the first sensor as an example of the radar, the measurement accuracy of the radar in the vertical direction is higher than that in the horizontal direction (in other words, the accuracy of measuring the X/azimuth angle is less than that of the vehicle-mounted radar in the Y direction due to the distance measurement of the radar in the Y direction). Taking the second sensor as a camera as an example, the azimuth angle measurement accuracy of the camera is higher than the horizontal or vertical direction accuracy (the monocular camera (camera) measures the azimuth angle accurately, and the distance measurement is relatively inaccurate).
Illustratively, as shown in fig. 8, the first area is a closed area surrounded by intersections of parallel lines (line 1 and line 2) having a distance from the position of the first target in the vertical direction, which is an error of the first sensor in the vertical direction, with line 3 and line 4. The lines 3 and 4 are intersected, and included angles between the lines 3 and 4 and lines passing through the position of the second target are angle measurement errors of the second sensor.
The embodiment of the application provides an object detection method, in which a first device determines the degree of overlap of an overlap region of a first object region and a second object region, and then when the degree of overlap is greater than or equal to a first threshold, the first device determines, as an object confidence, the confidence that the first object and the second object are the same object according to the relationship between the position of the first object and the position of the second object and the first region, and/or the relationship between the distance between the position of the first object and the position of the second object and a second threshold, so that when the degree of overlap is greater than or equal to the first threshold, the accuracy of the confidence that the first object and the second object are the same object is further determined based on the relationship between the position of the first object and the position of the second object and the first region, and/or the relationship between the distance between the position of the first object and the position of the second object and the second threshold And the misjudgment is reduced.
Alternatively, the first sensor and the second sensor may detect objects within the same range, in other words, the position of the first object and the position of the second object are located within the same range.
After the first device determines the confidence degrees that the first target and the second target are the same target, the subsequent execution of target fusion is facilitated. Since the measurement errors (also referred to as detection errors) of different sensors are different, the first target is detected by the first sensor, and the second target is detected by the second sensor, if the first device determines that the first target and the second target are the same target, the position of the first target and the position of the second target may be fused subsequently to obtain the final position of the fused target. If the first device determines that the first object and the second object are not the same object, then the location of the first object and the location of the second object do not need to be merged at a later time.
In this embodiment of the application, if the size of the degree of overlap is greater than or equal to the first threshold, the first device may determine that the first target and the second target are not the same target.
As a possible example, the first sensor in the embodiment of the present application is a radar, and the second sensor is a camera. For example, the radar may be a millimeter wave radar. The camera can be: monocular camera.
The millimeter wave radar in the embodiment of the present application is mounted on the center front portion of the vehicle for detecting targets such as other vehicles and pedestrians by using millimeter waves. The millimeter wave radar transmits millimeter waves forward from the vehicle while scanning along a horizontal plane and receives the reflected millimeter waves, thereby transmitting the transmitted and received data to the first device in the form of radar signals, or the millimeter wave radar transmits the transmitted and received data to the vehicle in the form of radar signals, which is transmitted by the vehicle to the server. The monocular camera includes a Charge Coupled Device (CCD) camera, and is mounted at the center front of the vehicle. The monocular camera transmits data of the captured image to the first device in the form of an image signal, or the monocular camera transmits data of the captured image to the vehicle in the form of an image signal, and the vehicle transmits the data to the server.
In the embodiment of the present application, the relationship between the position of the first target and the position of the second target and the first area includes: the position of the first target and the position of the second target are both located within the first region. Alternatively, the relationship between the position of the first object and the position of the second object and the first area includes: one or more of the location of the first target and the location of the second target are outside the first region.
In the embodiment of the present application, the relationship between the distance between the position of the first target and the position of the second target and the second threshold includes: the distance between the position of the first object and the position of the second object is greater than or equal to a second threshold, and the distance between the position of the first object and the position of the second object is less than or equal to the second threshold.
In a possible implementation manner, when the size of the degree of overlap is greater than or equal to the first threshold in step 703 of the embodiment of the present application, the determining, by the first device, the confidence that the first target and the second target are the same target as the target confidence according to the relationship between the position of the first target and the position of the second target and the first region includes: when the degree of overlap is greater than or equal to the first threshold, and the position of the first target and the position of the second target are both located in the first region, the first device determines that the confidence degrees that the first target and the second target are the same target are target confidence degrees, and the target confidence degree is greater than or equal to the first confidence degree threshold.
It is understood that, in the embodiment of the present application, the target confidence being greater than or equal to the first confidence threshold may be replaced by: the first target and the second target are the same target. The first confidence threshold in the embodiment of the present application may be set as needed, and the embodiment of the present application does not limit this.
As shown in FIG. 9a, taking a first object as object A and a second object as object B as an example, the position of object A and the position of object B are both located in the first region, and the overlapping region (the shaded portion R in FIG. 9 a)ir) The degree of overlap is greater than or equal to the first threshold, at this time, the first device may determine that the confidence that the target a and the target B are the same target is the target confidence, and the target confidence is greater than or equal to the first confidence threshold. Or the first device determines that object a and object B are the same object.
As a possible implementation manner, when the size of the overlap degree is greater than or equal to the first threshold, and the position of the first target and the position of the second target are both located in the first region, the target confidence degree is the first confidence degree. And when the degree of overlap is larger than or equal to the first threshold, and one or more of the position of the first target and the position of the second target are positioned outside the first area, the target confidence is a second confidence. In other words, when the degree of overlap is greater than or equal to the first threshold, the target confidence when the position of the first target and the position of the second target are both within the first region is higher than the target confidence when at least one of the position of the first target and the position of the second target is outside the first region.
As shown in FIG. 9B, taking the first target as target A and the second target as target B as an example, the confidence levels of target B and target A at different positions are shown when target B is at different positions, for example, target B is at position Q (X)21,Y21) Or the object B is located at the position Q' (X)22,Y22) Position Q' is outside the first region and position Q is within the first region. Assuming that the target B is located at the position Q, a second target region determined by the position Q of the target B and the first error parameter of the second sensor is Ri. The first target region RrAnd a second target region RiThe overlapping area (hatched in fig. 9 b) therebetween is Rir. Assuming that the target B is located at the position Q ', a second target region determined by the position Q' of the target B and a second detection error of the second sensor is Ri'。Ri' with the first target region RrThe overlapping area (hatched in fig. 9 b) therebetween is Rir'. If R isirOf the overlap and Rir' the magnitude of the overlap is greater than or equal to a first threshold, i.e. R, regardless of whether the object B is located at position Q or at position QiAnd Ri' the sizes of the overlapping areas with the first target areas, respectively, are all greater than or equal to a first threshold. However, when the target B is at the position Q, the position of the target B is within the first region, and when the target B is at the position Q ', the position of the target B is outside the first region, the target confidence that the target a and the target B at the position Q are the same target is higher than the target confidence that the target a and the target B at the position Q' are the same target.
In a possible implementation manner, when the size of the degree of overlap is greater than or equal to the first threshold in step 703 of the embodiment of the present application, the determining, by the first device, the confidence that the first target and the second target are the same target as the target confidence according to the relationship between the position of the first target and the position of the second target and the first region includes: when the degree of overlap is greater than or equal to the first threshold, if one or more of the position of the first target and the position of the second target are located outside the first region, the first device determines that the confidence degrees that the first target and the second target are the same target are target confidence degrees, and the target confidence degrees are less than or equal to the second confidence degree threshold.
In this embodiment of the present application, the confidence that the first target and the second target are the same target is a target confidence, and the target confidence that is less than or equal to the second confidence threshold may be replaced with: the first object and the second object are not the same object. The first confidence threshold may be the same as or different from the second confidence threshold, which is not limited in this embodiment of the application.
For example, as shown in fig. 9B, taking a first target as a target a located at a position P and a second target as a target B located at a position Q 'as an example, it can be seen from fig. 9B that the position P of the target a is located in a first region, and the position Q' of the target B is located outside the first region, the first device may determine that the confidence of the target a and the target B being the same target is a target confidence, and the target confidence is less than or equal to the second confidence threshold.
In a possible implementation manner, when the size of the degree of overlap is greater than or equal to a first threshold in step 703 of the embodiment of the present application, the determining, by the first device, the confidence that the first target and the second target are the same target as the target confidence according to a relationship between a distance between the position of the first target and the position of the second target and the second threshold by the first device includes: when the degree of overlap is greater than or equal to a first threshold, and the distance between the position of the first target and the position of the second target is less than or equal to a second threshold, the first device determines that the confidence degrees that the first target and the second target are the same target are target confidence degrees, and the target confidence degrees are greater than or equal to the first confidence degree threshold.
For example, as shown in FIG. 10a, the overlapping region between the first target region and the second target region (e.g., the shaded portion R in FIG. 10 a)ir) Is greater than or equal to a first threshold, and the first targetThe distance between the position of the first target and the position of the second target is smaller than a first threshold, the confidence that the first target and the second target are the same target is a target confidence, and the target confidence is larger than or equal to the first confidence threshold.
In the embodiment of the present application, reference may be made to the prior art for a way of calculating a distance between positions of two targets, and details are not described here.
In a possible implementation manner, when the size of the degree of overlap is greater than or equal to the first threshold, and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the confidence that the first target and the second target are the same target is the third confidence. When the degree of overlap is greater than or equal to the first threshold, and the distance between the position of the first target and the position of the second target is greater than the second threshold, the confidence that the first target and the second target are the same target is the fourth confidence. The third confidence level is greater than the fourth confidence level.
For example, as shown in FIG. 10B, the first target is target A and the second target is target B, and FIG. 10B shows target A at a different location (e.g., the location of target A may be location P (X)11,Y11) Or the position of the object A may be the position P' (X)12,Y12) When the object a is at the position P, a first object region determined by the position P of the object a and the detection error of the first sensor is Rr. The R isrAnd a second target region RiHas an overlap region of Rri。RriIs S1. Assuming that the object A is at the position P ', a first target region determined by the position P' of the object A and the detection error of the first sensor is Rr'。Rr' with a second target region RiHas an overlap region of Rri'。RriThe overlap of' is of size S2. Assume that S1 and S2 are both greater than or equal to the first threshold, i.e., the first target region R corresponding to target A at position PrA first target region R corresponding to the target A at the position Pr' overlap regions with the second target regions, respectively, are largeThe magnitudes are each greater than or equal to a first threshold. If object A is at position P, the distance between the position of object A and the position of object B is L1. When the object A is at the position P ', the distance between the object A and the object B is L2, and L1 is smaller than L2, i.e. the object B is closer to the object A at the position P than to the object A at the position P'. Therefore, the third confidence that the target B and the target a at the position P are the same target is greater than the fourth confidence that the target B and the target a at the position P' are the same target.
For example, as shown in FIG. 10c, taking the second target as target B and the first target as target A as an example, FIG. 10c shows target B at a different position (e.g., target B at position Q (X)21,Y21) The object B is located at the position Q' (X)22,Y22) Confidence of the same object as the same object a), and a second object region determined by the position Q of the object B and the detection error of the second sensor is R assuming that the object B is at the position Qi. The first target region RrAnd a second target region RiHas an overlap region of Rir。RirIs S3. When the target B is at the position Q ', a second target region determined by the position Q' of the target B and the detection error of the second sensor is Ri'。Ri' with the first target region RrHas an overlap region of Rir'。RriThe overlap of' is of size S4. If both S3 and S4 are greater than or equal to the first threshold, R corresponding to the target B at the position QiR corresponding to object B at position Qi' the magnitude of the degree of overlap with the overlap areas of the first target areas, respectively, is greater than or equal to a first threshold value. If object B is at position Q, the distance between the position of object A and object B at position Q is L3. And when the object B is at the position Q ', the distance between the position of the object A and the position of the object B at the position Q ' is L4, and L3 is smaller than L4, namely, the position of the object A is closer to the position of the object B at the position Q than the object B at the position Q '. Therefore, the confidence that the target a is the same as the target B at the position Q 'is greater than the confidence that the target a is the same as the target B at the position Q'.
It can be derived from fig. 10b and 10c that when the degree of overlap is greater than or equal to the first threshold, the closer the distance between the positions of any two objects is (for example, less than or equal to the second threshold), the higher the possibility that the any two objects are the same object.
It should be noted that fig. 10B and fig. 10c respectively take an example in which one object is located at different positions and the confidence between the same object and objects located at different positions is calculated, and it is understood that the object B located at the position Q and the object B located at the position Q' may not be the same object.
In a possible implementation manner, when the size of the degree of overlap is greater than or equal to a first threshold in step 703 of the embodiment of the present application, the determining, by the first device, the confidence that the first target and the second target are the same target as the target confidence according to a relationship between a distance between the position of the first target and the position of the second target and the second threshold by the first device includes: when the degree of overlap is larger than or equal to a first threshold, the distance between the position of the first target and the position of the second target is larger than a second threshold, the first device determines the confidence degree that the first target and the second target are the same target as the confidence degree of the target, and the confidence degree of the target is smaller than or equal to the second confidence degree threshold.
For example, as shown in fig. 10c, taking the first target as the target a located at the position P and the second target as the target B located at the position Q ', it can be seen from fig. 10c that the distance between the position of the target a and the position of the target B located at the position Q' is L4, and if L4 is greater than the second threshold, then the confidence that the target a and the target B are the same target is less than or equal to the fifth confidence threshold.
In a possible implementation manner, in step 703 in this embodiment of the present application, the first device may determine, according to a relationship between the positions of the first object and the second object and the first region, and according to a relationship between a distance between the position of the first object and the position of the second object and the second threshold, that the first object and the second object are the same object, by: when the degree of overlap is greater than or equal to the first threshold, the position of the first target and the position of the second target are both located in the first region, and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, then the first device determines the confidence that the first target and the second target are the same target as a target confidence, and the target confidence is greater than or equal to the first confidence threshold.
As shown in fig. 11, fig. 11 differs from fig. 9a in that: while the position of the object a and the position of the object B are not only located within the overlap region in fig. 11 and the distance between the position of the object a and the position of the object B is less than or equal to the second threshold, while the position of the object a and the position of the object B are located within the overlap region in fig. 9a, it is possible that the distance between the position of the object a and the position of the object B is not necessarily less than or equal to the second threshold, and therefore, the confidence that the object a and the object B are the same object in fig. 11 is higher than the confidence that the object a and the object B are the same object in fig. 9 a. Further, the area of the overlapping region in fig. 11 is larger than that in fig. 9a, and therefore, it can be determined that the possibility that the object a and the object B in fig. 11 are the same object is higher in comparison with fig. 11 and 9 a. This also corresponds to the situation shown in fig. 11, where the distance between the position of object a and the position of object B in fig. 11 is smaller than the distance between the position of object a and the position of object B in fig. 9a, in other words the distance between the position of object a and the position of object B in fig. 11 is closer than the distance between the position of object a and the position of object B in fig. 9 a.
It should be noted that, taking the target b ' as the target b located at the position 1, taking the target b "as the target b located at the position 2, and taking the position 1 and the position 2 as different examples, in this embodiment of the present application, under the condition that both the target a and the target b ' are located in the first region, and the position of the target a and the position of the target b" are also located in the first region, if the area of the corresponding overlapping region between the target a and the target b ' is greater than the area of the corresponding overlapping region between the target a and the target b ", and the size of the overlapping degree of the corresponding overlapping region between the target a and the target b ' and the size of the overlapping region between the target a and the target b" are both greater than or equal to the first threshold, the confidence that the target a and the target b ' are the same target is higher than the confidence that the target a and the target b "are the.
It should be noted that, the area of the corresponding overlapping region between the target a and the target b 'is larger than the area of the corresponding overlapping region between the target a and the target b ", both the size of the overlapping degree of the corresponding overlapping region between the target a and the target b' and the size of the overlapping degree of the corresponding overlapping region between the target a and the target b" are larger than or equal to the first threshold, and if the distance between the target a and the target b 'is larger than the distance between the target a and the target b ", the confidence that the target a and the target b" are the same target is higher than the possibility that the target a and the target b' are the same target.
In a possible implementation manner, when the size of the degree of overlap is greater than or equal to a first threshold, the position of the first target and the position of the second target are both located in the first region, the distance between the position of the first target and the position of the second target is less than or equal to a second threshold, and the confidence that the first target and the second target are the same target is a fifth confidence. When the degree of overlap is greater than or equal to the first threshold, at least one of the position of the first target and the position of the second target is located outside the first region, and/or the distance between the position of the first target and the position of the second target is greater than the second threshold, and the confidence that the first target and the second target are the same target is a sixth confidence. The fifth confidence level is greater than the sixth confidence level.
In the embodiment of the present application, at least one of the position of the first target and the position of the second target may be located outside the first area, and may be replaced by: at most one of the position of the first object and the position of the second object is located within the first region. The description includes the case where the position of the first object is within the first region and the position of the second object is outside the first region, or the position of the second object is within the first region and the position of the first object is outside the first region. Or the position of the first object and the position of the second object are both outside the first area.
It is understood that, when the degree of overlap is greater than or equal to the first threshold, the position of the first object is located within the first region, the position of the second object is located outside the first region, and the distance between the position of the first object and the position of the second object is greater than the second threshold, the confidence that the first object and the second object are the same object is the sixth confidence. Or when the degree of overlap is greater than or equal to the first threshold, the position of the first target is located within the first region, and the position of the second target is located outside the first region, and then the confidence that the first target and the second target are the same target is the sixth confidence. Or when the degree of overlap is greater than or equal to the first threshold, if the distance between the position of the first target and the position of the second target is greater than the second threshold, the confidence that the first target and the second target are the same target is the sixth confidence.
When the degree of overlap is greater than or equal to the first threshold, the position of the first target is located within the first region, the position of the second target is located within the first region, the position of the first target is located outside the first region, and the distance between the position of the first target and the position of the second target is greater than the second threshold, then the confidence that the first target and the second target are the same target is a sixth confidence. Or, when the degree of overlap is greater than or equal to the first threshold, the position of the first target is located in the first region, and the position of the second target is located in the first region, then the confidence that the first target and the second target are the same target is the sixth confidence.
When the degree of overlap is greater than or equal to the first threshold, the position of the first target and the position of the second target are both located outside the first area, and the distance between the position of the first target and the position of the second target is greater than the second threshold, then the confidence that the first target and the second target are the same target is the sixth confidence. Or when the degree of overlap is greater than or equal to the first threshold, the position of the first target and the position of the second target are both located outside the first region, and the confidence that the first target and the second target are the same target is the sixth confidence.
In a possible embodiment, in order to further improve the possibility of determining that the first target and the second target are the same target, the first threshold may be determined according to a detection error of the first sensor or a detection error of the second sensor.
As a possible embodiment, the first threshold in the embodiment of the present application is determined by the second detection error, or the first threshold is determined by the fourth detection error.
As an example, the first threshold increases with an increase in the second detection error.
As another example, the first threshold increases with an increase in the fourth detection error.
In one possible implementation, the first threshold is a first value when the second detection error is the first error, and the first threshold is a second value when the second detection error is the second error, where the first error is greater than the second error, and the first value is greater than the second value.
For example, taking the second detection error as the distance measurement error of the first sensor in the horizontal direction as an example, comparing fig. 9a and fig. 12, in fig. 12, the distance measurement error (EXr) of the first sensor in the horizontal direction is the first error, in fig. 9a, the distance measurement error (EXr) of the first sensor in the horizontal direction is the second error, and the comparison shows that the first error is larger than the second error. Comparing fig. 9a and 12, it can be seen that, when the range-finding error (EXr) in the horizontal direction of the first sensor increases, the overlapping area size of the overlapping area of the first target region and the second target region will gradually increase (for example, the first target region R in fig. 12)rIs larger than the first target region R in fig. 9arArea of). Therefore, if the first threshold value is fixed, it is possible to reduce the accuracy of determining whether the first target and the second target are the same target using the relationship between the degree of overlap of the overlap region and the first threshold value. Based on this, the first threshold value needs to be dynamically adjusted according to the range error of the first sensor in the horizontal direction.
As shown in fig. 12, when the distance measurement error (EXr) of the first sensor in the horizontal direction reaches a certain threshold value a, the distance measurement error (EXr) of the first sensor in the horizontal direction covers the entire first area in the horizontal direction. Then, as the distance measurement error (EXr) of the first sensor in the horizontal direction continues to increase, the first target area will continue to become larger, but the change of the size of the overlapping area of the first target area and the second target area will not be affected, that is, the overlapping area of the overlapping area does not change even if the distance measurement error (EXr) of the first sensor in the horizontal direction continues to increase on the basis of the threshold value a, that is, the overlapping area is in a "saturated" state. At this time, the first threshold may be set to a first value for the case in fig. 12, and may be set to a second value for the case in fig. 9a, in order to reduce the influence of such "saturation".
It will also be appreciated that the first value is a value set when the overlapping area is in a "saturated" state, and the second value is a value set when the overlapping area is not in a "saturated" state.
It is understood that, in the embodiment of the present application, the first device has a plurality of second detection errors and a value corresponding to each of the plurality of second detection errors. Thus, when the second detection error is the first error, the first device may use a value corresponding to the first error as the first threshold.
As another possible implementation, when the fourth detection error is a third error, the first threshold is a third value, and when the fourth detection error is a fourth error, the first threshold is a fourth value, where the third error is greater than the fourth error, and the third value is greater than the fourth value.
For example, taking the fourth detection error as the distance measurement error of the second sensor in the vertical direction as an example, comparing fig. 9a and fig. 13, the distance measurement Error (EYi) of the second sensor in the vertical direction in fig. 13 is the third error, the distance measurement Error (EYi) of the second sensor in the vertical direction in fig. 9a is the fourth error, and comparing fig. 9a and fig. 13 finds that the third error is larger than the fourth error, and when the distance measurement Error (EYi) of the second sensor in the vertical direction is increased, the size of the overlapping area of the first target region and the second target region will also gradually increase (for example, the second target region R in fig. 13iIs larger than the second target region R in FIG. 9aiArea of). Therefore, if the first threshold value is fixed, it is possible to reduce the accuracy of determining whether the first target and the second target are the same target using the relationship between the degree of overlap of the overlap region and the first threshold value. Base ofIn this case, the first threshold value needs to be dynamically adjusted according to the range Error (EYi) of the second sensor in the vertical direction.
As shown in fig. 13, when the range Error (EYi) of the second sensor in the vertical direction reaches a certain threshold b, the range Error (EYi) of the second sensor in the vertical direction covers the entire first area in the vertical direction. Then, as the Error (EYi) of the second sensor in the vertical direction continues to increase, the second target area will also continue to become larger, but the second target area, although continuing to increase, will not affect the size of the overlapping area of the first target area and the second target area, i.e., the overlapping area is in a "saturated" state. At this time, the first threshold may be set to a third value for the case in fig. 13, and may be set to a fourth value for the case in fig. 9a, in order to reduce the influence of such "saturation".
It is understood that, in the embodiment of the present application, the first device has a value corresponding to each of the plurality of fourth detection errors and the plurality of fourth detection errors. Thus, when the fourth detection error is the third error, the first device may use a value corresponding to the third error as the first threshold. For example, in the embodiment of the present application, the first device has a value corresponding to the range-finding Error (EYi) of the plurality of second sensors in the vertical direction, and the range-finding Error (EYi) of each of the plurality of second sensors in the vertical direction (EYi). Thus, when the distance measurement Error (EYi) of the second sensor in the vertical direction is the third error, the first device may use a value corresponding to the third error as the third threshold.
It should be noted that fig. 12 illustrates an increase in the distance measurement error of the first sensor in the horizontal direction, fig. 13 illustrates an increase in the distance measurement error of the second sensor in the vertical direction, and when the distance measurement error of the first sensor in the horizontal direction and the distance measurement error of the second sensor in the vertical direction both increase, the first threshold may be determined by the distance measurement error of the first sensor in the horizontal direction and the distance measurement error of the second sensor in the vertical direction. For example, the first threshold value varies (e.g., increases, as shown in fig. 14) as the range error of the first sensor in the horizontal direction and the range error of the second sensor in the vertical direction increase. As can be seen from comparing fig. 14 and 9a, in fig. 14, not only the distance measurement error of the first sensor in the horizontal direction is greater than that of the first sensor in fig. 9a, but also the distance measurement error of the second sensor in the vertical direction is greater than that of the second sensor in fig. 9a, so the first threshold value in the case shown in fig. 14 should be greater than the corresponding first threshold value in the case of fig. 9 a. Or, when the distance measurement error of the first sensor in the horizontal direction and the distance measurement error of the second sensor in the vertical direction both increase, the first threshold is a, or the first threshold is b, where a is greater than b, a is a parameter set when the overlapping area covers the first area, and b is a parameter set when the overlapping area does not cover the first area.
Of course, it is also understood that the first threshold value decreases as the second detection error of the first sensor decreases, and the first threshold value decreases as the fourth detection error of the second sensor decreases.
In a possible embodiment, the second threshold in the embodiment of the present application is determined by the second detection error and the fourth detection error together. For example, the second threshold value decreases as the second detection error and the fourth detection error increase. Alternatively, the second threshold value increases with a decrease in increase in the second detection error and the fourth detection error.
Specifically, the second detection error of the first sensor is a fifth error, the fourth detection error is a sixth error, the second threshold is a fifth value, the second detection error of the first sensor is a seventh error, the fourth detection error is an eighth error, the second threshold is a sixth value, the fifth value is smaller than the sixth value, the fifth error is larger than the seventh error, and the sixth error is larger than the eighth error.
For example, as shown in fig. 14, taking the example that the second detection error is the distance measurement error of the first sensor in the horizontal direction, and the fourth detection error is the distance measurement error of the second sensor in the vertical direction, the distance measurement error of the first sensor in the horizontal direction (for example, the fifth error), and the distance measurement error of the second sensor in the vertical direction (for example, the sixth error) are greater than the distance measurement error of the first sensor in the horizontal direction (for example, the seventh error) and the distance measurement error of the second sensor in the vertical direction (for example, the eighth error) in fig. 9a, in fig. 14, the overlapping region (the shaded portion in fig. 14) of the first target region and the second target region covers the whole first region, when the overlapping area of the overlapping region reaches the "saturation" state, that is, even if the distance measurement error of the first sensor in the horizontal direction and the distance measurement error of the second sensor in the vertical direction continue to increase, the overlap area of the subsequent overlap region will not change. Comparing fig. 14 and 9a, it can be seen that the distance between the position of the first target and the position of the second target in fig. 14 is smaller than the distance between the position of the first target and the position of the second target in fig. 9a, so in order to reduce the misjudgment, the second threshold corresponding to fig. 9a when the range-finding error of the first sensor in the horizontal direction is the seventh error and the range-finding error of the second sensor in the vertical direction is the eighth error should be larger than the second threshold corresponding to the range-finding error of the first sensor in the horizontal direction is the fifth error and the range-finding error of the second sensor in the vertical direction is the sixth error, so as to reduce the influence caused by such saturation.
In other words, the second threshold is determined by the relationship between the overlap region and the first region. Specifically, the second threshold is a seventh value, or the second threshold is an eighth value, where the seventh value is greater than the eighth value, the eighth value is a parameter set when the overlapping area covers the first area, and the seventh value is a parameter set when the overlapping area does not cover the first area.
In fig. 8 to 14, the second detection error of the first sensor is taken as the distance measurement error of the first sensor in the horizontal direction as an example.
It is understood that, in the embodiment of the present application, the first device may determine the confidence that the first object and the second object are the same object, for example, the confidence is 90% or the confidence is 80%. Of course, the first device may determine that the first target and the second target are the same target or are not the same target. For example, the first device outputs a first indicator or a second indicator, wherein the first indicator indicates that the first target and the second target are the same target. The second indicator indicates that the first object and the second object are not the same object.
As a possible implementation manner, the method provided in the embodiment of the present application may further include: and the first equipment determines whether the first target and the second target are the same target or not according to the target confidence. For example, if the first device determines that the confidence of the target is greater than or equal to the first confidence threshold, it determines that the first target and the second target are the same target. And if the first device determines that the confidence of the target is less than or equal to the second confidence threshold, determining that the first target and the second target are not the same target.
The above description has mainly described the solution of the embodiment of the present application from the perspective of the first device. It is to be understood that the first device, etc. includes corresponding hardware structures and/or software modules for performing the respective functions in order to implement the above-described functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the first device may perform the division of the functional units according to the method example described above, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
The method of the embodiment of the present application is described above with reference to fig. 7 to 14, and the object detection apparatus provided by the embodiment of the present application for performing the method is described below. Those skilled in the art will appreciate that the method and apparatus can be combined and referred to, and an object detection apparatus provided in the embodiments of the present application can perform the steps performed by the first device in the above-mentioned object detection method.
The following description will be given by taking the division of each function module corresponding to each function as an example:
in the case of an integrated unit, fig. 15 shows an object detection device according to the above embodiment, which may include: a first determination module 1501, a calculation module 1502, and a second determination module 1503.
In one example, the object detection apparatus is a first device or a chip applied in the first device. In this case, the first determining module 1501 is configured to support the object detecting apparatus to perform the step 701 performed by the first device in the above embodiment. A calculation module 1502 for enabling the object detection means to perform the step 702 performed by the first device in the above embodiments. A second determining module 1503, configured to support the target detecting apparatus to perform step 703 executed by the first device in the foregoing embodiment.
The object detection device may further include a storage module. The memory module is to store computer program code, the computer program code comprising instructions. If the target detection apparatus is a chip applied in the first device, the storage module may be a storage module (e.g., register, cache, etc.) in the chip, or a storage module (e.g., read only memory, random access memory, etc.) outside the chip in the first device.
The first determination module 1501, the calculation module 1502, and the second determination module 1503 may be integrated on a processor or controller, which may be a central processing unit, a general-purpose processor, a digital signal processor, an application specific integrated circuit, a field programmable gate array or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof, for example. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a digital signal processor and a microprocessor, or the like.
For example, the object detection device according to the present application may be the object detection device shown in fig. 6. At this time, the first determination module 1501, the calculation module 1502, and the second determination module 1503 may be integrated on the processor 41 or the processor 45.
In the case of an integrated unit, fig. 16 shows a schematic diagram of a possible logical structure of the object detection apparatus according to the above embodiment. The object detection device includes: a processing unit 1601. The processing unit 1601 is used for control management of the operation of the target detection apparatus, and for example, the processing unit 1601 is used for executing steps of information/data processing in the target detection apparatus.
In a possible embodiment, the object detection device may further comprise a storage unit 1602 for storing program codes and data usable by the object detection device. In a possible embodiment, when the object detection apparatus is a server, the object detection apparatus may further comprise a communication unit 1603. The communication unit 1603 is used to support the step of information/data transmission or reception by the target detection apparatus.
Illustratively, the object detection device is the first device, or a chip applied in the first device. In this case, the processing unit 1601 is configured to support the target detection apparatus to execute steps 701 to 703 executed by the first device in the above-described embodiment.
Fig. 17 is a schematic structural diagram of a chip 150 according to an embodiment of the present disclosure. Chip 150 includes one or more (including two) processors 1510 and a communication interface 1530.
Optionally, the chip 150 further includes a memory 1540, which may include both read-only memory and random access memory, and provides operating instructions and data to the processor 1510. A portion of memory 1540 may also include non-volatile random access memory (NVRAM).
In some embodiments, memory 1540 stores elements, execution modules, or data structures, or a subset thereof, or an expanded set thereof.
In the embodiment of the present application, by calling an operation instruction stored in the memory 1540 (the operation instruction may be stored in an operating system), a corresponding operation is performed.
One possible implementation is: the first device is similar in structure and different devices may use different chips to implement their respective functions.
The processor 1510 controls processing operations of the first device, and the processor 1510 may also be referred to as a Central Processing Unit (CPU).
Memory 1540 can include both read-only memory and random-access memory, and provides instructions and data to processor 1510. A portion of memory 1540 may also include non-volatile random access memory (NVRAM). For example, in an application where memory 1540, communications interface 1530 and memory 1540 are coupled together by bus system 1520, where bus system 1520 may include a power bus, control bus, status signal bus, etc. in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 1520 in FIG. 17.
The method disclosed in the embodiments of the present application may be applied to the processor 1510 or implemented by the processor 1510. The processor 1510 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by instructions in the form of hardware, integrated logic circuits, or software in the processor 1510. The processor 1510 may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 1540, and the processor 1510 reads the information in the memory 1540, and performs the steps of the above method in combination with the hardware thereof.
In one possible implementation, the processor 1510 is configured to perform the steps of the process performed by the first device in the embodiment shown in fig. 7.
The above communication unit may be an interface circuit or a communication interface of the apparatus for receiving signals from other apparatuses. For example, when the device is implemented in the form of a chip, the communication unit is an interface circuit or a communication interface for the chip to receive signals from or transmit signals to other chips or devices.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, e.g., the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means. A computer-readable storage medium may be any available medium that a computer can store or a data storage device including one or more available media integrated servers, data centers, and the like. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
In one aspect, a computer-readable storage medium is provided, in which instructions are stored, and when executed, the instructions cause a first device or a chip applied in the first device to perform steps 701, 702, and 703 in the embodiment.
The aforementioned readable storage medium may include: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
In one aspect, a computer program product comprising instructions stored therein, when executed, cause a first device or a chip applied in the first device to perform steps 701, 702, and 703 in an embodiment is provided.
In one aspect, a chip is provided, where the chip is applied to a first device, and the chip includes at least one processor and a communication interface, where the communication interface is coupled to the at least one processor, and the processor is configured to execute instructions to perform steps 701, 702, and 703 in the embodiments.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or can comprise one or more data storage devices, such as a server, a data center, etc., that can be integrated with the medium. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
While the present application has been described in connection with various embodiments, other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed application, from a review of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the word "a" or "an" does not exclude a plurality. A single processor or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Although the present application has been described in conjunction with specific features and embodiments thereof, it will be evident that various modifications and combinations can be made thereto without departing from the spirit and scope of the application. Accordingly, the specification and figures are merely exemplary of the present application as defined in the appended claims and are intended to cover any and all modifications, variations, combinations, or equivalents within the scope of the present application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to include such modifications and variations.

Claims (25)

1. A method of object detection, comprising:
determining an overlapping area of a first target area determined by a detection error of a first sensor and a position of a first target detected by the first sensor and a second target area determined by a detection error of a second sensor and a position of a second target detected by the second sensor, the first target area including the position of the first target and the second target area including the position of the second target;
calculating the overlapping degree of the overlapping area in the union of the first target area and the second target area;
when the degree of overlap is greater than or equal to a first threshold, determining the target confidence of the first target and the second target as the same target according to the relationship between the position of the first target and the position of the second target and the first region, and/or according to the relationship between the distance between the position of the first target and the position of the second target and the second threshold;
wherein the first region is determined by a first detection error of the first sensor including the first detection error and a second detection error and a third detection error of the second sensor including the third detection error and a fourth detection error, the third detection error being higher in accuracy than the fourth detection error.
2. The method of claim 1, wherein the relationship between the location of the first target and the location of the second target and the first region comprises:
the position of the first target and the position of the second target are both located within the first region.
3. The method of claim 1, wherein the relationship between the distance between the location of the first target and the location of the second target and the second threshold comprises:
a distance between the position of the first target and the position of the second target is less than or equal to a second threshold.
4. The method according to any one of claims 1 to 3, wherein the determining the object confidence of the first object and the second object being the same object according to the relationship between the position of the first object and the position of the second object and the first region, and/or according to the relationship between the distance between the position of the first object and the position of the second object and the second threshold value comprises:
the first target and the second target are both located in the first area, and/or the distance between the first target and the second target is smaller than or equal to a second threshold, and the target confidence of the first target and the second target being the same target is determined, wherein the target confidence is larger than or equal to a first confidence threshold.
5. The method according to any one of claims 1 to 4,
the first threshold is determined by the second detection error or the first threshold is determined by the fourth detection error.
6. The method of claim 5, wherein the first threshold increases with an increase in the second detection error or wherein the first threshold increases with an increase in the fourth detection error.
7. The method according to any one of claims 1 to 6,
the second threshold is determined by the second detection error and the fourth detection error.
8. The method of claim 7, wherein the second threshold decreases as the second detection error and the fourth detection error increase.
9. The method of any one of claims 1 to 8, wherein the first sensor is a radar and the second sensor is a camera.
10. The method according to any one of claims 1 to 9, wherein the first detection error is a range error of the first sensor in a vertical direction, and the second detection error is an angle measurement error of the first sensor or a range error of the first sensor in a horizontal direction;
the third detection error is an angle measurement error of the second sensor; the fourth detection error is a ranging error of the second sensor.
11. The method according to any one of claims 1 to 10, wherein the position of the first target is a position of the first target with respect to a reference point on an XY plane, an X-axis direction of the XY plane is a vehicle width direction, and a Y-axis direction of the XY plane is a vehicle length direction; the position of the second target is a position of the second target with respect to a reference point on the XY plane.
12. An object detection device, comprising:
a first determination module for determining an overlap region of a first target region determined by a detection error of a first sensor and a position of a first target detected by the first sensor and a second target region determined by a detection error of a second sensor and a position of a second target detected by the second sensor, the first target region including the position of the first target and the second target region including the position of the second target;
the calculation module is used for calculating the overlapping degree of the overlapping area in the union of the first target area and the second target area;
when the degree of overlap is greater than or equal to a first threshold, a second determining module is used for determining the confidence that the first target and the second target are the same target as a target confidence according to the relationship between the position of the first target and the position of the second target and the first region, and/or the relationship between the distance between the position of the first target and the position of the second target and a second threshold;
wherein the first region is determined by a first detection error of the first sensor including the first detection error and a second detection error and a third detection error of the second sensor including the third detection error and a fourth detection error, the third detection error being higher in accuracy than the fourth detection error.
13. The apparatus of claim 12, wherein the relationship between the location of the first target and the location of the second target and the first region comprises:
the position of the first target and the position of the second target are both located within the first region.
14. The apparatus of claim 12, wherein the relationship between the distance between the location of the first target and the location of the second target and the second threshold comprises:
a distance between the position of the first target and the position of the second target is less than or equal to a second threshold.
15. The apparatus according to any one of claims 12 to 14,
the first target and the second target are both located in the first region, and/or the distance between the first target and the second target is smaller than or equal to a second threshold, and the second determining module is configured to determine that the confidence is the target confidence, and the target confidence is larger than or equal to a first confidence threshold.
16. The apparatus according to any one of claims 12 to 15,
the first threshold is determined by the second detection error or the first threshold is determined by the fourth detection error.
17. The apparatus of claim 16, wherein the first threshold increases with an increase in the second detection error or wherein the first threshold increases with an increase in the fourth detection error.
18. The apparatus according to any one of claims 12 to 17,
the second threshold is determined by the second detection error and the fourth detection error.
19. The apparatus of claim 18, wherein the second threshold decreases as the second detection error and the fourth detection error increase.
20. The apparatus of any one of claims 12 to 19, wherein the first sensor is a radar and the second sensor is a camera.
21. The apparatus according to any one of claims 12 to 20, wherein the first detection error is a distance measurement error of the first sensor in a vertical direction, and the second detection error is an angle measurement error of the first sensor or a distance measurement error of the first sensor in a horizontal direction;
the third detection error is an angle measurement error of the second sensor; the fourth detection error is a ranging error of the second sensor.
22. The apparatus according to any one of claims 12 to 21, wherein the position of the first target is a position of the first target with respect to a reference point on an XY plane, an X-axis direction of the XY plane is a vehicle width direction, and a Y-axis direction of the XY plane is a vehicle length direction; the position of the second target is a position of the second target with respect to a reference point on the XY plane.
23. A computer-readable storage medium having stored thereon instructions which, when executed, implement the object detection method of any one of claims 1-11.
24. A chip, characterized in that the chip comprises at least one processor and a communication interface, the communication interface being coupled to the at least one processor, the at least one processor being configured to execute a computer program or instructions to implement the object detection method according to any of claims 1-11, the communication interface being configured to communicate with other modules than the chip.
25. An object detection device, comprising: at least one processor coupled to a memory having a computer program or instructions stored therein, the at least one processor configured to execute the computer program or instructions stored in the memory to implement the object detection method of any of claims 1-11.
CN202080004870.XA 2020-03-26 2020-03-26 Target detection method and device Active CN112689842B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/081506 WO2021189385A1 (en) 2020-03-26 2020-03-26 Target detection method and apparatus

Publications (2)

Publication Number Publication Date
CN112689842A true CN112689842A (en) 2021-04-20
CN112689842B CN112689842B (en) 2022-04-29

Family

ID=75457662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080004870.XA Active CN112689842B (en) 2020-03-26 2020-03-26 Target detection method and device

Country Status (3)

Country Link
EP (1) EP4123337A4 (en)
CN (1) CN112689842B (en)
WO (1) WO2021189385A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115035186A (en) * 2021-12-03 2022-09-09 荣耀终端有限公司 Target object marking method and terminal equipment
WO2023284333A1 (en) * 2021-07-14 2023-01-19 魔门塔(苏州)科技有限公司 Method and apparatus for determining confidence of automatic driving strategy

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749626A (en) * 2012-07-17 2012-10-24 奇瑞汽车股份有限公司 Radar sensor, automobile and target direction identification method
CN103837872A (en) * 2012-11-22 2014-06-04 株式会社电装 Object detection apparatus
US20150359512A1 (en) * 2014-06-11 2015-12-17 The Johns Hopkins University Synthetic aperture ultrasound system
US20180218228A1 (en) * 2017-01-31 2018-08-02 Denso Corporation Apparatus and method for controlling vehicle
CN108549876A (en) * 2018-04-20 2018-09-18 重庆邮电大学 The sitting posture detecting method estimated based on target detection and human body attitude
CN109212513A (en) * 2018-09-29 2019-01-15 河北德冠隆电子科技有限公司 Multiple target between radar data transmitting, data fusion and localization method is continuously tracked
CN110133637A (en) * 2019-06-05 2019-08-16 中国科学院长春光学精密机械与物理研究所 Object localization method, apparatus and system
WO2019184709A1 (en) * 2018-03-29 2019-10-03 上海智瞳通科技有限公司 Data processing method and device based on multi-sensor fusion, and multi-sensor fusion method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4428277B2 (en) * 2005-04-07 2010-03-10 トヨタ自動車株式会社 Object detection device
JP4876118B2 (en) * 2008-12-08 2012-02-15 日立オートモティブシステムズ株式会社 Three-dimensional object appearance detection device
JP6075408B2 (en) * 2012-11-22 2017-02-08 株式会社デンソー Target detection device
JP5812061B2 (en) * 2013-08-22 2015-11-11 株式会社デンソー Target detection apparatus and program
JP6512164B2 (en) * 2016-04-22 2019-05-15 株式会社デンソー Object detection apparatus, object detection method
CN106249239B (en) * 2016-08-23 2019-01-01 深圳市速腾聚创科技有限公司 Object detection method and device
JP6885721B2 (en) * 2016-12-27 2021-06-16 株式会社デンソー Object detection device, object detection method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749626A (en) * 2012-07-17 2012-10-24 奇瑞汽车股份有限公司 Radar sensor, automobile and target direction identification method
CN103837872A (en) * 2012-11-22 2014-06-04 株式会社电装 Object detection apparatus
US20150359512A1 (en) * 2014-06-11 2015-12-17 The Johns Hopkins University Synthetic aperture ultrasound system
US20180218228A1 (en) * 2017-01-31 2018-08-02 Denso Corporation Apparatus and method for controlling vehicle
WO2019184709A1 (en) * 2018-03-29 2019-10-03 上海智瞳通科技有限公司 Data processing method and device based on multi-sensor fusion, and multi-sensor fusion method
CN108549876A (en) * 2018-04-20 2018-09-18 重庆邮电大学 The sitting posture detecting method estimated based on target detection and human body attitude
CN109212513A (en) * 2018-09-29 2019-01-15 河北德冠隆电子科技有限公司 Multiple target between radar data transmitting, data fusion and localization method is continuously tracked
CN110133637A (en) * 2019-06-05 2019-08-16 中国科学院长春光学精密机械与物理研究所 Object localization method, apparatus and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023284333A1 (en) * 2021-07-14 2023-01-19 魔门塔(苏州)科技有限公司 Method and apparatus for determining confidence of automatic driving strategy
CN115035186A (en) * 2021-12-03 2022-09-09 荣耀终端有限公司 Target object marking method and terminal equipment

Also Published As

Publication number Publication date
EP4123337A1 (en) 2023-01-25
EP4123337A4 (en) 2023-04-12
WO2021189385A1 (en) 2021-09-30
CN112689842B (en) 2022-04-29

Similar Documents

Publication Publication Date Title
CN112213725B (en) Multipath false alarm suppression method and device for vehicle-mounted radar and terminal equipment
US11561295B2 (en) Position detecting method, device and storage medium for vehicle ladar
CN109884639B (en) Obstacle detection method and device for mobile robot
CN109828250B (en) Radar calibration method, calibration device and terminal equipment
CN112689842B (en) Target detection method and device
CN108859952B (en) Vehicle lane change early warning method and device and radar
US11954918B2 (en) Object detection device, object detection method, and storage medium
CN111105465B (en) Camera device calibration method, device, system electronic equipment and storage medium
JP2022087821A (en) Data fusion method and device
CN113741446B (en) Robot autonomous exploration method, terminal equipment and storage medium
WO2023024087A1 (en) Method, apparatus and device for processing laser radar point cloud, and storage medium
CN111722297B (en) Target existence probability calculation method and device, electronic equipment and storage medium
US11020857B2 (en) Robot distance measuring method, apparatus and robot using the same
CN115951336A (en) Method, device and equipment for determining laser radar error and storage medium
CN115728772A (en) Laser scanning point type detection method and device and terminal equipment
CN117677862A (en) Pseudo image point identification method, terminal equipment and computer readable storage medium
US20220012492A1 (en) Object detection device
CN116071421A (en) Image processing method, device and computer readable storage medium
CN111489375B (en) Information detection method, device and equipment
CN113140120B (en) Method and device for determining traffic indication information
CN116189137B (en) Parking space detection method, electronic equipment and computer readable storage medium
CN110471077B (en) Positioning method and device
JP7474689B2 (en) Object detection method and object detection device
CN115963485A (en) Speed detection method, device, equipment and readable storage medium
CN116626630B (en) Object classification method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant