WO2021189385A1 - 一种目标检测方法以及装置 - Google Patents

一种目标检测方法以及装置 Download PDF

Info

Publication number
WO2021189385A1
WO2021189385A1 PCT/CN2020/081506 CN2020081506W WO2021189385A1 WO 2021189385 A1 WO2021189385 A1 WO 2021189385A1 CN 2020081506 W CN2020081506 W CN 2020081506W WO 2021189385 A1 WO2021189385 A1 WO 2021189385A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
area
detection error
threshold
sensor
Prior art date
Application number
PCT/CN2020/081506
Other languages
English (en)
French (fr)
Inventor
林永兵
马莎
张慧
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN202080004870.XA priority Critical patent/CN112689842B/zh
Priority to EP20926907.5A priority patent/EP4123337A4/en
Priority to PCT/CN2020/081506 priority patent/WO2021189385A1/zh
Publication of WO2021189385A1 publication Critical patent/WO2021189385A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/809Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
    • G06V10/811Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data the classifiers operating on different input data, e.g. multi-modal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

Definitions

  • the embodiments of the present application relate to the field of automatic driving, and in particular, to a target detection method and device.
  • the realization of autonomous driving environment perception relies on the sensors deployed on the vehicle, where the sensors deployed on the vehicle may include cameras and millimeter wave radars. Multiple sensors work together and complement each other's advantages, which can enhance the perception and reliability of the autonomous driving environment.
  • the camera has the characteristics of passive measurement, non-contact measurement, high resolution, convenient use, and low cost. It is an essential sensor for automatic driving environment perception, target detection, and advanced driver assistance system (ADAS). Compared with cameras, millimeter-wave radar is an active measurement method, with high ranging accuracy, all-weather work, moderate price, and more and more widely used in autonomous driving situations.
  • ADAS advanced driver assistance system
  • the detection range of the camera and the millimeter wave radar overlaps, which can realize the simultaneous detection of the same target.
  • the accuracy and reliability of the target detection can be improved by fusing the target detected by the camera and the target detected by the millimeter wave radar. .
  • due to the detection errors of the sensors how to effectively fuse the same target detected by different sensors is a fundamental problem in achieving functions such as target tracking and vehicle obstacle avoidance.
  • the embodiments of the present application provide a target detection method and device to improve the accuracy of judging that the target detected by different sensors is the same target.
  • an embodiment of the present application provides a target detection method, including: a first device determines an overlapping area of a first target area and a second target area, the first target area is determined by the detection error of the first sensor and the detection error of the first sensor The position of the detected first target is determined, and the second target area is determined by the detection error of the second sensor and the position of the second target detected by the second sensor.
  • the first target area includes the location of the first target, and the second target area includes the location of the second target.
  • the first device calculates the degree of overlap of the overlapping area in the union of the first target area and the second target area.
  • the first device determines the relationship between the position of the first target and the position of the second target and the first area, and/or the first device determines the relationship between the position of the first target and the position of the first target.
  • the relationship between the distance between the positions of the second target and the second threshold determines the confidence that the first target and the second target are the same target as the target confidence.
  • the first area is determined by the first detection error of the first sensor and the third detection error of the second sensor.
  • the detection error of the first sensor includes a first detection error and a second detection error.
  • the accuracy of the first detection error is higher than the accuracy of the second detection error.
  • the detection error of the second sensor includes a third detection error and a fourth detection error, and the accuracy of the third detection error is higher than that of the fourth detection error.
  • the embodiment of the present application provides a target detection method.
  • the first device determines the degree of overlap between the first target area and the second target area, and then when the degree of overlap is greater than or equal to a first threshold, the first device determines A device based on the relationship between the position of the first target and the position of the second target and the first area, and/or the first device based on the distance between the position of the first target and the position of the second target and the second threshold
  • the relationship between the first target and the second target is determined as the target confidence, so that when the degree of overlap is greater than or equal to the first threshold, based on the position of the first target and the second target
  • the relationship between the position and the first area, and/or the relationship between the distance between the position of the first target and the position of the second target and the second threshold further determines that the first target and the second target are the same target
  • the accuracy of the confidence level reduces misjudgments.
  • the relationship between the position of the first target and the position of the second target and the first area includes: the position of the first target and the position of the second target are both located in the first area, or ,
  • the relationship between the position of the first target and the position of the second target and the first area includes: at least one of the position of the first target and the position of the second target is located outside the first area.
  • the target confidence when the degree of overlap is greater than or equal to the first threshold, and the position of the first target and the position of the second target are both located in the first area, the target confidence is the first confidence.
  • the target confidence is the second confidence.
  • the first degree of confidence is greater than the second degree of confidence. In other words, when the degree of overlap is greater than or equal to the first threshold, the confidence that the first target and the second target located in the first area are the same target is greater than the first target and the second target located outside the first area. The confidence that the target is the same target.
  • the relationship between the distance between the position of the first target and the position of the second target and the second threshold includes: the distance between the position of the first target and the position of the second target is less than Or equal to the second threshold.
  • the relationship between the distance between the position of the first target and the position of the second target and the second threshold includes: the distance between the position of the first target and the position of the second target is greater than the second threshold.
  • the target confidence when the degree of overlap is greater than or equal to the first threshold, and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the target confidence is the third confidence Spend.
  • the target confidence is the fourth confidence
  • the third confidence is greater than the fourth confidence . That is, when the degree of overlap is greater than or equal to the first threshold, two targets that are closer (for example, the distance between the position of the first target and the position of the second target is less than or equal to the second threshold) are the same target
  • the confidence level of is higher than that of two targets that are farther apart (for example, the distance between the position of the first target and the position of the second target is greater than the second threshold) is the confidence level of the same target.
  • the first device is based on the relationship between the position of the first target and the position of the second target and the first area, and/or the first device is based on the position of the first target and the second target.
  • the relationship between the distance between the positions of and the second threshold, and the confidence that the first target and the second target are the same target is determined as the target confidence, including: the position of the first target and the position of the second target are both located In the first area, and/or, the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the first device determines that the confidence is the target confidence, and the target confidence is greater than or equal to the first confidence Degree threshold. This further improves the accuracy of the confidence that the first target and the second target are the same target.
  • the first device determines the confidence that the first target and the second target are the same target as the target according to the position of the first target and the relationship between the position of the second target and the first area.
  • the confidence level includes: the position of the first target and the position of the second target are both located in the first area, the first device determines the confidence level as the target confidence level, and the target confidence level is greater than or equal to the first confidence level threshold.
  • the first device determines the confidence that the first target and the second target are the same target as the target according to the position of the first target and the relationship between the position of the second target and the first area.
  • the confidence level includes: at least one of the position of the first target and the position of the second target is located outside the first area, the first device determines that the first target and the second target are the same target as the target confidence, The target confidence is less than or equal to the second confidence threshold.
  • the first device determines that the first target and the second target are of the same target according to the relationship between the distance between the position of the first target and the position of the second target and the second threshold.
  • the confidence is the target confidence, including: the distance between the position of the first target and the second target is less than or equal to the second threshold, the first device determines the confidence as the target confidence, and the target confidence is greater than or equal to the first confidence Degree threshold.
  • the first device determines that the first target and the second target are of the same target according to the relationship between the distance between the position of the first target and the position of the second target and the second threshold.
  • the confidence is the target confidence, including: the distance between the position of the first target and the second target is greater than or equal to the second threshold, the first device determines the confidence as the target confidence, and the target confidence is less than or equal to the second confidence Degree threshold. It is convenient for the first device to determine the possibility that two distant targets are not the same target when the degree of overlap is greater than or equal to the first threshold.
  • the first device determines the relationship between the position of the first target and the position of the second target and the first area, and the relationship between the position of the first target and the first area.
  • the relationship between the distance between the position and the position of the second target and the second threshold to determine the confidence that the first target and the second target are the same target includes: when the degree of overlap is greater than or equal to the first threshold, the first The position of a target and the position of the second target are both located in the first area and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the first device determines the first target and the second target The confidence that is the same target is the target confidence, and the target confidence is greater than or equal to the first confidence threshold.
  • the judgment metric of the second threshold can further increase the possibility that the first target and the second target are the same target.
  • the position of the first target and the position of the second target are both located in the first area, and the position of the first target and the position of the second target
  • the distance between the distance is less than or equal to the second threshold, and the target confidence is the fifth confidence
  • the overlap is greater than or equal to the first threshold, at least one of the position of the first target and the position of the second target is outside the first area , And/or, the distance between the position of the first target and the position of the second target is greater than the second threshold, and the target confidence is the sixth confidence.
  • the fifth degree of confidence is greater than the sixth degree of confidence.
  • the first target and the second target when the target confidence is greater than or equal to the first confidence threshold, the first target and the second target can be considered to be the same target. In the embodiment of the present application, when the target confidence is less than or equal to the second confidence threshold, it can be considered that the first target and the second target are not the same target.
  • the first threshold is determined by the second detection error, or the first threshold is determined by the fourth detection error.
  • the first threshold value increases as the second detection error increases, or the first threshold value increases as the fourth detection error increases.
  • the first threshold when the second detection error is the first error, the first threshold is the first value; when the second detection error is the second error, the first threshold is the second value, where the first error Greater than the second error, the first value is greater than the second value; or, when the fourth detection error is the third error, the first threshold is the third value; when the fourth detection error is the fourth error, the first threshold is the fourth value , Where the third error is greater than the fourth error, and the third value is greater than the fourth value.
  • the second threshold is determined by the second detection error and the fourth detection error.
  • the second threshold value decreases as the second detection error and the fourth detection error increase.
  • the second detection error is the fifth error
  • the fourth detection error is the sixth error
  • the second threshold is the fifth value
  • the second detection error is the seventh error
  • the fourth detection error is the first Eight errors
  • the second threshold is the sixth value, where the fifth value is less than the sixth value; the fifth error is greater than the seventh error, and the sixth error is greater than the eighth error.
  • the degree of overlap is a ratio of the overlapping area of the first target area and the second target area in the union of the first target area and the second target area.
  • the first device calculates the overlap degree of the overlap area in the first target area and the second target area and can be replaced by the following description: the first device calculates the area of the overlap area, then the above overlap The degree size can be replaced by the area of the overlapping area.
  • the first sensor is a radar and the second sensor is a camera.
  • the first sensor and the second sensor are both on-board sensors.
  • the first sensor is a vehicle-mounted radar
  • the second sensor is a vehicle-mounted camera.
  • the first detection error is the distance measurement error of the first sensor in the vertical direction
  • the second detection error is the angle measurement error of the first sensor or the distance measurement error of the first sensor in the horizontal direction.
  • the third detection error is the angle measurement error of the second sensor.
  • the fourth detection error is the ranging error of the second sensor.
  • the position of the first target is the position of the first target relative to the reference point on the XY plane
  • the X-axis direction of the XY plane is the width direction of the vehicle
  • the Y-axis direction of the XY plane is the length direction of the vehicle.
  • the position of the second target is the position of the second target relative to the reference point on the XY plane.
  • the first device may be a vehicle or a server, or a component in the vehicle (for example, a circuit system inside the vehicle) capable of realizing the processing function or communication function of the vehicle.
  • a telematics box T-BOX
  • the first device may be a chip in the T-BOX.
  • inventions of the present application provide a target detection device, which can implement the first aspect or any possible implementation method of the first aspect, and therefore can also implement any of the first aspect or the first aspect.
  • the target detection device may be a first device, or a device that can support the first device to implement the first aspect or the method in any possible implementation manner of the first aspect, for example, a chip or circuit component applied to the first device .
  • the device can implement the above method by software, hardware, or by hardware executing corresponding software.
  • a target detection device includes: a first determining module for determining an overlapping area of a first target area and a second target area, the first target area being detected by the detection error of the first sensor and by the first sensor The position of the first target is determined, and the second target area is determined by the detection error of the second sensor and the position of the second target detected by the second sensor.
  • the first target area includes the position of the first target, and the second target area Include the location of the second target.
  • the calculation module is used to calculate the degree of overlap of the overlapping area in the union of the first target area and the second target area.
  • the second determining module is used to determine the position of the first target and the relationship between the position of the second target and the first area, and/or the second determining module, Based on the relationship between the distance between the position of the first target and the position of the second target and the second threshold, the confidence that the first target and the second target are the same target is determined as the target confidence.
  • the first area is determined by the first detection error of the first sensor and the third detection error of the second sensor.
  • the detection error of the first sensor includes the first detection error and the second detection error.
  • the accuracy of the first detection error is higher than The accuracy of the second detection error
  • the detection error of the second sensor includes a third detection error and a fourth detection error
  • the accuracy of the third detection error is higher than that of the fourth detection error.
  • the relationship between the position of the first target and the position of the second target and the first area includes: the position of the first target and the position of the second target are both located in the first area, or ,
  • the relationship between the position of the first target and the position of the second target and the first area includes: at least one of the position of the first target and the position of the second target is located outside the first area.
  • the target confidence when the degree of overlap is greater than or equal to the first threshold, and the position of the first target and the position of the second target are both located in the first area, the target confidence is the first confidence.
  • the target confidence is the second confidence.
  • the first degree of confidence is greater than the second degree of confidence.
  • the relationship between the distance between the position of the first target and the position of the second target and the second threshold includes: the distance between the position of the first target and the position of the second target is less than Or equal to the second threshold.
  • the relationship between the distance between the position of the first target and the position of the second target and the second threshold includes: the distance between the position of the first target and the position of the second target is greater than the second threshold.
  • the target confidence when the degree of overlap is greater than or equal to the first threshold, and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the target confidence is the third confidence Spend.
  • the target confidence is the fourth confidence
  • the third confidence is greater than the fourth confidence . That is, when the degree of overlap is greater than or equal to the first threshold, two targets that are closer (for example, the distance between the position of the first target and the position of the second target is less than or equal to the second threshold) are the same target
  • the confidence level of is higher than that of two targets that are farther apart (for example, the distance between the position of the first target and the position of the second target is greater than the second threshold) is the confidence level of the same target.
  • the position of the first target and the position of the second target are both located in the first area, and/or the distance between the position of the first target and the position of the second target is less than or equal to the second
  • the threshold, the second determining module is used to determine the confidence as the target confidence, and the target confidence is greater than or equal to the first confidence threshold. This further improves the accuracy of the confidence that the first target and the second target are the same target.
  • the second determining module is configured to determine that the first target and the second target are of the same target according to the position of the first target and the relationship between the position of the second target and the first area.
  • the confidence is the target confidence, including: the position of the first target and the second target are both located in the first area, and the second determination module is used to determine the confidence as the target confidence, and the target confidence is greater than or equal to the first Confidence threshold.
  • the second determining module is configured to determine that the first target and the second target are of the same target according to the position of the first target and the relationship between the position of the second target and the first area.
  • the confidence is the target confidence, including: at least one of the position of the first target and the position of the second target is located outside the first area, then the second determination module is used to determine that the first target and the second target are the same target.
  • the confidence of is the target confidence, and the target confidence is less than or equal to the second confidence threshold.
  • the second determining module is configured to determine that the first target and the second target are based on the relationship between the distance between the position of the first target and the position of the second target and the second threshold
  • the confidence of the same target is the target confidence, including: the distance between the position of the first target and the second target is less than or equal to the second threshold, and the second determination module is used to determine the confidence as the target confidence.
  • the confidence is greater than or equal to the first confidence threshold.
  • the second determining module is configured to determine that the first target and the second target are based on the relationship between the distance between the position of the first target and the position of the second target and the second threshold
  • the confidence of the same target is the target confidence, including: the distance between the position of the first target and the second target is greater than or equal to the second threshold, and the second determination module is used to determine the confidence as the target confidence.
  • the confidence is less than or equal to the second confidence threshold. It is convenient for the first device to determine that the two distant targets are not the same target when the degree of overlap is greater than or equal to the first threshold.
  • the second determining module when the degree of overlap is greater than or equal to the first threshold, the second determining module is used to determine the position of the first target and the relationship between the position of the second target and the first area, and According to the relationship between the distance between the position of the first target and the position of the second target and the second threshold, the confidence that the first target and the second target are the same target is determined, including: when the degree of overlap is greater than or equal to the first target When a threshold value is set, the position of the first target and the position of the second target are both located in the first area and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the second determining module uses The confidence that determines that the first target and the second target are the same target is the target confidence, and the target confidence is greater than or equal to the first confidence threshold.
  • the position of the first target and the position of the second target are both located in the first area, and the position of the first target and the position of the second target
  • the distance between the distance is less than or equal to the second threshold, and the target confidence is the fifth confidence
  • the overlap is greater than or equal to the first threshold, at least one of the position of the first target and the position of the second target is outside the first area , And/or, the distance between the position of the first target and the position of the second target is greater than the second threshold, and the target confidence is the sixth confidence.
  • the fifth degree of confidence is greater than the sixth degree of confidence.
  • the first target and the second target when the target confidence is greater than or equal to the first confidence threshold, the first target and the second target can be considered to be the same target. In the embodiment of the present application, when the target confidence is less than or equal to the second confidence threshold, it can be considered that the first target and the second target are not the same target.
  • the first threshold is determined by the second detection error, or the first threshold is determined by the fourth detection error.
  • the first threshold value increases as the second detection error increases, or the first threshold value increases as the fourth detection error increases.
  • the first threshold when the second detection error is the first error, the first threshold is the first value; when the second detection error is the second error, the first threshold is the second value, where the first error Greater than the second error, the first value is greater than the second value; or, when the fourth detection error is the third error, the first threshold is the third value; when the fourth detection error is the fourth error, the first threshold is the fourth value , Where the third error is greater than the fourth error, and the third value is greater than the fourth value.
  • the second threshold is determined by the second detection error and the fourth detection error.
  • the second threshold value decreases as the second detection error and the fourth detection error increase.
  • the second detection error is the fifth error
  • the fourth detection error is the sixth error
  • the second threshold is the fifth value
  • the second detection error is the seventh error
  • the fourth detection error is the first Eight errors
  • the second threshold is the sixth value, where the fifth value is less than the sixth value; the fifth error is greater than the seventh error, and the sixth error is greater than the eighth error.
  • the degree of overlap is a ratio of the overlapping area of the first target area and the second target area in the union of the first target area and the second target area.
  • the first sensor is a radar and the second sensor is a camera.
  • the first sensor and the second sensor are both on-board sensors.
  • the first sensor is a vehicle-mounted radar
  • the second sensor is a vehicle-mounted camera.
  • the first detection error is the distance measurement error of the first sensor in the vertical direction
  • the second detection error is the angle measurement error of the first sensor or the distance measurement error of the first sensor in the horizontal direction.
  • the third detection error is the angle measurement error of the second sensor; the fourth detection error is the distance measurement error of the second sensor.
  • the position of the first target is the position of the first target relative to the reference point on the XY plane
  • the X-axis direction of the XY plane is the width direction of the vehicle
  • the Y-axis direction of the XY plane is the length direction of the vehicle.
  • the position of the second target is the position of the second target relative to the reference point on the XY plane.
  • an embodiment of the present application provides a target detection device.
  • the target detection device may be a first device or a chip in the first device.
  • the target detection device may include: a processing unit.
  • the processing unit may be a processor.
  • the processing unit may replace the above-mentioned first determination module, second determination module, and calculation module.
  • the target detection device may further include a communication unit, and the communication unit may be an interface circuit.
  • the target detection device may also include a storage unit.
  • the storage unit may be a memory. The storage unit is used to store computer program code, and the computer program code includes instructions.
  • the processing unit executes the instructions stored in the storage unit, so that the target detection device implements the first aspect or the method described in any one of the possible implementation manners of the first aspect.
  • the processing unit may be a processor, and the communication unit may be collectively referred to as a communication interface.
  • the communication interface can be an input/output interface, a pin or a circuit, and so on.
  • the processing unit executes the computer program code stored in the storage unit to enable the first device to implement the method described in the first aspect or any one of the possible implementations of the first aspect.
  • the storage unit may be a storage in the chip.
  • the unit (for example, a register, a cache, etc.) may also be a storage unit (for example, a read-only memory, a random access memory, etc.) located outside the chip in the first device.
  • the processor, the communication interface and the memory are coupled with each other.
  • the embodiments of the present application provide a computer-readable storage medium, and a computer program or instruction is stored in the computer-readable storage medium.
  • the embodiments of the present application provide a computer program product including instructions.
  • the instructions run on a computer, the computer executes the target detection described in the first aspect or various possible implementations of the first aspect. method.
  • an embodiment of the present application provides a communication device.
  • the communication device includes a processor and a storage medium.
  • the storage medium stores instructions.
  • various possible implementations describe a target detection method.
  • an embodiment of the present application provides a communication device.
  • the communication device includes one or more modules for implementing the method of the first aspect.
  • the one or more modules may be compatible with the method of the first aspect.
  • the steps correspond to each other.
  • an embodiment of the present application provides a chip that includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run a computer program or instruction to implement the first aspect or various aspects of the first aspect.
  • the communication interface is used to communicate with modules other than the chip.
  • the chip provided in the embodiment of the present application further includes a memory for storing computer programs or instructions.
  • an embodiment of the present application provides a communication device.
  • the communication device includes a processor, the processor is coupled with a memory, a computer program or instruction is stored in the memory, and the processor is used to run the computer program or instruction stored in the memory to implement A target detection method as described in the first aspect or various possible implementation manners of the first aspect.
  • any device or computer storage medium or computer program product or chip or communication system provided above is used to execute the corresponding method provided above. Therefore, the beneficial effects that can be achieved can refer to the corresponding method provided above The beneficial effects of the corresponding solutions in the method will not be repeated here.
  • FIG. 1 is a schematic diagram 1 of the positions of an image target and a radar target provided by an embodiment of this application;
  • FIG. 2 is a schematic diagram 2 of the positions of an image target and a radar target provided by an embodiment of this application;
  • FIG. 3 is a third schematic diagram of the positions of an image target and a radar target provided by an embodiment of this application;
  • FIG. 4 is a fourth schematic diagram of the positions of an image target and a radar target provided by an embodiment of this application;
  • FIG. 5 is a schematic diagram five of the positions of an image target and a radar target provided by an embodiment of this application;
  • FIG. 6 is a schematic structural diagram of a communication device provided by an embodiment of this application.
  • FIG. 7 is a schematic flowchart of a target detection method provided by an embodiment of this application.
  • FIG. 8 is a schematic diagram of a first area provided by an embodiment of this application.
  • FIG. 9a is a schematic diagram 1 of the positions of a first target and a second target according to an embodiment of this application.
  • FIG. 9b is a second schematic diagram of the positions of a first target and a second target according to an embodiment of this application.
  • FIG. 10a is a third schematic diagram of the positions of a first target and a second target according to an embodiment of this application;
  • FIG. 10b is a fourth schematic diagram of the positions of a first target and a second target according to an embodiment of this application;
  • FIG. 10c is a schematic diagram 5 of the positions of a first target and a second target according to an embodiment of this application;
  • FIG. 11 is a sixth schematic diagram of the positions of a first target and a second target according to an embodiment of this application.
  • FIG. 12 is a schematic diagram 7 of the positions of a first target and a second target provided by an embodiment of this application;
  • FIG. 13 is an eighth schematic diagram of the positions of a first target and a second target according to an embodiment of this application.
  • FIG. 14 is a schematic diagram 9 of the positions of a first target and a second target provided by an embodiment of this application;
  • 15 is a schematic structural diagram of a target detection device provided by an embodiment of the application.
  • FIG. 16 is a schematic structural diagram of another target detection device provided by an embodiment of the application.
  • FIG. 17 is a schematic structural diagram of a chip provided by an embodiment of the application.
  • words such as “first” and “second” are used to distinguish the same items or similar items that have substantially the same function and effect.
  • the first sensor and the second sensor are only used to distinguish different sensors, and the order of their order is not limited.
  • words such as “first” and “second” do not limit the quantity and order of execution, and words such as “first” and “second” do not limit the difference.
  • At least one refers to one or more, and “multiple” refers to two or more.
  • “And/or” describes the association relationship of the associated objects, indicating that there can be three relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone, where A, B can be singular or plural.
  • the character “/” generally indicates that the associated objects before and after are in an “or” relationship.
  • the following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • at least one of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple .
  • a target detected by an image captured by a monocular camera is called an image target.
  • the target detected by the image captured by the millimeter wave radar is referred to as a radar target.
  • the X axis represents the widthwise direction of the vehicle
  • the Y axis represents the lengthwise direction of the vehicle as an example.
  • EYr in Figure 1 represents the position error of the radar target in the vertical direction
  • EXr represents the position error of the radar target in the horizontal direction
  • R r represents the error area of the radar target
  • R r is determined by EYr and EXr.
  • E ⁇ i represents the angle measurement accuracy of the image target
  • EYi represents the distance measurement accuracy of the image target
  • the detection of the image target is based on a monocular camera
  • E ⁇ i is higher than EYi.
  • R i represents the error image of the target region, R i and is determined by the E ⁇ i EYi.
  • the area of the overlap area between the error area of the radar target and the error area of the image target (for example, the shaded part R x in Figure 1) reflects the possibility that the image target and the radar target come from the same object. In other words, the image target and the radar target are the same The possibility of a goal. When the area of the overlapping area is greater than a certain threshold, it is determined that the radar target and the image target are the same target.
  • the difference between Figure 2 and Figure 1 is that the error area of the radar target in Figure 2 is determined by the vertical error EYr and the angle measurement error E ⁇ r.
  • the shape of the error region R r of the radar target in FIG. 2 is similar to a parallelogram (the shape of the error region of the radar target in FIG. 1 is a rectangle).
  • the relationship between the overlap area of the error area of the radar target and the error area of the image target and a certain threshold can still be used to determine whether the image target and the radar target are the same target.
  • judging whether two targets for example, an image target and a radar target
  • judging whether the two targets are from the same object.
  • the overlapping area area is greater than a certain threshold to determine whether the image target and the radar target are the same target, there are still the following problems: even if the overlapping area has the same size, the target position relationship may be different, which may lead to misjudgment of the image The target and the radar target are the same target.
  • the possibility that the radar target at position X'and the image target are the same target should be the same as the radar target at position X'and the image
  • the possibility that the target is the same target should be different.
  • the radar target at position X'and the image target are the same target, and the radar target at position X'and the image target are not the same target.
  • the possibility that the image target and radar target at position Y'are the same target should be equal to the possibility that the image target and radar target at position Y are the same target, that is, the image target at position Y' It is the same target as the radar target, and the image target and radar target at position Y are also the same target.
  • the possibility that the image target and radar target at position Y'are the same target should be the same target as the image target and radar target at position Y
  • the possibilities are not the same.
  • the above scheme is only based on the relationship between the overlap area and the threshold. Since C and D are both greater than the threshold, it may be judged that the image target and radar target at Y'are the same target, and the image target and radar target at position Y are the same. A goal, but this conclusion is contrary to the facts, so only based on the relationship between the overlap area and the threshold will cause misjudgment.
  • an embodiment of the present application provides a target detection method.
  • this method in addition to the measurement of the overlap area between the error area of the radar target and the error area of the image target, other metrics are also considered to improve the judgment of the image target. The accuracy of whether the radar target is the same target.
  • FIG. 6 shows a schematic diagram of the hardware structure of a target detection apparatus in an embodiment of the present application.
  • the target detection device includes a processor 41 and a communication line 43.
  • the target detection device may further include a memory 42.
  • the processor 41 can be a general-purpose central processing unit (central processing unit, CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more programs for controlling the execution of the program of this application. integrated circuit.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • the communication line 43 may include a path to transmit information between the aforementioned components.
  • the memory 42 may be a read-only memory (ROM) or other types of static storage devices that can store static information and instructions, random access memory (RAM), or other types that can store information and instructions
  • the dynamic storage device can also be electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM) or other optical disk storage, optical disc storage (Including compact discs, laser discs, optical discs, digital versatile discs, Blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or can be used to carry or store desired program codes in the form of instructions or data structures and can be used by a computer Any other media accessed, but not limited to this.
  • the memory can exist independently and is connected to the processor through the communication line 43. The memory can also be integrated with the processor.
  • the memory 42 is used to store computer-executable instructions for executing the solution of the present application, and the processor 41 controls the execution.
  • the processor 41 is configured to execute computer-executable instructions stored in the memory 42 to implement a target detection method provided in the following embodiments of the present application.
  • the target detection device may further include a first sensor and a second sensor.
  • the steps executed by the first device in the target detection method in the embodiments of the present application can also be executed by the chip applied to the first device.
  • the following embodiments take the execution subject of the target detection method as The first device is described as an example.
  • FIG. 7 shows an embodiment of a target detection method provided by an embodiment of the present application, and the method includes:
  • Step 701 The first device determines the overlapping area of the first target area and the second target area, the first target area is determined by the detection error of the first sensor and the position of the first target detected by the first sensor, and the second target area It is determined by the detection error of the second sensor and the position of the second target detected by the second sensor.
  • the first target area includes the position of the first target, in other words, the position of the first target is located in the first target area.
  • the second target area includes the position of the second target, in other words, the position of the second target is located in the second target area.
  • the overlapping area of the first target area and the second target area is the common area where the first target area and the second target area exist.
  • the overlapping area is the intersection of the first target area and the second target area.
  • the overlapping area belongs to both the first target area and the second target area.
  • the target detected by the image captured by the first sensor may be referred to as the first target
  • the target detected by the image captured by the second sensor may be referred to as the second target
  • the first device in the embodiment of the present application may be a vehicle, or a chip applied to a vehicle, or a component applied to a vehicle (for example, an electronic control unit (ECU) or an automatic driving controller), or a first
  • the device may be a server, or a chip applied to the server.
  • the electronic control unit also known as a “travel computer” or “vehicle computer”.
  • the first device in the embodiment of the present application may be a vehicle with information processing functions part.
  • the detection error of the first sensor is the position error of the first sensor.
  • the detection error of the first sensor includes a first detection error and a second detection error.
  • the first detection error is the distance measurement error of the first sensor in the vertical direction
  • the second detection error is the angle measurement error of the first sensor or the distance measurement error of the first sensor in the horizontal direction.
  • the detection error of the first sensor is predetermined based on the characteristics of the first sensor.
  • the detection error of the second sensor is the position error of the second sensor.
  • the detection error of the second sensor includes a third detection error and a fourth detection error.
  • the third detection error is the angle measurement error of the second sensor
  • the fourth detection error is the distance measurement error of the second sensor.
  • the detection error of the second sensor is determined in advance based on the characteristics of the second sensor.
  • the ranging error of the second sensor may be the ranging error of the second sensor in the horizontal direction or the ranging error in the vertical direction.
  • the "error" of the sensor in the embodiments of the present application can also be replaced with the "accuracy” of the sensor.
  • the angle measurement error can be replaced with the angle measurement accuracy.
  • the detection error in the embodiment of the present application may also be referred to as: measurement error.
  • the first target area in the embodiment of the present application is defined by the fifth line and the sixth line of parallel lines arranged vertically, and the closed area enclosed by the seventh line and the eighth line of parallel lines arranged horizontally.
  • the fifth line and the sixth line are arranged in parallel, and the shortest distance between the fifth line and the position of the first target is equal to the shortest distance between the sixth line and the position of the first target, and the fifth line is the same as the first target.
  • the shortest distance between the positions of the target is the ranging error of the first sensor in the vertical direction.
  • the seventh line and the eighth line are arranged in parallel, and the shortest distance between the seventh line and the position of the first target is equal to the shortest distance between the eighth line and the position of the first target, and the seventh line is equal to the position of the first target.
  • the shortest distance between the positions is the ranging error of the first sensor in the horizontal direction.
  • the first target area extends from the position of the first target up and down along the vertical direction. Extend the closed area defined by the distance measurement error of the first sensor in the horizontal direction, that is, the position of the first target is the center of the first target area. Or in the embodiment of the present application, the first target area is centered on the position of the first target, and the ranging error of the first sensor in the vertical direction is extended up and down in the vertical direction, and centered on the position of the first target, along the horizontal direction The left and right extending first sensor is defined by the closed area enclosed by the ranging error in the horizontal direction, that is, the position of the first target is the center of the first target area.
  • a target area may be expressed as Y 1 -EYr ⁇ Y r ⁇ Y 1 + EYr and X 1 -EXr ⁇ X r ⁇ X 1 + EXr , wherein, Y 1 represents The coordinates of a target in the vertical direction, X 1 represents the coordinates of the first target in the horizontal direction.
  • ⁇ EYr represents the distance measurement error of the first sensor in the vertical direction (Y-axis direction), and ⁇ EXr represents the distance measurement error of the first sensor in the horizontal direction (X-axis direction).
  • Y r represents the coordinates of the first target area in the vertical direction, and X r represents the coordinates of the first target area in the horizontal direction.
  • the first target area in the embodiment of the present application is centered on the position of the first target, and the ranging error of the first sensor in the vertical direction is extended up and down in the vertical direction, and the first target The line between the position of and the reference point is the axis, and the closed area formed by the angle measurement error of the first sensor rotating left and right is defined.
  • the specific shape of the first target area can refer to R r in FIG. 4.
  • a target area may be expressed as Y 1 -EYr ⁇ Y r ⁇ Y 1 + EYr and X 1 -E ⁇ r ⁇ X r ⁇ X 1 + E ⁇ r , wherein, Y 1 represents The coordinates of a target in the vertical direction, X 1 represents the coordinates of the first target in the horizontal direction.
  • ⁇ EYr represents the distance measurement error of the first sensor in the vertical direction (Y-axis direction), and ⁇ E ⁇ r represents the angle measurement error of the first sensor.
  • Y r represents the coordinates of the first target area in the vertical direction, and X r represents the coordinates of the first target area in the horizontal direction.
  • the second target area in the embodiment of the present application is centered on the position of the second target, extending up and down in the vertical direction.
  • the line between the position and the reference point is the axis, and the closed area formed by the angle measurement error of the second sensor rotating left and right is defined.
  • the second target area in the embodiment of the present application can be expressed as Y 2 -EYi ⁇ Y i ⁇ Y 2 +EYi and X 2 -E ⁇ i ⁇ X i ⁇ X 2 +E ⁇ i, where Y 2 represents the first The coordinates of the second target in the vertical direction (Y-axis direction), X 2 represents the coordinates of the second target in the horizontal direction (X-axis direction).
  • ⁇ EYi represents the ranging error of the second sensor in the vertical direction (Y-axis direction), and ⁇ E ⁇ i represents the angle measurement error of the second sensor in the horizontal azimuth (X-axis direction).
  • Y i represents the coordinates of the second target area in the vertical direction, and X i represents the coordinates of the second target area in the horizontal direction.
  • R ir represents the overlapping area of the first target area and the second target area (that is, the shaded part in Fig. 8)
  • R r represents the first target area, that is, the first target area is composed of X r and Y r limitation, in other words, the width of the coordinate range of the first target area in the vertical direction (Y-axis direction) is 2EYr.
  • the width of the coordinate range of the first target area in the horizontal direction (X-axis direction) is 2EXr.
  • R i represents a second target area, i.e., the second target region defined by the X i and Y i, in other words the width of the coordinate range of the second target region in the vertical direction (Y axis direction) is 2EYi.
  • the width of the azimuth angle range of the second target area in the horizontal azimuth angle (X-axis direction) is 2E ⁇ i.
  • the position of the first target is the position of the first target relative to the reference point on the XY plane
  • the X-axis direction of the XY plane is the width direction of the vehicle
  • the Y-axis direction of the XY plane is the length direction of the vehicle.
  • the position of the second target is the position of the second target relative to the reference point on the XY plane.
  • the reference point on the XY plane may be the coordinate origin O shown in FIG. 8, and the coordinate origin O may be the position of the vehicle, and the first sensor and the second sensor are deployed on the vehicle.
  • Step 702 The first device calculates the degree of overlap of the overlapping area in the union of the first target area and the second target area.
  • the degree of overlap is the ratio of the overlapping area of the first target area and the second target area in the union of the first target area and the second target area.
  • the degree of overlap ( the intersection of R r and R i )/(the union of R r and R i ).
  • R r represents the first target area
  • R i represents the second target area.
  • step 702 in the embodiment of the present application can be replaced by the following steps: the first device calculates the overlap area of the overlap region.
  • the first device calculates the overlap area of the overlap region.
  • the first threshold may also be a preset ratio value.
  • the first threshold may be a preset area threshold, which is not limited in the embodiment of the present application.
  • Step 703 When the degree of overlap is greater than or equal to the first threshold, the first device determines the relationship between the position of the first target and the position of the second target and the first area, and/or the first device The relationship between the distance between the position of the second target and the position of the second target and the second threshold determines the confidence that the first target and the second target are the same target as the target confidence.
  • the first area is determined by the first detection error of the first sensor and the third detection error of the second sensor.
  • the detection error of the first sensor includes the first detection error and the second detection error.
  • the accuracy of the first detection error is higher than The accuracy of the second detection error
  • the detection error of the second sensor includes a third detection error and a fourth detection error
  • the accuracy of the third detection error is higher than that of the fourth detection error.
  • the confidence that the first target and the second target are the same target can be understood as: the possibility that the first target and the second target are the same target, or the first target and the second target originate from the same target Possibility of objects.
  • the first threshold and the second threshold in the embodiment of the present application may be values predefined by the protocol or values determined by the first device.
  • the first area in the embodiment of the present application is determined by the first detection error of the first sensor and the third detection error of the second sensor, which is based on the following facts:
  • the measurement accuracy of the radar in the vertical direction is higher than the measurement accuracy in the horizontal direction (in other words, the range accuracy of the radar in the Y direction, and the accuracy of measuring X/azimuth is less than that of the vehicle-mounted radar in the Y direction. On the accuracy).
  • the second sensor as a camera as an example, the azimuth measurement accuracy of the camera is higher than the horizontal or vertical accuracy (a monocular camera (camera) measures the azimuth accurately, but the distance measurement is relatively inaccurate).
  • the first area is composed of parallel lines (line 1 and line 2) whose distance between the vertical direction and the position of the first target is the error of the first sensor in the vertical direction. )
  • the embodiment of the present application provides a target detection method.
  • the first device determines the degree of overlap between the first target area and the second target area, and then when the degree of overlap is greater than or equal to a first threshold, the first device determines A device based on the relationship between the position of the first target and the position of the second target and the first area, and/or the first device based on the distance between the position of the first target and the position of the second target and the second threshold
  • the relationship between the first target and the second target is determined as the target confidence, so that when the degree of overlap is greater than or equal to the first threshold, based on the position of the first target and the second target
  • the relationship between the position and the first area, and/or the relationship between the distance between the position of the first target and the position of the second target and the second threshold further determines that the first target and the second target are the same target
  • the accuracy of the confidence level reduces misjudgments.
  • the first sensor and the second sensor may detect targets in the same range, in other words, the position of the first target and the position of the second target are located in the same range.
  • the first device determines the confidence that the first target and the second target are the same target, it is convenient to assist in the subsequent execution of target fusion. Since the measurement errors (also called detection errors) of different sensors are different, the first target is detected by the first sensor, and the second target is detected by the second sensor. If the first device determines the first target and the second target If it is the same target, the position of the first target and the position of the second target can be merged subsequently to obtain the final position of the target after the fusion. However, if the first device determines that the first target and the second target are not the same target, there is no need to merge the position of the first target with the position of the second target subsequently.
  • the measurement errors also called detection errors
  • the first device may determine that the first target and the second target are not the same target.
  • the first sensor in the embodiment of the present application is a radar
  • the second sensor is a camera
  • the radar may be a millimeter wave radar
  • the camera can be: a monocular camera.
  • the millimeter wave radar in the embodiment of the present application is installed in the middle and front part of the vehicle, and is used to detect targets such as other vehicles and pedestrians by using millimeter waves.
  • the millimeter wave radar sends millimeter waves to the front from the vehicle while scanning along the horizontal plane, and receives the reflected millimeter waves, so that the transmitted and received data is transmitted to the first device in the form of radar signals, or the millimeter wave radar transmits and The received data is transmitted to the vehicle in the form of a radar signal, and the vehicle is transmitted to the server.
  • the monocular camera includes a charge coupled device (CCD) camera, and the monocular camera is installed in the middle and front of the vehicle. The monocular camera transmits the data of the captured image to the first device in the form of an image signal, or the monocular camera transmits the data of the captured image to the vehicle in the form of an image signal, and the vehicle transmits to the server.
  • CCD charge coupled device
  • the relationship between the position of the first target and the position of the second target and the first area in the embodiment of the present application includes: both the position of the first target and the position of the second target are located in the first area.
  • the relationship between the position of the first target and the position of the second target and the first area includes: one or more of the position of the first target and the position of the second target is located outside the first area.
  • the relationship between the distance between the position of the first target and the position of the second target and the second threshold in the embodiment of the present application includes: the distance between the position of the first target and the position of the second target is greater than or equal to the second threshold , And, the distance between the position of the first target and the position of the second target is less than or equal to the second threshold.
  • the first device determines the distance between the position of the first target and the position of the second target and the first area.
  • the relationship between the first target and the second target is the target confidence, including: when the degree of overlap is greater than or equal to the first threshold, and the position of the first target and the position of the second target are both Located in the first area, the first device determines the confidence that the first target and the second target are the same target as the target confidence, and the target confidence is greater than or equal to the first confidence threshold.
  • the target confidence in the embodiment of the present application is greater than or equal to the first confidence threshold and can be replaced with: the first target and the second target are the same target.
  • the first confidence threshold in the embodiment of the present application can be set as required, which is not limited in the embodiment of the present application.
  • the first device may determine that the confidence that the target A and the target B are the same target is the target confidence, and the target confidence is greater than or equal to the first confidence threshold . Or the first device determines that the target A and the target B are the same target.
  • the target confidence when the degree of overlap is greater than or equal to the first threshold, and the position of the first target and the position of the second target are both located in the first area, the target confidence is the first confidence.
  • the target confidence is the second confidence.
  • the target confidence when the degree of overlap is greater than or equal to the first threshold, the target confidence when the position of the first target and the position of the second target are both located in the first area is higher than the position of the first target and the position of the second target. At least one target confidence level when it is outside the first area.
  • FIG. 9b taking the first target as target A and the second target as target B as an example, it shows that when target B is in different positions, target B and target A in different positions are the same target.
  • target B is located at position Q (X 21 , Y 21 ), or target B is located at position Q'(X 22 , Y 22 ), where position Q'is outside the first area, and position Q is inside the first area .
  • Assumptions object B located at the position Q, the second target region determined by the object B and the position Q first error parameter of the second sensor is R i.
  • the overlapping area between the first target area R r and the second target area R i (the shaded part in FIG. 9b) is R ir .
  • R i ′ the position of the object B from the Q' of the second region and the second target detection error of the second sensor is determined as R i '.
  • the overlapping area between R i ′ and the first target area R r (the shaded part in FIG. 9b) is R ir ′. If the degree of overlap of R ir and the degree of overlap of R ir 'are both greater than or equal to the first threshold, that is, regardless of whether the target B is located at the position Q or at the position Q', R i and R i ' are respectively the same as those of the first target area. The sizes of the overlapping areas are all greater than or equal to the first threshold.
  • the target B is at the position Q, the position of the target B is in the first area, and when the target B is at the position Q', the position of the target B is outside the first area, then the target A and the target B at the position Q are The target confidence of the same target is higher than the target confidence that the target A and the target B located at the position Q'are the same target.
  • the first device determines the distance between the position of the first target and the position of the second target and the first area.
  • the relationship between the first target and the second target is determined as the target confidence, including: when the degree of overlap is greater than or equal to the first threshold, if the position of the first target and the position of the second target are in One or more of are located outside the first area, the first device determines that the confidence that the first target and the second target are the same target is the target confidence, and the target confidence is less than or equal to the second confidence threshold.
  • the confidence that the first target and the second target are the same target is the target confidence, and the target confidence is less than or equal to the second confidence threshold can be replaced with: the first target and the second target are not the same Target.
  • the first confidence threshold may be the same as or different from the second confidence threshold, which is not limited in the embodiment of the present application.
  • the first device can determine that the confidence that the target A and the target B are the same target is the target confidence, and the target confidence is less than or equal to the second confidence Threshold.
  • the first device compares the distance between the position of the first target and the position of the second target with the second The relationship between the thresholds, the confidence that the first target and the second target are the same target is determined as the target confidence, including: when the degree of overlap is greater than or equal to the first threshold, the position of the first target and the second target If the distance between the positions is less than or equal to the second threshold, the first device determines the confidence that the first target and the second target are the same target as the target confidence, and the target confidence is greater than or equal to the first confidence threshold.
  • the overlap between the first target area and the second target area has an overlap degree greater than or equal to the first threshold, and the The distance between the position and the position of the second target is less than the first threshold, the confidence that the first target and the second target are the same target is the target confidence, and the target confidence is greater than or equal to the first confidence threshold.
  • the first target and the second target when the degree of overlap is greater than or equal to the first threshold, and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold, the first target and the second target The confidence level for the same target is the third confidence level.
  • the degree of overlap is greater than or equal to the first threshold, and the distance between the position of the first target and the position of the second target is greater than the second threshold, the confidence that the first target and the second target are the same target is the fourth confidence Spend.
  • the third degree of confidence is greater than the fourth degree of confidence.
  • the first target is target A
  • the second target is target B
  • Fig. 10b shows target A in different positions (for example, the position of target A can be position P(X 11 , Y 11 ), or when the position of the target A can be the position P'(X 12 , Y 12 )), the confidence that the target A and the same target B in different positions are the same target, assuming that the target A is at the position P, by
  • the first target area determined by the position P of the target A and the detection error of the first sensor is R r .
  • the overlapping area between the R r and the second target area R i is R ri .
  • the degree of overlap of R ri is S1.
  • the first target area determined by the position P′ of the target A and the detection error of the first sensor is R r ′.
  • the overlapping area between R r ′ and the second target area R i is R ri ′.
  • the degree of overlap of R ri ' is S2.
  • S1 and S2 are assumed equal to or greater than a first threshold value, i.e. the object A is located in a position corresponding to a first target region P and R r at a position P 'of the object A corresponding to the first target region R r' respectively, and a second target
  • the overlapping area sizes of the areas are all greater than or equal to the first threshold.
  • the distance between the position of the target A and the position of the target B is L1.
  • the distance between target A and target B is L2, and L1 is less than L2, that is, compared to target A at position P', the distance between target B and target A at position P Relatively close. Therefore, the third confidence that the target B and the target A located at the position P are the same target is greater than the fourth confidence that the target B and the target A located at the position P′ are the same target.
  • Fig. 10c shows target B located at different positions (for example, target B is located at position Q(X 21 , Y 21 ), the target B is located at the position Q'(X 22 , Y 22 )) and the same target A is the same target confidence level, assuming that the target B is at the position Q, it is detected by the target B’s position Q and the second sensor the second error determination target region for R i.
  • the overlapping area between the first target area R r and the second target area R i is R ir .
  • the degree of overlap of R ir is S3.
  • the object B from the position Q' determined by the second sensor and the detection error region R i '.
  • the overlapping area between R i 'and the first target area R r is R ir '.
  • the degree of overlap of R ri ' is S4. If S3 and S4 are both greater than or equal to the first threshold, that is, the degree of overlap between the R i corresponding to the target B at the position Q and the R i corresponding to the target B at the position Q'and the overlapping area of the first target area respectively Greater than or equal to the first threshold. If target B is at position Q, the distance between target A's position and target B's position Q is L3.
  • the distance between target A's position and target B's position Q' is L4, and L3 is less than L4, that is, compared to target B located at position Q', target A's position and target
  • the distance between B and Q is relatively short. Therefore, the confidence that the target A and the target B located at the position Q are the same target is greater than the confidence that the target A and the target B located at the position Q'are the same target.
  • Figure 10b and Figure 10c respectively take a target at different positions and calculate the confidence between the same target and targets at different positions as examples. It can be understood that the target B at position Q and the target at position Q Q''s goal B may not be the same goal.
  • the first device compares the distance between the position of the first target and the position of the second target with the second The relationship between the thresholds, the confidence that the first target and the second target are the same target is determined as the target confidence, including: when the degree of overlap is greater than or equal to the first threshold, the position of the first target and the second target If the distance between the positions is greater than the second threshold, the first device determines the confidence that the first target and the second target are the same target as the target confidence, and the target confidence is less than or equal to the second confidence threshold.
  • Fig. 10c taking the first target as target A at position P and the second target as target B at position Q'as an example, it can be seen from Fig. 10c that the position of target A and the position of target A are at position Q
  • the distance between the positions of the target B of 'is L4, and L4 is greater than the second threshold, then the confidence that the target A and the target B are the same target is less than or equal to the fifth confidence threshold.
  • the first device is based on the position of the first target and the relationship between the position of the second target and the first area, and, according to the position of the first target
  • the relationship between the distance between the position of the second target and the second threshold and the target confidence that the first target and the second target are the same target can be achieved in the following way: when the degree of overlap is greater than or equal to the first target
  • the threshold is set, the position of the first target and the position of the second target are both located in the first area and the distance between the position of the first target and the position of the second target is less than or equal to the second threshold
  • the first device determines the first target The confidence that the second target is the same target is the target confidence, and the target confidence is greater than or equal to the first confidence threshold.
  • the difference between Figure 11 and Figure 9a is: in Figure 11, the position of target A and the position of target B are not only located in the overlapping area, and the distance between the position of target A and the position of target B is less than or It is equal to the second threshold.
  • the position of target A and the position of target B in Figure 9a are located in the overlapping area, it is possible that the distance between the position of target A and the position of target B is not necessarily less than or equal to the second threshold, so ,
  • the confidence that the target A and the target B are the same target in Fig. 11 is higher than the confidence that the target A and the target B are the same target in Fig. 9a.
  • FIG. 11 is larger than the area of the overlapping area in FIG. 9a. Therefore, comparing FIG. 11 and FIG. 9a, it can be determined that the target A and the target B in FIG. 11 are more likely to be the same target. This is also in line with the actual situation shown in Figure 11.
  • the distance between the position of target A and the position of target B in Figure 11 is smaller than the distance between the position of target A and the position of target B in Figure 9a, in other words, in Figure 11
  • the distance between the position of the target A and the position of the target B is closer than the distance between the position of the target A and the position of the target B in FIG. 9a.
  • target a and target b' Both are located in the first area, and the position of target a and the position of target b" are also located in the first area, if the area of the corresponding overlap area between target a and target b'is larger than the corresponding overlap of target a and target b"
  • the area, the overlap degree of the overlap area corresponding to the target a and the target b'and the overlap degree of the overlap area corresponding to the target a and the target b" are both greater than or equal to the first threshold, then the target a and the target b'are The confidence of the same target is higher than the confidence that target a and target b" are the same target.
  • the area of the overlapping area corresponding to the target a and the target b' is larger than the overlapping area corresponding to the target a and the target b".
  • the overlap degree of the overlapping area corresponding to the target b" is greater than or equal to the first threshold. If the distance between the target a and the target b'is greater than the distance between the target a and the target b", then the target a and the target b" are The confidence level of the same target is higher than the probability that target a and target b'are the same target.
  • the degree of overlap is greater than or equal to the first threshold
  • the position of the first target and the position of the second target are both located in the first area
  • the position of the first target and the position of the second target The distance between is less than or equal to the second threshold
  • the confidence that the first target and the second target are the same target is the fifth confidence.
  • the degree of overlap is greater than or equal to the first threshold
  • at least one of the position of the first target and the position of the second target is located outside the first area, and/or the distance between the position of the first target and the position of the second target Greater than the second threshold
  • the confidence that the first target and the second target are the same target is the sixth confidence.
  • the fifth degree of confidence is greater than the sixth degree of confidence.
  • At least one of the position of the first target and the position of the second target is located outside the first area can also be replaced by: at most one of the position of the first target and the position of the second target is located in the first area .
  • the description includes the following situations: the position of the first target is located in the first area, and the position of the second target is outside the first area, or the position of the second target is located in the first area, and the position of the first target is located in the first area. Outside the area. Or the position of the first target and the position of the second target are both located outside the first area.
  • the degree of overlap is greater than or equal to the first threshold, the position of the first target is located in the first area, and the position of the second target is located outside the first area, and the position of the first target and the second target If the distance between the positions of is greater than the second threshold, the confidence that the first target and the second target are the same target is the sixth confidence. Or, when the degree of overlap is greater than or equal to the first threshold, the position of the first target is located in the first area, and the position of the second target is outside the first area, then the first target and the second target are of the same target The confidence level is the sixth confidence level. Or, when the degree of overlap is greater than or equal to the first threshold, and the distance between the position of the first target and the position of the second target is greater than the second threshold, the confidence that the first target and the second target are the same target is The sixth degree of confidence.
  • the position of the first target is located in the first area, the position of the second target is located in the first area, and the position of the first target is located outside the first area and the first target If the distance between the position of and the position of the second target is greater than the second threshold, the confidence that the first target and the second target are the same target is the sixth confidence. Or, when the degree of overlap is greater than or equal to the first threshold, the position of the first target is located in the first area, and the position of the second target is located in the first area, then the first target and the second target are the same The confidence level of the target is the sixth confidence level.
  • the degree of overlap is greater than or equal to the first threshold, the position of the first target and the position of the second target are both located outside the first area, and the distance between the position of the first target and the position of the second target is greater than the second threshold, then The confidence that the first target and the second target are the same target is the sixth confidence.
  • the degree of overlap is greater than or equal to the first threshold, the position of the first target and the position of the second target are both outside the first area, and the confidence that the first target and the second target are the same target is the sixth Confidence.
  • the first sensor may be determined according to the detection error of the first sensor or the detection error of the second sensor. Threshold.
  • the first threshold in the embodiment of the present application is determined by the second detection error, or the first threshold is determined by the fourth detection error.
  • the first threshold value increases as the second detection error increases.
  • the first threshold value increases as the fourth detection error increases.
  • the first threshold when the second detection error is the first error, the first threshold is the first value, and when the second detection error is the second error, the first threshold is the second value, where the first error is greater than the second Error, the first value is greater than the second value.
  • the first sensor's horizontal ranging error (EXr) is the first An error
  • the distance measurement error (EXr) of the first sensor in the horizontal direction in FIG. 9a is the second error
  • the comparison shows that the first error is greater than the second error. Comparing Fig. 9a and Fig. 12, it can be found that when the distance measurement error (EXr) in the horizontal direction of the first sensor increases, the overlap area of the overlap area of the first target area and the second target area will gradually increase (for example, R r the first target region 12 in FIG area larger than the area in FIG. 9a is a first target region R r).
  • the first threshold is fixed, it may be possible to reduce the accuracy of using the relationship between the degree of overlap of the overlapping area and the first threshold to determine whether the first target and the second target are the same target. Based on this, it is necessary to dynamically adjust the first threshold according to the ranging error of the first sensor in the horizontal direction.
  • the distance measurement error (EXr) of the first sensor in the horizontal direction when the distance measurement error (EXr) of the first sensor in the horizontal direction reaches a certain threshold a, the distance measurement error (EXr) of the first sensor in the horizontal direction covers the entire first area in the horizontal direction. Later, as the distance measurement error (EXr) of the first sensor in the horizontal direction continues to increase, although the first target area will continue to grow, it will not affect the overlap area of the overlapping area of the first target area and the second target area.
  • the size change that is, even if the horizontal ranging error (EXr) of the first sensor in the horizontal direction continues to increase based on the threshold a, the overlap area of the overlap area does not change, that is, the overlap area is in a "saturated" state .
  • the first threshold may be set to the first value, and for the situation in FIG. 9a, the first threshold may be set to the second value to reduce the influence of such "saturation".
  • the first value is the value set when the overlapping area is in the "saturated” state
  • the second value is the value set when the overlapping area is not in the "saturated” state.
  • the first device in the embodiment of the present application has multiple second detection errors and a value corresponding to each second detection error in the multiple second detection errors. In this way, when the second detection error is the first error, the first device can use the value corresponding to the first error as the first threshold.
  • the first threshold is the third value
  • the fourth detection error is the fourth error
  • the first threshold is the fourth value
  • the third error is greater than The fourth error
  • the third value is greater than the fourth value
  • the fourth detection error is the second sensor's ranging error in the vertical direction as an example
  • the second sensor's vertical ranging error (EYi) is the first Three errors.
  • the distance measurement error (EYi) of the second sensor in the vertical direction is the fourth error.
  • the comparison shows that the third error is greater than the fourth error.
  • the ranging error (EYi) direction increases, the first target region and the size of the overlapping area of the overlapping region of the second target area is also on the increase (e.g., the target area of the second region in FIG. 13 R i is greater than Figure 9a is a second target area region R i).
  • the first threshold is fixed, it may be possible to reduce the accuracy of using the relationship between the degree of overlap of the overlapping area and the first threshold to determine whether the first target and the second target are the same target. Based on this, it is necessary to dynamically adjust the first threshold according to the vertical range error (EYi) of the second sensor.
  • EYi vertical range error
  • the ranging error (EYi) of the second sensor in the vertical direction reaches a certain threshold b
  • the ranging error (EYi) of the second sensor in the vertical direction covers the entire first sensor in the vertical direction.
  • One area the second target area will continue to increase, but although the second target area continues to increase, it will not affect the first target area and the second target.
  • the size of the overlapping area of the overlapping area that is, the overlapping area is in a "saturated" state.
  • the first threshold may be set to a third value, and for the situation in FIG. 9a, the first threshold may be set to a fourth value to reduce the influence of such "saturation".
  • the first device in the embodiment of the present application has multiple fourth detection errors and a value corresponding to each fourth detection error in the multiple fourth detection errors.
  • the fourth detection error is the third error
  • the first device can use the value corresponding to the third error as the first threshold.
  • the first device has multiple second sensors in the vertical direction ranging error (EYi) and multiple second sensors in the vertical direction ranging error (EYi) each second The value corresponding to the distance measurement error (EYi) of the sensor in the vertical direction.
  • the distance measurement error (EYi) of the second sensor in the vertical direction is the third error
  • the first device can use the value corresponding to the third error as the third threshold.
  • Fig. 12 takes the increase of the ranging error of the first sensor in the horizontal direction as an example for description
  • Fig. 13 takes the increase of the ranging error of the second sensor in the vertical direction as an example for description.
  • the first threshold can also be determined by the distance measurement error of the first sensor in the horizontal direction and the distance measurement error of the second sensor in the vertical direction. The ranging error on the above is determined together.
  • the first threshold value changes as the ranging error of the first sensor in the horizontal direction and the ranging error of the second sensor in the vertical direction increase (for example, increase, as shown in FIG. 14).
  • the first threshold value is a, or the first threshold value is b, where a is greater than b, a is a parameter set when the overlapping area covers the first area, and b is a parameter set when the overlapping area does not cover the first area.
  • the first threshold value decreases as the second detection error of the first sensor decreases, and the first threshold value decreases as the fourth detection error of the second sensor decreases.
  • the second threshold in the embodiment of the present application is jointly determined by the second detection error and the fourth detection error.
  • the second threshold value decreases as the second detection error and the fourth detection error increase.
  • the second threshold value increases as the second detection error and the fourth detection error increase and decrease.
  • the second detection error of the first sensor is the fifth error
  • the fourth detection error is the sixth error
  • the second threshold is the fifth value
  • the second detection error of the first sensor is the seventh error
  • the fourth detection error is It is the eighth error
  • the second threshold is the sixth value, where the fifth value is less than the sixth value, the fifth error is greater than the seventh error, and the sixth error is greater than the eighth error.
  • the second detection error is the distance measurement error of the first sensor in the horizontal direction
  • the fourth detection error is the distance measurement error of the second sensor in the vertical direction.
  • the ranging error in the horizontal direction (for example, the fifth error)
  • the ranging error of the second sensor in the vertical direction (for example, the sixth error) is greater than the ranging error of the first sensor in the horizontal direction in FIG. 9a (E.g., the seventh error) and the second sensor's ranging error in the vertical direction (e.g., the eighth error).
  • the overlapping area of the first target area and the second target area (the shaded part in FIG. ) Covers the entire first area.
  • the overlapping area of the overlapping area reaches a "saturated" state, that is, even if the ranging error of the first sensor in the horizontal direction and the ranging error of the second sensor in the vertical direction continue to increase. Large, the overlapping area of the subsequent overlapping area will no longer change.
  • the second threshold is determined by the relationship between the overlapping area and the first area. Specifically, the second threshold is the seventh value, or the second threshold is the eighth value, where the seventh value is greater than the eighth value, the eighth value is a parameter set when the overlapping area covers the first area, and the seventh value is The parameter set when the overlapping area does not cover the first area.
  • the second detection error of the first sensor is the distance measurement error of the first sensor in the horizontal direction as an example.
  • the first device in the embodiment of the present application may determine the confidence that the first target and the second target are the same target, for example, the confidence is 90% or the confidence is 80%.
  • the indication value determined by the first device may also be an indication that the first target and the second target are the same target or not the same target.
  • the first device outputs the first indicator or the second indicator, where the first indicator indicates that the first target and the second target are the same target.
  • the second indicator indicates that the first target and the second target are not the same target.
  • the method provided in the embodiment of the present application may further include: the first device determines whether the first target and the second target are the same target according to the target confidence. For example, if the first device determines that the target confidence is greater than or equal to the first confidence threshold, it determines that the first target and the second target are the same target. The first device determines that the target confidence is less than or equal to the second confidence threshold, and then determines that the first target and the second target are not the same target.
  • the first device and the like include hardware structures and/or software modules corresponding to the respective functions.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software-driven hardware depends on the specific application and design constraint conditions of the technical solution. Professionals and technicians can use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of this application.
  • the first device may be divided into functional units according to the foregoing method example.
  • each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit. It should be noted that the division of units in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • the method of the embodiment of the present application has been described above in conjunction with FIG. 7 to FIG. 14.
  • the following describes the target detection device provided by the embodiment of the present application for performing the foregoing method.
  • the method and the device can be combined and referenced with each other, and the target detection device provided in the embodiment of the present application can execute the steps performed by the first device in the above-mentioned target detection method.
  • FIG. 15 shows a target detection device involved in the foregoing embodiment.
  • the target detection device may include: a first determination module 1501, a calculation module 1502, and a second determination module 1503.
  • the target detection apparatus is a first device, or a chip applied in the first device.
  • the first determining module 1501 is used to support the target detection apparatus to execute step 701 executed by the first device in the foregoing embodiment.
  • the calculation module 1502 is configured to support the target detection apparatus to execute step 702 executed by the first device in the above-mentioned embodiment.
  • the second determining module 1503 is configured to support the target detection apparatus to execute step 703 executed by the first device in the foregoing embodiment.
  • the target detection device may also include a storage module.
  • the storage module is used to store computer program code, and the computer program code includes instructions. If the target detection device is a chip applied to the first device, the storage module can be a storage module (for example, a register, a cache, etc.) in the chip, or a storage module located outside the chip in the first device. Module (for example, read only memory, random access memory, etc.).
  • the first determination module 1501, the calculation module 1502, and the second determination module 1503 may be integrated on a processor or a controller.
  • the processor or the controller may be a central processing unit, a general-purpose processor, a digital signal processor, and a dedicated Integrated circuits, field programmable gate arrays or other programmable logic devices, transistor logic devices, hardware components, or any combination thereof. It can implement or execute various exemplary logical blocks, modules, and circuits described in conjunction with the disclosure of the present invention.
  • the processor may also be a combination of computing functions, for example, a combination of one or more microprocessors, a combination of a digital signal processor and a microprocessor, and so on.
  • the target detection device involved in this application may be the target detection device shown in FIG. 6.
  • the first determination module 1501, the calculation module 1502, and the second determination module 1503 may be integrated on the processor 41 or the processor 45.
  • FIG. 16 shows a schematic diagram of a possible logical structure of the target detection device involved in the foregoing embodiment.
  • the target detection device includes: a processing unit 1601.
  • the processing unit 1601 is used to control and manage the actions of the target detection device.
  • the processing unit 1601 is used to perform information/data processing steps in the target detection device.
  • the target detection device may further include a storage unit 1602 for storing program codes and data that the target detection device can use.
  • the target detection device when the target detection device is a server, the target detection device may further include a communication unit 1603. The communication unit 1603 is used to support the target detection device to send or receive information/data.
  • the target detection apparatus is the first device, or is a chip applied to the first device.
  • the processing unit 1601 is configured to support the target detection apparatus to execute steps 701 to 703 performed by the first device in the foregoing embodiment.
  • FIG. 17 is a schematic structural diagram of a chip 150 provided by an embodiment of the present application.
  • the chip 150 includes one or more (including two) processors 1510 and a communication interface 1530.
  • the chip 150 further includes a memory 1540.
  • the memory 1540 may include a read-only memory and a random access memory, and provides operation instructions and data to the processor 1510.
  • a part of the memory 1540 may also include a non-volatile random access memory (NVRAM).
  • NVRAM non-volatile random access memory
  • the memory 1540 stores the following elements, execution modules or data structures, or their subsets, or their extended sets.
  • the corresponding operation is executed by calling the operation instruction stored in the memory 1540 (the operation instruction may be stored in the operating system).
  • One possible implementation manner is that the structure of the first device is similar, and different devices can use different chips to implement their respective functions.
  • the processor 1510 controls the processing operations of the first device, and the processor 1510 may also be referred to as a central processing unit (CPU).
  • CPU central processing unit
  • the memory 1540 may include a read-only memory and a random access memory, and provides instructions and data to the processor 1510.
  • a part of the memory 1540 may also include a non-volatile random access memory (NVRAM).
  • NVRAM non-volatile random access memory
  • the memory 1540, the communication interface 1530, and the memory 1540 are coupled together through a bus system 1520, where the bus system 1520 may include a power bus, a control bus, and a status signal bus in addition to a data bus.
  • various buses are marked as the bus system 1520 in FIG. 17.
  • the methods disclosed in the foregoing embodiments of the present application may be applied to the processor 1510 or implemented by the processor 1510.
  • the processor 1510 may be an integrated circuit chip with signal processing capabilities. In the implementation process, the steps of the foregoing method can be completed by hardware integrated logic circuits in the processor 1510 or instructions in the form of software.
  • the aforementioned processor 1510 may be a general-purpose processor, a digital signal processing (digital signal processing, DSP), an application specific integrated circuit (ASIC), a ready-made programmable gate array (field-programmable gate array, FPGA), or Other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components.
  • DSP digital signal processing
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • Other programmable logic devices discrete gates or transistor logic devices, discrete hardware components.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the steps of the method disclosed in the embodiments of the present application may be directly embodied as being executed and completed by a hardware decoding processor, or executed and completed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in a mature storage medium in the field, such as random access memory, flash memory, read-only memory, programmable read-only memory, or electrically erasable programmable memory, registers.
  • the storage medium is located in the memory 1540, and the processor 1510 reads the information in the memory 1540, and completes the steps of the foregoing method in combination with its hardware.
  • the processor 1510 is configured to execute the steps of the processing executed by the first device in the embodiment shown in FIG. 7.
  • the above communication unit may be an interface circuit or communication interface of the device for receiving signals from other devices.
  • the communication unit is an interface circuit or communication interface used by the chip to receive signals or send signals from other chips or devices.
  • the instructions stored in the memory for execution by the processor may be implemented in the form of a computer program product.
  • the computer program product may be written in the memory in advance, or it may be downloaded and installed in the memory in the form of software.
  • the computer program product includes one or more computer instructions.
  • the computer can be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • Computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • computer instructions may be transmitted from a website, computer, server, or data center through a cable (such as Coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) to transmit to another website site, computer, server or data center.
  • a cable such as Coaxial cable, optical fiber, digital subscriber line (DSL)
  • wireless such as infrared, wireless, microwave, etc.
  • the computer-readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk, SSD).
  • a computer-readable storage medium is provided, and instructions are stored in the computer-readable storage medium.
  • the first device or a chip applied to the first device executes steps 701 and steps in the embodiment. 702 and step 703.
  • the aforementioned readable storage medium may include: U disk, mobile hard disk, read-only memory, random access memory, magnetic disk or optical disk and other media that can store program codes.
  • a computer program product including instructions.
  • the computer program product stores instructions.
  • the first device or a chip applied to the first device executes steps 701 and 702 in the embodiment. And step 703.
  • a chip which is used in a first device.
  • the chip includes at least one processor and a communication interface, the communication interface is coupled to at least one processor, and the processor is used to execute instructions to execute step 701 in the embodiment. , Step 702 and Step 703.
  • the above embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • a software program it can be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer program instructions When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application are generated in whole or in part.
  • the computer can be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • Computer instructions may be stored in a computer-readable storage medium, or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • computer instructions may be transmitted from a website, computer, server, or data center through a cable (such as Coaxial cable, optical fiber, digital subscriber line (digital subscriber line, referred to as DSL) or wireless (such as infrared, wireless, microwave, etc.) transmission to another website site, computer, server or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or includes one or more data storage devices such as a server or a data center that can be integrated with the medium.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

本申请提供一种目标检测方法以及装置,涉及自动驾驶领域,用以提高判断不同传感器检测到目标为同一目标的准确性,该方法包括:确定第一目标区域以及第二目标区域的重叠区域,第一目标区域由第一传感器的检测误差以及由第一传感器检测到的第一目标的位置确定,第二目标区域由第二传感器的检测误差以及由第二传感器检测到的第二目标的位置确定;计算重叠区域占第一目标区域以及第二目标区域并集的重叠度大小;当重叠度大小大于或等于第一阈值时,根据第一目标的位置和第二目标的位置与第一区域之间的关系,和/或,根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度。

Description

一种目标检测方法以及装置 技术领域
本申请实施例涉及自动驾驶领域,尤其涉及一种目标检测方法以及装置。
背景技术
自动驾驶环境感知的实现依赖于车上部署的传感器,其中,车上部署的传感器可以包括摄像头和毫米波雷达等。多种传感器协同工作,优势互补,可以增强自动驾驶环境感知能力和可靠性。
摄像头具有被动测量、非接触测量,分辨率高,使用方便,成本低廉等特点,是自动驾驶环境感知、目标检测,自动驾驶辅助系统(advanced driver assistance system,ADAS)的必备传感器。与摄像头相比,毫米波雷达属于主动测量方法,具有测距精度高,全天候工作,价格适中等特点,越来越广泛的应用于自动驾驶场合。
摄像头和毫米波雷达的检测范围存在交叠区域,可以实现对同一目标进行同时检测,通过对摄像头检测到的目标和对毫米波雷达检测到的目标进行目标融合可以提高目标检测的精度和可靠性。但是,由于传感器存在检测误差,如何有效融合不同传感器检测到的同一目标是实现目标跟踪、车辆避障等功能的一个基础性问题。
这涉及到,如何判定摄像头检测到的目标和对毫米波雷达检测到的目标是否是同一目标。
发明内容
本申请实施例提供一种目标检测方法以及装置,用以提高判断不同传感器检测到目标为同一目标的准确性。
为了达到上述目的,本申请实施例提供如下技术方案:
第一方面,本申请实施例提供一种目标检测方法,包括:第一设备确定第一目标区域以及第二目标区域的重叠区域,第一目标区域由第一传感器的检测误差以及由第一传感器检测到的第一目标的位置确定,第二目标区域由第二传感器的检测误差以及由第二传感器检测到的第二目标的位置确定。第一目标区域包括第一目标的位置,第二目标区域包括第二目标的位置。第一设备计算重叠区域占第一目标区域以及第二目标区域并集的重叠度大小。当重叠度大小大于或等于第一阈值时,第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,和/或,第一设备根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度。其中,第一区域由第一传感器的第一检测误差和第二传感器的第三检测误差确定。第一传感器的检测误差包括第一检测误差和第二检测误差。第一检测误差的精度高于第二检测误差的精度。第二传感器的检测误差包括第三检测误差和第四检测误差,第三检测误差的精度高于第四检测误差的精度。
本申请实施例提供一种目标检测方法,该方法中第一设备通过确定第一目标区域和第二目标区域的重叠区域的重叠度大小,然后当重叠度大小大于或等于第一阈值时,第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,和/或,第一 设备根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,这样通过当重叠度大小大于或等于第一阈值时,基于第一目标的位置和第二目标的位置与第一区域之间的关系,和/或,第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系进一步确定第一目标和第二目标为同一个目标的置信度的准确性,降低误判。
在一种可能的实现方式中,第一目标的位置和第二目标的位置与第一区域之间的关系,包括:第一目标的位置和第二目标的位置均位于第一区域内,或者,第一目标的位置和第二目标的位置与第一区域之间的关系,包括:第一目标的位置和第二目标的位置中的至少一个位于第一区域外。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域内时,目标置信度为第一置信度。当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置中的一个或多个位于第一区域外时,目标置信度为第二置信度。第一置信度大于第二置信度。换言之,当重叠度大小大于或等于第一阈值时,位置位于第一区域内的第一目标和第二目标为同一目标的置信度,大于,位置在第一区域外的第一目标和第二目标为同一目标的置信度。
在一种可能的实现方式中,第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系包括:第一目标的位置和第二目标的位置之间的距离小于或等于第二阈值。或者,第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系包括:第一目标的位置和第二目标的位置之间的距离大于第二阈值。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置之间距离小于或等于第二阈值时,目标置信度为第三置信度。当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置之间距离大于第二阈值时,目标置信度为第四置信度,第三置信度大于第四置信度,也即在重叠度大于或等于第一阈值的情况下,距离较近的两个目标(例如,第一目标的位置和第二目标的位置之间距离小于或等于第二阈值)为同一目标的置信度高于距离较远的两个目标(例如,第一目标的位置和第二目标的位置之间距离大于第二阈值)为同一目标的置信度。
在一种可能的实现方式中,第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,和/或,第一设备根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:第一目标的位置和第二目标的位置均位于第一区域内,和/或,第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,第一设备确定置信度为目标置信度,目标置信度大于或者等于第一置信度阈值。进一步提高了确定第一目标和第二目标为同一个目标的置信度的准确性。
在一种可能的实现方式中,第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:第一目标的位置和第二目标的位置均位于第一区域内,第一设备确定置信度为目标置信度,目标置信度大于或者等于第一置信度阈值。
在一种可能的实现方式中,第一设备根据第一目标的位置和第二目标的位置与第 一区域之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:第一目标的位置和第二目标的位置中的至少一个位于第一区域外,则第一设备确定第一目标和第二目标为同一个目标的置信度为目标置信度,目标置信度小于或等于第二置信度阈值。
在一种可能的实现方式中,第一设备根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,第一设备确定置信度为目标置信度,目标置信度大于或者等于第一置信度阈值。
在一种可能的实现方式中,第一设备根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:第一目标的位置和第二目标的位置之间距离大于或等于第二阈值,第一设备确定置信度为目标置信度,目标置信度小于或者等于第二置信度阈值。便于当重叠度大小大于或等于第一阈值时,第一设备确定距离较远的两个目标不是同一目标的可能性。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,以及第一目标的位置和第二目标的位置之间距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度,包括:当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域内且第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,则第一设备确定第一目标和第二目标为同一个目标的置信度为目标置信度,且目标置信度大于或等于第一置信度阈值。这样便于当重叠度大小大于或等于第一阈值时,引入第一目标的位置和第二目标的位置均位于第一区域内且第一目标的位置和第二目标的位置之间距离小于或等于第二阈值这个判断度量,可以进一步提高第一目标和第二目标为同一个目标的可能性。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域内且第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,目标置信度为第五置信度;当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置中至少一个位于第一区域外,和/或,第一目标的位置和第二目标的位置之间距离大于第二阈值,目标置信度为第六置信度。第五置信度大于第六置信度。
本申请实施例中当目标置信度大于或等于第一置信度阈值时,可以认为第一目标和第二目标为同一个目标。本申请实施例中当目标置信度小于或等于第二置信度阈值时,可以认为第一目标和第二目标不是同一个目标。
在一种可能的实现方式中,第一阈值由第二检测误差确定,或者第一阈值由第四检测误差确定。
在一种可能的实现方式中,第一阈值随着第二检测误差的增大而增大,或者第一阈值随着第四检测误差的增大而增大。
在一种可能的实现方式中,第二检测误差为第一误差时,第一阈值为第一值;第二检测误差为第二误差时,第一阈值为第二值,其中,第一误差大于第二误差,第一 值大于第二值;或者,第四检测误差为第三误差时,第一阈值为第三值;第四检测误差为第四误差时,第一阈值为第四值,其中,第三误差大于第四误差,第三值大于第四值。
在一种可能的实现方式中,第二阈值由第二检测误差以及第四检测误差确定。
在一种可能的实现方式中,第二阈值随着第二检测误差以及第四检测误差的增大而减小。
在一种可能的实现方式中,第二检测误差为第五误差,第四检测误差为第六误差,第二阈值为第五值,第二检测误差为第七误差,第四检测误差为第八误差,第二阈值为第六值,其中,第五值小于第六值;第五误差大于第七误差,第六误差大于第八误差。
在一种可能的实现方式中,重叠度大小为第一目标区域和第二目标区域的重叠区域在第一目标区域和第二目标区域的并集中的比例。
在一种可能的实现方式中,第一设备计算重叠区域占第一目标区域以及第二目标区域并集的重叠度大小可以由下述描述替换:第一设备计算重叠区域的面积,则上述重叠度大小可以由重叠区域的面积替换。
在一种可能的实现方式中,第一传感器为雷达,第二传感器为摄像头。例如,第一传感器和第二传感器均为车载传感器。例如第一传感器为车载雷达,第二传感器为车载摄像头。
在一种可能的实现方式中,第一检测误差为第一传感器在垂直方向上的测距误差,第二检测误差为第一传感器的测角误差或者第一传感器在水平方向上的测距误差。第三检测误差为第二传感器的测角误差。第四检测误差为第二传感器的测距误差。
在一种可能的实现方式中,第一目标的位置为第一目标相对于XY平面上的基准点的位置,XY平面的X轴方向是车辆宽度方向,XY平面的Y轴方向是车辆长度方向。第二目标的位置为第二目标相对于XY平面上的基准点的位置。
在一种可能的实现方式中,第一设备可以为车辆或者服务器或者车辆中够实现车辆的处理功能或通信功能的部件(例如,车辆内部的电路系统)。例如,车载通讯盒子(telematics box,T-BOX)。再或者,第一设备可以为T-BOX中的芯片。
第二方面,本申请实施例提供一种目标检测装置,该目标检测装置可以实现第一方面或第一方面的任意可能的实现方式中的方法,因此也能实现第一方面或第一方面任意可能的实现方式中的有益效果。该目标检测装置可以为第一设备,也可以为可以支持第一设备实现第一方面或第一方面的任意可能的实现方式中的方法的装置,例如应用于第一设备中的芯片或电路部件。该装置可以通过软件、硬件、或者通过硬件执行相应的软件实现上述方法。
一种示例,一种目标检测装置,包括:第一确定模块,用于确定第一目标区域以及第二目标区域的重叠区域,第一目标区域由第一传感器的检测误差以及由第一传感器检测到的第一目标的位置确定,第二目标区域由第二传感器的检测误差以及由第二传感器检测到的第二目标的位置确定,第一目标区域包括第一目标的位置,第二目标区域包括第二目标的位置。计算模块,用于计算重叠区域占第一目标区域以及第二目标区域并集的重叠度大小。当重叠度大小大于或等于第一阈值时,第二确定模块,用 于根据第一目标的位置和第二目标的位置与第一区域之间的关系,和/或,第二确定模块,用于根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度。其中,第一区域由第一传感器的第一检测误差和第二传感器的第三检测误差确定,第一传感器的检测误差包括第一检测误差和第二检测误差,第一检测误差的精度高于第二检测误差的精度,第二传感器的检测误差包括第三检测误差和第四检测误差,第三检测误差的精度高于第四检测误差的精度。
在一种可能的实现方式中,第一目标的位置和第二目标的位置与第一区域之间的关系,包括:第一目标的位置和第二目标的位置均位于第一区域内,或者,第一目标的位置和第二目标的位置与第一区域之间的关系,包括:第一目标的位置和第二目标的位置中的至少一个位于第一区域外。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域内时,目标置信度为第一置信度。当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置中的一个或多个位于第一区域外时,目标置信度为第二置信度。第一置信度大于第二置信度。换言之,当重叠度大小大于或等于第一阈值时,位置位于第一区域内的第一目标和第二目标为同一目标的置信度,大于,当重叠度大小大于或等于第一阈值时,位置位于第一区域外的第一目标和第二目标为同一目标的置信度。
在一种可能的实现方式中,第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系包括:第一目标的位置和第二目标的位置之间的距离小于或等于第二阈值。或者,第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系包括:第一目标的位置和第二目标的位置之间的距离大于第二阈值。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置之间距离小于或等于第二阈值时,目标置信度为第三置信度。当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置之间距离大于第二阈值时,目标置信度为第四置信度,第三置信度大于第四置信度,也即在重叠度大于或等于第一阈值的情况下,距离较近的两个目标(例如,第一目标的位置和第二目标的位置之间距离小于或等于第二阈值)为同一目标的置信度高于距离较远的两个目标(例如,第一目标的位置和第二目标的位置之间距离大于第二阈值)为同一目标的置信度。
在一种可能的实现方式中,第一目标的位置和第二目标的位置均位于第一区域内,和/或,第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,第二确定模块,用于确定置信度为目标置信度,目标置信度大于或者等于第一置信度阈值。进一步提高了确定第一目标和第二目标为同一个目标的置信度的准确性。
在一种可能的实现方式中,第二确定模块,用于根据第一目标的位置和第二目标的位置与第一区域之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:第一目标的位置和第二目标的位置均位于第一区域内,第二确定模块,用于确定置信度为目标置信度,目标置信度大于或者等于第一置信度阈值。
在一种可能的实现方式中,第二确定模块,用于根据第一目标的位置和第二目标 的位置与第一区域之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:第一目标的位置和第二目标的位置中的至少一个位于第一区域外,则第二确定模块,用于确定第一目标和第二目标为同一个目标的置信度为目标置信度,目标置信度小于或等于第二置信度阈值。
在一种可能的实现方式中,第二确定模块,用于根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,第二确定模块,用于确定置信度为目标置信度,目标置信度大于或者等于第一置信度阈值。
在一种可能的实现方式中,第二确定模块,用于根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:第一目标的位置和第二目标的位置之间距离大于或等于第二阈值,第二确定模块,用于确定置信度为目标置信度,目标置信度小于或者等于第二置信度阈值。便于当重叠度大小大于或等于第一阈值时,第一设备确定距离较远的两个目标不是同一目标。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第二确定模块,用于根据第一目标的位置和第二目标的位置与第一区域之间的关系,和根据第一目标的位置和第二目标的位置之间距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度,包括:当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域内且第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,则第二确定模块,用于确定第一目标和第二目标为同一个目标的置信度为目标置信度,且目标置信度大于或等于第一置信度阈值。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域内且第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,目标置信度为第五置信度;当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置中至少一个位于第一区域外,和/或,第一目标的位置和第二目标的位置之间距离大于第二阈值,目标置信度为第六置信度。第五置信度大于第六置信度。
本申请实施例中当目标置信度大于或等于第一置信度阈值时,可以认为第一目标和第二目标为同一个目标。本申请实施例中当目标置信度小于或等于第二置信度阈值时,可以认为第一目标和第二目标不是同一个目标。
在一种可能的实现方式中,第一阈值由第二检测误差确定,或者第一阈值由第四检测误差确定。
在一种可能的实现方式中,第一阈值随着第二检测误差的增大而增大,或者第一阈值随着第四检测误差的增大而增大。
在一种可能的实现方式中,第二检测误差为第一误差时,第一阈值为第一值;第二检测误差为第二误差时,第一阈值为第二值,其中,第一误差大于第二误差,第一值大于第二值;或者,第四检测误差为第三误差时,第一阈值为第三值;第四检测误差为第四误差时,第一阈值为第四值,其中,第三误差大于第四误差,第三值大于第 四值。
在一种可能的实现方式中,第二阈值由第二检测误差以及第四检测误差确定。
在一种可能的实现方式中,第二阈值随着第二检测误差以及第四检测误差的增大而减小。
在一种可能的实现方式中,第二检测误差为第五误差,第四检测误差为第六误差,第二阈值为第五值,第二检测误差为第七误差,第四检测误差为第八误差,第二阈值为第六值,其中,第五值小于第六值;第五误差大于第七误差,第六误差大于第八误差。
在一种可能的实现方式中,重叠度大小为第一目标区域和第二目标区域的重叠区域在第一目标区域和第二目标区域的并集中的比例。
在一种可能的实现方式中,第一传感器为雷达,第二传感器为摄像头。例如,第一传感器和第二传感器均为车载传感器。例如第一传感器为车载雷达,第二传感器为车载摄像头。
在一种可能的实现方式中,第一检测误差为第一传感器在垂直方向上的测距误差,第二检测误差为第一传感器的测角误差或者第一传感器在水平方向上的测距误差。第三检测误差为第二传感器的测角误差;第四检测误差为第二传感器的测距误差。
在一种可能的实现方式中,第一目标的位置为第一目标相对于XY平面上的基准点的位置,XY平面的X轴方向是车辆宽度方向,XY平面的Y轴方向是车辆长度方向;第二目标的位置为第二目标相对于XY平面上的基准点的位置。
另一种示例,本申请实施例提供一种目标检测装置,该目标检测装置可以是第一设备,也可以是第一设备内的芯片。该目标检测装置可以包括:处理单元。当该目标检测装置是第一设备时,该处理单元可以是处理器。处理单元可以替换上述第一确定模块、第二确定模块以及计算模块。此外,该目标检测装置还可以包括通信单元,通信单元可以为接口电路。该目标检测装置还可以包括存储单元。该存储单元可以是存储器。该存储单元,用于存储计算机程序代码,计算机程序代码包括指令。该处理单元执行该存储单元所存储的指令,以使该目标检测装置实现第一方面或第一方面的任意一种可能的实现方式中描述的方法。当该目标检测装置是第一设备内的芯片时,该处理单元可以是处理器,该通信单元可以统称为:通信接口。例如,通信接口可以为输入/输出接口、管脚或电路等。该处理单元执行存储单元所存储的计算机程序代码,以使该第一设备实现第一方面或第一方面的任意一种可能的实现方式中描述的方法,该存储单元可以是该芯片内的存储单元(例如,寄存器、缓存等),也可以是该第一设备内的位于该芯片外部的存储单元(例如,只读存储器、随机存取存储器等)。
可选的,处理器、通信接口和存储器相互耦合。
第三方面,本申请实施例提供一种计算机可读存储介质,计算机可读存储介质中存储有计算机程序或指令,当计算机程序或指令在计算机上运行时,使得计算机执行如第一方面至第一方面的任意一种可能的实现方式中描述的一种目标检测方法。
第四方面,本申请实施例提供一种包括指令的计算机程序产品,当指令在计算机上运行时,使得计算机执行第一方面或第一方面的各种可能的实现方式中描述的一种目标检测方法。
第五方面,本申请实施例提供一种通信装置,该通信装置包括处理器和存储介质,所述存储介质存储有指令,所述指令被所述处理器运行时,实现如第一方面或第一方面的各种可能的实现方式描述的一种目标检测方法。
第六方面,本申请实施例提供了一种通信装置,该通信装置包括一个或者多个模块,用于实现上述第一方面的方法,该一个或者多个模块可以与上述第一方面的方法中的各个步骤相对应。
第七方面,本申请实施例提供一种芯片,该芯片包括处理器和通信接口,通信接口和处理器耦合,处理器用于运行计算机程序或指令,以实现第一方面或第一方面的各种可能的实现方式中所描述的一种目标检测方法。通信接口用于与所述芯片之外的其它模块进行通信。
具体的,本申请实施例中提供的芯片还包括存储器,用于存储计算机程序或指令。
第八方面,本申请实施例提供一种通信装置,该通信装置包括处理器,处理器和存储器耦合,存储器中存储有计算机程序或指令,处理器用于运行存储器中存储的计算机程序或指令,实现如第一方面或第一方面的各种可能的实现方式描述的一种目标检测方法。
上述提供的任一种装置或计算机存储介质或计算机程序产品或芯片或通信系统均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文提供的对应的方法中对应方案的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种图像目标和雷达目标的位置示意图一;
图2为本申请实施例提供的一种图像目标和雷达目标的位置示意图二;
图3为本申请实施例提供的一种图像目标和雷达目标的位置示意图三;
图4为本申请实施例提供的一种图像目标和雷达目标的位置示意图四;
图5为本申请实施例提供的一种图像目标和雷达目标的位置示意图五;
图6为本申请实施例提供的一种通信设备的结构示意图;
图7为本申请实施例提供的一种目标检测方法的流程示意图;
图8为本申请实施例提供的一种第一区域的示意图;
图9a为本申请实施例提供的一种第一目标和第二目标的位置示意图一;
图9b为本申请实施例提供的一种第一目标和第二目标的位置示意图二;
图10a为本申请实施例提供的一种第一目标和第二目标的位置示意图三;
图10b为本申请实施例提供的一种第一目标和第二目标的位置示意图四;
图10c为本申请实施例提供的一种第一目标和第二目标的位置示意图五;
图11为本申请实施例提供的一种第一目标和第二目标的位置示意图六;
图12为本申请实施例提供的一种第一目标和第二目标的位置示意图七;
图13为本申请实施例提供的一种第一目标和第二目标的位置示意图八;
图14为本申请实施例提供的一种第一目标和第二目标的位置示意图九;
图15为本申请实施例提供的一种目标检测装置的结构示意图;
图16为本申请实施例提供的另一种目标检测装置的结构示意图;
图17为本申请实施例提供的一种芯片的结构示意图。
具体实施方式
为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。例如,第一传感器和第二传感器仅仅是为了区分不同的传感器,并不对其先后顺序进行限定。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。
需要说明的是,本申请中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其他实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
本申请实施例描述的网络架构以及业务场景是为了更加清楚的说明本申请实施例的技术方案,并不构成对于本申请实施例提供的技术方案的限定,本领域普通技术人员可知,随着网络架构的演变和新业务场景的出现,本申请实施例提供的技术方案对于类似的技术问题,同样适用。
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B的情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
本申请实施例中将由单目摄像头捕捉的图像而检测到的目标称为图像目标。本申请实施例中将由毫米波雷达捕捉的图像而检测到的目标称为雷达目标。
如图1所示,以车辆位于坐标系原点O处,X轴表示车辆宽度方向(widthwise direction),Y轴表示车辆长度方向(lengthwise direction)为例。图1中的EYr表示雷达目标在垂直方向上的位置误差,EXr表示雷达目标在水平方向上的位置误差,其中EYr小于EXr,R r表示雷达目标的误差区域,R r由EYr和EXr确定。Eθi表示图像目标的测角精度,EYi表示图像目标的测距精度,图像目标的检测基于单目摄像头,Eθi高于EYi。同样的,R i表示图像目标的误差区域,R i由Eθi和EYi确定。雷达目标的误差区域和图像目标误差区域的重叠区域(例如,图1中的阴影部分R x)的面积大小反应了图像目标和雷达目标来自同一物体的可能性,换言之图像目标和雷达目标为同一个目标的可能性。当重叠区域的面积大于某一阈值时,判定雷达目标和图像目标为同一目标。
如图2所示,图2与图1的区别在于:在图2中雷达目标的误差区域由垂直方向误差EYr和测角误差Eθr确定。图2中的雷达目标的误差区域R r的形状类似平行四边形(图1中的雷达目标的误差区域形状为长方形)。这种情况下,依然可以使用雷达目标的误差区域和图像目标的误差区域的重叠面积大小与某一阈值之间的关系来判定图像目标和雷达目标是否为同一个目标。本申请实施例中判断两个目标(例如,图像目标和雷达目标)是否为同一个目标也可以理解为:判断两个目标是否来自于同一物体。
虽然可以使用重叠区域面积大于某一阈值来确定图像目标和雷达目标是否为同一个目标,但是还存在以下问题:即使重叠区域面积大小相同,但目标位置关系却可能不同,因此可能导致错误判断图像目标和雷达目标为同一个目标。如图3所示,如果位于位置X的雷达目标的误差区域与图像目标的误差区域的重叠面积为A,位于位置X'的雷达目标的误差区域与图像目标的误差区域的重叠面积为B,如果A和B均大于某一阈值,且A和B相等,理论上按照上述方案位于位置X的雷达目标和图像目标为同一个目标的可能性应该与位于位置X'的雷达目标和图像目标为同一个目标的可能性相同,即位于位置X的雷达目标和图像目标为同一个目标,且位于位置X'的雷达目标和图像目标也为同一个目标。但是,在实际过程中可能存在如下状态:位于位置X的雷达目标和位于位置X'的雷达目标不是同一个雷达目标。当位于位置X的雷达目标和位于位置X'的雷达目标不是同一个雷达目标时,位于位置X'的雷达目标和图像目标为同一目标的可能性应该和位于位置X'的雷达目标和该图像目标为同一目标的可能性应该不同,例如,位于位置X'的雷达目标和图像目标为同一目标,而位于位置X'的雷达目标和该图像目标不是同一个目标。但是如果仅依据重叠面积和阈值之间的关系,由于A和B均大于阈值,可能会判断出位于位置X'的雷达目标和图像目标为同一目标,位于位置X的雷达目标和图像目标为同一目标,但是这一结论与事实相反的,因此仅仅依据重叠面积和阈值之间的关系会引起误判。此外,即使位于位置X'的雷达目标和位于位置X的雷达目标属于同一个雷达目标,但是位于位置X'的雷达目标和位于位置X的雷达目标与同一个图像目标为同一个目标的可能性也不相同。
再者,如图4所示,位于位置Y的图像目标的误差区域R i与雷达目标的误差区域R r的重叠面积为C,位于位置Y'的图像目标的误差区域R i与雷达目标的误差区域R r的重叠面积为D,如果C和D均大于某一阈值,且C和D相等。理论上按照上述方案位于位置Y'的图像目标和雷达目标为同一个目标的可能性应该与位于位置Y的图像目标和雷达目标为同一个目标的可能性相等,即位于位置Y'的图像目标和雷达目标为同一个目标,且位于位置Y的图像目标和雷达目标也为同一个目标。但是,如果位于不同位置的图像目标不是同一个图像目标的情况下,位于位置Y'的图像目标和雷达目标为同一个目标的可能性应该与位于位置Y的图像目标和雷达目标为同一个目标的可能性不相同。但是上述方案仅依据重叠面积和阈值之间的关系,由于C和D均大于阈值,可能会判断出位于Y'的图像目标和雷达目标为同一目标,位于位置Y的图像目标和雷达目标为同一个目标,但是这一结论与事实相反的,因此仅仅依据重叠面积和阈值之间的关系会引起误判。
如图5所示,当摄像头和雷达的在水平方向上的检测误差持续大时,图像目标的误差区域和雷达目标的误差区域的重叠面积达到最大值。此时,图像目标和雷达目标可以出现在更多的不同位置上,而重叠面积可能保持不变。也就是说,如果根据重叠面积就无法准确判断图像目标和雷达目标是否为同一个目标。因此,这种情况下出现误判的可能性大大增加了。
从以上分析可看出,判断图像目标和雷达目标是否为同一个目标,仅仅根据雷达目标的误差区域与图像目标的误差区域的重叠面积大小这个度量,可能导致误判。基于此,本申请实施例中提供一种目标检测方法,该方法中除了考虑到雷达目标的误差 区域与图像目标的误差区域的重叠面积大小这个度量外,还考虑到其他度量以提高判断图像目标和雷达目标是否为同一个目标的准确性。
如图6所示,图6示出了本申请实施例中的一种目标检测装置的硬件结构示意图。本申请实施例中的第一设备的结构可以参考图6所示的结构。该目标检测装置包括处理器41,以及通信线路43。
可选的,该目标检测装置还可以包括存储器42。
处理器41可以是一个通用中央处理器(central processing unit,CPU),微处理器,特定应用集成电路(application-specific integrated circuit,ASIC),或一个或多个用于控制本申请方案程序执行的集成电路。
通信线路43可包括一通路,在上述组件之间传送信息。
存储器42可以是只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(electrically erasable programmable read-only memory,EEPROM)、只读光盘(compact disc read-only memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。存储器可以是独立存在,通过通信线路43与处理器相连接。存储器也可以和处理器集成在一起。
其中,存储器42用于存储执行本申请方案的计算机执行指令,并由处理器41来控制执行。处理器41用于执行存储器42中存储的计算机执行指令,从而实现本申请下述实施例提供的一种目标检测方法。
可以理解的是,如果第一设备为车辆,则该目标检测装置还可以包括第一传感器和第二传感器。
可以理解的是,本申请实施例中一种目标检测方法中由第一设备执行的步骤也可以由应用于第一设备中的芯片执行,下述实施例以一种目标检测方法的执行主体为第一设备为例进行描述。
本申请各实施例之间可以相互借鉴或参考,例如,相同或相似的步骤,方法实施例、装置实施例或系统实施例之间,均可以相互参考,不予限制。
如图7所示,图7示出了本申请实施例提供的一种目标检测方法的实施例,该方法包括:
步骤701、第一设备确定第一目标区域以及第二目标区域的重叠区域,第一目标区域由第一传感器的检测误差以及由第一传感器检测到的第一目标的位置确定,第二目标区域由第二传感器的检测误差以及由第二传感器检测到的第二目标的位置确定。
其中,第一目标区域包括第一目标的位置,换言之,第一目标的位置位于第一目标区域内。第二目标区域包括第二目标的位置,换言之,第二目标的位置位于第二目标区域内。
本申请实施例中第一目标区域和第二目标区域的重叠区域即为第一目标区域和第二目标区域存在的共同区域,换言之,重叠区域为第一目标区域和第二目标区域的交 集,或者重叠区域既属于第一目标区域又属于第二目标区域。
本申请实施例中可以将第一传感器捕捉的图像而检测到的目标称为第一目标,将由第二传感器捕捉的图像而检测到的目标称为第二目标。
本申请实施例中的第一设备可以为车辆,或者应用于车辆中的芯片,或者应用于车辆中的部件(例如,电子控制单元(Electronic Control Unit,ECU)或者自动驾驶控制器,或者第一设备可以为服务器,或者为应用于服务器的芯片。电子控制单元,又称“行车电脑”、“车载电脑”。具体的,本申请实施例中的第一设备可以为车辆中具有信息处理功能的部件。
示例性的,第一传感器的检测误差为第一传感器的位置误差。例如,第一传感器的检测误差包括第一检测误差和第二检测误差。例如,第一检测误差为第一传感器在垂直方向上的测距误差,第二检测误差为第一传感器的测角误差或者第一传感器在水平方向上的测距误差。其中,第一传感器的检测误差基于第一传感器的特性预先确定。
示例性的,第二传感器的检测误差为第二传感器的位置误差。例如,第二传感器的检测误差包括第三检测误差和第四检测误差。第三检测误差为第二传感器的测角误差,第四检测误差为第二传感器的测距误差。第二传感器的检测误差基于第二传感器的特性预先确定。第二传感器的测距误差可以为第二传感器在水平方向上的测距误差或者垂直方向上的测距误差。
本申请实施例中的传感器的“误差”也可以替换为传感器的“精度”,例如,测角误差可以替换为测角精度。本申请实施例中的检测误差也可以称为:测量误差。
作为一种可能的实现,本申请实施例中第一目标区域由垂直设置的平行线第五线条和第六线条,以及水平设置的平行线第七线条和第八线条围合的闭合区域限定。其中,第五线条和第六线条平行设置,且第五线条与第一目标的位置之间的最短距离和第六线条与第一目标的位置之间的最短距离相等,第五线条与第一目标的位置之间的最短距离为第一传感器在垂直方向上的测距误差。第七线条和第八线条平行设置,且第七线条与第一目标的位置之间的最短距离和第八线条与第一目标的位置之间的最短距离相等,第七线条与第一目标的位置之间的最短距离为第一传感器在水平方向上的测距误差。
换言之,本申请实施例中第一目标区域由从第一目标的位置沿沿垂直方向上下延伸第一传感器在垂直方向上的测距误差,且以第一目标的位置为中心,沿水平方向左右延伸第一传感器在水平方向上的测距误差所围成的闭合区域限定,也即第一目标的位置为第一目标区域的中心。或者本申请实施例中第一目标区域由以第一目标的位置为中心,沿垂直方向上下延伸第一传感器在垂直方向上的测距误差,且以第一目标的位置为中心,沿水平方向左右延伸第一传感器在水平方向上的测距误差所围成的闭合区域限定,也即第一目标的位置为第一目标区域的中心。
作为一种具体实现,本申请实施例中第一目标区域可以表示为Y 1-EYr≤Y r≤Y 1+EYr和X 1-EXr≤X r≤X 1+EXr,其中,Y 1表示第一目标在垂直方向上的坐标,X 1表示第一目标在水平方向上的坐标。±EYr表示第一传感器在垂直方向(Y轴方向)上的测距误差,±EXr表示第一传感器在水平方向(X轴方向)上的测距误差。Y r表示第一目标区域在垂直方向上的坐标,X r表示第一目标区域在水平方向上的坐标。
作为另一种可能的实现,本申请实施例中的第一目标区域由以第一目标的位置为中心,沿垂直方向上下延伸第一传感器在垂直方向上的测距误差,以及以第一目标的位置与基准点之间的连线为轴线,左右旋转第一传感器的测角误差所形成的闭合区域限定。此时,第一目标区域的具体形状可以参考图4中的R r
作为一种具体实现,本申请实施例中第一目标区域可以表示为Y 1-EYr≤Y r≤Y 1+EYr和X 1-Eθr≤X r≤X 1+Eθr,其中,Y 1表示第一目标在垂直方向上的坐标,X 1表示第一目标在水平方向上的坐标。±EYr表示第一传感器在垂直方向(Y轴方向)上的测距误差,±Eθr表示第一传感器的测角误差。Y r表示第一目标区域在垂直方向上的坐标,X r表示第一目标区域在水平方向上的坐标。
作为一种可能的实现,本申请实施例中的第二目标区域由以第二目标的位置为中心,沿垂直方向上下延伸第二传感器在垂直方向上的测距误差,以及以第二目标的位置与基准点之间的连线为轴线,左右旋转第二传感器的测角误差所形成的闭合区域限定。
作为一种具体实现,本申请实施例中第二目标区域可以表示为Y 2-EYi≤Y i≤Y 2+EYi和X 2-Eθi≤X i≤X 2+Eθi,其中,Y 2表示第二目标在垂直方向(Y轴方向)上的坐标,X 2表示第二目标在水平方向(X轴方向)上的坐标。±EYi表示第二传感器在垂直方向(Y轴方向)上的测距误差,±Eθi表示第二传感器在水平方位角(X轴方向)上的测角误差。Y i表示第二目标区域在垂直方向上的坐标,X i表示第二目标区域在水平方向上的坐标。
如图8所示,R ir表示第一目标区域和第二目标区域的重叠区域(即图8中的阴影部分),R r表示第一目标区域,也即第一目标区域由X r和Y r限定,换言之第一目标区域在垂直方向(Y轴方向)上的坐标范围的宽度为2EYr。第一目标区域在水平方向(X轴方向)上的坐标范围的宽度为2EXr。R i表示第二目标区域,也即第二目标区域由X i和Y i限定,换言之第二目标区域在垂直方向(Y轴方向)上的坐标范围的宽度为2EYi。第二目标区域在水平方位角(X轴方向)上的方位角范围宽度为2Eθi。
其中,第一目标的位置为第一目标相对于XY平面上的基准点的位置,XY平面的X轴方向是车辆宽度方向,XY平面的Y轴方向是车辆长度方向。第二目标的位置为第二目标相对于XY平面上的基准点的位置。
可以理解的是,XY平面上的基准点可以为图8所示的坐标原点O,该坐标原点O可以为车辆的位置,第一传感器和第二传感器部署在车辆上。
步骤702、第一设备计算重叠区域占第一目标区域以及第二目标区域并集的重叠度大小。
作为一种具体实现:重叠度大小为第一目标区域和第二目标区域的重叠区域在第一目标区域和第二目标区域的并集中的比例。例如,重叠度大小=(R r与R i的交集)/(R r与R i的并集)。其中,R r表示第一目标区域,R i表示第二目标区域。
另一种示例,本申请实施例中的步骤702可以通过以下步骤替换:第一设备计算重叠区域的重叠面积。关于如何计算重叠区域的面积可以参考现有技术中的描述,本申请实施例对此不做限定。
可以理解的是,如果重叠度大小为第一目标区域和第二目标区域的重叠区域在第一目标区域和第二目标区域的并集中的比例,则第一阈值也可以为预设比例值。
如果第一设备计算的是重叠面积,则第一阈值可以为预设面积阈值,本申请实施例对此不做限定。
步骤703、当重叠度大小大于或等于第一阈值时,第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,和/或,第一设备根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度。
其中,第一区域由第一传感器的第一检测误差和第二传感器的第三检测误差确定,第一传感器的检测误差包括第一检测误差和第二检测误差,第一检测误差的精度高于第二检测误差的精度,第二传感器的检测误差包括第三检测误差和第四检测误差,所述第三检测误差的精度高于所述第四检测误差的精度。
本申请实施例中第一目标和第二目标为同一个目标的置信度可以理解为:第一目标和第二目标为同一个目标的可能性,或者第一目标和第二目标来源于同一个物体的可能性。
本申请实施例中的第一阈值和第二阈值可以为协议预定义的值或者第一设备确定的值。
本申请实施例中的第一区域由第一传感器的第一检测误差和第二传感器的第三检测误差确定,这是基于如下事实:
以第一传感器为雷达为例,雷达在垂直方向的测量精度高于水平方向的测量精度(换言之,雷达在Y方向上的测距准,测X/方位角的准确度小于车载雷达在Y方向上的准确度)。以第二传感器为摄像头为例,摄像头的方位角测量精度高于水平或垂直方向精度(单目摄像头(camera)测方位角准,测距相对不准)。
示例性的,示例性的,如图8所示,第一区域由在垂直方向与第一目标的位置之间的距离为第一传感器在垂直方向上的误差的平行线(线条1和线条2)与线条3和线条4相交所围成的闭合区域。其中,线条3和线条4相交,且线条3和线条4与经过第二目标的位置的线条之间的夹角均为第二传感器的测角误差。
本申请实施例提供一种目标检测方法,该方法中第一设备通过确定第一目标区域和第二目标区域的重叠区域的重叠度大小,然后当重叠度大小大于或等于第一阈值时,第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,和/或,第一设备根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,这样通过当重叠度大小大于或等于第一阈值时,基于第一目标的位置和第二目标的位置与第一区域之间的关系,和/或,第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系进一步确定第一目标和第二目标为同一个目标的置信度的准确性,降低误判。
可选的,第一传感器和第二传感器可以对同一个范围内的目标进行检测,换言之,第一目标的位置和第二目标的位置位于同一个范围。
本申请实施例中第一设备确定第一目标和第二目标为同一个目标的置信度之后,便于辅助后续执行目标融合。由于不同传感器的测量误差(也可以称为:检测误差)有所不同,第一目标由第一传感器检测,而第二目标由第二传感器检测,如果第一设备确定第一目标和第二目标为同一个目标,则后续可以对第一目标的位置和第二目标 的位置进行融合,以得到融合后该目标的最终位置。而如果第一设备确定第一目标和第二目标不是同一个目标,则后续无须将第一目标的位置和第二目标的位置进行融合。
本申请实施例中如果重叠度大小大于或等于第一阈值,则第一设备可以确定第一目标和第二目标不是同一个目标。
作为一种可能的示例,本申请实施例中的第一传感器为雷达,第二传感器为摄像头。例如,雷达可以为毫米波雷达。摄像头可以为:单目摄像头。
本申请实施例中的毫米波雷达安装在车辆的中前部,用于通过使用毫米波检测诸如其它车辆和行人的目标。毫米波雷达在沿水平面扫描的同时从车辆向前方发送毫米波,并接收反射回来的毫米波,从而将传输和接收的数据以雷达信号的形式传输至第一设备,或者毫米波雷达将传输和接收的数据以雷达信号的形式传输至车辆,由车辆传输给服务器。单目摄像头包括电荷耦合器件(charge coupled device,CCD)摄像头,并且该单目摄像头安装在车辆的中前部。单目摄像头将捕捉的图像的数据以图像信号的形式传输至第一设备,或者单目摄像头将捕捉的图像的数据以图像信号的形式传输至车辆,由车辆传输至服务器。
本申请实施例中第一目标的位置和第二目标的位置与第一区域之间的关系包括:第一目标的位置和第二目标的位置均位于第一区域内。或者,第一目标的位置和第二目标的位置与第一区域之间的关系包括:第一目标的位置和第二目标的位置中的一个或多个位于第一区域外。
本申请实施例中第一目标的位置和第二目标的位置之间距离与第二阈值之间的关系包括:第一目标的位置和第二目标的位置之间的距离大于或等于第二阈值,和,第一目标的位置和第二目标的位置之间的距离小于或等于第二阈值。
在一种可能的实现方式中,本申请实施例的步骤703中当重叠度大小大于或等于第一阈值时,第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:当重叠度大小大于或等于第一阈值时,且第一目标的位置和第二目标的位置均位于第一区域内,第一设备确定第一目标和第二目标为同一个目标的置信度为目标置信度,该目标置信度大于或等于第一置信度阈值。
可以理解的是,本申请实施例中的目标置信度大于或等于第一置信度阈值可以替换为:第一目标和第二目标为同一个目标。本申请实施例中的第一置信度阈值可以根据需要设置,本申请实施例对此不做限定。
如图9a所示,以第一目标为目标A和第二目标为目标B为例,目标A的位置和目标B的位置均位于第一区域内,且重叠区域(图9a中的阴影部分R ir)的重叠度大小大于或等于第一阈值,此时,第一设备可以确定目标A和目标B为同一个目标的置信度为目标置信度,且目标置信度大于或等于第一置信度阈值。或者第一设备确定目标A和目标B为同一个目标。
作为一种可能的实现方式,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域内时,目标置信度为第一置信度。当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置中一个或多个位于第一区域外时,目标置信度为第二置信度。换言之,当重叠度大小大于或等于第一阈值时, 第一目标的位置和第二目标的位置均位于第一区域内时的目标置信度高于第一目标的位置和第二目标的位置中至少一个位于第一区域外时的目标置信度。
如图9b所示,以第一目标为目标A,第二目标为目标B为例,示出了目标B位于不同位置时,位于不同位置的目标B和目标A为同一个目标的置信度的情况,例如,目标B位于位置Q(X 21,Y 21),或者目标B位于位置Q’(X 22,Y 22),位于位置Q’位于第一区域外,而位置Q位于第一区域内。假设目标B位于位置Q时,由目标B的位置Q和第二传感器的第一误差参数确定的第二目标区域为R i。该第一目标区域R r与第二目标区域R i之间的重叠区域(图9b中的阴影部分)为R ir。假设目标B位于位置Q’时,由目标B的位置Q’和第二传感器的第二检测误差确定的第二目标区域为R i'。R i'与第一目标区域R r之间的重叠区域(图9b中的阴影部分)为R ir'。如果R ir的重叠度大小和R ir'的重叠度大小均大于或等于第一阈值,也即无论目标B位于位置Q还是位于位置Q’,R i和R i'分别与第一目标区域的重叠区域大小均大于或等于第一阈值。但是,目标B在位置Q时,该目标B的位置位于第一区域内,而目标B在位置Q’时,目标B的位置位于第一区域外,则目标A与位于位置Q的目标B为同一个目标的目标置信度高于目标A与位于位置Q’的目标B为同一个目标的目标置信度。
在一种可能的实现方式中,本申请实施例的步骤703中当重叠度大小大于或等于第一阈值时,第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:当重叠度大小大于或等于第一阈值时,如果第一目标的位置和第二目标的位置中的一个或多个位于第一区域外,第一设备确定第一目标和第二目标为同一个目标的置信度为目标置信度,且目标置信度小于或等于第二置信度阈值。
本申请实施例中第一目标和第二目标为同一个目标的置信度为目标置信度,且目标置信度小于或等于第二置信度阈值可以替换为:第一目标和第二目标不是同一个目标。第一置信度阈值可以和第二置信度阈值相同或者不同,本申请实施例对此不做限定。
举例说明,如图9b所示,以第一目标为位于位置P的目标A,第二目标为位于位置Q’的目标B为例,从图9b中可以看出目标A的位置P位于第一区域内,而目标B的位置Q’位于第一区域外,则第一设备可以确定目标A和目标B为同一个目标的置信度为目标置信度,且目标置信度小于或等于第二置信度阈值。
在一种可能的实现方式中,本申请实施例的步骤703中当重叠度大小大于或等于第一阈值时,第一设备根据第一目标的位置和第二目标的位置之间距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,则第一设备确定第一目标和第二目标为同一个目标的置信度为目标置信度,且目标置信度大于或等于第一置信度阈值。
举例说明,如图10a所示,第一目标区域和第二目标区域之间的重叠区域(例如,图10a中的阴影部分R ir)的重叠度大于或等于第一阈值,且第一目标的位置和第二目标的位置之间的距离小于第一阈值,则第一目标和第二目标为同一个目标的置信度为目标置信度,且目标置信度大于或等于第一置信度阈值。
本申请实施例中计算两个目标的位置之间距离的方式可以参考现有技术,此处不再赘述。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置之间距离小于或等于第二阈值时,第一目标和第二目标为同一个目标的置信度为第三置信度。当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置之间距离大于第二阈值时,第一目标和第二目标为同一个目标的置信度为第四置信度。第三置信度大于第四置信度。
举例说明,如图10b所示,第一目标为目标A,第二目标为目标B,图10b示出了目标A在不同位置(例如,目标A的位置可以为位置P(X 11,Y 11),或者目标A的位置可以为位置P’(X 12,Y 12))时,位于不同位置的目标A和同一个目标B为同一个目标的置信度,假设目标A在位置P时,由目标A的位置P和第一传感器的检测误差确定的第一目标区域为R r。该R r与第二目标区域R i之间的重叠区域为R ri。R ri的重叠度大小为S1。假设目标A在位置P’时,由目标A的位置P’和第一传感器的检测误差确定的第一目标区域为R r'。R r'与第二目标区域R i之间的重叠区域为R ri'。R ri'的重叠度大小为S2。假设S1和S2均大于或等于第一阈值,也即位于位置P的目标A对应的第一目标区域R r和位于位置P’的目标A对应的第一目标区域R r'分别与第二目标区域的重叠区域大小均大于或等于第一阈值。如果目标A在位置P时,目标A的位置与目标B的位置之间的距离为L1。目标A在位置P’时,目标A与目标B之间的距离为L2,L1小于L2,也即相比于位于位置P’的目标A,目标B与位于位置P的目标A之间的距离比较近。因此,目标B与位于位置P的目标A为同一个目标的第三置信度大于目标B与位于位置P’的目标A为同一个目标的第四置信度。
举例说明,如图10c所示,以第二目标为目标B,第一目标为目标A为例,图10c示出了位于不同位置的目标B(例如,目标B位于位置Q(X 21,Y 21),目标B位于位置Q’(X 22,Y 22))与同一个目标A为同一个目标的置信度,假设目标B在位置Q时,由目标B的位置Q和第二传感器的检测误差确定的第二目标区域为R i。该第一目标区域R r与第二目标区域R i之间的重叠区域为R ir。R ir的重叠度大小为S3。当目标B在位置Q’时,由目标B的位置Q’和第二传感器的检测误差确定的第二目标区域为R i'。R i'与第一目标区域R r之间的重叠区域为R ir'。R ri'的重叠度大小为S4。如果S3、S4均大于或等于第一阈值,也即位于位置Q的目标B对应的R i和位于位置Q’的目标B对应的R i'分别与第一目标区域的重叠区域的重叠度大小大于或等于第一阈值。如果目标B在位置Q时,目标A的位置与目标B在位置Q之间的距离为L3。而目标B位于位置Q’时,目标A的位置与目标B在位置Q’之间的距离为L4,L3小于L4,也即相比于位于位置Q’的目标B,目标A的位置与目标B在位置Q之间的距离比较近。因此,目标A与位于位置Q的目标B为同一个目标的置信度大于目标A与位于位置Q’的目标B为同一个目标的置信度。
由图10b和图10c可以得出当重叠度大小大于或等于第一阈值时,任意两个目标的位置之间的距离越近(例如,小于或等于第二阈值),则该任意两个目标为同一个目标的可能性越高。
需要说明的是,图10b和图10c分别以一个目标位于不同位置,计算同一个目标 和位于不同位置的目标之间的置信度为例,可以理解的是,位于位置Q的目标B和位于位置Q’的目标B可能不是同一个目标。
在一种可能的实现方式中,本申请实施例的步骤703中当重叠度大小大于或等于第一阈值时,第一设备根据第一目标的位置和第二目标的位置之间距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的置信度为目标置信度,包括:当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置之间距离大于第二阈值,第一设备确定第一目标和第二目标为同一个目标的置信度的置信度为目标置信度,且目标置信度小于或等于第二置信度阈值。
举例说明,如图10c所示,以第一目标为位于位置P的目标A,第二目标为位于位置Q’的目标B为例,从图10c中可以看出目标A的位置和位于位置Q’的目标B的位置之间的距离为L4,L4大于第二阈值,则目标A和目标B为同一个目标的置信度小于或等于第五置信度阈值。
在一种可能的实现方式中,本申请实施例中的步骤703中第一设备根据第一目标的位置和第二目标的位置与第一区域之间的关系,和,根据第一目标的位置和第二目标的位置之间的距离与第二阈值之间的关系,确定第一目标和第二目标为同一个目标的目标置信度可以通过以下方式实现:当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域内且第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,则第一设备确定第一目标和第二目标为同一个目标的置信度为目标置信度,且目标置信度大于或等于第一置信度阈值。
如图11所示,图11与图9a的区别在于:在图11中目标A的位置和目标B的位置不仅位于重叠区域内,且目标A的位置和目标B的位置之间的距离小于或等于第二阈值,而在图9a中目标A的位置和目标B的位置虽然位于重叠区域内,但是可能目标A的位置和目标B的位置之间的距离不一定小于或等于第二阈值,因此,图11中目标A和目标B为同一个目标的置信度高于图9a中目标A和目标B为同一个目标的置信度。此外,图11中重叠区域的面积大于图9a中重叠区域的面积,因此,对比图11和图9a可以确定图11中目标A和目标B为同一目标的可能性更高。这也符合图11中所示的实际情况,图11中目标A的位置和目标B的位置之间的距离小于图9a中目标A的位置和目标B的位置之间的距离,换言之图11中目标A的位置和目标B的位置之间的距离相比于图9a中目标A的位置和目标B的位置之间的距离更近。
需要说明的是,以目标b’为位于位置1的目标b,以目标b”为位于位置2的目标b,位置1和位置2不同为例,本申请实施例中在目标a和目标b’均位于第一区域,且目标a的位置和目标b”的位置也位于第一区域的情况下,如果目标a和目标b’之间对应的重叠区域面积大于目标a和目标b”对应的重叠区域面积,目标a和目标b’之间对应的重叠区域的重叠度大小和目标a和目标b”对应的重叠区域的重叠度大小均大于或等于第一阈值,则目标a和目标b’为同一个目标的置信度高于目标a和目标b”为同一个目标的置信度。
需要说明的是,目标a和目标b’之间对应的重叠区域面积大于目标a和目标b”对应的重叠区域,目标a和目标b’之间对应的重叠区域的重叠度大小和目标a和目标b”对应的重叠区域的重叠度大小均大于或等于第一阈值,如果目标a和目标b’之间的 距离大于目标a和目标b”之间的距离,则目标a和目标b”为同一个目标的置信度高于目标a和目标b’为同一个目标的可能性。
在一种可能的实现方式中,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域内且第一目标的位置和第二目标的位置之间距离小于或等于第二阈值,第一目标和第二目标为同一个目标的置信度为第五置信度。当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置中至少一个位于第一区域外,和/或,第一目标的位置和第二目标的位置之间距离大于第二阈值,第一目标和所述第二目标为同一个目标的置信度为第六置信度。第五置信度大于第六置信度。
本申请实施例中的第一目标的位置和第二目标的位置中至少一个位于第一区域外也可以替换为:第一目标的位置和第二目标的位置中的最多一个位于第一区域内。该描述包括如下情况:第一目标的位置位于第一区域内,而第二目标的位置位于第一区域外,或者第二目标的位置位于第一区域内,而第一目标的位置位于第一区域外。或者第一目标的位置和第二目标的位置均位于第一区域外。
可以理解的是,当重叠度大小大于或等于第一阈值时,第一目标的位置位于第一区域内,而第二目标的位置位于第一区域外,以及第一目标的位置和第二目标的位置之间距离大于第二阈值,则第一目标和第二目标为同一个目标的置信度为第六置信度。或者,当重叠度大小大于或等于第一阈值时,第一目标的位置位于第一区域内,而第二目标的位置位于第一区域外,则第一目标和第二目标为同一个目标的置信度为第六置信度。或者,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置之间距离大于第二阈值则第一目标和所述第二目标为同一个目标的置信度为第六置信度。
当重叠度大小大于或等于第一阈值时,第一目标的位置位于第一区域内,而第二目标的位置位于第一区域内,而第一目标的位置位于第一区域外以及第一目标的位置和第二目标的位置之间距离大于第二阈值则第一目标和第二目标为同一个目标的置信度为第六置信度。或者,当重叠度大小大于或等于第一阈值时,第一目标的位置位于第一区域内,而第二目标的位置位于第一区域内,则第一目标和所述第二目标为同一个目标的置信度为第六置信度。
当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域外以及第一目标的位置和第二目标的位置之间距离大于第二阈值,则第一目标和第二目标为同一个目标的置信度为第六置信度。或者,当重叠度大小大于或等于第一阈值时,第一目标的位置和第二目标的位置均位于第一区域外,则第一目标和第二目标为同一个目标的置信度为第六置信度。
在一种可能的实施例中,本申请实施例为了进一步提高确定第一目标和第二目标为同一个目标的可能性,可以根据第一传感器的检测误差或第二传感器的检测误差确定第一阈值。
作为一种可能的实施例,本申请实施例中的第一阈值由所述第二检测误差确定,或者所述第一阈值由所述第四检测误差确定。
作为一种示例,第一阈值随着第二检测误差的增大而增大。
作为另一种示例,第一阈值随着第四检测误差的增大而增大。
一种可能的实现,第二检测误差为第一误差时,第一阈值为第一值,第二检测误差为第二误差时,第一阈值为第二值,其中,第一误差大于第二误差,第一值大于第二值。
举例说明,以第二检测误差为第一传感器在水平方向上的测距误差为例,对比图9a和图12,在图12中第一传感器在水平方向上的测距误差(EXr)为第一误差,图9a中第一传感器在水平方向上的测距误差(EXr)为第二误差,对比发现第一误差大于第二误差。对比图9a和图12可以发现,当第一传感器水平方向上的测距误差(EXr)增大时,第一目标区域和第二目标区域的重叠区域的重叠面积大小将逐渐增大(例如,图12中的第一目标区域R r的面积大于图9a中第一目标区域R r的面积)。因此如果第一阈值固定不变的话,将可能降低使用重叠区域的重叠度大小与第一阈值之间的关系作为判断第一目标和第二目标是否为同一个目标的准确度。基于此,则需要根据第一传感器在水平方向上的测距误差动态调整第一阈值。
如图12所示,第一传感器水平方向上的测距误差(EXr)达到一定阈值a时,第一传感器在水平方向上的测距误差(EXr)在水平方向上覆盖了整个第一区域。之后随着第一传感器在水平方向上的测距误差(EXr)继续增大,第一目标区域虽然将继续变大,但是不会影响第一目标区域和第二目标区域的重叠区域的重叠面积大小的变化,也即后续即使第一传感器在水平方向上的测距误差(EXr)在阈值a的基础上继续增大而重叠区域的重叠面积不发生变化,也即重叠面积处于“饱和”状态。此时,对于图12中的情形可以将第一阈值设置为第一值,对于图9a的情况,可以将第一阈值设置为第二值,以减少这种“饱和”带来的影响。
也可以理解的是,第一值为重叠面积处于“饱和”状态时设置的值,而第二值为重叠面积未处于“饱和”状态时设置的值。
可以理解的是,本申请实施例中第一设备中具有多个第二检测误差以及多个第二检测误差中每个第二检测误差对应的值。这样在第二检测误差为第一误差时,第一设备便可以将第一误差对应的值作为第一阈值。
作为另一种可能的实现,第四检测误差为第三误差时,第一阈值为第三值,第四检测误差为第四误差时,第一阈值为第四值,其中,第三误差大于第四误差,第三值大于第四值。
举例说明,以第四检测误差为第二传感器在垂直方向上的测距误差为例,对比图9a和图13,在图13中第二传感器在垂直方向上的测距误差(EYi)为第三误差,图9a中第二传感器在垂直方向上的测距误差(EYi)为第四误差,对比发现第三误差大于第四误差,对比图9a和图13可以发现,当第二传感器在垂直方向上的测距误差(EYi)增大时,第一目标区域和第二目标区域的重叠区域的重叠面积大小也将逐渐增大(例如,图13中的第二目标区域R i的面积大于图9a中第二目标区域R i的面积)。因此如果第一阈值固定不变的话,将可能降低使用重叠区域的重叠度大小与第一阈值之间的关系作为判断第一目标和第二目标是否为同一个目标的准确度。基于此,则需要根据第二传感器在垂直方向上的测距误差(EYi)动态调整第一阈值。
如图13所示,当第二传感器在垂直方向上的测距误差(EYi)达到一定的阈值b 时,第二传感器在垂直方向上的测距误差(EYi)在垂直方向上覆盖了整个第一区域。之后随着第二传感器在垂直方向上的误差(EYi)继续增大,第二目标区域也将继续变大,但是第二目标区域虽然继续增大但不会影响第一目标区域和第二目标区域的重叠区域的重叠面积大小,也即重叠面积处于“饱和”状态。此时,对于图13中的情形可以将第一阈值设置为第三值,对于图9a的情况,可以将第一阈值设置为第四值,以减少这种“饱和”带来的影响。
可以理解的是,本申请实施例中第一设备中具有多个第四检测误差以及多个第四检测误差中每个第四检测误差对应的值。这样在第四检测误差为第三误差时,第一设备便可以将第三误差对应的值作为第一阈值。例如,本申请实施例中第一设备中具有多个第二传感器在垂直方向上的测距误差(EYi)以及多个第二传感器在垂直方向上的测距误差(EYi)中每个第二传感器在垂直方向上的测距误差(EYi)对应的值。这样在第二传感器在垂直方向上的测距误差(EYi)为第三误差时,第一设备便可以将第三误差对应的值作为第三阈值。
需要说明的是,图12以第一传感器在水平方向上的测距误差增大为例进行描述,图13以第二传感器在垂直方向上的测距误差增大为例进行描述,当第一传感器在水平方向上的测距误差和第二传感器在垂直方向上的测距误差均增大时,第一阈值也可以由第一传感器在水平方向上的测距误差和第二传感器在垂直方向上的测距误差共同确定。例如,第一阈值随着第一传感器在水平方向上的测距误差和第二传感器在垂直方向上的测距误差增大而变化(例如,增大,如图14所示)。对比图14和图9a可知,在图14中不仅第一传感器在水平方向上的测距误差大于图9a中第一传感器在水平方向上的测距误差,且图14中第二传感器在垂直方向上的测距误差也大于图9a中第二传感器在垂直方向上的测距误差,因此在图14所示的情况下的第一阈值应该大于图9a的情况对应的第一阈值。或者,当第一传感器在水平方向上的测距误差和第二传感器在垂直方向上的测距误差均增大的情况下,第一阈值为a,或第一阈值为b,其中,a大于b,a为重叠区域覆盖第一区域时设定的参数,b为重叠区域未覆盖第一区域时设定的参数。
当然,也可以理解的是,第一阈值随着第一传感器的第二检测误差减小而降低,第一阈值随着第二传感器的第四检测误差的减小而降低。
在一种可能的实施例中,本申请实施例中的第二阈值由第二检测误差以及第四检测误差共同确定。例如,第二阈值随着第二检测误差以及第四检测误差的增大而减小。或者,第二阈值随着第二检测误差以及第四检测误差的增大而减小的减小而增大。
具体的,第一传感器的第二检测误差为第五误差,第四检测误差为第六误差,第二阈值为第五值,第一传感器的第二检测误差为第七误差,第四检测误差为第八误差,第二阈值为第六值,其中,第五值小于第六值,第五误差大于第七误差,第六误差大于第八误差。
举例说明,如图14所示,以第二检测误差为第一传感器在水平方向上的测距误差,第四检测误差为第二传感器在垂直方向上的测距误差为例,第一传感器在水平方向上的测距误差(例如,为第五误差),第二传感器在垂直方向上的测距误差(例如,为第六误差)大于图9a中第一传感器在水平方向上的测距误差(例如,第七误差)和第 二传感器在垂直方向上的测距误差(例如,第八误差),图14中,第一目标区域和第二目标区域的重叠区域(图14中的阴影部分)覆盖了整个第一区域,这时重叠区域的重叠面积达到“饱和”状态,也即后续即使第一传感器在水平方向上的测距误差和第二传感器在垂直方向上的测距误差继续增大,后续重叠区域的重叠面积将不再发生变化。对比图14和图9a可以发现图14第一目标的位置和第二目标的位置之间的距离小于图9a中第一目标的位置和第二目标的位置之间的距离,因此为了降低误判,图9a当第一传感器在水平方向上的测距误差为第七误差和第二传感器在垂直方向上的测距误差为第八误差时对应的第二阈值应大于第一传感器在水平方向上的测距误差为第五误差和第二传感器在垂直方向上的测距误差为第六误差时对应的第二阈值,以降低这种饱和带来的影响。
换言之,第二阈值由重叠区域和第一区域之间的关系确定。具体的,第二阈值为第七值,或第二阈值为第八值,其中,第七值大于第八值,第八值为重叠区域覆盖第一区域时设定的参数,第七值为重叠区域未覆盖第一区域时设定的参数。
需要说明的是,图8~图14中均以第一传感器的第二检测误差为第一传感器在水平方向上的测距误差为例。
可以理解的是,本申请实施例中第一设备确定的可以是第一目标和第二目标为同一个目标的置信度,例如,置信度为90%或者置信度为80%。当然,第一设备确定的也可以是第一目标和第二目标为同一个目标或者不是同一个目标的指示值。例如,第一设备输出的是第一指示符或第二指示符,其中,第一指示符表示第一目标和第二目标为同一个目标。第二指示符表示第一目标和第二目标不是同一个目标。
作为在一种可能的实现方式,本申请实施例提供的方法还可以包括:第一设备根据目标置信度确定第一目标和第二目标是否为同一个目标。例如,第一设备确定目标置信度大于或等于第一置信度阈值,则确定第一目标和第二目标为同一个目标。第一设备确定目标置信度小于或等于第二置信度阈值,则确定第一目标和第二目标不是同一个目标。
上述主要从第一设备的角度对本申请实施例的方案进行了介绍。可以理解的是,第一设备等为了实现上述功能,其包括了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以根据上述方法示例第一设备进行功能单元的划分,例如,可以对应各个功能划分各个功能单元,也可以将两个或两个以上的功能集成在一个处理单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。需要说明的是,本申请实施例中对单元的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
上面结合图7至图14,对本申请实施例的方法进行了说明,下面对本申请实施例提供的执行上述方法的目标检测装置进行描述。本领域技术人员可以理解,方法和装 置可以相互结合和引用,本申请实施例提供的一种目标检测装置可以执行上述目标检测方法中由第一设备执行的步骤。
下面以采用对应各个功能划分各个功能模块为例进行说明:
在采用集成的单元的情况下,图15示出了上述实施例中所涉及的一种目标检测装置,该目标检测装置可以包括:第一确定模块1501,计算模块1502以及第二确定模块1503。
一种示例,该目标检测装置为第一设备,或者为应用于第一设备中的芯片。在这种情况下,第一确定模块1501,用于支持该目标检测装置执行上述实施例中由第一设备执行的步骤701。计算模块1502,用于支持目标检测装置执行上述实施例中由第一设备执行的步骤702。第二确定模块1503,用于支持该目标检测装置执行上述实施例中由第一设备执行的步骤703。
该目标检测装置还可以包括存储模块。该存储模块,用于存储计算机程序代码,计算机程序代码包括指令。如果目标检测装置为应用于第一设备中的芯片时,该存储模块可以是该芯片内的存储模块(例如,寄存器、缓存等),也可以是该第一设备内的位于该芯片外部的存储模块(例如,只读存储器、随机存取存储器等)。
其中,第一确定模块1501、计算模块1502和第二确定模块1503可以集成在处理器或控制器上,例如处理器或控制器可以是中央处理器单元,通用处理器,数字信号处理器,专用集成电路,现场可编程门阵列或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本发明公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,数字信号处理器和微处理器的组合等等。
例如,本申请所涉及的目标检测装置可以为图6所示的目标检测装置。此时,第一确定模块1501、计算模块1502和第二确定模块1503可以集成在处理器41或处理器45上。
在采用集成的单元的情况下,图16示出了上述实施例中所涉及的目标检测装置的一种可能的逻辑结构示意图。该目标检测装置包括:处理单元1601。处理单元1601用于对目标检测装置的动作进行控制管理,例如,处理单元1601用于执行在目标检测装置中进行信息/数据处理的步骤。
在一种可能的实施例中,目标检测装置还可以包括存储单元1602,用于存储目标检测装置可的程序代码和数据。在一种可能的实施例中,当目标检测装置为服务器时,目标检测装置还可以包括通信单元1603。通信单元1603用于支持目标检测装置进行信息/数据发送或者接收的步骤。
示例性的,目标检测装置为第一设备,或者为应用于第一设备中的芯片。在这种情况下,处理单元1601,用于支持目标检测装置执行上述实施例中由第一设备执行的步骤701~步骤703。
图17是本申请实施例提供的芯片150的结构示意图。芯片150包括一个或两个以上(包括两个)处理器1510和通信接口1530。
可选的,该芯片150还包括存储器1540,存储器1540可以包括只读存储器和随机存取存储器,并向处理器1510提供操作指令和数据。存储器1540的一部分还可以 包括非易失性随机存取存储器(non-volatile random access memory,NVRAM)。
在一些实施方式中,存储器1540存储了如下的元素,执行模块或者数据结构,或者他们的子集,或者他们的扩展集。
在本申请实施例中,通过调用存储器1540存储的操作指令(该操作指令可存储在操作系统中),执行相应的操作。
一种可能的实现方式中为:第一设备的结构类似,不同的装置可以使用不同的芯片以实现各自的功能。
处理器1510控制第一设备的处理操作,处理器1510还可以称为中央处理单元(central processing unit,CPU)。
存储器1540可以包括只读存储器和随机存取存储器,并向处理器1510提供指令和数据。存储器1540的一部分还可以包括非易失性随机存取存储器(non-volatile random access memory,NVRAM)。例如应用中存储器1540、通信接口1530以及存储器1540通过总线系统1520耦合在一起,其中总线系统1520除包括数据总线之外,还可以包括电源总线、控制总线和状态信号总线等。但是为了清楚说明起见,在图17中将各种总线都标为总线系统1520。
上述本申请实施例揭示的方法可以应用于处理器1510中,或者由处理器1510实现。处理器1510可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器1510中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器1510可以是通用处理器、数字信号处理器(digital signal processing,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field-programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器1540,处理器1510读取存储器1540中的信息,结合其硬件完成上述方法的步骤。
一种可能的实现方式中,处理器1510用于执行图7所示的实施例中的第一设备执行的处理的步骤。
以上通信单元可以是一种该装置的接口电路或通信接口,用于从其它装置接收信号。例如,当该装置以芯片的方式实现时,该通信单元是该芯片用于从其它芯片或装置接收信号或发送信号的接口电路或通信接口。
在上述实施例中,存储器存储的供处理器执行的指令可以以计算机程序产品的形式实现。计算机程序产品可以是事先写入在存储器中,也可以是以软件形式下载并安装在存储器中。
计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。计算机指令可以存储在计算机 可读存储介质中,或者从一个计算机可读存储介质向另一计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存储的任何可用介质或者是包括一个或多个可用介质集成的服务器、数据中心等数据存储设备。可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘solid state disk,SSD)等。
一方面,提供一种计算机可读存储介质,计算机可读存储介质中存储有指令,当指令被运行时,使得第一设备或者应用于第一设备中的芯片执行实施例中的步骤701、步骤702以及步骤703。
前述的可读存储介质可以包括:U盘、移动硬盘、只读存储器、随机存取存储器、磁碟或者光盘等各种可以存储程序代码的介质。
一方面,提供一种包括指令的计算机程序产品,计算机程序产品中存储有指令,当指令被运行时,使得第一设备或者应用于第一设备中的芯片执行实施例中的步骤701、步骤702以及步骤703。
一方面,提供一种芯片,该芯片应用于第一设备中,芯片包括至少一个处理器和通信接口,通信接口和至少一个处理器耦合,处理器用于运行指令,以执行实施例中的步骤701、步骤702以及步骤703。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件程序实现时,可以全部或部分地以计算机程序产品的形式来实现。该计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行计算机程序指令时,全部或部分地产生按照本申请实施例的流程或功能。计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,计算机指令可以从一个网站站点、计算机、服务器或者数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,简称DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包括一个或多个可以用介质集成的服务器、数据中心等数据存储设备。可用介质可以是磁性介质(例如,软盘、硬盘、磁带),光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,简称SSD))等。
尽管在此结合各实施例对本申请进行了描述,然而,在实施所要求保护的本申请过程中,本领域技术人员通过查看附图、公开内容、以及所附权利要求书,可理解并实现公开实施例的其他变化。在权利要求中,“包括”(comprising)一词不排除其他组成部分或步骤,“一”或“一个”不排除多个的情况。单个处理器或其他单元可以实现权利要求中列举的若干项功能。相互不同的从属权利要求中记载了某些措施,但这并不表示这些措施不能组合起来产生良好的效果。
尽管结合具体特征及其实施例对本申请进行了描述,显而易见的,在不脱离本申请的精神和范围的情况下,可对其进行各种修改和组合。相应地,本说明书和附图仅 仅是所附权利要求所界定的本申请的示例性说明,且视为已覆盖本申请范围内的任意和所有修改、变化、组合或等同物。显然,本领域的技术人员可以对本申请进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包括这些改动和变型在内。

Claims (25)

  1. 一种目标检测方法,其特征在于,包括:
    确定第一目标区域以及第二目标区域的重叠区域,所述第一目标区域由第一传感器的检测误差以及由所述第一传感器检测到的第一目标的位置确定,所述第二目标区域由第二传感器的检测误差以及由所述第二传感器检测到的第二目标的位置确定,所述第一目标区域包括所述第一目标的位置,所述第二目标区域包括所述第二目标的位置;
    计算所述重叠区域占所述第一目标区域以及所述第二目标区域并集的重叠度大小;
    当所述重叠度大小大于或等于第一阈值时,根据所述第一目标的位置和所述第二目标的位置与第一区域之间的关系,和/或,根据所述第一目标的位置和所述第二目标的位置之间的距离与第二阈值之间的关系,确定所述第一目标和所述第二目标为同一个目标的目标置信度;
    其中,所述第一区域由所述第一传感器的第一检测误差和所述第二传感器的第三检测误差确定,所述第一传感器的检测误差包括所述第一检测误差和第二检测误差,所述第一检测误差的精度高于所述第二检测误差的精度,所述第二传感器的检测误差包括所述第三检测误差和第四检测误差,所述第三检测误差的精度高于所述第四检测误差的精度。
  2. 根据权利要求1所述的方法,其特征在于,所述第一目标的位置和所述第二目标的位置与第一区域之间的关系包括:
    所述第一目标的位置和所述第二目标的位置均位于所述第一区域内。
  3. 根据权利要求1所述的方法,其特征在于,所述第一目标的位置和所述第二目标的位置之间的距离与第二阈值之间的关系包括:
    所述第一目标的位置和所述第二目标的位置之间的距离小于或等于第二阈值。
  4. 根据权利要求1~3任一项所述的方法,其特征在于,所述根据所述第一目标的位置和所述第二目标的位置与第一区域之间的关系,和/或,根据所述第一目标的位置和所述第二目标的位置之间的距离与第二阈值之间的关系,确定所述第一目标和所述第二目标为同一个目标的目标置信度,包括:
    所述第一目标和所述第二目标均位于所述第一区域内,和/或,所述第一目标和所述第二目标之间距离小于或等于第二阈值,确定所述第一目标和所述第二目标为同一个目标的目标置信度,所述目标置信度大于或者等于第一置信度阈值。
  5. 根据权利要求1~4任一项所述的方法,其特征在于,
    所述第一阈值由所述第二检测误差确定,或者所述第一阈值由所述第四检测误差确定。
  6. 根据权利要求5所述的方法,其特征在于,所述第一阈值随着所述第二检测误差的增大而增大,或者所述第一阈值随着所述第四检测误差的增大而增大。
  7. 根据权利要求1~6任一项所述的方法,其特征在于,
    所述第二阈值由所述第二检测误差以及所述第四检测误差确定。
  8. 根据权利要求7所述的方法,其特征在于,所述第二阈值随着所述第二检测误差以及所述第四检测误差的增大而减小。
  9. 根据权利要求1~8任一项所述的方法,其特征在于,所述第一传感器为雷达,所述第二传感器为摄像头。
  10. 根据权利要求1~9任一项所述的方法,其特征在于,所述第一检测误差为所述第一传感器在垂直方向上的测距误差,所述第二检测误差为所述第一传感器的测角误差或者所述第一传感器在水平方向上的测距误差;
    所述第三检测误差为所述第二传感器的测角误差;所述第四检测误差为所述第二传感器的测距误差。
  11. 根据权利要求1~10任一项所述的方法,其特征在于,所述第一目标的位置为所述第一目标相对于XY平面上的基准点的位置,所述XY平面的X轴方向是车辆宽度方向,所述XY平面的Y轴方向是车辆长度方向;所述第二目标的位置为所述第二目标相对于XY平面上的基准点的位置。
  12. 一种目标检测装置,其特征在于,包括:
    第一确定模块,用于确定第一目标区域以及第二目标区域的重叠区域,所述第一目标区域由第一传感器的检测误差以及由所述第一传感器检测到的第一目标的位置确定,所述第二目标区域由第二传感器的检测误差以及由所述第二传感器检测到的第二目标的位置确定,所述第一目标区域包括所述第一目标的位置,所述第二目标区域包括所述第二目标的位置;
    计算模块,用于计算所述重叠区域占所述第一目标区域以及所述第二目标区域并集的重叠度大小;
    当所述重叠度大小大于或等于第一阈值时,第二确定模块用于根据所述第一目标的位置和所述第二目标的位置与第一区域之间的关系,和/或,所述确定模块用于根据所述第一目标的位置和所述第二目标的位置之间的距离与第二阈值之间的关系,确定所述第一目标和所述第二目标为同一个目标的置信度为目标置信度;
    其中,所述第一区域由所述第一传感器的第一检测误差和所述第二传感器的第三检测误差确定,所述第一传感器的检测误差包括所述第一检测误差和第二检测误差,所述第一检测误差的精度高于所述第二检测误差的精度,所述第二传感器的检测误差包括所述第三检测误差和第四检测误差,所述第三检测误差的精度高于所述第四检测误差的精度。
  13. 根据权利要求12所述的装置,其特征在于,所述第一目标的位置和所述第二目标的位置与第一区域之间的关系包括:
    所述第一目标的位置和所述第二目标的位置均位于所述第一区域内。
  14. 根据权利要求12所述的装置,其特征在于,所述第一目标的位置和所述第二目标的位置之间的距离与第二阈值之间的关系包括:
    所述第一目标的位置和所述第二目标的位置之间的距离小于或等于第二阈值。
  15. 根据权利要求12~14任一项所述的装置,其特征在于,
    所述第一目标和所述第二目标均位于所述第一区域内,和/或,所述第一目标和所述第二目标之间距离小于或等于第二阈值,所述第二确定模块,用于确定所述置信度为所述目标置信度,所述目标置信度大于或者等于第一置信度阈值。
  16. 根据权利要求12~15任一项所述的装置,其特征在于,
    所述第一阈值由所述第二检测误差确定,或者所述第一阈值由所述第四检测误差确定。
  17. 根据权利要求16所述的装置,其特征在于,所述第一阈值随着所述第二检测误差的增大而增大,或者所述第一阈值随着所述第四检测误差的增大而增大。
  18. 根据权利要求12~17任一项所述的装置,其特征在于,
    所述第二阈值由所述第二检测误差以及所述第四检测误差确定。
  19. 根据权利要求18所述的装置,其特征在于,所述第二阈值随着所述第二检测误差以及所述第四检测误差的增大而减小。
  20. 根据权利要求12~19任一项所述的装置,其特征在于,所述第一传感器为雷达,所述第二传感器为摄像头。
  21. 根据权利要求12~20任一项所述的装置,其特征在于,所述第一检测误差为所述第一传感器在垂直方向上的测距误差,所述第二检测误差为所述第一传感器的测角误差或者所述第一传感器在水平方向上的测距误差;
    所述第三检测误差为所述第二传感器的测角误差;所述第四检测误差为所述第二传感器的测距误差。
  22. 根据权利要求12~21任一项所述的装置,其特征在于,所述第一目标的位置为所述第一目标相对于XY平面上的基准点的位置,所述XY平面的X轴方向是车辆宽度方向,所述XY平面的Y轴方向是车辆长度方向;所述第二目标的位置为所述第二目标相对于XY平面上的基准点的位置。
  23. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有指令,当所述指令被运行时,实现上述权利要求1-11任一项所述的目标检测方法。
  24. 一种芯片,其特征在于,所述芯片包括至少一个处理器和通信接口,所述通信接口和所述至少一个处理器耦合,所述至少一个处理器用于运行计算机程序或指令,以实现如权利要求1-11中任一项所述的目标检测方法,所述通信接口用于与所述芯片之外的其它模块进行通信。
  25. 一种目标检测装置,其特征在于,包括:至少一个处理器,所述至少一个处理器和存储器耦合,所述存储器中存储有计算机程序或指令,所述至少一个处理器用于运行所述存储器中存储的所述计算机程序或指令,以实现权利要求1-11中任一项所述的目标检测方法。
PCT/CN2020/081506 2020-03-26 2020-03-26 一种目标检测方法以及装置 WO2021189385A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202080004870.XA CN112689842B (zh) 2020-03-26 2020-03-26 一种目标检测方法以及装置
EP20926907.5A EP4123337A4 (en) 2020-03-26 2020-03-26 TARGET DETECTION METHOD AND APPARATUS
PCT/CN2020/081506 WO2021189385A1 (zh) 2020-03-26 2020-03-26 一种目标检测方法以及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/081506 WO2021189385A1 (zh) 2020-03-26 2020-03-26 一种目标检测方法以及装置

Publications (1)

Publication Number Publication Date
WO2021189385A1 true WO2021189385A1 (zh) 2021-09-30

Family

ID=75457662

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/081506 WO2021189385A1 (zh) 2020-03-26 2020-03-26 一种目标检测方法以及装置

Country Status (3)

Country Link
EP (1) EP4123337A4 (zh)
CN (1) CN112689842B (zh)
WO (1) WO2021189385A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115700204A (zh) * 2021-07-14 2023-02-07 魔门塔(苏州)科技有限公司 自动驾驶策略的置信度确定方法及装置
CN115035186B (zh) * 2021-12-03 2023-04-11 荣耀终端有限公司 目标对象标记方法和终端设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006292475A (ja) * 2005-04-07 2006-10-26 Toyota Motor Corp 物体検出装置
US20110234761A1 (en) * 2008-12-08 2011-09-29 Ryo Yumiba Three-dimensional object emergence detection device
CN103837872A (zh) * 2012-11-22 2014-06-04 株式会社电装 目标检测装置
CN106249239A (zh) * 2016-08-23 2016-12-21 深圳市速腾聚创科技有限公司 目标检测方法及装置

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102749626B (zh) * 2012-07-17 2015-04-08 奇瑞汽车股份有限公司 雷达传感器、汽车以及目标方位识别方法
JP6075408B2 (ja) * 2012-11-22 2017-02-08 株式会社デンソー 物標検出装置
JP5812061B2 (ja) * 2013-08-22 2015-11-11 株式会社デンソー 物標検出装置およびプログラム
US10349917B2 (en) * 2014-06-11 2019-07-16 The Johns Hopkins University Synthetic aperture ultrasound system
JP6512164B2 (ja) * 2016-04-22 2019-05-15 株式会社デンソー 物体検出装置、物体検出方法
JP6885721B2 (ja) * 2016-12-27 2021-06-16 株式会社デンソー 物体検出装置、物体検出方法
JP6787157B2 (ja) * 2017-01-31 2020-11-18 株式会社デンソー 車両制御装置
CN108663677A (zh) * 2018-03-29 2018-10-16 上海智瞳通科技有限公司 一种多传感器深度融合提高目标检测能力的方法
CN108549876A (zh) * 2018-04-20 2018-09-18 重庆邮电大学 基于目标检测和人体姿态估计的坐姿检测方法
CN109212513B (zh) * 2018-09-29 2021-11-12 河北德冠隆电子科技有限公司 多目标在雷达间数据传递、数据融合及连续跟踪定位方法
CN110133637B (zh) * 2019-06-05 2021-06-01 中国科学院长春光学精密机械与物理研究所 目标定位方法、装置及系统

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006292475A (ja) * 2005-04-07 2006-10-26 Toyota Motor Corp 物体検出装置
US20110234761A1 (en) * 2008-12-08 2011-09-29 Ryo Yumiba Three-dimensional object emergence detection device
CN103837872A (zh) * 2012-11-22 2014-06-04 株式会社电装 目标检测装置
CN106249239A (zh) * 2016-08-23 2016-12-21 深圳市速腾聚创科技有限公司 目标检测方法及装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4123337A4 *

Also Published As

Publication number Publication date
EP4123337A4 (en) 2023-04-12
CN112689842A (zh) 2021-04-20
CN112689842B (zh) 2022-04-29
EP4123337A1 (en) 2023-01-25

Similar Documents

Publication Publication Date Title
US11561295B2 (en) Position detecting method, device and storage medium for vehicle ladar
US20220196408A1 (en) Lane Line Information Determining Method and Apparatus
WO2021189385A1 (zh) 一种目标检测方法以及装置
US11532166B2 (en) Obstacle positioning method, device and terminal
CN112946623B (zh) 基于车辆上安装77g毫米波雷达的测速方法和装置
WO2020107176A1 (zh) 一种障碍物检测方法及装置
WO2021056434A1 (zh) 一种探测对象的检测方法、探测设备及毫米波雷达
WO2023279225A1 (zh) 激光雷达的点云处理方法、装置、存储介质及终端设备
JP2021025945A (ja) 物標検出装置
KR102473272B1 (ko) 목표 추적 방법 및 장치
US11841419B2 (en) Stationary and moving object recognition apparatus
JP6834020B2 (ja) 物体認識装置および物体認識方法
CN116071421A (zh) 一种图像处理方法、装置及计算机可读存储介质
WO2023015407A1 (zh) 一种伪像点识别方法、终端设备及计算机可读存储介质
JP7254243B2 (ja) 物体検知システムおよび物体検知方法
CN114779257A (zh) 一种车位检测方法、装置、存储介质和车辆
US20220351618A1 (en) Traffic Indication Information Determining Method and Apparatus
CN113715753A (zh) 车辆传感器数据的处理方法和系统
CN116626630B (zh) 一种物体分类方法、装置、电子设备及存储介质
WO2023061355A1 (zh) 速度检测方法、装置、设备及可读存储介质
WO2023206166A1 (zh) 目标检测方法及装置
US20240193961A1 (en) Parking space detection method, electronic device and computer-readable storage medium
WO2022102371A1 (ja) 物体検出装置、物体検出方法
US20230417894A1 (en) Method and device for identifying object
WO2023125321A1 (zh) 激光雷达的信号处理方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20926907

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020926907

Country of ref document: EP

Effective date: 20221021