WO2021121247A1 - Procédé et appareil permettant de déterminer un seuil de suivi d'objet cible - Google Patents

Procédé et appareil permettant de déterminer un seuil de suivi d'objet cible Download PDF

Info

Publication number
WO2021121247A1
WO2021121247A1 PCT/CN2020/136718 CN2020136718W WO2021121247A1 WO 2021121247 A1 WO2021121247 A1 WO 2021121247A1 CN 2020136718 W CN2020136718 W CN 2020136718W WO 2021121247 A1 WO2021121247 A1 WO 2021121247A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
point cloud
frame
threshold
tracking
Prior art date
Application number
PCT/CN2020/136718
Other languages
English (en)
Chinese (zh)
Inventor
崔天翔
刘兴业
康文武
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021121247A1 publication Critical patent/WO2021121247A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles

Definitions

  • This application relates to the field of automatic driving technology, and in particular to a method and device for determining a target object tracking threshold.
  • target classification technology can provide information about the surrounding objects for the vehicle and help the vehicle's subsequent driving decisions. Therefore, target recognition plays a vital role in the vehicle's perception of the surrounding environment.
  • Target recognition can also be called target type, which can refer to determining the target object as a certain type of object, that is, distinguishing a target object from other objects.
  • One solution is to use a clustering algorithm to cluster the detection points on the radar point cloud to obtain clusters. Then, the detection points in the same cluster are classified as a target object.
  • This solution has low accuracy, especially when the detection points in the radar point cloud are relatively sparse (for example, the resolution of millimeter wave radar is low, and the detection points in the radar point cloud are relatively sparse), the classification accuracy of the target object is low, and there is A large number of false detections.
  • clustering of detection points on a radar point cloud using a clustering algorithm often results in detection points corresponding to a single object being clustered into multiple clusters.
  • using a clustering algorithm to cluster the detection points on the radar point cloud can show that the detection points corresponding to multiple objects are clustered into a cluster.
  • One solution is to use tracking algorithms such as Kalman filter and particle filter to frame the detection points according to a fixed threshold frame.
  • the detection point framed by a threshold frame is regarded as a target object for classification. It can be understood that different objects have different sizes, and the corresponding detection point distribution ranges are different.
  • a unified tracking threshold frame is used for framing, and different detection points corresponding to different objects are often framed into a threshold frame, which makes the tracks of different objects interfere with each other, and there are a lot of errors. Check.
  • the embodiments of the present application provide a method and device for determining the tracking threshold of a target object, which can determine the type of the target object, thereby improving the accuracy of the tracking threshold of the target object, effectively reducing the interference between tracks of different target objects, and reducing errors. Inspection rate.
  • a method for determining a tracking threshold of a target object including determining at least one frame of radar point cloud, the at least one frame of radar point cloud is a point data set obtained by radar measuring the target object, and the at least one frame of radar point cloud
  • the point cloud includes the first frame of radar point cloud; N tracking thresholds corresponding to the first frame of radar point cloud are determined, the N tracking thresholds include the first threshold, and K confidence levels are determined according to the point cloud data in the first threshold, the K confidence levels correspond to K target categories one-to-one; at least one frame of radar point cloud in other frames of radar point cloud can refer to the first frame of radar point cloud to obtain K confidence levels; at least according to the first frame of radar point cloud
  • the K confidence levels of, determine the first target category; according to the first target category, the target threshold for tracking the target object can be determined from N tracking thresholds.
  • the solution of this application can determine the confidence that the target object belongs to different target categories, and then, according to the confidence of belonging to different target categories, determine the category of the target object, that is, the first target category, and then according to the category of the target object , Determine the tracking threshold suitable for the target object, as the target threshold for tracking the target object, by comprehensively determining the target object category, the accuracy of classification can be improved, and then the determination of the target threshold can be optimized.
  • the K confidence levels of the radar point cloud of the first frame include a first confidence level corresponding to the first target category, and the first confidence level is used to characterize that the point cloud data in the first threshold belongs to The accuracy of the first target category.
  • the other confidences in the K confidences are used to characterize the accuracy of the point cloud data in the first threshold belonging to the target category corresponding to the other confidences.
  • the accuracy of the point cloud data in the first threshold belonging to different target categories can be determined, and then the target category with the highest accuracy can be determined as the target object category, and the most suitable target object can be determined The tracking threshold.
  • the N tracking thresholds are determined according to preset parameter information, and are used to define the range corresponding to the target object in the radar point cloud of the first frame.
  • N tracking thresholds can be set according to preset parameter information so as to define the range corresponding to the target object in the radar point cloud to determine the type of the target object.
  • the parameter information includes geometric size information of the preset target category and/or speed information of the preset target category corresponding to the parameter information.
  • the geometric size information of the target category and/or the speed information of the preset target category can be preset to set the tracking threshold, so as to define the range of the target in the radar point cloud to determine the target The category of the object and the target threshold suitable for the target.
  • the first target category is determined based on K total confidence levels, the K total confidence levels include a first total confidence level, and the first total confidence level is the number of N first confidence levels. Sum, N first confidence levels correspond to N tracking thresholds one by one.
  • the K confidence levels of the target object under each of the N tracking thresholds it is possible to determine the K confidence levels of the target object under each of the N tracking thresholds, and calculate the confidence levels of the same target category under each of the N tracking thresholds. Add them together to get the total confidence of the same target category.
  • the total confidence of the different target categories can be obtained, and then the first target category can be determined, which improves the accuracy of determining the target object category.
  • the first target category is determined according to K multi-frame total confidence levels
  • the K multi-frame total confidence levels include the first multi-frame total confidence level
  • the first multi-frame total confidence level Is the sum of at least one first total confidence level
  • the at least one first total confidence level corresponds to the at least one frame of radar point cloud one by one.
  • the first target category can be determined according to the K total confidences in the multi-frame radar point cloud, that is, the first target category can be determined using the information of the multi-frame radar point cloud, which improves Determine the accuracy of the target object category.
  • the K target categories include two or more of pedestrians, automobiles, bicycles, and electric vehicles.
  • pedestrians, cars, bicycles, and electric vehicles can be distinguished, and different tracking thresholds can be used to track pedestrians, cars, bicycles, and electric vehicles respectively.
  • the at least one frame of radar point cloud is a millimeter wave radar point cloud.
  • the resolution of millimeter-wave radar is low.
  • the point data collection of the target object measured by the millimeter-wave radar can be used to determine the target object category and the threshold for tracking the target object, which can reduce the use of millimeter-wave radar. Interference between different tracks when tracking the target object, reducing the false detection rate.
  • an embodiment of the present application provides a device for determining a tracking threshold of a target object.
  • the device includes a processor and a transceiver; wherein the transceiver is used to determine at least one frame of radar point cloud, and at least one frame of radar point cloud is The point data set obtained by measuring the target object, at least one frame of radar point cloud includes the first frame of radar point cloud; the processor is used to determine N tracking thresholds corresponding to the first frame of radar point cloud, and the N tracking thresholds include The first threshold is to determine K confidence levels according to the point cloud data in the first threshold, and the K confidence levels correspond to the K target categories one-to-one; the processor is further configured to determine the first target category according to the K confidence levels The processor is also used to determine the target threshold for tracking the target object from the N tracking thresholds according to the first target category.
  • the device may be, for example, a radar detection device, or for example, a processing device independent of the radar device.
  • the K confidence levels of the radar point cloud of the first frame include a first confidence level corresponding to the first target category, and the first confidence level is used to characterize that the point cloud data in the first threshold belongs to The accuracy of the first target category.
  • the other confidences in the K confidences are used to characterize the accuracy of the point cloud data in the first threshold belonging to the target category corresponding to the other confidences.
  • the N tracking thresholds are determined according to preset parameter information, and are used to define the range corresponding to the target object in the radar point cloud of the first frame.
  • the parameter information includes geometric size information of the preset target category and/or speed information of the preset target category corresponding to the parameter information.
  • the first target category is determined based on K total confidence levels, the K total confidence levels include a first total confidence level, and the first total confidence level is the number of N first confidence levels. Sum, N first confidence levels correspond to N tracking thresholds one by one.
  • the first target category is determined according to K multi-frame total confidence levels
  • the K multi-frame total confidence levels include the first multi-frame total confidence level
  • the first multi-frame total confidence level Is the sum of at least one first total confidence level
  • the at least one first total confidence level corresponds to the at least one frame of radar point cloud one by one.
  • the K target categories include two or more of pedestrians, automobiles, bicycles, and electric vehicles.
  • the device for determining the target object tracking threshold provided in the second aspect is used to implement the corresponding method provided in the first aspect. Therefore, the beneficial effects that can be achieved can refer to the corresponding method provided in the first aspect The beneficial effects of, will not be repeated here.
  • an embodiment of the present application provides a device for determining a target object tracking threshold.
  • the device includes a processing unit and a transceiver unit; wherein the transceiver unit is used to determine at least one frame of radar point cloud, and at least one frame of radar point cloud
  • the processing unit is used to determine N tracking thresholds corresponding to the first frame of radar point cloud, and the N tracking
  • the threshold includes a first threshold, and K confidences are determined according to the point cloud data in the first threshold, and the K confidences correspond to K target categories one-to-one; the processing unit is also used to determine the first confidence based on the K confidences.
  • Target category the processing unit is further configured to determine a target threshold for tracking the target object from the N tracking thresholds according to the first target category.
  • the device for determining the tracking threshold of a target object provided in the third aspect is used to execute the corresponding method provided in the first aspect. Therefore, the beneficial effects that it can achieve can refer to the corresponding method provided in the first aspect. The beneficial effects will not be repeated here.
  • an embodiment of the present application provides a computer storage medium, the computer storage medium includes computer instructions, when the computer instructions run on an electronic device, the electronic device is caused to execute the method described in the first aspect .
  • the computer storage medium provided in the fourth aspect is used to execute the corresponding method provided in the first aspect. Therefore, the beneficial effects that can be achieved can refer to the beneficial effects in the corresponding method provided in the first aspect. I won't repeat them here.
  • the embodiments of the present application provide a computer program product, and the program code included in the computer program product implements the method described in the first aspect when the program code included in the computer program product is executed by a processor in an electronic device.
  • the computer program product provided in the fifth aspect is used to execute the corresponding method provided in the first aspect. Therefore, the beneficial effects it can achieve can refer to the beneficial effects in the corresponding method provided in the first aspect. I won't repeat them here.
  • an embodiment of the present application provides a system for determining a target tracking threshold, which is composed of a detection device and a processing device; wherein the detection device can be used to determine at least one frame of radar point cloud, and the at least one frame
  • the radar point cloud is a collection of point data obtained by the detection device measuring the target object, the at least one frame of radar point cloud includes the first frame of radar point cloud;
  • the processing device is used to determine N corresponding to the first frame of radar point cloud Tracking thresholds, the N tracking thresholds include a first threshold, and K confidence levels are determined according to the point cloud data in the first threshold, and the K confidence levels correspond to K target categories one-to-one; the processing device is also used to K confidence levels determine the first target category;
  • the processing device is further configured to determine a target threshold for tracking the target object from the N tracking thresholds according to the first target category.
  • the detection device may be a radar, such as a vehicle-mounted radar.
  • the system for determining the tracking threshold of the target object may be a smart car.
  • an embodiment of the present application provides a chip system, the chip system includes a processor, and the processor is configured to execute instructions so that a device installed with the chip system executes the method provided in the first aspect.
  • the solution provided by the embodiment of the present application can determine the category of the target object, and use the tracking threshold corresponding to the category of the target object to track the target threshold of the target object, so that the detection point framed by the target threshold no longer participates in the detection of other target objects.
  • Clustering or the process of establishing the track of other target objects realizes that the detection point in the target threshold frame does not affect other tracks, eliminates or reduces the interference between different tracks, and reduces the false detection rate.
  • Figure 1A shows an application scenario of target recognition
  • FIG. 1B shows a clustering result of the point cloud data corresponding to the target object in the scene shown in FIG. 1A;
  • Figure 2A shows another application scenario of target recognition
  • FIG. 2B shows the result of using a fixed tracking threshold to frame the point cloud data corresponding to the target object in the scene shown in FIG. 2A;
  • FIG. 3 is a schematic diagram of an application scenario of an embodiment of the application.
  • FIG. 4 is a schematic diagram of the hardware structure of a vehicle provided by an embodiment of the application.
  • FIG. 5 is a schematic diagram of determining an estimated category of a target object according to an embodiment of the application.
  • FIG. 6 is a schematic diagram of determining the confidence sum of a single frame according to an embodiment of the application.
  • FIG. 7 is a schematic diagram of determining an estimated category of a target object according to an embodiment of the application.
  • FIG. 8 is a flowchart of adjusting the estimated category of a target object provided by an embodiment of the application.
  • FIG. 9A is a scene diagram of an actual verification experiment provided by an embodiment of the application.
  • FIG. 9B is a schematic diagram of an actual verification result provided by an embodiment of the application.
  • FIG. 10A is a scene diagram of an actual verification experiment provided by an embodiment of the application.
  • FIG. 10B is a schematic diagram of an actual verification result provided by an embodiment of the application.
  • FIG. 11A is a scene diagram of an actual verification experiment provided by an embodiment of the application.
  • FIG. 11B is a schematic diagram of an actual verification result provided by an embodiment of the application.
  • FIG. 12 is a flowchart of determining a target object tracking threshold provided by an embodiment of the application.
  • FIG. 13 is a schematic structural diagram of a tracking device for determining a target object provided by an embodiment of the application.
  • FIG. 14 is a schematic block diagram of a device for determining a target object tracking provided by an embodiment of the application.
  • first and second are only used for descriptive purposes, and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features.
  • the terms “including”, “including”, “having” and their variations all mean “including but not limited to”, unless otherwise specifically emphasized.
  • the vehicle 100 may be an automobile, or other forms of motor vehicles.
  • the vehicle may be a vehicle in the form of a car, a bus, a truck, a motorcycle, an agricultural locomotive, a parade float, a game vehicle in an amusement park, and the like.
  • the vehicle 100 may be in an automatic driving state, that is, the vehicle 100 is driven completely autonomously, without the driver's control or only a small amount of driver's control.
  • the vehicle 100 can track nearby objects, such as the vehicle 210, the pedestrian 220, etc., to provide assistance for the subsequent driving decision of the vehicle 100.
  • the vehicle 100 may interact with the control center 300 to perform automatic driving with the assistance of the control center 300.
  • FIG. 4 shows the hardware structure of the vehicle 100.
  • the vehicle 100 may include a computing system 102, an interactive system 104, a propulsion system 106, a sensor system 108, a control system 110, and a power source 112.
  • the computing system 102 may include a processor 1021, a memory 1022, and the like.
  • the interactive system 104 may include a wireless communication system 1041, a display screen 1042, a microphone 1043, a speaker 1044, and the like.
  • the propulsion system 106 may include a power component 1061, an energy component 1062, a transmission component 1063, an actuation component 1064, and the like.
  • the sensor system 108 may include a positioning component 1081, a camera 1082, an inertial measurement unit 1083, a radar 1084, and the like.
  • the control system may include a control component 1101, a throttle valve 1102, a brake component 1103, and the like.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the vehicle 100.
  • the vehicle 100 may include more or fewer components than shown, or combine certain components, or disassemble certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the components of the vehicle 100 can be connected together through a system bus (for example, a controller area network bus (controller area network bus), CAN bus), a network, and/or other connection mechanisms, so that the components can work in an interconnected manner.
  • a system bus for example, a controller area network bus (controller area network bus), CAN bus), a network, and/or other connection mechanisms, so that the components can work in an interconnected manner.
  • the processor 1021 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU), etc.
  • AP application processor
  • modem processor modem processor
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the different processing units may be independent devices or integrated in one or more processors.
  • the memory 1022 may be used to store computer executable program code, where the executable program code includes instructions.
  • the memory 1022 may include a program storage area and a data storage area.
  • the storage program area can store information such as a classifier, and can also store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.).
  • the memory 1022 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the processor 1021 can execute various automobile functions and data processing described below by running instructions stored in the memory 1022.
  • the computing system 102 can be implemented as a vehicle-mounted intelligent system or an automatic driving system, which can realize the automatic driving of the vehicle 100 (when the vehicle 100 is running, the vehicle 100 is completely autonomous driving without the driver’s control or only the driver’s Little control). It is also possible to realize semi-autonomous driving of the vehicle 100 (when the vehicle is running, the vehicle is not fully autonomous driving and requires proper control by the driver). The driver can also drive the vehicle 100 manually (the driver height controls the vehicle 100).
  • the computing system 102 may include a vehicle controller.
  • the vehicle controller is the core control component of the vehicle.
  • the vehicle controller is configured to complete numerous task coordination while the vehicle is running.
  • the main tasks include: communication with subsystems; collecting driver's operating signals to identify their intentions; monitoring the driving status of the vehicle, detecting and identifying vehicle faults, storing fault information, and ensuring the safe driving of the vehicle.
  • the vehicle controller also contains multiple independent motor control units, and the information exchange between the vehicle controller and the motor control unit is carried out through a bus.
  • the vehicle controller is the controller center of the vehicle. It can communicate with signal sensors, active steering controllers, and electric drive controllers through CAN bus communication to realize signal collection, control strategy decision-making, and drive signal output.
  • the vehicle controller collects and processes signals from sensors (such as accelerator pedal, brake pedal, etc.), and is responsible for the logic control of power-on and power-off of its own controller and the logic control of power-on and power-off of the motor control unit. It is also responsible for torque calculation: driver demand torque calculation, mechanical brake and electric brake torque distribution, front and rear axles bear drive/brake torque, and 4-wheel motor torque distribution. It is also responsible for energy optimization management: charging control, power distribution based on motor operating efficiency, braking energy recovery control. It is also responsible for vehicle dynamics control: vehicle state recognition, yaw control, anti-skid control, anti-lock control, anti-roll control, and active steering control. It is also responsible for monitoring and diagnosis functions: bus node receiving and dispatching monitoring, sensor failure diagnosis, torque monitoring, CPU monitoring diagnosis, fault management, fault realization safety measures (such as vehicle deceleration speed limit processing).
  • sensors such as accelerator pedal, brake pedal, etc.
  • the vehicle controller can complete data exchange with other sub-control units (such as motor controllers, power management systems, dashboards, etc.) through CAN network communication.
  • the motor control unit receives the command distributed by the vehicle controller through the CAN bus, converts the chemical energy of the battery pack into the mechanical energy of the motor, and then transmits the power to the wheels through the transmission system to ensure the power of the vehicle.
  • the computing system 102 may also include a body controller.
  • the body controller manages modules in the field of vehicle body electronics and supports multiple functions.
  • a typical body control module consists of a microprocessor and is used to control and classify the body electronics. Functions of equipment (power windows, wipers, side mirrors, etc.).
  • ports are provided on the body controller for communication with different body control modules, instrument panels, sensors and actuators, etc.
  • the computing system 102 may include an intelligent driving controller for processing data from various sensors.
  • the wireless communication system 1041 may include one or more antennas, modems, baseband processors, etc., and may communicate with the management center 200, other automobiles, and other communication entities.
  • a wireless communication system can be configured to communicate according to one or more communication technologies, such as mobile communication technologies such as 2G/3G/4G/5G, and wireless local area networks (WLAN) (such as wireless security).
  • mobile communication technologies such as 2G/3G/4G/5G
  • WLAN wireless local area networks
  • True wireless fidelity, Wi-Fi
  • Bluetooth bluetooth, BT
  • global navigation satellite system global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • infrared technology infrared, IR
  • other wireless communication technologies will not be listed here.
  • the display screen 1042 is used to display images, videos, and so on.
  • the display screen 1042 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active matrix organic light-emitting diode active-matrix organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-OLED, quantum dot light-emitting diode (QLED), etc.
  • the display panel may be covered with a touch panel, and when the touch panel detects a touch operation on or near it, the touch operation may be transmitted to the processor 1021 to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 1042. In other embodiments, the position of the touch panel and the display screen 1042 may be different.
  • the microphone 1043 also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user wants to control the vehicle 100 by voice, the user can speak close to the microphone 1043 through the human mouth, and input a voice command into the microphone 1043.
  • the vehicle 100 may be provided with at least one microphone 1043.
  • the vehicle 100 may be provided with two microphones 1043, in addition to collecting sound signals, it may also implement a noise reduction function.
  • the electronic device 100 may also be provided with three, four or more microphones 1043 to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the speaker 1044 also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the vehicle 100 can listen to music through the speaker 1044, or listen to prompt information.
  • the power component 1061 may be an engine, and may be any one or a combination of a gasoline engine, an electric motor of an electric vehicle, a diesel engine, a hybrid engine, etc., or a combination of other types of engines.
  • the energy component 1062 may be a source of energy, and provides power for the power component 1061 in whole or in part. That is, the power component 1061 may be configured to convert the energy provided by the energy component 1062 into mechanical energy.
  • Energy components 1062 can provide energy including gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power.
  • the energy component 1062 may also include any combination of fuel tanks, batteries, capacitors, and/or flywheels. In some embodiments, the energy component 1062 may also provide energy for other systems of the vehicle 100.
  • the transmission component 1063 may include a gearbox, a clutch, a differential, a transmission shaft, and other components. After being configured, the transmission component 1063 can transmit mechanical energy from the power component 1061 to the actuation component 1064.
  • the actuating member 1064 may include wheels, tires, and the like.
  • the wheels can be configured in various styles, including unicycle, two-wheeler/motorcycle, tricycle, or car/truck four-wheel style.
  • a tire may be attached to a wheel, and the wheel may be attached to the transmission member 1063, and may rotate in response to the mechanical power transmitted by the transmission member 1063 to drive the vehicle 100 to move.
  • the positioning component 1081 may be configured to estimate the position of the vehicle 100.
  • the positioning component 1081 may include a transceiver configured to estimate the position of the vehicle 100 relative to the earth based on satellite positioning data.
  • the computing system 102 may be configured to use the positioning component 1081 in conjunction with map data to estimate the road on which the vehicle 100 may travel and the position of the vehicle 100 on the road.
  • the positioning component 1081 may include a global positioning system (GPS) module, may also include a Beidou navigation satellite system (BDS), or may include a Galileo satellite navigation system (galileo satellite navigation system), and many more.
  • GPS global positioning system
  • BDS Beidou navigation satellite system
  • Galileo satellite navigation system Galileo satellite navigation system
  • the camera 1082 may include an outside camera configured to capture the environment outside the vehicle 100, and may also include an in-vehicle camera configured to capture the environment inside the vehicle 100.
  • the camera 1082 can be a camera that detects visible light, or can detect light from other parts of the spectrum (infrared or ultraviolet, etc.).
  • the camera 1082 is used to capture two-dimensional images, and can also be used to capture depth images.
  • An inertial measurement unit (IMU) 1083 is configured as any combination of sensors that sense changes in the position and orientation of the vehicle 100 based on inertial acceleration.
  • the inertial measurement unit 1083 may include one or more accelerometers and gyroscopes.
  • the radar 1084 may include a sensor configured to use radio waves or sound waves to sense or detect objects in the environment where the vehicle 100 is located.
  • the radar 1084 may include laser radar, millimeter wave radar, or ultrasonic radar.
  • the radar 1084 may include a waveform generator, a transmitting antenna, a receiving antenna, and a signal processor. In each scan, the waveform generator can generate a waveform signal and transmit it through the transmitting antenna. The waveform signal can be received by the receiving antenna after being reflected by objects in the environment where the vehicle 100 is located. By comparing the transmitted signal and the receiving antenna signal, the original detection data can be obtained.
  • the signal processor of the radar 1084 can perform constant false-alarm rate (CFAR) detection, peak grouping and direction of arrival (DOA) on the original detection data. Obtain inspection points.
  • the detection points obtained by the radar 1084 in one scan form a frame of radar point cloud.
  • the detection points scanned by the radar 1084 may also be referred to as a point data set.
  • the radar 1084 is a millimeter wave radar, the resolution of the millimeter wave radar is low, and the detection points on the radar point cloud of a millimeter wave radar are relatively sparse. Therefore, the radar point cloud of the meter wave radar can be called a sparse point cloud ( sparse point cloud).
  • the radar 1084 may transmit the original detection data to the computing system 102, and the computing system 102 determines to obtain the radar point cloud according to the original detection data.
  • the radar 1084 may send the original detection data to the control center 300 through the wireless communication system 1041, and the control center determines to obtain the radar point cloud based on the original detection data.
  • the detection point on the radar point cloud corresponds to the reflection point of the reflected received signal.
  • one object corresponds to multiple reflection points, that is, one object can correspond to multiple detection points on the radar point cloud.
  • the signal processor can obtain information such as the position and speed of the reflection point corresponding to the detection point based on the time difference between the received signal and the transmitted signal corresponding to the detection point and the Doppler frequency shift, which is the detection point on the radar point cloud. It has information such as position and speed.
  • the signal processor of the radar 1084 may execute the method provided in the embodiments of the present application.
  • the radar 1084 may transmit the radar point cloud to the computing system 102 so that the computing system 102 can execute the method provided in the embodiments of the present application.
  • the radar 1084 can send the radar point cloud to the control center 300 through the wireless communication system 1041, so that the control center executes the method provided in the embodiment of this application and processes The result is fed back to the vehicle 100.
  • the manipulation component 1101 may be a component configured to adjust the direction of movement of the vehicle 100 in response to a driver's operation or a computer instruction.
  • the throttle valve 1102 may be a component configured to control the operating speed and acceleration of the power component 1061 and thereby control the speed and acceleration of the vehicle 100.
  • the brake component 1103 may be a component configured to reduce the moving speed of the vehicle 100.
  • the brake component 1103 may use friction to slow the rotation speed of the wheels in the actuating component 1064.
  • the power supply 112 may be configured to provide power to a part or all of the components of the vehicle 100.
  • the power source 112 may include a lithium ion battery or a lead storage battery that can be recharged and discharged.
  • the power source 112 may include one or more battery packs.
  • the power supply 112 and the energy component 1062 can be implemented together, and the chemical energy provided by the power supply 112 can be converted into the mechanical energy of the motor through the power component 1061, and it can be transmitted to the actuating component 1064 through the transmission component 1063 to realize the vehicle 100 Action.
  • the method for determining the tracking threshold of the target object may be applied to scenarios such as automatic driving, automatic parking, or automatic cruise of the vehicle 100.
  • the method can determine the target category of the target object detected by the on-board radar of the vehicle 100, and use the tracking gate threshold corresponding to the target category to which the target object belongs to track the target object, which can reduce The detection point corresponding to the target object interferes with tracking other target objects, thereby eliminating or reducing the interference between the tracks of the multiple target objects when the vehicle 100 tracks multiple target objects, and reducing the false detection rate.
  • the method may be implemented by a detection device, such as a radar device.
  • the radar device may be a vehicle-mounted radar device, such as the radar 1084 shown in FIG. 4.
  • This method can also be implemented by a processing device integrated in the detection device, such as the signal processor in the radar 1084 shown in FIG. 4.
  • the method can also be implemented by a processing device independent of the detection device (for example, the control center 300, the computing system 102, etc.), and the processing result is fed back to the detection device.
  • the target category can be referred to as a category for short, which can be a preset object category, for example, multiple target categories such as pedestrians, cars, bicycles, and electric vehicles can be set.
  • the target category can also be referred to as a category for short.
  • the tracking threshold also known as the tracking threshold frame, refers to a limit range set according to the target category parameter information.
  • the tracking threshold of the target category can be set according to the size information and/or speed information of the target category.
  • the size in the size information may be the upper limit of the size.
  • the speed in the speed information can be the upper speed limit or the lower speed limit.
  • the tracking threshold corresponding to the target category of pedestrians can be set as size (1.5-2)m*(2.5-3)m and speed 4m/s.
  • the speed of 4m/s can be the upper limit of speed.
  • the tracking threshold corresponding to the target category of pedestrians can be set to a size of 1.5m*2.5m and a speed of 4m/s.
  • the tracking threshold corresponding to the target category of pedestrians can be set to a size of 2m*3m and a speed of 4m/s.
  • the tracking threshold corresponding to the target category of pedestrians can be set to a size of 1.8m*2.6m and a speed of 4m/s. And so on, you can set the tracking threshold corresponding to the target category of pedestrians based on experience or experiment.
  • the tracking threshold corresponding to the target category of pedestrians can be set based on experience or experiment.
  • the tracking threshold corresponding to the target category of cars can be set as size (3-5)m*(5-7), 10m/s.
  • the speed of 10m/s can be the lower limit of speed.
  • the tracking threshold corresponding to the target category of cars can be set to a size of 3m*5m and a speed of 10m/s.
  • the tracking threshold corresponding to the target category of cars can be set to a size of 4m*6m and a speed of 10m/s.
  • the tracking threshold corresponding to the target category of cars can be set to a size of 5m*7m and a speed of 10m/s. and many more.
  • the tracking threshold corresponding to the target category of automobiles can be set based on experience or experiments.
  • the tracking threshold corresponding to the target category of bicycles can be set as 3m*4m in size and 7m/s in speed. Among them, the speed of 7m/s can be the upper limit of speed.
  • the tracking threshold corresponding to the target category of bicycles can be set based on experience or experiment.
  • the target category of electric vehicles Take the target category of electric vehicles as an example. It can be understood that, generally speaking, the size of an electric vehicle (and a cyclist) is similar to the size of a bicycle and a cyclist, and the speed is also similar.
  • the tracking threshold corresponding to the target category of electric vehicles is the same as the tracking threshold corresponding to bicycles.
  • the tracking threshold can also be set independently for the electric vehicle, for example, it can be 3m*4m in size and 8m/s in speed. Among them, the speed of 8m/s can be the upper limit of speed.
  • the tracking threshold corresponding to the target category of electric vehicles can be set based on experience or experiments.
  • K target categories and N tracking thresholds can be set. Both K and N are positive integers greater than 1, and K ⁇ N.
  • Each target category may correspond to a tracking threshold, wherein there may be two or more target categories that jointly correspond to a tracking threshold, and different tracking thresholds correspond to different target categories.
  • a machine learning algorithm can be used to use the detection points (also referred to as a point data set) in a multi-frame radar point cloud pre-marked with target categories as training samples for training to obtain K targets.
  • the classifier of the category can also be called the recognition model).
  • the machine learning algorithm used may be the XGBoost algorithm. In the training process of the classifier, 70% of the training samples may be taken as the training set and 30% of the training samples as the test set.
  • the tracking process of each target object can be divided into a track establishment stage and a track tracking stage.
  • the tracking threshold for tracking the target object can be determined in the stage of establishing the track of the target object.
  • the tracking threshold used to track the target object may also be referred to as the target threshold of the target object.
  • the target threshold of the target object can be used to track the target object.
  • the detection point that falls within the target threshold of the target object is regarded as the detection point belonging to the target object and is only used for Determine the track of the target object and no longer participate in the process of determining the track outside the track of the target object.
  • a clustering algorithm can be used to cluster the detection points on the radar point cloud 1 to roughly classify the target objects detected by the radar point cloud.
  • the clustering algorithm can be K-means.
  • the clustering algorithm may be DBSCAN. Other clustering algorithms can also be used, which will not be listed here.
  • the detection points on the radar point cloud 1 may be filtered to exclude the stationary detection points on the radar point cloud 1. That is, only the detection points moving on the radar point cloud 1 can be clustered.
  • Each type of cluster obtained by clustering can be regarded as a target object.
  • the target object S1 for any target object, for example, the target object S1.
  • the detection point in the target object S1 is framed by the tracking threshold 1
  • the detection point in the target object S1 is framed by the tracking threshold 2
  • the detection point in the target object S1 is framed by the tracking threshold N .
  • the detection points in the target object S1 can be framed respectively through N tracking thresholds.
  • any detection point framed by a tracking threshold it can be referred to as a detection point within the tracking threshold, or it can be referred to as point cloud data within the tracking threshold.
  • each type of cluster obtained has a cluster center.
  • the tracking threshold includes size information
  • the detection point where the cluster center of the target object is located can be used as the center of the frame to frame the detection point (point cloud data) within each tracking threshold.
  • the tracking threshold also includes speed information
  • the detection points within the tracking threshold can be screened according to the speed information. Take the tracking threshold corresponding to the target category of pedestrians as an example, where the speed information is 4m/s. Among the detection points within the tracking threshold corresponding to the target category of pedestrians, the detection points with a speed greater than 4m/s can be eliminated.
  • the speed information is a speed of 10m/s.
  • the detection points with a speed less than 10m/s among the detection points within the tracking threshold corresponding to the target category of automobiles can be excluded.
  • the point cloud data within the N tracking thresholds corresponding to the target object S1 can be obtained.
  • the point cloud data in each of the N tracking thresholds corresponding to the target object S1 is input into the above-mentioned classifier, and K confidence levels corresponding to the K target categories can be obtained.
  • the confidence levels It is used to characterize the accuracy of the target object belonging to a certain target category.
  • the point cloud data within the tracking threshold 1 corresponding to the target object S1 can be input into the above-mentioned classifier to obtain K confidence levels corresponding to the target object S1 under the tracking threshold 1.
  • the point cloud data corresponding to the tracking threshold 1 is processed to obtain the confidence of each target category, and a total of K confidences are obtained.
  • the first confidence level in the K confidence levels represents the accuracy of the target object S1 belonging to the first target category corresponding to the first confidence level.
  • the point cloud data within the tracking threshold 2 corresponding to the target object S1 can be input into the aforementioned classifier to obtain K confidence levels corresponding to the target object S1 under the tracking threshold 2, which also includes the first confidence level
  • the first confidence level here corresponds to the tracking threshold 2, which is obtained by processing the point cloud data in the tracking threshold 2, and is used to characterize the accuracy of the target object S1 belonging to the first target category.
  • the above-mentioned processing can be performed on all N tracking thresholds, and then K confidence levels corresponding to each tracking threshold can be obtained, that is, N*K confidence levels can be obtained.
  • the confidence levels of the target object S1 corresponding to the same target category under N tracking thresholds can be added, and the obtained sum can be used as the single frame confidence sum of the target category.
  • the target object S1 corresponds to the confidence level of category 1 under the tracking threshold 1
  • the target object S1 corresponds to the confidence level of category 1 under the tracking threshold 2,...
  • the target object S1 corresponds to the category under the tracking threshold N
  • the confidence of 1 is added, that is, the N first confidences are added, and the obtained sum can be used as the single-frame confidence sum of category 1.
  • the single-frame confidence sum of category 2,..., and category K can be obtained.
  • the sum of the single-frame confidence levels of each target category under the target object S1 may be compared to obtain the target category with the highest single-frame confidence.
  • the target category with the highest single frame confidence can be used as the estimated category of the target object S1.
  • the estimated category of the target object S1 corresponds to the tracking threshold, which is used as the tracking threshold for tracking the target object S1 in the track tracking phase of the target object S1, that is, as the target threshold of the target object S1.
  • the target threshold of the target object S1 can be used to frame the detection point clusters corresponding to the target object S1, so that the detection points within the target threshold of the target object S1 can be used to determine the flight path of the target object S1. trace.
  • the detection points within the target threshold of the target object S1 no longer participate in the process of establishing the track of the clustering other target objects, thereby reducing mutual interference of the track and reducing the false detection rate.
  • the detection points that are not framed to any tracking threshold can be clustered again , To get the cluster. And according to the clusters obtained by clustering again, the new target object is judged or the track is established.
  • using the target threshold of the target object S1 to frame the detection point cluster corresponding to the target object S1 may specifically be based on the detection point center where the cluster center point of the detection point cluster corresponding to the target object S1 is located. Framed.
  • a clustering algorithm can be used to detect the radar point cloud of each frame of the two radar point clouds. Points are clustered to get clusters. Then, using the tracking algorithm to correlate the clusters on the radar point cloud of two adjacent frames, the clusters of the two adjacent frame rate radar point clouds that can be regarded as corresponding to the same object can be obtained, that is, the two adjacent frame rate radar points can be obtained.
  • the cloud can be regarded as a cluster corresponding to the target object S1.
  • the tracking algorithm may be Kalman filter algorithm, particle filter algorithm, etc.
  • the single-frame confidence sum of the target object S1 in each frame of the radar point cloud in the multi-frame radar point cloud can be calculated.
  • the calculation method of the single-frame confidence sum of each target category under each frame of radar point cloud can refer to the above introduction.
  • the single-frame confidence sum corresponding to the same target category under each frame of the radar point cloud in the multi-frame radar point cloud can be added to obtain the multi-frame confidence sum of the target category.
  • the single frame confidence level of the target object S1 under the radar point cloud 1 of category 1 can be combined, ..., the single frame confidence level of the target object S1 under the radar point cloud P, and the target object S1
  • the single-frame confidence sum of category 1 under the radar point cloud P+1 is added to obtain the multi-frame confidence sum of category 1 corresponding to the target object S1.
  • the multi-frame confidence sum can be a simple addition of the single-frame confidence sum.
  • it may also be a weighted addition of the single frame confidence sum. For example, according to the sequence of the acquisition time of each frame of the radar point cloud in the multi-frame radar point cloud, the confidence and weight of the single frame corresponding to each frame of the radar point cloud can be assigned.
  • radar point cloud L1 Taking radar point cloud L1, radar point cloud L2, and radar point cloud L3 as examples, you can set the acquisition time of radar point cloud L1 to be earlier than the acquisition time of radar point cloud L2, and the acquisition time of radar point cloud L2 to be earlier than radar point cloud
  • the weight of the single frame confidence sum corresponding to the radar point cloud L1 ⁇ the weight of the single frame confidence sum corresponding to the radar point cloud L2 ⁇ the weight of the single frame confidence sum corresponding to the radar point cloud L3, for example
  • the weight corresponding to the radar point cloud L1 can be 0.1
  • the weight corresponding to the radar point cloud L1 can be 0.3
  • the weight corresponding to the radar point cloud L1 can be 0.6.
  • the sum of the multi-frame confidence levels of each target category under the target object S1 can be compared to obtain the target category with the highest multi-frame confidence.
  • the target category with the highest confidence in the multiple frames may be used as the estimated category of the target object S1.
  • the estimated category of the target object S1 corresponds to the tracking threshold, which is used as the tracking threshold for tracking the target object S1 in the track tracking phase of the target object S1, that is, as the target threshold of the target object S1.
  • the target threshold of the target object S1 can be used to frame the detection point clusters corresponding to the target object S1, so that the detection points within the target threshold of the target object S1 can be used to determine the flight path of the target object S1. trace.
  • the detection points within the target threshold of the target object S1 no longer participate in the process of establishing the track of the clustering other target objects, thereby reducing mutual interference of the track and reducing the false detection rate.
  • the confidence threshold can be a threshold preset based on experience or experiment, for example, it can be set to 80%, can also be set to 85%, can also be set to 95%, etc., and will not be listed here.
  • the confidence that the target object S1 belongs to its estimated category is specifically: the tracking threshold corresponding to the estimated category of the target object S1 is framed (target Threshold) detection points (point cloud data) are input to the above-mentioned classifier, and the K confidence levels output by the classifier correspond to the confidence level of the estimated category.
  • the confidence that the target object S1 belongs to its estimated category under its target threshold may be a simple average of the confidences corresponding to multiple frames of radar point clouds.
  • the confidence that the target object S1 corresponding to each frame of the radar point cloud in the multi-frame radar point cloud belongs to the estimated category under the target threshold can be calculated.
  • the specific calculation method can refer to the introduction above, and will not be repeated here.
  • the confidence of the target object S1 corresponding to each frame of the radar point cloud in the multi-frame radar point cloud belonging to the estimated category under the target threshold can be added to obtain the multi-frame confidence of the target object S1 belonging to the estimated category under the target threshold
  • divide the multi-frame confidence level by the number of multi-frame frames to obtain the simple average value of the confidence level corresponding to the multi-frame radar point cloud, and use this average value as the confidence that the target object S1 belongs to its estimated category under its target threshold degree.
  • the confidence that the target object S1 belongs to its estimated category under its target threshold may be a weighted average of the confidences corresponding to multiple frames of radar point clouds.
  • the confidence that the target object S1 corresponding to each frame of the radar point cloud in the multi-frame radar point cloud belongs to the estimated category under the target threshold can be calculated.
  • the specific calculation method can refer to the introduction above, and will not be repeated here.
  • the confidence that the target object S1 corresponding to each frame of the radar point cloud in the multi-frame radar point cloud belongs to the estimated category under the target threshold can be multiplied by the corresponding weight, and then added, and the weighted average value obtained can be used as the target object S1 Confidence of belonging to its estimated category under its target threshold.
  • the way of assigning the weight of each frame in the multi-frame radar point cloud can be to assign different weights according to the sequence of the acquisition time of the radar point cloud. For details, please refer to the above introduction and will not be repeated here.
  • the estimated category of the target object S1 is locked, and the target object S1 is no longer targeted in the subsequent collected radar point cloud.
  • Object S1 is classified.
  • the confidence that the target object S1 belongs to its estimated category is greater than the confidence threshold, by locking the estimated category of the target object S1, the computational overhead of triggering the classifier can be reduced, and the detection efficiency can be improved.
  • the track tracking stage of the target object S1 can be performed according to the radar newly acquired by the detection device.
  • Point cloud re-determine the estimated category of the target object S1.
  • the estimated category of the target object S1 it can be determined whether the confidence that the target object S1 belongs to the re-determined estimated category is greater than the confidence threshold.
  • the confidence level of the estimated category of the target object S1 determined during the trajectory establishment phase of the target object S1 can be set to be less than or equal to the confidence threshold.
  • the radar point cloud P+2 can be used to re- Determine the estimated category of the target object S1. If the confidence level of the estimated category of the target object S1 re-determined by the radar point cloud P+2 is still less than or equal to the confidence threshold, the radar point cloud P+3 (not included) whose acquisition time is after the radar point cloud P+2 can be used. (Shown), the estimated category of the target object S1 is re-determined again. The foregoing process is repeated until the confidence of the newly determined estimated category of the target object S1 is greater than the confidence threshold or the track tracking of the target object S1 ends.
  • the tracking threshold corresponding to the newly determined estimated category of the target object S1 can be passed Frame the radar point cloud of this frame, determine the track of the target object S1 with the detection points in the threshold frame, and avoid the detection points in the threshold frame from participating in the track establishment process of other target objects.
  • the confidence level of the estimated category of the target object S1 determined in the trajectory establishment phase of the target object S1 can be set to be less than or equal to the confidence threshold.
  • the tracking threshold corresponding to the estimated category of the target object S1 determined in the track establishment stage of the target object S1 can be used on the radar point cloud P+2 Framed.
  • the tracking threshold corresponding to the estimated category of the target object S1 re-determined through the radar point cloud P+2 can be used. The tracking threshold is determined on P+3.
  • the method for determining the tracking threshold of a target object provided by the embodiment of the present application can determine the category of the target object, and use the tracking threshold corresponding to the category of the target object to track the target threshold of the target object, so that the detection point defined by the target threshold is not Participate in the clustering of other target objects or the process of establishing the trajectory of other target objects to realize that the detection point in the target threshold frame does not affect other trajectories, eliminates or reduces the interference between different trajectories, and reduces the false detection rate.
  • FIGS. 9A and 9B show an actual verification result of the method for determining the tracking threshold of a target object provided by an embodiment of the present application.
  • the target threshold of the pedestrian is determined to be the threshold 920 by the method for determining the target object tracking threshold provided in the embodiment of the present application.
  • the threshold 920 is a preset tracking threshold corresponding to pedestrians in the embodiment of the application.
  • FIGS. 10A and 10B show another actual verification result of the method for determining the tracking threshold of a target object provided by an embodiment of the present application.
  • the target threshold of the bicycle is determined to be the threshold 1020 by the method for determining the target object tracking threshold provided by the embodiment of the present application.
  • the threshold 1020 is the preset tracking threshold corresponding to the bicycle in the embodiment of the application.
  • Figures 11A and 11B show another actual verification result of the method for determining the target object tracking threshold provided by an embodiment of the present application.
  • the target threshold of the car is determined to be the threshold 1120 through the method for determining the target object tracking threshold provided in the embodiment of the present application.
  • the threshold 1120 is the preset tracking threshold corresponding to the car in the embodiment of the application.
  • the three actual verification results shown above show that the method for determining the target object tracking threshold provided by the embodiments of the present application can accurately determine the target object category, and use the tracking threshold corresponding to the target object category to target the target object.
  • an embodiment of the present application provides a method for determining a target object tracking threshold.
  • the method can be implemented by a detection device, such as a radar device.
  • the radar device may be a vehicle-mounted radar device, such as the radar 1084 shown in FIG. 4.
  • This method can also be implemented by a processing device integrated in the detection device, such as the signal processor in the radar 1084 shown in FIG. 4.
  • the method can also be implemented by a processing device independent of the detection device (for example, the control center 300, the computing system 102, etc.), and the processing result is fed back to the detection device.
  • the method may include the following steps.
  • Step 1201 Determine at least one frame of radar point cloud, where the at least one frame of radar point cloud is a point data set obtained by measuring a target object, and the at least one frame of radar point cloud includes the first frame of radar point cloud.
  • the detection device or other processing device may determine one or more frames of radar point clouds based on the original detection data collected by the detection device for one scan of the target object or multiple raw detection data collected by multiple scans. .
  • one frame of radar point cloud corresponds to one scan of the detection device.
  • Step 1203 Determine N tracking thresholds corresponding to the first frame of radar point cloud, where the N tracking thresholds include a first threshold, and determine K confidence levels based on the point cloud data in the first threshold, the K confidence levels correspond to K target categories one-to-one.
  • N tracking thresholds can be preset, and point cloud data (detection points) are respectively framed on the first frame of radar point cloud, and N tracking thresholds corresponding to the first frame of radar point cloud are respectively obtained.
  • the point cloud data in any one of the N tracking thresholds corresponding to the first frame of radar point cloud can be input into the K classification, for example, the point cloud data in the first threshold (that is, the point cloud data framed by the first threshold) In the classifier of, K confidences are obtained.
  • the above method can be repeated to obtain K confidence levels corresponding to each of the N tracking thresholds corresponding to the other frame radar point clouds.
  • K confidence levels corresponding to each of the N tracking thresholds corresponding to the other frame radar point clouds K confidence levels corresponding to each of the N tracking thresholds corresponding to the other frame radar point clouds.
  • Step 1205 Determine the first target category according to the K confidence levels.
  • Step 1207 Determine a target threshold for tracking the target object from the N tracking thresholds according to the first target category.
  • the tracking threshold corresponding to the first target category may be determined as the target threshold for tracking the target object.
  • the point cloud data used to establish the trajectory of the target object can be framed by the target threshold, and the point cloud data framed by the target threshold can not participate in the trajectory establishment process of other target objects.
  • the K confidence levels include a first confidence level corresponding to the first target category, and the first confidence level is used to characterize that the point cloud data in the first threshold belongs to the first target category.
  • the accuracy of a target category is used to characterize that the point cloud data in the first threshold belongs to the first target category.
  • any one of the K confidence levels corresponding to the threshold 1 in the N tracking thresholds is used to characterize the accuracy of the point cloud data in the threshold 1 belonging to the target category corresponding to the confidence.
  • the confidence level 1 represents the accuracy of the point cloud data in the threshold 1 belonging to the target category corresponding to the confidence level 1
  • the confidence level 2 represents the accuracy of the point cloud data in the threshold 1 belonging to the target category corresponding to the confidence level 2.
  • the confidence K represents the accuracy of the point cloud data in the threshold 1 belonging to the target category corresponding to the confidence K.
  • any one of the K confidence levels corresponding to threshold 2 in the N tracking thresholds is used to characterize the accuracy of the point cloud data in the threshold 2 belonging to the target category corresponding to the confidence level
  • the threshold 3 in the N tracking thresholds Any one of the corresponding K confidence levels is used to characterize the accuracy of the point cloud data in the threshold 3 that belongs to the target category corresponding to the confidence level,..., among the K confidence levels corresponding to the threshold N in the N tracking thresholds Any confidence is used to characterize the accuracy of the point cloud data in the threshold N belonging to the target category corresponding to the confidence.
  • the N tracking thresholds are determined according to preset parameter information, and are used to define a range corresponding to the target object in the radar point cloud of the first frame.
  • Different tracking thresholds may be determined according to preset parameter information corresponding to different target categories. It can be understood that target objects belonging to the same target category have similar measurement ranges or point data ranges.
  • the parameter information can be set according to the measurement range or the point data range of the target object under the same target category, and then the corresponding tracking threshold can be determined according to the parameter information.
  • the range of the target object defined by the tracking threshold may be the range of the point data of the target object or the measurement range of the target object. Defining the range or measurement range of the point data of the target object by the tracking threshold can facilitate the classifier to classify the target object.
  • the parameter information includes geometric size information of a preset target category and/or speed information of a preset target category. Target objects belonging to the same target category have similar geometric dimensions and speeds. Through the geometric size information and/or speed information of the target category, the tracking threshold corresponding to the target category can be determined.
  • the confidence level corresponding to the first target category in the K target categories is the largest and is greater than the preset The first confidence threshold (for example, 95%) of, then the first target category can be determined as the target category of the target object.
  • the first target category is determined according to K total confidence levels, the K total confidence levels include a first total confidence level, and the first total confidence level is N first total confidence levels.
  • the sum of the confidences, the N first confidences correspond to the N tracking thresholds one by one.
  • the cloud data in each of the N tracking thresholds is input to the classifier of the K classification to obtain K confidences, and K confidences can be obtained, that is, each tracking threshold corresponds to K confidences.
  • the sum of the first confidences in each of the K confidences in the N tracking thresholds can be obtained.
  • the sum of the second confidences in each of the K confidences in the N tracking thresholds can be obtained.
  • the first target category can be determined according to the sum of each of the K confidence degrees.
  • the sum of each confidence degree in the K confidence degrees may be compared, and the target category corresponding to the confidence degree with the largest sum of confidence degrees may be used as the first target category.
  • the first target category is determined according to K multi-frame total confidence levels
  • the K multi-frame total confidence levels include a first multi-frame total confidence level
  • the first multi-frame total confidence level The degree is the sum of at least one first total confidence degree, and the at least one first total confidence degree corresponds to the at least one frame of radar point cloud one by one, wherein the first frame of radar point cloud corresponds to the first
  • the total confidence is the sum of the N first confidences, and the N first confidences correspond to the N tracking thresholds one by one.
  • the sum of each of the K confidence degrees corresponding to the first radar point cloud can be determined to obtain K total confidence degrees.
  • K total confidences corresponding to each frame of radar point cloud in at least one frame of radar point cloud can be obtained.
  • the first total confidence among the K total confidences corresponding to each frame of the radar point cloud in at least one frame of radar point cloud is added to obtain the first multi-frame total confidence corresponding to the first total confidence.
  • the multi-frame total confidence level corresponding to each of the K total confidence levels can be obtained, that is, the K multi-frame total confidence levels can be obtained.
  • the target category corresponding to the largest multi-frame total confidence among the K multi-frame total confidences may be determined as the first target category.
  • the K target categories include two or more of the following:
  • Pedestrians cars, bicycles, electric vehicles.
  • the at least one frame of radar point cloud is a millimeter wave radar point cloud.
  • the method for determining the tracking threshold of a target object provided by the embodiment of the present application can determine the category of the target object, and use the tracking threshold corresponding to the category of the target object to track the target threshold of the target object, so that the detection point defined by the target threshold is not Participate in the clustering of other target objects or the process of establishing the trajectory of other target objects to realize that the detection point in the target threshold frame does not affect other trajectories, eliminates or reduces the interference between different trajectories, and reduces the false detection rate.
  • an embodiment of the present application provides an apparatus for determining a tracking threshold of a target object.
  • the device may include a processor 1310 and a transceiver 1320.
  • the processor 1310 executes computer instructions to make the device execute the method shown in FIG. 12.
  • the processor 1310 may determine at least one frame of radar point cloud, where the at least one frame of radar point cloud is a point data set obtained by measuring a target object, and the at least one frame of radar point cloud includes the first frame of radar point cloud;
  • the processor 1310 may determine N tracking thresholds corresponding to the first frame of radar point cloud, where the N tracking thresholds include a first threshold, and determine K confidence levels based on the point cloud data in the first threshold, so The K confidence levels correspond one-to-one with the K target categories.
  • the processor 1310 may determine the first target category according to the K confidence levels.
  • the processor 1310 may determine a target threshold for tracking the target object from the N tracking thresholds according to the first target category.
  • the device further includes a memory 1330.
  • the memory 1330 may be used to store the foregoing computer instructions, and may also be used to store a classifier and the like.
  • the electronic device further includes a communication bus 1340, where the processor 1310 can be connected to the transceiver 1320 and the memory 1330 through the communication bus 1340, so that the computer can execute the instructions stored in the memory 1330 to execute the command to the transceiver 1320. And other components for corresponding control.
  • the category of the target object can be determined, and the tracking threshold corresponding to the category of the target object can be used to track the target threshold of the target object, so that the detection points framed by the target threshold no longer participate in the clustering of other target objects or other targets
  • the establishment process of the target trajectory realizes that the detection point in the target threshold frame does not affect other trajectories, eliminates or reduces the interference between different trajectories, and reduces the false detection rate.
  • the processor in the embodiment of the present application may be a central processing unit (central processing unit, CPU), or other general-purpose processors, digital signal processors (digital signal processors, DSP), and application-specific integrated circuits. (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components, or any combination thereof.
  • the general-purpose processor may be a microprocessor or any conventional processor.
  • an embodiment of the present application provides an apparatus 1400 for determining a target object tracking threshold.
  • the device 1400 includes a processing unit 1410 and a transceiver unit 1420.
  • the transceiver unit 1420 is configured to determine at least one frame of radar point cloud, the at least one frame of radar point cloud is a point data set obtained by measuring a target object, and the at least one frame of radar point cloud includes the first frame of radar point cloud;
  • the processing unit 1410 is configured to determine N tracking thresholds corresponding to the first frame of radar point cloud, the N tracking thresholds include a first threshold, and K confidence levels are determined according to the point cloud data in the first threshold, and the K confidence levels One-to-one correspondence with K target categories;
  • the processing unit 1410 is further configured to determine the first target category according to the K confidence levels
  • the processing unit 1410 is further configured to determine a target threshold for tracking the target object from the N tracking thresholds according to the first target category.
  • the device for determining the tracking threshold of a target object provided by the embodiment of the present application can determine the category of the target object, and use the tracking threshold corresponding to the category of the target object to track the target threshold of the target object, so that the detection point defined by the target threshold is not Participate in the clustering of other target objects or the process of establishing the trajectory of other target objects to realize that the detection point in the target threshold frame does not affect other trajectories, eliminates or reduces the interference between different trajectories, and reduces the false detection rate.
  • the method steps in the embodiments of the present application can be implemented by hardware, and can also be implemented by a processor executing software instructions.
  • Software instructions can be composed of corresponding software modules, which can be stored in random access memory (RAM), flash memory, read-only memory (ROM), programmable read-only memory (programmable rom) , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or well-known in the art Any other form of storage medium.
  • An exemplary storage medium is coupled to the processor, so that the processor can read information from the storage medium and write information to the storage medium.
  • the storage medium may also be an integral part of the processor.
  • the processor and the storage medium may be located in the ASIC.
  • the above embodiments it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
  • software it can be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted through the computer-readable storage medium.
  • the computer instructions can be sent from a website site, computer, server, or data center to another website site via wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , Computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).

Abstract

La présente invention concerne un procédé et un appareil permettant de déterminer un seuil de suivi d'objet cible. Le procédé consiste : à déterminer au moins une trame de nuage de points de radar, la ou les trames de nuage de points de radar étant un ensemble de données de point obtenu par mesure d'un objet cible, et la ou les trames de nuage de points de radar comprenant une première trame de nuage de points de radar (1201) ; à déterminer N seuils de suivi correspondant à la première trame du nuage de points de radar, les N seuils de suivi comprenant un premier seuil et à déterminer K niveaux de confiance en fonction de données de nuage de points dans le premier seuil, les K niveaux de confiance étant dans une correspondance biunivoque avec K catégories cibles (1203) ; à déterminer une première catégorie cible en fonction des K niveaux de confiance (1205) ; et à déterminer, parmi les N seuils de suivi et en fonction de la première catégorie cible, un seuil cible pour suivre l'objet cible (1207). Au moyen du procédé et de l'appareil, la catégorie d'un objet cible peut être déterminée et, ensuite, l'objet cible peut être suivi avec le seuil de suivi de catégorie de l'objet cible de telle sorte qu'une interférence entre des suivis de différents objets cibles puisse être efficacement réduite et que le taux de fausse détection puisse être réduit.
PCT/CN2020/136718 2019-12-16 2020-12-16 Procédé et appareil permettant de déterminer un seuil de suivi d'objet cible WO2021121247A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911294731.0A CN113064153B (zh) 2019-12-16 2019-12-16 一种确定目标对象跟踪门限的方法、装置
CN201911294731.0 2019-12-16

Publications (1)

Publication Number Publication Date
WO2021121247A1 true WO2021121247A1 (fr) 2021-06-24

Family

ID=76477088

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/136718 WO2021121247A1 (fr) 2019-12-16 2020-12-16 Procédé et appareil permettant de déterminer un seuil de suivi d'objet cible

Country Status (2)

Country Link
CN (1) CN113064153B (fr)
WO (1) WO2021121247A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113734046A (zh) * 2021-08-17 2021-12-03 厦门星图安达科技有限公司 基于雷达的车内位置分区人员检测方法、装置以及设备
CN114509720A (zh) * 2022-01-18 2022-05-17 国网河北省电力有限公司信息通信分公司 一种电网设备室内定位方法、装置及终端设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7003136B1 (en) * 2002-04-26 2006-02-21 Hewlett-Packard Development Company, L.P. Plan-view projections of depth image data for object tracking
CN105957106A (zh) * 2016-04-26 2016-09-21 湖南拓视觉信息技术有限公司 三维目标跟踪的方法和装置
CN108427112A (zh) * 2018-01-22 2018-08-21 南京理工大学 一种改进的多扩展目标跟踪方法
CN110488815A (zh) * 2019-08-01 2019-11-22 广州小鹏汽车科技有限公司 一种车辆的路径跟踪方法及路径跟踪系统
CN110488273A (zh) * 2019-08-30 2019-11-22 成都纳雷科技有限公司 一种基于雷达的车辆跟踪检测方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7003136B1 (en) * 2002-04-26 2006-02-21 Hewlett-Packard Development Company, L.P. Plan-view projections of depth image data for object tracking
CN105957106A (zh) * 2016-04-26 2016-09-21 湖南拓视觉信息技术有限公司 三维目标跟踪的方法和装置
CN108427112A (zh) * 2018-01-22 2018-08-21 南京理工大学 一种改进的多扩展目标跟踪方法
CN110488815A (zh) * 2019-08-01 2019-11-22 广州小鹏汽车科技有限公司 一种车辆的路径跟踪方法及路径跟踪系统
CN110488273A (zh) * 2019-08-30 2019-11-22 成都纳雷科技有限公司 一种基于雷达的车辆跟踪检测方法及装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113734046A (zh) * 2021-08-17 2021-12-03 厦门星图安达科技有限公司 基于雷达的车内位置分区人员检测方法、装置以及设备
CN113734046B (zh) * 2021-08-17 2023-09-19 江苏星图智能科技有限公司 基于雷达的车内位置分区人员检测方法、装置以及设备
CN114509720A (zh) * 2022-01-18 2022-05-17 国网河北省电力有限公司信息通信分公司 一种电网设备室内定位方法、装置及终端设备

Also Published As

Publication number Publication date
CN113064153A (zh) 2021-07-02
CN113064153B (zh) 2024-01-02

Similar Documents

Publication Publication Date Title
US11815623B2 (en) Single frame 4D detection using deep fusion of camera image, imaging RADAR and LiDAR point cloud
US11821990B2 (en) Scene perception using coherent doppler LiDAR
US11726189B2 (en) Real-time online calibration of coherent doppler lidar systems on vehicles
WO2018086218A1 (fr) Procédé et dispositif de récupération d'énergie de freinage de véhicule
WO2021103511A1 (fr) Procédé et appareil de détermination de domaine de conception opérationnelle (odd), et dispositif associé
WO2021212379A1 (fr) Procédé et appareil de détection de ligne de délimitation de voie
CN112512887B (zh) 一种行驶决策选择方法以及装置
US20190311559A1 (en) Methods and systems for providing a mixed autonomy vehicle trip summary
US20210206395A1 (en) Methods and systems to enhance safety of bi-directional transition between autonomous and manual driving modes
CN112543877B (zh) 定位方法和定位装置
WO2021121247A1 (fr) Procédé et appareil permettant de déterminer un seuil de suivi d'objet cible
CN113498529B (zh) 一种目标跟踪方法及其装置
WO2021103536A1 (fr) Procédé de réglage et de commande de véhicule, appareil et dispositif électronique
US20200130690A1 (en) Lateral adaptive cruise control
CN111311947B (zh) 一种网联环境下考虑驾驶人意图的行车风险评估方法和装置
US20220073104A1 (en) Traffic accident management device and traffic accident management method
US11647164B2 (en) Methods and systems for camera sharing between autonomous driving and in-vehicle infotainment electronic control units
US20210362727A1 (en) Shared vehicle management device and management method for shared vehicle
US20230249660A1 (en) Electronic Mechanical Braking Method and Electronic Mechanical Braking Apparatus
US20230048680A1 (en) Method and apparatus for passing through barrier gate crossbar by vehicle
CN115147796A (zh) 评测目标识别算法的方法、装置、存储介质及车辆
US20210387628A1 (en) Extracting agent intent from log data for running log-based simulations for evaluating autonomous vehicle software
CN115179930B (zh) 车辆控制方法、装置、车辆及可读存储介质
CN114572219B (zh) 自动超车方法、装置、车辆、存储介质及芯片
WO2024024471A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et système de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20903508

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20903508

Country of ref document: EP

Kind code of ref document: A1