WO2021164514A1 - 目标识别方法及装置 - Google Patents

目标识别方法及装置 Download PDF

Info

Publication number
WO2021164514A1
WO2021164514A1 PCT/CN2021/073984 CN2021073984W WO2021164514A1 WO 2021164514 A1 WO2021164514 A1 WO 2021164514A1 CN 2021073984 W CN2021073984 W CN 2021073984W WO 2021164514 A1 WO2021164514 A1 WO 2021164514A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
target recognition
vehicle
target
radar
Prior art date
Application number
PCT/CN2021/073984
Other languages
English (en)
French (fr)
Inventor
康文武
崔天翔
刘兴业
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021164514A1 publication Critical patent/WO2021164514A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/52Discriminating between fixed and moving objects or between objects moving at different speeds
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/12Target-seeking control

Definitions

  • This application relates to the field of communications, and in particular to a target recognition method and device.
  • target recognition can provide target information around the vehicle to provide references for vehicle driving decisions, such as acceleration or deceleration, lane change, steering, uphill or downhill, and so on. Therefore, target recognition is the core issue in the field of autonomous driving scenarios, and it plays a vital role in the safety and driving experience of autonomous driving.
  • target recognition can be achieved through the following scheme: the original data collected by vehicle radar is subjected to two-dimensional discrete fast Fourier transform (2DFFT) and non-coherent accumulation (non-coherent accumulation). , NCI), get range-Doppler map (range-Doppler map, RD-Map) data, and then go through constant false-alarm (constant false-alarm rate, CFAR) detection, peak grouping (peak grouping), and arrival angle estimation ( direction of arrival: DOA) and other signal processing operations to obtain sparse point cloud (SPC) data, and identify targets based on the sparse point cloud data.
  • 2DFFT discrete fast Fourier transform
  • non-coherent accumulation non-coherent accumulation
  • the above-mentioned vehicle-mounted radars used for target recognition are usually millimeter wave radars with low resolution, and the amount of sparse point cloud data obtained is insufficient, resulting in a high false detection rate.
  • the bicycle is misidentified as a car, etc., resulting in poor target recognition accuracy.
  • the embodiments of the present application provide a target recognition method and device, which can solve the problem of insufficient amount of sparse point cloud data caused by the low resolution of the vehicle-mounted millimeter wave radar, and can improve the accuracy and success rate of target recognition.
  • a target recognition method includes: obtaining second data according to the first data.
  • the first data is the range Doppler map data obtained according to the radar measurement data
  • the second data is the sparse point cloud data of the radar.
  • the third data is determined based on the second data.
  • the third data and the second data are used for target recognition.
  • the sparse point cloud data of the radar can be obtained from the first data, namely the range Doppler data, and then used
  • the third data and the second data that is, using more data for target recognition, can improve the accuracy and success rate of target recognition.
  • determining the third data based on the second data may include: identifying moving target point data in the second data; the moving target point data is sparse point cloud data with a speed greater than a speed threshold. Then, the first distance Doppler image data is obtained from the first data according to the moving target point data. Afterwards, a pattern is selected according to the first data to obtain fourth data from the first distance Doppler image data. After that, the third data is determined, and the third data is the sparse point cloud data corresponding to the fourth data. In this way, more sparse point cloud data can be used for target recognition to improve the accuracy and success rate of target recognition.
  • the first range Doppler map data is: range Doppler map data whose amplitude difference between the range Doppler map data corresponding to the moving target point data is less than or equal to the amplitude difference threshold.
  • the range of the moving target point can be selected from the area where the moving target point is located, which is second only to the moving target point data, that is, the sparse point cloud data corresponding to the distance Doppler data whose energy is closest to the moving target point data (the energy distribution is the most uniform) For target recognition, to further improve the accuracy and success rate of target recognition.
  • the above-mentioned selecting the pattern according to the first data to obtain the fourth data from the first range Doppler data may include: selecting the pattern according to a variety of data to obtain at least one set of the first range Doppler data from the first range Doppler data.
  • Five data the fourth data is the data with the largest entropy value in at least one group of the fifth data
  • the first data selection pattern is the selection pattern corresponding to the fourth data.
  • a group of distance Doppler data with the most uniform energy distribution can be extracted as the fourth data to further improve the accuracy and success rate of target recognition.
  • X new is the third data
  • X rnd is the second data
  • X new and X rnd correspond to the same target point
  • X mean is the data average of all target points in the second data
  • rand(0,1) is random Number generating function
  • a target recognition device in the second aspect, includes: an acquisition unit and a determination unit.
  • the acquiring unit is configured to acquire second data according to the first data; the first data is range Doppler map data obtained according to radar measurement data, and the second data is sparse point cloud data of the radar.
  • the determining unit is used for determining the third data according to the second data; the third data and the second data are used for target identification.
  • the target recognition device provided in the second aspect may further include: a recognition unit.
  • the identification unit is used to identify the moving target point data in the second data; the moving target point data is sparse point cloud data with a speed greater than a speed threshold.
  • the acquiring unit is further configured to acquire the first range Doppler image data from the first data according to the moving target point data.
  • the acquiring unit is further configured to acquire fourth data from the first distance Doppler image data according to the first data selection pattern.
  • the determining unit is further configured to determine third data, and the third data is sparse point cloud data corresponding to the fourth data.
  • the first range Doppler map data is: range Doppler map data whose amplitude difference between the range Doppler map data corresponding to the moving target point data is less than or equal to the amplitude difference threshold.
  • the acquiring unit is further configured to acquire at least one set of fifth data from the first range Doppler map data according to multiple data selection patterns, and the fourth data is the data with the largest entropy value among the at least one set of fifth data ,
  • the first data selection pattern is the selection pattern corresponding to the fourth data.
  • the target recognition device provided by the second aspect may further include: a generating unit.
  • X new is the third data
  • X rnd is the second data
  • X new and X rnd correspond to the same target point
  • X mean is the data average of all target points in the second data
  • rand(0,1) is random Number generating function
  • acquisition unit, determination unit, recognition unit, and generation unit may also be integrated into one or more processing units, and the one or more processing units are used to execute the target recognition method provided in the first aspect.
  • the target recognition device provided in the second aspect may further include a storage unit, and the storage unit stores a program or an instruction.
  • the processing unit executes the program or instruction
  • the target recognition device provided by the second aspect can execute the target recognition method provided by the first aspect.
  • the target recognition device provided in the second aspect may further include a transceiver unit.
  • the transceiver unit may include an input/output port, and the input/output port may be used to communicate with various vehicle-mounted sensors, such as receiving raw data collected by the vehicle-mounted sensor, sending data collection instructions, and so on.
  • the target recognition device provided by the second aspect can be a vehicle-mounted terminal or a vehicle-mounted radar with a target recognition function, or a chip set in a vehicle-mounted terminal or a vehicle-mounted radar with a target recognition function, or other chips with a target recognition function. This application does not limit the components.
  • a target recognition device in a third aspect, includes a processor coupled with a memory, and the memory is used to store a computer program; the processor is used to execute the computer program stored in the memory, so that the target recognition device executes the target recognition method provided in the first aspect.
  • the target recognition device provided in the third aspect may further include a transceiver.
  • the transceiver can be a transceiver circuit or an input/output port.
  • the transceiver can be used for the target identification device to communicate with other devices.
  • the target recognition device is a vehicle-mounted terminal, and the transceiver can be used for communication between the vehicle-mounted terminal and the vehicle-mounted radar.
  • the target recognition device provided in the third aspect may be a vehicle-mounted terminal or vehicle-mounted radar, or a chip or other component with a target recognition function provided in the vehicle-mounted terminal or vehicle-mounted radar.
  • a vehicle-mounted radar includes the target recognition device provided by the second aspect or the third aspect.
  • a vehicle-mounted terminal in a fifth aspect, includes the target recognition device provided by the second aspect or the third aspect.
  • the vehicle-mounted terminal may be a vehicle-mounted controller.
  • a target recognition system includes the target recognition device provided by the second aspect or the third aspect.
  • the target recognition system may also include various vehicle-mounted sensors, such as vehicle-mounted radar, vehicle-mounted camera, and infrared imaging equipment.
  • a vehicle in a seventh aspect, includes the vehicle-mounted radar provided in the fourth aspect, the vehicle-mounted terminal provided in the fifth aspect, or the vehicle includes the target recognition system provided in the sixth aspect.
  • a computer-readable storage medium including: computer instructions are stored in the computer-readable storage medium; when the computer instructions are run on a computer, the computer is caused to execute the target identification method provided in the first aspect.
  • a computer program product containing instructions including a computer program or instruction, when the computer program or instruction runs on a computer, the computer executes the target identification method provided in the first aspect.
  • FIG. 1 is a schematic diagram of the architecture of a target recognition system provided by an embodiment of the application
  • FIG. 2 is a first structural diagram of a target recognition device provided by an embodiment of this application.
  • FIG. 3 is a schematic flowchart of a target recognition method provided by an embodiment of the application.
  • FIG. 4 is a schematic diagram 1 of a target recognition scenario provided by an embodiment of the application.
  • FIG. 5 is a second schematic diagram of a target recognition scenario provided by an embodiment of this application.
  • FIG. 6 is a second structural diagram of the target recognition device provided by an embodiment of the application.
  • a vehicle-mounted target recognition system such as a vehicle-mounted radar (system), a vehicle control system containing the vehicle-mounted radar (system), a vehicle containing the vehicle-mounted radar (system) or a vehicle control system, etc.
  • FIG. 1 is a schematic structural diagram of a target recognition system to which the target recognition method provided in an embodiment of the application is applicable.
  • the target recognition system shown in FIG. 1 is taken as an example to describe in detail the target recognition system applicable to the embodiments of the present application.
  • the solutions in the embodiments of the present application can also be applied to other target recognition systems, and the corresponding names can also be replaced with the names of corresponding functions in other target recognition systems.
  • the target recognition system includes a target recognition device.
  • the target recognition device may be a vehicle-mounted radar or camera with a target recognition processing function, or a device that is coupled with the vehicle-mounted radar or camera and can perform target recognition based on data collected by the vehicle-mounted radar or camera.
  • the above-mentioned target recognition device is a device that can be installed in a vehicle and used to identify targets around the vehicle, or a chip or other component that can be installed in the device.
  • the target recognition device includes, but is not limited to: vehicle-mounted terminal, vehicle-mounted controller, vehicle-mounted module, vehicle-mounted module, vehicle-mounted components, vehicle-mounted chip, vehicle-mounted unit, vehicle-mounted radar or camera with target recognition processing function, and the vehicle can pass through the vehicle-mounted terminal, On-board controllers, on-board modules, on-board modules, on-board components, on-board chips, on-board units, and on-board radars or cameras with target recognition processing functions implement the target recognition method provided in this application.
  • target recognition method provided in the embodiments of the present application can also be applied to scenes other than vehicles that require target recognition, such as toll stations, parking lots, and so on.
  • the above-mentioned vehicle-mounted sensor is used to collect the identification information of the object to be identified, such as collecting information of pedestrians, bicycles, motorcycles, and other vehicles, so that the object identification device can identify the object to be identified.
  • the above-mentioned on-board sensors can also be used to collect the vehicle's own driving status, such as speed, position, uphill or downhill, steering angle, etc., so that the target recognition device controls the actions of the vehicle accordingly, such as controlling the start or stop of the vehicle , Acceleration or deceleration, steering, etc.
  • Vehicle-mounted sensors include but are not limited to: vehicle-mounted millimeter wave radar, vehicle-mounted infrared radar, gyroscope, speed sensor, camera, etc.
  • FIG. 1 is only a simplified schematic diagram of an example for ease of understanding, and the target recognition system may also include other devices and/or other sensors, which are not shown in FIG. 1.
  • FIG. 2 is a schematic structural diagram of a target recognition device 200 that can be used to implement the target recognition method provided by the embodiment of the present application.
  • the target recognition apparatus 200 may include a processor 201, a memory 202, and a transceiver 203.
  • the processor 201 is coupled with the memory 202 and the transceiver 203, for example, can be connected through a communication bus.
  • each component of the target recognition device 200 will be specifically introduced with reference to FIG. 2.
  • the processor 201 is the control center of the target recognition device 200, and may be a processor or a collective name for multiple processing elements.
  • the processor 201 is one or more central processing units (CPU), or an application specific integrated circuit (ASIC), or is configured to implement one or more of the embodiments of the present application.
  • An integrated circuit for example: one or more microprocessors (digital signal processors, DSP), or one or more field programmable gate arrays (FPGA).
  • the processor 201 can execute various functions of the target recognition apparatus 200 by running or executing a software program stored in the memory 202 and calling data stored in the memory 202.
  • the processor 201 may include one or more CPUs, such as CPU0 and CPU1 shown in FIG. 2.
  • the target recognition apparatus 200 may also include multiple processors, such as the processor 201 and the processor 204 shown in FIG. 2. Each of these processors can be a single-core processor (single-CPU) or a multi-core processor (multi-CPU).
  • the processor here may refer to one or more communication devices, circuits, and/or processing cores for processing data (for example, computer program instructions).
  • the memory 202 can be a read-only memory (ROM) or other types of static storage communication devices that can store static information and instructions, a random access memory (RAM), or other types that can store information and instructions.
  • the type of dynamic storage communication equipment can also be electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM) or other optical disk storage, Optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, Blu-ray disc, etc.), magnetic disk storage media or other magnetic storage communication devices, or can be used to carry or store desired program codes in the form of instructions or data structures and Any other medium that can be accessed by the computer, but not limited to this.
  • the memory 202 can be integrated with the processor 201, or can exist independently, and is coupled with the processor 201 through the input/output port (not shown in FIG. 2) of the target recognition device 200, which is not specifically limited in the embodiment of the present application. .
  • the memory 202 is used to store a software program for executing the solution of the present application, and the processor 201 controls the execution.
  • the processor 201 controls the execution.
  • the transceiver 203 is used for communication with other target recognition devices.
  • the target recognition apparatus 200 is a terminal device, and the transceiver 203 may be used to communicate with a network device or to communicate with another terminal device.
  • the target recognition apparatus 200 is a network device, and the transceiver 203 may be used to communicate with a terminal device or to communicate with another network device.
  • the transceiver 203 may include a receiver and a transmitter (not separately shown in FIG. 2). Among them, the receiver is used to realize the receiving function, and the transmitter is used to realize the sending function.
  • the transceiver 203 can be integrated with the processor 201, or can exist independently, and is coupled with the processor 201 through the input/output port (not shown in FIG. 2) of the target recognition device 200, which is not specifically described in this embodiment of the application. limited.
  • the target recognition device may be a vehicle-mounted radar or camera with a target recognition function.
  • the target recognition device may also be a vehicle-mounted terminal, and the transceiver may be used for the vehicle-mounted terminal to communicate with a vehicle-mounted radar or camera, such as sending data collection instructions and receiving collected data as follows.
  • the structure of the target recognition device 200 shown in FIG. 2 does not constitute a limitation on the target recognition device.
  • the actual target recognition device may include more or less components than those shown in the figure, or a combination of some components. Components, or different component arrangements.
  • FIG. 3 is a first schematic flowchart of a target recognition method provided by an embodiment of this application.
  • the target recognition method can be applied to the target recognition system shown in FIG. 1.
  • the target recognition method includes the following steps:
  • S301 Acquire second data according to the first data.
  • the first data is the range Doppler map data obtained according to the radar measurement data
  • the second data is the sparse point cloud data of the radar.
  • the radar measurement data may include measurement data of multiple dimensions such as horizontal and vertical coordinates, velocity, radar cross-section (radar cross-section, RCS), etc.
  • the measurement data of the multiple dimensions may be arranged in a preset manner or
  • the measurement data of the multiple dimensions can be a two-dimensional matrix.
  • the two-dimensional matrix can be one A 3 ⁇ 20 matrix, that is, the three row vectors of the two-dimensional matrix correspond to the three dimensions one-to-one, and each row includes radar measurement data of the same dimension of 20 identification points.
  • the two-dimensional matrix can also be a 20 ⁇ 3 matrix, that is, the three column vectors of the two-dimensional matrix correspond to the three dimensions one to one, and each column includes 20 identification points of the same dimension. Measurement data.
  • the embodiment of the present application does not specifically limit the manner in which the radar measurement data is expressed.
  • the radar measurement data is the original data collected by the vehicle-mounted radar, and one or more signal processing, such as 2DFFT and NCI, are required to obtain the first data. Then, perform CFAR detection, peak extraction, and arrival angle estimation on the first data to obtain the second data. In this process, because some target points are filtered out, the number of second data will be less than the number of radar measurement data, that is, the number of target points corresponding to the second data will be less than the number of target points corresponding to the radar measurement data. Taking the above-mentioned 3 ⁇ 20 two-dimensional matrix as an example, assuming that there are 5 target points corresponding to the second data, the second data is a 3 ⁇ 5 two-dimensional matrix.
  • the second data is usually less than the first data and radar measurement data, and the target recognition result is based on the second data. Therefore, the insufficient amount of the second data will reduce the accuracy of target recognition, especially when the vehicle-mounted radar is a millimeter-wave radar with lower resolution, the amount of radar measurement data collected is inherently small, making this problem particularly prominent.
  • S302 Determine third data according to the second data.
  • the above S302, determining the third data according to the second data may include the following steps 1 to 4:
  • Step 1 Identify the moving target point data in the second data.
  • the moving target point data is sparse point cloud data with a speed greater than the speed threshold, and the moving target point data is peak data.
  • the moving target point can be identified according to the first speed or the second speed of the identified point in the second data.
  • the first speed can be the vehicle's traveling speed
  • the vehicle's traveling speed can be the absolute traveling speed of the vehicle, such as 100 kilometers per hour (km/h), or the relative traveling speed of the vehicle, such as the vehicle relative to the reference object.
  • the second speed may be the speed between different parts of the vehicle, such as the speed between different wheels, the speed between the wheels and the vehicle body, and so on.
  • different speed thresholds can be set for the above-mentioned first speed and second speed, and the moving target points in the second data can be identified according to the different speed thresholds, so as to obtain the moving target point data, that is, the moving target.
  • the moving target point data is usually all the recognition points in the area where the moving target point is located. The maximum value in the data. Among them, the area where the moving target point is located can be determined according to preset conditions.
  • the area where the moving target point is located can be: a circular area with the moving target point as the geometric center, or an ellipse Shape area, or rectangular area, or triangle area, or other geometric figure area.
  • Step 2 Acquire first range Doppler image data from the first data according to the moving target point data.
  • the first range Doppler data may be: range Doppler data whose amplitude difference between the range Doppler data corresponding to the moving target point data is less than or equal to the amplitude difference threshold.
  • FIG. 4 is a schematic diagram 1 of a target recognition scene provided by an embodiment of this application.
  • the left image is a square area centered on the distance Doppler map data corresponding to a moving target point data, and each includes 5 identification point data in the abscissa and ordinate directions, that is, the square
  • the area is a 5 ⁇ 5 square with a total of 25 identification points.
  • the target point whose amplitude difference between the signal amplitude of the identification point and the signal amplitude of the moving target point is less than or equal to the amplitude difference threshold may be determined as the first distance Doppler image data in the square area.
  • Step 3 Select a pattern according to the first data to obtain fourth data from the first distance Doppler image data.
  • selecting a pattern according to the first data to obtain fourth data from the first range Doppler map data may include:
  • At least one set of fifth data is acquired from the first range Doppler image data according to multiple data selection patterns, the fourth data is the data with the largest entropy value among the at least one set of fifth data, and the first data selection pattern is the fourth data The corresponding selected pattern.
  • a group of distance Doppler data can be selected as the fifth data according to the data selection pattern 1 to the data selection pattern 3 shown in the figure. Then, the entropy value of each group of fifth data is calculated separately, and the group of fifth data with the largest entropy value is selected as the fourth data, and the data selection pattern corresponding to the group of fifth data with the largest entropy value is called the first data Select the pattern. For example, assuming that the group of distance Doppler data corresponding to the data selection pattern 2 shown in FIG. 4 has the largest entropy value, then the data selection pattern 2 is the first data selection pattern. Regarding the implementation of the entropy value, reference may be made to the prior art, which will not be repeated in the embodiment of the present application.
  • the first range Doppler data can be in multiple groups, and the range can be selected from the area where the moving target point is second only to the moving target point data, that is, the energy is closest to the moving target point data (the entropy value is the largest, that is, The range Doppler data with the most uniform energy distribution is used for target recognition to further improve the accuracy and success rate of target recognition.
  • Step four determine the third data.
  • the fourth data that is, the set of range Doppler data with the largest entropy value selected in step 3 above, that is, the set of range Doppler data corresponding to the first data selection pattern
  • the third data is sparse point cloud data corresponding to the fourth data.
  • the set of distance Doppler image data corresponding to the data selection pattern 2 can be estimated for the angle of arrival, so as to obtain the third data.
  • X new is the third data
  • X rnd is the second data
  • X new and X rnd correspond to the same target point
  • X mean is the data average of all target points in the second data
  • rand(0,1) is random Number generating function
  • FIG. 5 is a second schematic diagram of a target recognition scene provided by an embodiment of this application.
  • the second data can include the target points A to C shown in the figure, that is, the sparse point cloud data of a total of 3 target points, which can be connected end to end to form the first contour of the target to be identified, as shown in Figure 5.
  • the target points A1, A2, B1, B2, and C1 can be generated according to the above formula
  • a total of 5 target points of sparse point cloud data that is, the third data.
  • part or all of the target points in the second data and the third data are connected to form a second contour of the target to be recognized, as shown by the solid line in FIG. 5.
  • the second contour has more lines and richer connection relationships, so it is closer to the actual contour of the target to be recognized, that is, the accuracy and success rate of using the second contour to recognize the target are better. high.
  • the target points A1, A2, B1, B2, and C1 generated in FIG. 5 are all located within a dashed circle centered on one of the target points A to C corresponding to the second data.
  • the third data may also be generated in other ways, which is not specifically limited in the embodiment of the present application.
  • the target point corresponding to the second data may be the center of the ellipse, and the target point data on the circumference of the ellipse whose long diameter is consistent with the driving direction of the vehicle may be used as the third data.
  • the second profile shown in FIG. 5 is only an example. In practical applications, other methods can be used to construct the second contour of the target to be identified, such as convex hull, bounding rectangle, confidence ellipse, etc. This embodiment of the application will not specifically describe this. limited.
  • third data A a part of third data, such as third data A, may be determined according to the method shown in FIG. 4, and then another part of third data, such as third data B, may be generated according to the method shown in FIG.
  • third data B may be generated only based on the second data, or may be generated based on the second data and the third data A, which is not specifically limited in the embodiment of the present application.
  • the number of third data can be determined according to the number of configured target points. For example, if the number of configured target points is 50, and the number of target points corresponding to the second data obtained when S301 is executed is 25, then another 25 targets can be obtained according to the method shown in FIG. 4 and/or FIG. 5 Point the third data.
  • the number of configured target points can be determined according to factors such as target recognition algorithms, target recognition history records, etc., for specific implementation, reference may be made to existing implementation manners, which will not be repeated in this embodiment of the application.
  • S303 Complete target recognition according to the third data and the second data.
  • a target recognition algorithm can be used to process the third data and the second data to determine the target to be recognized.
  • the target recognition algorithm can include one or more of the following: region-donvoutional neural network (regions-donvoutional neural network, RCNN) algorithm, deep convoutional neural network (deep convoutional neural network, DCNN) algorithm, migration learning algorithm, deep Learning algorithm, etc.
  • the target recognition algorithm can be implemented in a vehicle-mounted radar, such as a millimeter-wave radar, or a vehicle-mounted terminal such as a vehicle owner controller, and can also implement a part of the functions in each of the vehicle-mounted radar and the vehicle-mounted terminal, which is not done in the embodiment of this application. Specific restrictions.
  • the regulatory control module After that, it can be output to the regulatory control module according to the target recognition result, which is used to realize vehicle control and/or target tracking, such as automatic driving, intelligent driving, and intelligent navigation.
  • the control device can obtain more data, such as the third data, from the first data, namely the range Doppler data, according to the second data, that is, the sparse point cloud data of the radar. Then use the third data and the second data, that is, use more data for target recognition, which can improve the accuracy and success rate of target recognition.
  • the third data such as the third data
  • the second data that is, the sparse point cloud data of the radar.
  • the target recognition method provided by the embodiment of the present application is described in detail above with reference to FIGS. 3 to 5.
  • another target recognition device provided by an embodiment of the present application will be described in detail with reference to FIG. 6.
  • FIG. 6 is a second structural diagram of the target recognition device provided by an embodiment of the present application.
  • the target recognition device can be applied to the target recognition system shown in FIG. 1 to implement the target recognition method shown in FIG. 3.
  • FIG. 6 only shows the main components of the target recognition device.
  • the target recognition device 600 includes: an acquisition unit 601 and a determination unit 602.
  • the acquiring unit 601 is configured to acquire second data according to the first data; the first data is range Doppler map data obtained according to radar measurement data, and the second data is sparse point cloud data of the radar.
  • the determining unit 602 is configured to determine the third data according to the second data; the third data and the second data are used for target identification.
  • the target recognition device 600 may further include: a recognition unit 603. Wherein, the identification unit 603 is used to identify the moving target point data in the second data; the moving target point data is sparse point cloud data with a speed greater than a speed threshold.
  • the obtaining unit 601 is further configured to obtain first range Doppler image data from the first data according to the moving target point data.
  • the acquiring unit 601 is further configured to acquire fourth data from the first distance Doppler image data according to the first data selection pattern.
  • the determining unit 602 is further configured to determine third data, and the third data is sparse point cloud data corresponding to the fourth data.
  • the first range Doppler map data is: range Doppler map data whose amplitude difference between the range Doppler map data corresponding to the moving target point data is less than or equal to the amplitude difference threshold.
  • the acquiring unit 601 is further configured to acquire at least one set of fifth data from the first range Doppler map data according to a variety of data selection patterns, and the fourth data is the one with the largest entropy value among the at least one set of fifth data.
  • the first data selection pattern is the selection pattern corresponding to the fourth data.
  • the target recognition device 600 may further include: a generating unit 604.
  • X new is the third data
  • X rnd is the second data
  • X new and X rnd correspond to the same target point
  • X mean is the data average of all target points in the second data
  • rand(0,1) is random Number generating function
  • the acquisition unit 601, the determination unit 602, the identification unit 603, and the generation unit 604 may also be integrated into one or more processing units, and the one or more processing units are used to execute the targets described in the above method embodiments. recognition methods.
  • the target recognition device 600 may further include a storage unit (not shown in FIG. 6), and the storage unit stores programs or instructions.
  • the processing unit executes the program or instruction
  • the target recognition device 600 can execute the target recognition method shown in FIG. 3.
  • the target recognition device 600 may further include a transceiving unit (not shown in FIG. 6).
  • the transceiver unit may include an input/output port, and the input/output port may be used to communicate with various vehicle-mounted sensors, such as receiving raw data collected by the vehicle-mounted sensor, sending data collection instructions, and so on.
  • the transceiver unit can also be divided into a receiving unit and a sending unit.
  • the receiving unit is used to receive data from various vehicle-mounted sensors, other vehicle-mounted terminals, handheld terminals, and wireless networks.
  • the sending unit is used to send data to various vehicle-mounted sensors, other vehicle-mounted terminals, handheld terminals, and wireless networks.
  • the target recognition device 600 may be the target recognition device 200 shown in FIG. 2, a vehicle-mounted terminal, or a vehicle-mounted radar with a target recognition function, or may be installed in the target recognition device 200, the vehicle-mounted terminal shown in FIG. Or the chip in the vehicle-mounted radar or other components with target recognition function, which is not limited in this application.
  • the technical effect of the target recognition device 600 can refer to the technical effect of the above-mentioned target recognition method, which will not be repeated here.
  • the embodiment of the application provides a vehicle-mounted radar.
  • the vehicle-mounted radar may include the target recognition device shown in FIG. 2 or FIG. 6.
  • the embodiment of the present application provides a vehicle-mounted terminal.
  • the vehicle-mounted terminal may include the target recognition device shown in FIG. 2 or FIG. 6.
  • the vehicle-mounted terminal may be a vehicle-mounted controller.
  • the on-board controller can also be used to control the driving state of the vehicle, such as controlling the vehicle to start or stop, accelerate or decelerate, turn, and change lanes.
  • the embodiment of the present application provides a target recognition system.
  • the target recognition system includes the above-mentioned vehicle-mounted terminal.
  • the target recognition system may also include various vehicle-mounted sensors, such as vehicle-mounted radar, vehicle-mounted camera, and infrared imaging equipment.
  • vehicle-mounted sensors such as vehicle-mounted radar, vehicle-mounted camera, and infrared imaging equipment.
  • the embodiment of the present application provides a vehicle.
  • the vehicle includes the above-mentioned vehicle-mounted radar or vehicle-mounted terminal.
  • the vehicle includes the above-mentioned target recognition system.
  • the embodiment of the present application provides a computer-readable storage medium, including: the computer-readable storage medium stores computer instructions; when the computer instructions are run on a computer, the computer is caused to execute the target recognition described in the foregoing method embodiment method.
  • the embodiment of the present application provides a computer program product containing instructions, including a computer program or instruction, when the computer program or instruction runs on a computer, the computer executes the target identification method described in the above method embodiment.
  • the processor in the embodiments of the present application may be a central processing unit (central processing unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (digital signal processors, DSP), and dedicated integration Circuit (application specific integrated circuit, ASIC), ready-made programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • the memory in the embodiments of the present application may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory can be read-only memory (ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), and electrically available Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • the volatile memory may be random access memory (RAM), which is used as an external cache.
  • RAM random access memory
  • static random access memory static random access memory
  • DRAM dynamic random access memory
  • DRAM synchronous dynamic random access memory
  • Access memory synchronous DRAM, SDRAM
  • double data rate synchronous dynamic random access memory double data rate SDRAM, DDR SDRAM
  • enhanced synchronous dynamic random access memory enhanced SDRAM, ESDRAM
  • synchronous connection dynamic random access memory Take memory (synchlink DRAM, SLDRAM) and direct memory bus random access memory (direct rambus RAM, DR RAM).
  • the foregoing embodiments may be implemented in whole or in part by software, hardware (such as circuits), firmware, or any other combination.
  • the above-mentioned embodiments may be implemented in the form of a computer program product in whole or in part.
  • the computer program product includes one or more computer instructions or computer programs.
  • the processes or functions described in the embodiments of the present application are generated in whole or in part.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center. Transmission to another website, computer, server or data center via wired (such as infrared, wireless, microwave, etc.).
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center that includes one or more sets of available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium.
  • the semiconductor medium may be a solid state drive.
  • At least one refers to one or more, and “multiple” refers to two or more.
  • the following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or a plurality of items (a).
  • at least one item (a) of a, b, or c can mean: a, b, c, ab, ac, bc, or abc, where a, b, and c can be single or multiple .
  • the size of the sequence number of the above-mentioned processes does not mean the order of execution, and the execution order of each process should be determined by its function and internal logic, and should not correspond to the embodiments of the present application.
  • the implementation process constitutes any limitation.
  • the disclosed system, device, and method can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or It can be integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the function is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present application essentially or the part that contributes to the existing technology or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (read-only memory, ROM), random access memory (random access memory, RAM), magnetic disks or optical disks and other media that can store program codes. .

Abstract

一种目标识别方法及装置,该方法包括:根据第一数据获取第二数据(S301),其中,第一数据为根据雷达测量数据得到的距离多普勒图数据,第二数据为雷达的稀疏点云数据;根据第二数据确定第三数据(S302);根据第三数据和第二数据完成目标识别(S303)。还提供一种车载雷达、车载终端、车辆、计算机可读存储介质以及计算机程序产品。可以解决车载毫米波雷达分辨率低所导致的稀疏点云数据数量不足的问题,能够提高目标识别的准确性和成功率,可应用于自动驾驶系统、智能驾驶系统、导航系统中。

Description

目标识别方法及装置
本申请要求于2020年02月21日提交国家知识产权局、申请号为202010108659.4、申请名称为“目标识别方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及通信领域,尤其涉及一种目标识别方法及装置。
背景技术
在自动驾驶场景中,目标识别可以提供车辆周围目标信息,以便为车辆驾驶决策提供参考,如加速或减速、变道、转向、上坡或下坡等。因此,目标识别是自动驾驶场景领域的核心问题,对自动驾驶的安全性和驾驶体验起着至关重要的作用。
目前,目标识别可以通过如下方案实现:将车载雷达(vehicle radar)采集的原始数据经过二维离散快速傅里叶变换(two-dimensional discrete fast Fourier transform,2DFFT)和非相干累加(non-coherent accumulation,NCI),得到距离多普勒图(range-Doppler map,RD-Map)数据,然后经过恒虚警(constant false-alarm rate,CFAR)检测、峰值提取(peak grouping)、波达角估计(direction of arrival:DOA)等信号处理操作获取稀疏点云(sparse point cloud,SPC)数据,并基于稀疏点云数据识别目标。
然而,上述用于目标识别的车载雷达通常为分辨率较低的毫米波雷达(millimeter wave radar),获取的稀疏点云数据数量不足,导致误检率高,如将行人误识别为自行车、将自行车误识别为小汽车等,导致目标识别的准确性差。
发明内容
本申请实施例提供一种目标识别方法及装置,可以解决车载毫米波雷达分辨率低所导致的稀疏点云数据数量不足的问题,能够提高目标识别的准确性和成功率。
为达到上述目的,本申请采用如下技术方案:
第一方面,提供一种目标识别方法。该方法包括:根据第一数据获取第二数据。其中,第一数据为根据雷达测量数据得到的距离多普勒图数据,第二数据为雷达的稀疏点云数据。然后,根据第二数据确定第三数据。其中,第三数据和第二数据用于目标识别。
基于第一方面提供的目标识别方法,可以根据第二数据,即雷达的稀疏点云数据,从第一数据,即距离多普勒图数据中获取更多的数据,如第三数据,然后使用第三数据和第二数据,也就是使用更多的数据做目标识别,可以提高目标识别的准确性和成功率。
在一种可能的设计方法中,上述根据第二数据确定第三数据,可以包括:识别第二数据中的动目标点数据;动目标点数据为速度大于速度阈值的稀疏点云数据。然后,根据动目标点数据从第一数据中获取第一距离多普勒图数据。之后,根据第一数据选取图案从第一距离多普勒图数据中获取第四数据。再之后,确定第三数据,第三数据 为第四数据对应的稀疏点云数据。如此,可以使用更多的稀疏点云数据做目标识别,以提高目标识别的准确性和成功率。
其中,第一距离多普勒图数据为:与动目标点数据对应的距离多普勒图数据之间的幅度差小于或等于幅度差阈值的距离多普勒图数据。也就是说,可以从动目标点所在区域选择幅度仅次于动目标点数据,即能量与动目标点数据最为接近(能量分布最为均匀)的距离多普勒图数据对应的稀疏点云数据用于目标识别,以进一步提高目标识别的准确性和成功率。
可选地,上述根据第一数据选取图案从第一距离多普勒图数据中获取第四数据,可以包括:根据多种数据选取图案从第一距离多普勒图数据中获取至少一组第五数据,第四数据为至少一组第五数据中熵值最大的数据,第一数据选取图案为第四数据对应的选取图案。也就是说,可以将能量分布最为均匀一组距离多普勒图数据提取出来作为第四数据,以进一步提高目标识别的准确性和成功率。
在另一种可能的设计方法中,上述根据第二数据确定第三数据,可以包括:根据如下公式,生成第三数据:X new=X rnd+rand(0,1)*(X mean-X rnd)。其中,X new为第三数据,X rnd为第二数据,X new和X rnd对应同一个目标点,X mean为第二数据中所有目标点的数据平均值,rand(0,1)为随机数生成函数,且0<rand(0,1)<1。如此,也可以获取更多的稀疏点云数据,以提高目标识别的准确性和成功率。
需要说明的是,上述两种获取更多稀疏点云数据用于目标识别的方法,也可以结合使用,以进一步提高目标识别的准确性和成功率。
第二方面,提供一种目标识别装置。该装置包括:获取单元和确定单元。其中,获取单元,用于根据第一数据获取第二数据;第一数据为根据雷达测量数据得到的距离多普勒图数据,第二数据为雷达的稀疏点云数据。确定单元,用于根据第二数据确定第三数据;第三数据和第二数据用于目标识别。
在一种可能的设计中,第二方面提供的目标识别装置还可以包括:识别单元。其中,识别单元,用于识别第二数据中的动目标点数据;动目标点数据为速度大于速度阈值的稀疏点云数据。获取单元,还用于根据动目标点数据从第一数据中获取第一距离多普勒图数据。获取单元,还用于根据第一数据选取图案从第一距离多普勒图数据中获取第四数据。确定单元,还用于确定第三数据,第三数据为第四数据对应的稀疏点云数据。
其中,第一距离多普勒图数据为:与动目标点数据对应的距离多普勒图数据之间的幅度差小于或等于幅度差阈值的距离多普勒图数据。
可选地,获取单元,还用于根据多种数据选取图案从第一距离多普勒图数据中获取至少一组第五数据,第四数据为至少一组第五数据中熵值最大的数据,第一数据选取图案为第四数据对应的选取图案。
在另一种可能的设计中,第二方面提供的目标识别装置还可以包括:生成单元。其中,生成单元,用于根据如下公式,生成第三数据:X new=X rnd+rand(0,1)*(X mean-X rnd)。其中,X new为第三数据,X rnd为第二数据,X new和X rnd对应同一个目标点,X mean为第二数据中所有目标点的数据平均值,rand(0,1)为随机数生成函数,且0<rand(0,1)<1。
需要说明的是,上述获取单元、确定单元、识别单元、生成单元也可以集成为一个或多个处理单元,该一个或多个处理单元用于执行第一方面提供的目标识别方法。
可选地,第二方面提供的目标识别装置还可以包括存储单元,该存储单元存储有程序或指令。当处理单元执行该程序或指令时,使得第二方面提供的目标识别装置可以执行第一方面提供的目标识别方法。
可选地,第二方面提供的目标识别装置还可以包括收发单元。该收发单元可以包括输入/输出端口,该输入/输出端口可以用于与各种车载传感器通信,如接收车载传感器采集的原始数据、发送数据采集指令等。
需要说明的是,第二方面提供的目标识别装置可以是车载终端或具有目标识别功能的车载雷达,也可以是设置于车载终端或具有目标识别功能的车载雷达中的芯片或其他具有目标识别功能的部件,本申请对此不做限定。
第二方面提供的目标识别装置的技术效果可以参考第一方面提供的目标识别方法的技术效果,此处不再赘述。
第三方面,提供一种目标识别装置。该目标识别装置包括:处理器,该处理器与存储器耦合,存储器用于存储计算机程序;处理器用于执行存储器中存储的计算机程序,以使得该目标识别装置执行第一方面提供的目标识别方法。
在一种可能的设计中,第三方面提供的目标识别装置还可以包括收发器。该收发器可以为收发电路或输入/输出端口。该收发器可以用于该目标识别装置与其他装置通信。例如,该目标识别装置为车载终端,该收发器可以用于车载终端与车载雷达通信。
在本申请中,第三方面提供的目标识别装置可以为车载终端或车载雷达,或者设置于车载终端或车载雷达内部的芯片或其他具有目标识别功能的部件。
第三方面提供的目标识别装置的技术效果可以参考第一方面提供的目标识别方法的技术效果,此处不再赘述。
第四方面,提供一种车载雷达。该车载雷达包括第二方面或第三方面提供的目标识别装置。
第五方面,提供一种车载终端。该车载终端包括第二方面或第三方面提供的目标识别装置。
可选地,该车载终端可以为车载控制器。
第六方面,提供一种目标识别系统。该目标识别系统包括第二方面或第三方面提供的目标识别装置。可选地,该目标识别系统还可以包括各种车载传感器,如车载雷达、车载摄像头、红外成像设备等。
第七方面,提供一种车辆。该车辆包括第四方面提供的车载雷达,第五方面提供的车载终端,或者该车辆包括第六方面提供的目标识别系统。
第八方面,提供一种计算机可读存储介质,包括:该计算机可读存储介质中存储有计算机指令;当该计算机指令在计算机上运行时,使得该计算机执行第一方面提供的目标识别方法。
第九方面,提供了一种包含指令的计算机程序产品,包括计算机程序或指令,当该计算机程序或指令在计算机上运行时,使得该计算机执行第一方面提供的目标识别方法。
附图说明
图1为本申请实施例提供的目标识别系统的架构示意图;
图2为本申请实施例提供的目标识别装置的结构示意图一;
图3为本申请实施例提供的目标识别方法的流程示意图;
图4为本申请实施例提供的目标识别的场景示意图一;
图5为本申请实施例提供的目标识别的场景示意图二;
图6为本申请实施例提供的目标识别装置的结构示意图二。
具体实施方式
下面将结合附图,对本申请中的技术方案进行描述。
本申请实施例的技术方案可以应用于车载目标识别系统,例如车载雷达(系统)、包含该车载雷达(系统)的车辆控制系统、包含该车载雷达(系统)或车辆控制系统的车辆等。
本申请将围绕可包括多个设备、组件、模块等的系统来呈现各个方面、实施例或特征。应当理解和明白的是,各个系统可以包括另外的设备、组件、模块等,并且/或者可以并不包括结合附图讨论的所有设备、组件、模块等。此外,还可以使用这些方案的组合。
另外,在本申请实施例中,“示例地”、“例如”等词用于表示作例子、例证或说明。本申请中被描述为“示例”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用示例的一词旨在以具体方式呈现概念。
本申请实施例中,“的(of)”,“相应的(corresponding,relevant)”和“对应的(corresponding)”有时可以混用,应当指出的是,在不强调其区别时,其所要表达的含义是一致的。
本申请实施例描述的网络架构以及业务场景是为了更加清楚的说明本申请实施例的技术方案,并不构成对于本申请实施例提供的技术方案的限定,本领域普通技术人员可知,随着目标识别系统的架构演变和新业务场景的出现,本申请实施例提供的技术方案对于类似的技术问题,同样适用。
示例性地,图1为本申请实施例提供的目标识别方法所适用的一种目标识别系统的架构示意图。为便于理解本申请实施例,首先以图1中示出的目标识别系统为例详细说明适用于本申请实施例的目标识别系统。应当指出的是,本申请实施例中的方案还可以应用于其他目标识别系统中,相应的名称也可以用其他目标识别系统中的对应功能的名称进行替代。
如图1所示,该目标识别系统包括目标识别装置。其中,该目标识别装置可以是具有目标识别处理功能的车载雷达或摄像头,也可以是与车载雷达或摄像头耦合,且可根据车载雷达或摄像头采集的数据进行目标识别的装置。
其中,上述目标识别装置为可设置于车辆,且用于识别车辆周围目标的装置,或可设置于该设备的芯片或其他部件。该目标识别装置包括但不限于:车载终端、车载控制器、车载模块、车载模组、车载部件、车载芯片、车载单元、具有目标识别处理功能的车载雷达或摄像头,车辆可通过该车载终端、车载控制器、车载模块、车载模组、车载部件、车载芯片、车载单元、具有目标识别处理功能的车载雷达或摄像头, 实施本申请提供的目标识别方法。
需要说明的是,本申请实施例提供的目标识别方法也可以适用于车辆之外需要目标识别的场景,如收费站、停车场等。
上述车载传感器用于以及收集待识别目标的识别信息,如收集行人、自行车、摩托车、其他车辆的信息,以便目标识别装置识别待识别目标。可选地,上述车载传感器还可以用于采集车辆自身的行驶状态,如速度、位置、上坡或下坡、转向角等,以便目标识别装置据此控制车辆的动作,如控制车辆启动或停止、加速或减速、转向等。车载传感器包括但不限于:车载毫米波雷达、车载红外雷达、陀螺仪、速度传感器、摄像头等。
应理解,图1仅为便于理解而示例的简化示意图,该目标识别系统还可以包括其他设备,和/或,其他传感器,图1中未予以画出。
图2为可用于执行本申请实施例提供的目标识别方法的一种目标识别装置200的结构示意图。如图2所示,目标识别装置200可以包括处理器201、存储器202和收发器203。其中,处理器201与存储器202和收发器203耦合,如可以通过通信总线连接。
下面结合图2对目标识别装置200的各个构成部件进行具体的介绍。
处理器201是目标识别装置200的控制中心,可以是一个处理器,也可以是多个处理元件的统称。例如,处理器201是一个或多个中央处理器(central processing unit,CPU),也可以是特定集成电路(application specific integrated circuit,ASIC),或者是被配置成实施本申请实施例的一个或多个集成电路,例如:一个或多个微处理器(digital signal processor,DSP),或,一个或者多个现场可编程门阵列(field programmable gate array,FPGA)。
其中,处理器201可以通过运行或执行存储在存储器202内的软件程序,以及调用存储在存储器202内的数据,执行目标识别装置200的各种功能。
在具体的实现中,作为一种实施例,处理器201可以包括一个或多个CPU,例如图2中所示的CPU0和CPU1。
在具体实现中,作为一种实施例,目标识别装置200也可以包括多个处理器,例如图2中所示的处理器201和处理器204。这些处理器中的每一个可以是一个单核处理器(single-CPU),也可以是一个多核处理器(multi-CPU)。这里的处理器可以指一个或多个通信设备、电路、和/或用于处理数据(例如计算机程序指令)的处理核。
存储器202可以是只读存储器(read-only memory,ROM)或可存储静态信息和指令的其他类型的静态存储通信设备,随机存取存储器(random access memory,RAM)或者可存储信息和指令的其他类型的动态存储通信设备,也可以是电可擦可编程只读存储器(electrically erasable programmable read-only memory,EEPROM)、只读光盘(compact disc read-only memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储通信设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码并能够由计算机存取的任何其他介质,但不限于此。存储器202可以和处理器201集成在一起,也可以独立存在,并通过目标识别装置200的输入/输出端口(图2中未示出) 与处理器201耦合,本申请实施例对此不作具体限定。
其中,所述存储器202用于存储执行本申请方案的软件程序,并由处理器201来控制执行。上述具体实现方式可以参考下述方法实施例,此处不再赘述。
收发器203,用于与其他目标识别装置之间的通信。例如,目标识别装置200为终端设备,收发器203可以用于与网络设备通信,或者与另一个终端设备通信。又例如,目标识别装置200为网络设备,收发器203可以用于与终端设备通信,或者与另一个网络设备通信。此外,收发器203可以包括接收器和发送器(图2中未单独示出)。其中,接收器用于实现接收功能,发送器用于实现发送功能。收发器203可以和处理器201集成在一起,也可以独立存在,并通过目标识别装置200的输入/输出端口(图2中未示出)与处理器201耦合,本申请实施例对此不作具体限定。
例如,该目标识别装置可以为具有目标识别功能的车载雷达或摄像头。又例如,该目标识别装置也可以为车载终端,该收发器可以用于该车载终端与车载雷达或摄像头通信,如下发数据采集指令,接收采集的数据等。
需要说明的是,图2中示出的目标识别装置200的结构并不构成对该目标识别装置的限定,实际的目标识别装置可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。
下面将结合图3-图5对本申请实施例提供的目标识别方法进行具体阐述。
图3为本申请实施例提供的目标识别方法的流程示意图一。该目标识别方法可以适用于图1所示的目标识别系统中。
如图3所示,该目标识别方法包括如下步骤:
S301,根据第一数据获取第二数据。
其中,第一数据为根据雷达测量数据得到的距离多普勒图数据,第二数据为雷达的稀疏点云数据。
示例性地,雷达测量数据可以包括诸如横纵坐标、速度、雷达散射截面积(radar cross-section,RCS)等多个维度的测量数据,该多个维度的测量数据可以按照预设方式排列或组合,如该多个维度的测量数据可以为一个二维矩阵。例如,假定雷达测量数据共计包括同一识别场景中的20个识别点的数据,且该20个识别点的数据均包括横纵坐标、速度、RCS这3个维度,则该二维矩阵可以为一个3×20的矩阵,即该二维矩阵的3个行向量与该3个维度一一对应,每行包括20个识别点同一维度的雷达测量数据。应理解,该二维矩阵也可以为一个也可以采用20×3的矩阵,即该二维矩阵的3个列向量与该3个维度一一对应,每列包括20个识别点同一维度的雷达测量数据。本申请实施例对于雷达测量数据的表示方式,不做具体限定。
需要说明的是,雷达测量数据为车载雷达采集的原始数据,还需要对其做一项或多项信号处理,如2DFFT和NCI,方可得到第一数据。然后,对第一数据做CFAR检测、峰值提取和波达角估计得到第二数据。在此过程中,由于滤除了部分目标点,因此第二数据的数量会少于雷达测量数据的数量,即第二数据对应的目标点的数量会少于雷达测量数据对应的目标点的数量。以上述3×20的二维矩阵为例,假定第二数据对应的目标点为5个,则第二数据为一个3×5的二维矩阵。
也就是说,第二数据通常会少于第一数据和雷达测量数据,而目标识别结果是基 于第二数据做出的。因此,第二数据的数量不足会降低目标识别的准确性,尤其当车载雷达为分辨率较低的毫米波雷达时,采集的雷达测量数据本就数量较少,使得这一问题尤为突出。
S302,根据第二数据确定第三数据。
在一种可能的设计方法中,上述S302,根据第二数据确定第三数据,可以包括如下步骤一至步骤四:
步骤一,识别第二数据中的动目标点数据。
其中,动目标点数据为速度大于速度阈值的稀疏点云数据,且动目标点数据为峰值数据。
具体地,可以根据第二数据中的识别点的第一速度或第二速度来识别动目标点。其中,第一速度可以是车辆行驶速度,车辆行驶速度可以为车辆绝对行驶速度,如100千米每小时(kilometers per hour,km/h),或车辆相对行驶速度,如本车相对于参照物,如另一辆车、自行车、行人等的速度。第二速度可以是本车不同部位之间的速度,如不同车轮之间的速度、车轮与车身之间的速度等。
需要说明的是,可以针对上述第一速度和第二速度,分别设置不同的速度阈值,并根据该不同的速度阈值识别第二数据中的动目标点,从而获得动目标点数据,即动目标点对应的稀疏点云数据。应理解,鉴于动目标点的速度大于速度阈值,即动目标点的信号特征变化较大,车载雷达对动目标点更为敏感,因此动目标点数据通常为动目标点所在区域中所有识别点数据中的最大值。其中,动目标点所在区域可以根据预设条件确定,如在由上述横纵坐标对应的坐标系为例,动目标点所在区域可以为:以动目标点为几何中心的圆形区域、或椭圆形区域、或矩形区域、或三角形区域、或其他几何图形的区域。
步骤二,根据动目标点数据从第一数据中获取第一距离多普勒图数据。
其中,第一距离多普勒图数据可以为:与动目标点数据对应的距离多普勒图数据之间的幅度差小于或等于幅度差阈值的距离多普勒图数据。
示例性地,图4为本申请实施例提供的目标识别的场景示意图一。如图4所示,左图为以一个动目标点数据对应的距离多普勒图数据为中心,且分别在横坐标和纵坐标方向上各包括5个识别点数据的正方形区域,即该正方形区域为一个5×5,共计25个识别点的正方形。
然后,可以将识别点的信号幅度与动目标点的信号幅度之间的幅度差小于或等于幅度差阈值的目标点,确定为该正方形区域中的第一距离多普勒图数据。
步骤三,根据第一数据选取图案从第一距离多普勒图数据中获取第四数据。
可选地,上述步骤三,根据第一数据选取图案从第一距离多普勒图数据中获取第四数据,可以包括:
根据多种数据选取图案从第一距离多普勒图数据中获取至少一组第五数据,第四数据为至少一组第五数据中熵值最大的数据,第一数据选取图案为第四数据对应的选取图案。
示例性地,如图4所示,可以根据图中所示的数据选取图案1至数据选取图案3各选择一组距离多普勒图数据作为第五数据。然后,分别计算每组第五数据的熵值, 并选择熵值最大的一组第五数据作为第四数据,且将熵值最大的一组第五数据对应的数据选取图案称为第一数据选取图案。例如,假定图4中所示的数据选取图案2对应的那组距离多普勒图数据的熵值最大,则数据选取图案2即为第一数据选取图案。关于熵值的实现方式可以参考现有技术,本申请实施例对此不再赘述。
也就是说,第一距离多普勒图数据可以为多组,且可以从动目标点所在区域选择幅度仅次于动目标点数据,即能量与动目标点数据最为接近(熵值最大,即能量分布最为均匀)的距离多普勒图数据用于目标识别,以进一步提高目标识别的准确性和成功率。
步骤四,确定第三数据。
具体地,可以对第四数据,即上述步骤三中选择出来的熵值最大的一组距离多普勒图数据,也就是第一数据选取图案对应的那组距离多普勒图数据做波达角估计,以得到第三数据。其中,第三数据为第四数据对应的稀疏点云数据。然后,即可执行下述S303。如此,可以使用更多的稀疏点云数据做目标识别,以提高目标识别的准确性和成功率。
示例性地,如图4所示,可以将数据选取图案2对应的那组距离多普勒图数据做波达角估计,从而得到第三数据。
在另一种可能的设计方法中,上述S302,根据第二数据确定第三数据,可以包括如下步骤:根据如下公式,生成第三数据:X new=X rnd+rand(0,1)*(X mean-X rnd)。
其中,X new为第三数据,X rnd为第二数据,X new和X rnd对应同一个目标点,X mean为第二数据中所有目标点的数据平均值,rand(0,1)为随机数生成函数,且0<rand(0,1)<1。
如此,也可以获取更多的稀疏点云数据,以提高目标识别的准确性和成功率。
示例性地,图5为本申请实施例提供的目标识别的场景示意图二。如图5所示,第二数据可以包括图中所示的目标点A至C,即共计3个目标点的稀疏点云数据,首尾相连即可形成待识别目标的第一轮廓,即图5中虚线所示的轮廓。假定根据上述公式可以生成目标点A1、A2、B1、B2、C1,共计5个目标点的稀疏点云数据,即第三数据。然后,将上述第二数据和第三数据中的部分或全部目标点连接起来形成待识别目标的第二轮廓,如图5中实线所示的轮廓。比较第二轮廓和第一轮廓可知,第二轮廓的线条数目更多,连接关系更为丰富,因此更为接近待识别目标的实际轮廓,即使用第二轮廓识别目标的准确性和成功率更高。
需要说明的是,图5中生成的目标点A1、A2、B1、B2、C1均位于以第二数据对应的目标点A至C中的一个为中心的虚线圆周内。实际应用中,也可以采用其他方式生成第三数据,本申请实施例对此不做具体限定。例如,可以第二数据对应的目标点为椭圆中心,且椭圆长径与车辆行驶方向一致的椭圆周上的目标点数据作为第三数据。
应理解,图5中所示的第二轮廓仅仅为一个示例。实际应用中,可以采用其他方式构造待识别目标的第二轮廓,如凸包线(convex hull)、外接矩形(bounding rectangle)、置信椭圆(confidence ellipse)等,本申请实施例对此不做具体限定。
需要说明的是,图5和图4所示的两种确定第三数据的方法可以独立实施,也可以结合实施。例如,可以先根据图4所示的方法确定一部分第三数据,如第三数据A, 然后根据图5所示的方法生成另一部分第三数据,如第三数据B。其中,第三数据B可以只根据第二数据生成,也可以根据第二数据和第三数据A生成,本申请实施例对此不做具体限定。
进一步地,可以根据配置的目标点的数量,确定第三数据的数量。例如,若配置的目标点的数量为50,且在执行S301时获取的第二数据对应的目标点的数量为25,则可以根据图4和/或图5所示的方法获取另外25个目标点的第三数据。其中,配置的目标点的数量可以根据目标识别算法、目标识别历史记录等因素确定,具体实现可以参考现有实现方式,本申请实施例对此不再赘述。
S303,根据第三数据和第二数据完成目标识别。
具体地,可以使用目标识别算法处理第三数据和第二数据,以确定待识别目标。其中,目标识别算法可以包括如下一种或多种:区域卷积神经网络(regions-donvoutional neural network,RCNN)算法、深度卷积神经网络(deep convoutional neural network,DCNN)算法、迁移学习算法、深度学习算法等。该目标识别算法可以在车载雷达,如毫米波雷达中实现,也可以在车载终端如车主控制器中实现,还可以在车载雷达和车载终端中各实现一部分功能,本申请实施例对此不做具体限定。
之后,即可根据目标识别结果输出给规控模块,用于实现车辆控制和/或目标跟踪,如自动驾驶、智能驾驶、智能导航等。
基于第一方面提供的目标识别方法,控制设备可以根据第二数据,即雷达的稀疏点云数据,从第一数据,即距离多普勒图数据中获取更多的数据,如第三数据,然后使用第三数据和第二数据,也就是使用更多的数据做目标识别,可以提高目标识别的准确性和成功率。
以上结合图3-图5详细说明了本申请实施例提供的目标识别方法。以下结合图6详细说明本申请实施例提供的另一种目标识别装置。
图6是本申请实施例提供的目标识别装置的结构示意图二。该目标识别装置可适用于图1所示出的目标识别系统中,执行图3所示的目标识别方法。为了便于说明,图6仅示出了该目标识别装置的主要部件。
如图6所示,目标识别装置600包括:获取单元601和确定单元602。
其中,获取单元601,用于根据第一数据获取第二数据;第一数据为根据雷达测量数据得到的距离多普勒图数据,第二数据为雷达的稀疏点云数据。
确定单元602,用于根据第二数据确定第三数据;第三数据和第二数据用于目标识别。
在一种可能的设计中,目标识别装置600还可以包括:识别单元603。其中,识别单元603,用于识别第二数据中的动目标点数据;动目标点数据为速度大于速度阈值的稀疏点云数据。
获取单元601,还用于根据动目标点数据从第一数据中获取第一距离多普勒图数据。
获取单元601,还用于根据第一数据选取图案从第一距离多普勒图数据中获取第四数据。
确定单元602,还用于确定第三数据,第三数据为第四数据对应的稀疏点云数据。
其中,第一距离多普勒图数据为:与动目标点数据对应的距离多普勒图数据之间的幅度差小于或等于幅度差阈值的距离多普勒图数据。
可选地,获取单元601,还用于根据多种数据选取图案从第一距离多普勒图数据中获取至少一组第五数据,第四数据为至少一组第五数据中熵值最大的数据,第一数据选取图案为第四数据对应的选取图案。
在另一种可能的设计中,目标识别装置600还可以包括:生成单元604。其中,生成单元604,用于根据如下公式,生成第三数据:X new=X rnd+rand(0,1)*(X mean-X rnd)。其中,X new为第三数据,X rnd为第二数据,X new和X rnd对应同一个目标点,X mean为第二数据中所有目标点的数据平均值,rand(0,1)为随机数生成函数,且0<rand(0,1)<1。
需要说明的是,上述获取单元601、确定单元602、识别单元603、生成单元604也可以集成为一个或多个处理单元,该一个或多个处理单元用于执行上述方法实施例所述的目标识别方法。
可选地,目标识别装置600还可以包括存储单元(图6中未示出),存储单元存储有程序或指令。当处理单元执行该程序或指令时,使得目标识别装置600可以执行图3所示的目标识别方法。
可选地,目标识别装置600还可以包括收发单元(图6中未示出)。可选地,收发单元可以包括输入/输出端口,该输入/输出端口可以用于与各种车载传感器通信,如接收车载传感器采集的原始数据、发送数据采集指令等。
进一步地,收发单元也可以分为接收单元和发送单元。其中,接收单元用于从各车载传感器、其他车载终端、手持终端、无线网络接收数据。发送单元用于向各车载传感器、其他车载终端、手持终端、无线网络发送数据。
需要说明的是,目标识别装置600可以是图2所示的目标识别装置200、车载终端或具有目标识别功能的车载雷达,也可以是设置于图2所示的目标识别装置200、该车载终端或车载雷达中的芯片或其他具有目标识别功能的部件,本申请对此不做限定。
此外,目标识别装置600的技术效果可以参考上述目标识别方法的技术效果,此处不再赘述。
本申请实施例提供一种车载雷达。该车载雷达可以包括图2或图6所示的目标识别装置。
本申请实施例提供一种车载终端。该车载终端可以包括图2或图6所示的目标识别装置。可选地,该车载终端可以为车载控制器。除用于目标识别外,该车载控制器还可以用于控制车辆的行驶状态,如控制车辆启动或停止、加速或减速、转向、变道等。
本申请实施例提供一种目标识别系统。该目标识别系统包括上述车载终端。
可选地,该目标识别系统还可以包括各种车载传感器,如车载雷达、车载摄像头、红外成像设备等。
本申请实施例提供一种车辆。该车辆包括上述车载雷达或车载终端。或者,该车辆包括上述目标识别系统。
本申请实施例提供一种计算机可读存储介质,包括:该计算机可读存储介质中存 储有计算机指令;当该计算机指令在计算机上运行时,使得该计算机执行上述方法实施例所述的目标识别方法。
本申请实施例提供了一种包含指令的计算机程序产品,包括计算机程序或指令,当该计算机程序或指令在计算机上运行时,使得该计算机执行上述方法实施例所述的目标识别方法。
应理解,在本申请实施例中的处理器可以是中央处理单元(central processing unit,CPU),该处理器还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
还应理解,本申请实施例中的存储器可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的随机存取存储器(random access memory,RAM)可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
上述实施例,可以全部或部分地通过软件、硬件(如电路)、固件或其他任意组合来实现。当使用软件实现时,上述实施例可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令或计算机程序。在计算机上加载或执行所述计算机指令或计算机程序时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以为通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集合的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质。半导体介质可以是固态硬盘。
应理解,本文中术语“和/或”,仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,其中A,B可以是单数或者复数。另外,本文中字符“/”,一般表示前后关联对象是一种“或”的关系,但也可能表示的是一种“和/或”的关系,具体可 参考前后文进行理解。
本申请中,“至少一个”是指一个或者多个,“多个”是指两个或两个以上。“以下至少一项(个)”或其类似表达,是指的这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b,或c中的至少一项(个),可以表示:a,b,c,a-b,a-c,b-c,或a-b-c,其中a,b,c可以是单个,也可以是多个。
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read-only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (16)

  1. 一种目标识别方法,其特征在于,包括:
    根据第一数据获取第二数据;所述第一数据为根据雷达测量数据得到的距离多普勒图数据,所述第二数据为雷达的稀疏点云数据;
    根据所述第二数据确定第三数据;所述第三数据和所述第二数据用于目标识别。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述第二数据确定第三数据,包括:
    识别所述第二数据中的动目标点数据;所述动目标点数据为速度大于速度阈值的稀疏点云数据;
    根据所述动目标点数据从所述第一数据中获取第一距离多普勒图数据;
    根据第一数据选取图案从所述第一距离多普勒图数据中获取第四数据;
    确定第三数据,所述第三数据为所述第四数据对应的稀疏点云数据。
  3. 根据权利要求2所述的方法,其特征在于,所述第一距离多普勒图数据为:与所述动目标点数据对应的距离多普勒图数据之间的幅度差小于或等于幅度差阈值的距离多普勒图数据。
  4. 根据权利要求2或3所述的方法,其特征在于,所述根据第一数据选取图案从所述第一距离多普勒图数据中获取第四数据;包括:
    根据多种数据选取图案从所述第一距离多普勒图数据中获取至少一组第五数据,所述第四数据为所述至少一组第五数据中熵值最大的数据,所述第一数据选取图案为所述第四数据对应的选取图案。
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述第二数据确定第三数据,包括:
    根据如下公式,生成所述第三数据:
    X new=X rnd+rand(0,1)*(X mean-X rnd),
    其中,X new为所述第三数据,X rnd为所述第二数据,X new和X rnd对应同一个目标点,X mean为所述第二数据中所有目标点的数据平均值,rand(0,1)为随机数生成函数,且0<rand(0,1)<1。
  6. 一种目标识别装置,其特征在于,包括:获取单元和确定单元;其中,
    所述获取单元,用于根据第一数据获取第二数据;所述第一数据为根据雷达测量数据得到的距离多普勒图数据,所述第二数据为雷达的稀疏点云数据;
    所述确定单元,用于根据所述第二数据确定第三数据;所述第三数据和所述第二数据用于目标识别。
  7. 根据权利要求6所述的装置,其特征在于,所述装置还包括:识别单元;其中,
    所述识别单元,用于识别所述第二数据中的动目标点数据;所述动目标点数据为速度大于速度阈值的稀疏点云数据;
    所述获取单元,还用于根据所述动目标点数据从所述第一数据中获取第一距离多普勒图数据;
    所述获取单元,还用于根据第一数据选取图案从所述第一距离多普勒图数据中获取第四数据;
    所述确定单元,还用于确定第三数据,所述第三数据为所述第四数据对应的稀疏点云数据。
  8. 根据权利要求7所述的装置,其特征在于,所述第一距离多普勒图数据为:与所述动目标点数据对应的距离多普勒图数据之间的幅度差小于或等于幅度差阈值的距离多普勒图数据。
  9. 根据权利要求7或8所述的装置,其特征在于,
    所述获取单元,还用于根据多种数据选取图案从所述第一距离多普勒图数据中获取至少一组第五数据,所述第四数据为所述至少一组第五数据中熵值最大的数据,所述第一数据选取图案为所述第四数据对应的选取图案。
  10. 根据权利要求6所述的装置,其特征在于,所述装置还包括:生成单元;其中,
    所述生成单元,用于根据如下公式,生成所述第三数据:
    X new=X rnd+rand(0,1)*(X mean-X rnd),
    其中,X new为所述第三数据,X rnd为所述第二数据,X new和X rnd对应同一个目标点,X mean为所述第二数据中所有目标点的数据平均值,rand(0,1)为随机数生成函数,且0<rand(0,1)<1。
  11. 一种目标识别装置,其特征在于,所述目标识别装置包括:处理器,所述处理器与存储器耦合;
    所述存储器,用于存储计算机程序;
    所述处理器,用于执行所述存储器中存储的所述计算机程序,以使得所述目标识别装置执行如权利要求1-5中任一项所述的目标识别方法。
  12. 一种车载雷达,其特征在于,所述车载雷达包括如权利要求6-11中任一项所述的目标识别装置。
  13. 一种车载终端,其特征在于,所述车载终端包括如权利要求6-11中任一项所述的目标识别装置。
  14. 一种车辆,其特征在于,所述车辆包括如权利要求12所述的车载雷达,和/或,如权利要求13所述的车载终端。
  15. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质包括程序或指令,当所述程序或指令在计算机上运行时,使得所述计算机执行如权利要求1-5中任一项所述的目标识别方法。
  16. 一种计算机程序产品,其特征在于,所述计算机程序产品包括:计算机程序代码,当所述计算机程序代码在计算机上运行时,使得所述计算机执行如权利要求1-5中任一项所述的目标识别方法。
PCT/CN2021/073984 2020-02-21 2021-01-27 目标识别方法及装置 WO2021164514A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010108659.4A CN113296086A (zh) 2020-02-21 2020-02-21 目标识别方法及装置
CN202010108659.4 2020-02-21

Publications (1)

Publication Number Publication Date
WO2021164514A1 true WO2021164514A1 (zh) 2021-08-26

Family

ID=77318470

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/073984 WO2021164514A1 (zh) 2020-02-21 2021-01-27 目标识别方法及装置

Country Status (2)

Country Link
CN (1) CN113296086A (zh)
WO (1) WO2021164514A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2790143C1 (ru) * 2022-05-18 2023-02-14 Акционерное общество "Научно-исследовательский институт Приборостроения имени В.В. Тихомирова" Способ распознавания типа воздушного объекта по турбинному эффекту

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117169888A (zh) * 2022-05-25 2023-12-05 华为技术有限公司 数据处理方法与处理装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109448111A (zh) * 2018-10-25 2019-03-08 山东鲁能软件技术有限公司 一种图像三维曲面模型优化构建方法及装置
CN110208793A (zh) * 2019-04-26 2019-09-06 纵目科技(上海)股份有限公司 基于毫米波雷达的辅助驾驶系统、方法、终端和介质
CN110427986A (zh) * 2019-07-16 2019-11-08 浙江大学 一种基于毫米波雷达点云特征的核支持向量机目标分类方法
CN110751731A (zh) * 2019-10-16 2020-02-04 光沦科技(深圳)有限公司 一种用于结构光的3d重建方法和系统
US20200058987A1 (en) * 2018-08-17 2020-02-20 Metawave Corporation Multi-layer, multi-steering antenna system for autonomous vehicles

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11029403B2 (en) * 2017-12-18 2021-06-08 The Government Of The United States Of America, As Represented By The Secretary Of The Navy Millimeter-wave airborne radar for 3-Dimensional imaging of moving and stationary targets
US11133577B2 (en) * 2018-05-24 2021-09-28 Metawave Corporation Intelligent meta-structure antennas with targeted polarization for object identification
CN110133610B (zh) * 2019-05-14 2020-12-15 浙江大学 基于时变距离-多普勒图的超宽带雷达动作识别方法
CN110450784A (zh) * 2019-07-30 2019-11-15 深圳普捷利科技有限公司 一种基于fmcw雷达的驾驶员状态监视方法及系统
CN110466562B (zh) * 2019-08-14 2020-12-08 南京慧尔视智能科技有限公司 一种基于红外激光和微波的平交道口存在检测装置与方法
CN110609262B (zh) * 2019-08-27 2023-05-05 南京理工大学 一种场面监视雷达的三维恒虚警检测方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200058987A1 (en) * 2018-08-17 2020-02-20 Metawave Corporation Multi-layer, multi-steering antenna system for autonomous vehicles
CN109448111A (zh) * 2018-10-25 2019-03-08 山东鲁能软件技术有限公司 一种图像三维曲面模型优化构建方法及装置
CN110208793A (zh) * 2019-04-26 2019-09-06 纵目科技(上海)股份有限公司 基于毫米波雷达的辅助驾驶系统、方法、终端和介质
CN110427986A (zh) * 2019-07-16 2019-11-08 浙江大学 一种基于毫米波雷达点云特征的核支持向量机目标分类方法
CN110751731A (zh) * 2019-10-16 2020-02-04 光沦科技(深圳)有限公司 一种用于结构光的3d重建方法和系统

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2790143C1 (ru) * 2022-05-18 2023-02-14 Акционерное общество "Научно-исследовательский институт Приборостроения имени В.В. Тихомирова" Способ распознавания типа воздушного объекта по турбинному эффекту

Also Published As

Publication number Publication date
CN113296086A (zh) 2021-08-24

Similar Documents

Publication Publication Date Title
CN106255899B (zh) 用于将对象用信号通知给配备有此装置的车辆的导航模块的装置
CN108688660B (zh) 运行范围确定装置
US10860028B2 (en) Vehicle control apparatus, vehicle control method, and program
US9892329B2 (en) Animal type determination device
US10890658B2 (en) Vehicle group control device
Rawashdeh et al. Collaborative automated driving: A machine learning-based method to enhance the accuracy of shared information
US9669838B2 (en) Method and system for information use
US20200104614A1 (en) Method and device for positioning vehicle, device, and computer readable storage medium
CN113743171A (zh) 目标检测方法及装置
Mammeri et al. Extending the detection range of vision-based vehicular instrumentation
WO2021164514A1 (zh) 目标识别方法及装置
CN114550142A (zh) 基于4d毫米波雷达和图像识别融合的车位检测方法
US20200103918A1 (en) Method for detecting caller by autonomous vehicle
US11640172B2 (en) Vehicle controls based on reliability values calculated from infrastructure information
Liu et al. Vision‐based inter‐vehicle distance estimation for driver alarm system
CN115685224A (zh) 一种激光雷达点云聚类方法、装置、激光雷达及车辆
US20210354634A1 (en) Electronic device for vehicle and method of operating electronic device for vehicle
CN110023781B (zh) 用于根据车辆周围环境的雷达签名确定车辆的准确位置的方法和设备
US11919528B2 (en) Vehicle control system and vehicle control method
US20230358547A1 (en) Map matching method and apparatus, electronic device and storage medium
CN111723866A (zh) 点云聚类的方法及装置、无人车、可读存储介质
CN114694375B (zh) 交通监视系统、交通监视方法及存储介质
CN113312403B (zh) 地图获取方法、装置、电子设备及存储介质
WO2023036032A1 (zh) 一种车道线检测方法及装置
US20230147434A1 (en) System for localizing three-dimensional objects

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21757789

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21757789

Country of ref document: EP

Kind code of ref document: A1