CN116626630B - Object classification method and device, electronic equipment and storage medium - Google Patents

Object classification method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116626630B
CN116626630B CN202310913111.0A CN202310913111A CN116626630B CN 116626630 B CN116626630 B CN 116626630B CN 202310913111 A CN202310913111 A CN 202310913111A CN 116626630 B CN116626630 B CN 116626630B
Authority
CN
China
Prior art keywords
distance
target object
probability
gaussian distribution
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310913111.0A
Other languages
Chinese (zh)
Other versions
CN116626630A (en
Inventor
胡大林
杨振兴
杨强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Saimu Technology Co ltd
Original Assignee
Beijing Saimu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Saimu Technology Co ltd filed Critical Beijing Saimu Technology Co ltd
Priority to CN202310913111.0A priority Critical patent/CN116626630B/en
Publication of CN116626630A publication Critical patent/CN116626630A/en
Application granted granted Critical
Publication of CN116626630B publication Critical patent/CN116626630B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Probability & Statistics with Applications (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The application provides an object classification method, an object classification device, electronic equipment and a storage medium, which relate to the technical field of radar sensors and comprise the following steps: the method comprises the steps of obtaining a plurality of objects through hybrid radar sensor identification, determining a first polar coordinate corresponding to a target object and a second polar coordinate corresponding to a nearest object, calculating a distance difference value and an angle difference value between the target object and the nearest object according to the first polar coordinate and the second polar coordinate, obtaining a Gaussian distribution function of the target object under a polar coordinate system through preset Gaussian distribution variance, calculating a probability result that the target object and the nearest object are identified as the same object according to the distance difference value, the angle difference value, radar resolution and the Gaussian distribution function, and determining a classification result of the target object according to the probability result. According to the method, the output of the object classification probability is realized in the hybrid radar sensor through the Gaussian variance, so that the calculation efficiency is ensured, and meanwhile, the radar application scene is prolonged.

Description

Object classification method and device, electronic equipment and storage medium
Technical Field
The present application relates to the field of radar sensors, and in particular, to an object classification method, an object classification device, an electronic device, and a storage medium.
Background
In the automatic driving technology, a vehicle-mounted millimeter wave sensor is used for detecting a target obstacle, in order to simulate an algorithm of an automatic driving automobile to sensing data when automatic driving simulation software is designed, detection data of the millimeter wave radar needs to be simulated, meanwhile, a simulated millimeter wave radar result needs to meet the characteristics of a real millimeter wave radar, one of the millimeter wave radar result is resolution, two objects which are too close to each other or relative to a sensor angle cannot be classified into one object by the millimeter wave, and due to the accuracy and the error of the millimeter wave radar, the phenomenon can occur with probability, and in the prior art, the detection simulation of the target obstacle is usually completed by adopting a hybrid sensor or a physical sensor.
Because the output data of the physical sensor is obtained by calculating each frame through a physical flow, if the output recognition probability is to be outputted, the output probability cannot be realized in real time by measuring and counting multi-frame simulation data, and the mixed model has higher calculation efficiency, but does not support the calculation of the probability or the possibility that the target obstacle is correctly classified.
Disclosure of Invention
In view of the above, an object of the present application is to provide at least a method, an apparatus, an electronic device, and a storage medium for classifying objects, which output the object classification probability in a hybrid radar sensor by gaussian variance, and which extend the radar application scenario while ensuring the calculation efficiency.
The application mainly comprises the following aspects:
in a first aspect, an embodiment of the present application provides a method for classifying objects, including:
in a preset simulation environment, a plurality of objects are obtained through recognition of a hybrid radar sensor, the nearest object which is closest to a target object in the preset simulation environment is determined, a first polar coordinate corresponding to the target object and a second polar coordinate corresponding to the nearest object are respectively determined under a polar coordinate system taking the hybrid radar sensor as a pole, a distance difference value and an angle difference value between the target object and the nearest object are calculated according to the first polar coordinate and the second polar coordinate, a Gaussian distribution function of the target object under the polar coordinate system is obtained through a preset Gaussian distribution variance, a probability result that the target object and the nearest object are recognized as the same object is calculated according to the distance difference value, the angle difference value, the radar resolution and the Gaussian distribution function, and a classification result of the target object is determined according to the probability result.
In one possible embodiment, the first polar coordinate comprises a first distance value and a first angle value, and the second polar coordinate comprises a second distance value and a second angle value, wherein the distance difference and the angle difference between the target object and the nearest neighbor object are determined by: determining a difference between the first distance value and the second distance value as a distance difference between the target object and the nearest neighbor object; the difference between the first angle value and the second angle value is determined as the angle difference between the target object and the nearest neighbor object.
In one possible implementation manner, the preset gaussian distribution variance includes a distance gaussian distribution variance and an angle gaussian distribution variance, and the gaussian distribution function includes a distance gaussian distribution function and an angle gaussian distribution function, where the gaussian distribution function of the target object under the polar coordinate system is obtained by the following manner: adding a distance Gaussian distribution variance to the first distance value to obtain a distance Gaussian distribution function, wherein the horizontal axis of the distance Gaussian distribution function represents a predicted distance value obtained by the target object under the influence of the distance Gaussian distribution variance, and the vertical axis of the distance Gaussian distribution function represents the occurrence probability of the predicted distance value; and adding an angular Gaussian distribution variance to the first angle value to obtain an angular Gaussian distribution function, wherein the horizontal axis of the angular Gaussian distribution function represents a plurality of predicted angle values obtained by the target object under the influence of the angular Gaussian distribution variance, and the vertical axis of the angular Gaussian distribution function represents the occurrence probability of the predicted angle.
In one possible embodiment, the radar resolution includes a range resolution and an angle resolution, wherein the likelihood result is determined by: determining a distance range in which the target object and the nearest object cannot be correctly classified according to the distance difference value and the distance resolution; calculating a first probability that the predicted distance of the target object is within the distance range according to the distance Gaussian distribution variance, the distance range and the distance resolution, wherein the first probability represents the possibility that the target object and the nearest neighbor object cannot be correctly classified in the distance dimension; determining an angular range in which the target object and the nearest neighbor object cannot be correctly classified based on the angle difference value and the angle resolution; calculating a second probability that the predicted angle of the target object is in the angle range according to the angular Gaussian distribution variance, the angle range and the angular resolution, wherein the second probability represents the possibility that the target object and the nearest neighbor object cannot be correctly classified in the angle dimension; and calculating a probability result that the target object and the nearest neighbor object are identified as the same object according to the first probability and the second probability.
In one possible implementation, the first probability is determined by the following formula:
in the course of this formula (ii) the formula,representing a first probability->Representing the difference in distance between the target object and the nearest neighbor object,for distance Gaussian distribution variance +.>Represents the upper limit value of the distance range,/-, for example>Represents the lower limit value of the distance range,/-, for>Representing the distance resolution.
The second probability is determined by the following formula:
in the course of this formula (ii) the formula,representing a second probability->Representing the difference in angle between the target object and the nearest neighbor,for the angular gaussian distribution variance +.>Represents the upper limit value of the angle range,/->Represents the lower limit value of the angular range,/-, for>Indicating angular resolution.
In one possible embodiment, the step of calculating a probability result that the target object is identified as the same object as the nearest neighbor object based on the first probability and the second probability includes: calculating a product between the first probability and the second probability; the difference between 1 and the product is determined as a result of the likelihood that the target object and the nearest neighbor object are identified as the same object.
In one possible embodiment, the step of determining the classification result of the target object according to the likelihood result includes: comparing the probability result with a first preset threshold value, and if the probability result is greater than or equal to the first preset threshold value, determining that the radar sensor classifies the target object correctly; if the probability result is smaller than the first preset threshold, judging whether the probability result is smaller than the second preset threshold; if the probability result is smaller than a second preset threshold value, determining that the radar sensor classifies the target object wrongly; if the probability result is greater than or equal to a second preset threshold value, determining that the radar sensor is inaccurate in classifying the target object.
In a second aspect, an embodiment of the present application further provides an object classification apparatus, where the apparatus includes:
the identification module is used for identifying a plurality of objects through the hybrid radar sensor in a preset simulation environment; the first determining module is used for determining the nearest object which is closest to the target object in a preset simulation environment, respectively determining a first polar coordinate corresponding to the target object and a second polar coordinate corresponding to the nearest object in a polar coordinate system taking the radar sensor as a pole, and the first calculating module is used for calculating a distance difference value and an angle difference value between the target object and the nearest object according to the first polar coordinate and the second polar coordinate; the second determining module is used for obtaining a Gaussian distribution function of the target object under a polar coordinate system by presetting a Gaussian distribution variance; the second calculation module is used for calculating a probability result that the target object and the nearest object are identified as the same object according to the distance difference value, the angle difference value, the radar resolution and the Gaussian distribution function; and the third determining module is used for determining the classification result of the target object according to the possibility result.
In a third aspect, an embodiment of the present application further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine readable instructions executable by the processor, the processor and the memory in communication via the bus when the electronic device is running, the machine readable instructions when executed by the processor performing the steps of the method of object classification in the first aspect or any of the possible implementation manners of the first aspect.
In a fourth aspect, the embodiment of the present application further provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor performs the steps of the method for classifying objects in the first aspect or any of the possible implementation manners of the first aspect.
The embodiment of the application provides an object classification method, an object classification device, electronic equipment and a storage medium, which comprise the following steps: the method comprises the steps of obtaining a plurality of objects through hybrid radar sensor identification, determining a first polar coordinate corresponding to a target object and a second polar coordinate corresponding to a nearest object, calculating a distance difference value and an angle difference value between the target object and the nearest object according to the first polar coordinate and the second polar coordinate, obtaining a Gaussian distribution function of the target object under a polar coordinate system through preset Gaussian distribution variance, calculating a probability result that the target object and the nearest object are identified as the same object according to the distance difference value, the angle difference value, radar resolution and the Gaussian distribution function, and determining a classification result of the target object according to the probability result. According to the method, the output of the object classification probability is realized in the hybrid radar sensor through the Gaussian variance, so that the calculation efficiency is ensured, and meanwhile, the radar application scene is prolonged.
The application has the advantages that:
1. the method realizes the output of the probability of correctly classifying the object, bypasses some actual measurement and physical complicated processes, is directly applied to the high-efficiency hybrid sensor, and avoids the influence of the non-real-time simulation of the physical sensor on the verification of the simulation technology.
2. The method simplifies the flow of target classification, makes the data output speed faster, quantifies the recognition classification capability of the radar with the concept of correct classification probability, and can be applied to simulation verification of more automatic driving algorithms.
In order to make the above objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for classifying objects according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a distance Gaussian distribution function according to an embodiment of the application;
fig. 3 is a schematic structural diagram of an object classification device according to an embodiment of the present application;
fig. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for the purpose of illustration and description only and are not intended to limit the scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this disclosure, illustrates operations implemented according to some embodiments of the present application. It should be appreciated that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to or removed from the flow diagrams by those skilled in the art under the direction of the present disclosure.
In addition, the described embodiments are only some, but not all, embodiments of the application. The components of the embodiments of the present application generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the application, as presented in the figures, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by a person skilled in the art based on embodiments of the application without making any inventive effort, fall within the scope of the application.
In the prior art, physical sensors and hybrid sensors are commonly used for measuring obstacles in a simulation environment, and specifically:
compared with a physical sensor with higher precision and more relying on the millimeter wave principle, the hybrid radar sensor directly adds a plurality of millimeter wave characteristics into data in an approximation way, and has the advantages of high operation speed, but the hybrid sensor does not have the function of outputting probability parameters.
The physical sensor needs to use optical tracking method to identify the object or measure the obstacle, so that the signal emission and receiving of the sensor and more other sensor parameters need to be considered, while the principle of the sensor operation is restored, larger calculation amount and modeling of environmental materials are needed, resulting in low classification calculation efficiency of the physical sensor.
And because the output data of the physical sensor is obtained by calculating each frame through a physical process, the probability parameters cannot be output in real time, for example, if the probability of correctly classifying two target objects is 90%, the probability of the two target objects in the physical sensor is expressed as follows: of the 100 frames of simulation data, 90 frames of simulation data are identified as two target objects, and 10 frames of simulation data are identified as one target object, that is, for a physical sensor, it is impossible to output a probability that the target is correctly classified within a real-time frame.
Based on this, the embodiment of the application provides an object classification method, which enables a hybrid radar sensor to realize the real-time output of parameters of the possibility that targets are correctly classified, and extends radar application scenes while ensuring calculation efficiency, and specifically comprises the following steps:
referring to fig. 1, fig. 1 is a flowchart illustrating an object classification method according to an embodiment of the application. As shown in fig. 1, the object classification method provided by the embodiment of the application includes the following steps:
s100, in a preset simulation environment, a plurality of objects are obtained through recognition of the hybrid radar sensor.
In a specific implementation, the hybrid radar sensor may be a hybrid millimeter wave radar sensor, the preset simulation environment may be an autopilot simulation environment built in advance, the hybrid radar sensor is located on a host vehicle in the autopilot simulation environment, the plurality of objects may be vehicles or other obstacles, the autopilot simulation environment is scanned by controlling the hybrid radar sensor, and the plurality of objects located in the detection range of the hybrid radar sensor can be determined according to detection data of the hybrid radar sensor.
S200, determining the nearest object closest to the target object in a preset simulation environment, and respectively determining a first polar coordinate corresponding to the target object and a second polar coordinate corresponding to the nearest object under a polar coordinate system taking the hybrid radar sensor as a pole.
In the application, any one of a plurality of objects can be selected as a target object, according to the detection data output by the radar sensor, the position coordinate of each object under a rectangular coordinate system with the radar sensor as an origin can be determined, according to the position coordinate corresponding to each object, the distance between the target object and other objects can be calculated, and then the object corresponding to the minimum distance value is determined as the nearest object corresponding to the target object.
In the application, the position coordinates corresponding to the target object and the nearest object are subjected to polar coordinate conversion respectively, so that a first polar coordinate corresponding to the target object and a second polar coordinate corresponding to the nearest object can be obtained, wherein the first polar coordinate comprises a first distance value and a first angle value, and the second polar coordinate comprises a second distance value and a second angle value.
In particular, the first polar coordinate may be expressed as(/>,/>),/>Representing a first distance value, ">Representing a first angle value, the second pole coordinate may be expressed as +.>(/>,/>),/>Representing a second distance value, ">Representing a second angle value.
S300, calculating a distance difference value and an angle difference value between the target object and the nearest neighboring object according to the first polar coordinate and the second polar coordinate.
In a preferred embodiment, the distance and angle differences between the target object and the nearest neighbor object are determined by:
the difference between the first distance value and the second distance value is determined as the distance difference between the target object and the nearest object, and the difference between the first angle value and the second angle value is determined as the angle difference between the target object and the nearest object.
In particular, the distance differenceAngle difference->
S400, obtaining a Gaussian distribution function of the target object under a polar coordinate system by presetting the Gaussian distribution variance.
In the application, measurement errors possibly occurring in the target object are considered, but measurement errors corresponding to other objects except the target object are not considered (namely, the measurement errors corresponding to other objects are regarded as zero), so that a preset Gaussian distribution variance is only introduced into the target object, the preset Gaussian distribution variance comprises a distance Gaussian distribution variance and an angle Gaussian distribution variance from two error dimensions of a distance and an angle, and the Gaussian distribution function comprises a distance Gaussian distribution function and an angle Gaussian distribution function.
In a preferred embodiment, the gaussian distribution function of the target object in the polar coordinate system is obtained by:
adding a distance Gaussian distribution variance to the first distance value to obtain a distance Gaussian distribution function, wherein the horizontal axis of the distance Gaussian distribution function represents a predicted distance value obtained by the target object under the influence of the distance Gaussian distribution variance, the vertical axis of the distance Gaussian distribution function represents the occurrence probability of the predicted distance value, adding an angle Gaussian distribution variance to the first angle value to obtain an angle Gaussian distribution function, the horizontal axis of the angle Gaussian distribution function represents a plurality of predicted angle values obtained by the target object under the influence of the angle Gaussian distribution variance, and the vertical axis of the angle Gaussian distribution function represents the occurrence probability of the predicted angle.
Referring to fig. 2, fig. 2 is a schematic diagram illustrating a distance gaussian distribution function according to an embodiment of the present application. As shown in FIG. 2, FIG. 2 shows the distance Gaussian distribution variance asA normal distribution curve L corresponding to a distance Gaussian distribution function with a mathematical expectation of mu, wherein the horizontal axis R represents that the first distance value is +.>The vertical axis P represents the probability of occurrence of the predicted distance value obtained under the influence, and at this time, the target Object1 is taken as the center point, that is, the target Object1 is located at the position shown in fig. 2, and the nearest Object2 is located at the position shown in fig. 2 relative to the target Object1, because of the symmetry of gaussian distribution, the direct relative distance is taken to be positive.
S500, calculating a probability result that the target object and the nearest neighbor object are identified as the same object according to the distance difference value, the angle difference value, the radar resolution and the Gaussian distribution function.
Wherein the radar resolution includes a range resolution and an angle resolution.
In a preferred embodiment, the likelihood results are determined by:
and determining a distance range in which the target object and the nearest object cannot be correctly classified according to the distance difference value and the distance resolution, and calculating a first probability that the predicted distance of the target object is in the distance range according to the distance Gaussian distribution variance, the distance range and the distance resolution, wherein the first probability represents the possibility that the target object and the nearest object cannot be correctly classified in the distance dimension.
As shown in fig. 2, the distance resolution of the width between the broken lines LL1 and LL2 is doubledThe determined distance range is (R->,R+/>) Thus, the randomness of the distance is predicted based on the Object1The probability that the Object1 cannot be correctly classified, i.e. the first probability, can be known only by calculating the corresponding first probability that the Object1 falls within the distance range.
If the predicted distance value of the target Object falls within the distance range formed by the broken lines LL1 and LL2, it is determined that the target Object cannot be correctly classified, that is, the target Object1 and the nearest neighboring Object2 are recognized as the same Object by the mixed millimeter wave radar.
And determining an angle range in which the target object and the nearest object cannot be correctly classified based on the angle difference value and the angle resolution, and calculating a second probability that the predicted angle of the target object is in the angle range according to the angle Gaussian distribution variance, the angle range and the angle resolution, wherein the second probability represents the possibility that the target object and the nearest object cannot be correctly classified in the angle dimension.
The determination manner of the second probability is the same as that of the first probability, and will not be described here.
And calculating a probability result that the target object and the nearest neighbor object are identified as the same object according to the first probability and the second probability.
In another preferred embodiment, the first probability is determined by the following formula:
in the course of this formula (ii) the formula,representing a first probability->Representing the difference in distance between the target object and the nearest neighbor object,for distance Gaussian distribution variance +.>Representing distanceUpper limit of the range,/->Represents the lower limit value of the distance range,representing the distance resolution.
The second probability is determined by the following formula:
in the course of this formula (ii) the formula,representing a second probability->Representing the difference in angle between the target object and the nearest neighbor,for the angular gaussian distribution variance +.>Represents the upper limit value of the angle range,/->Represents the lower limit value of the angular range,indicating angular resolution.
In a specific embodiment, the step of calculating the probability result that the target object and the nearest neighbor object are identified as the same object according to the first probability and the second probability comprises:
the product between the first probability and the second probability is calculated, and the difference between 1 and the product is determined as a result of the likelihood that the target object and the nearest neighbor object are identified as the same object.
Specifically, the calculation can be performed by the following formula:
wherein,,representing the likelihood result that the target object and the nearest neighbor object are identified as the same object.
In the application, the above-mentioned process is repeated for all objects in the measuring and detecting range of the mixed millimeter wave sensor, and the output (namely, the probability result) of the probability that the targets are correctly classified can be added to each object, thereby achieving the function of outputting probability type parameters of the mixed millimeter wave sensor.
S600, determining a classification result of the target object according to the probability result.
In a preferred embodiment, step S600 includes:
comparing the probability result with a first preset threshold value, if the probability result is larger than or equal to the first preset threshold value, determining that the radar sensor classifies the target object correctly, if the probability result is smaller than the first preset threshold value, judging whether the probability result is smaller than a second preset threshold value, if the probability result is smaller than the second preset threshold value, determining that the radar sensor classifies the target object incorrectly, and if the probability result is larger than or equal to the second preset threshold value, determining that the radar sensor classifies the target object inaccurately.
Based on the same application conception, the embodiment of the application also provides an object classification device corresponding to the object classification method provided by the embodiment, and because the principle of solving the problem of the device in the embodiment of the application is similar to that of the object classification method of the embodiment of the application, the implementation of the device can refer to the implementation of the method, and the repetition is omitted.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an object classification device according to an embodiment of the application. As shown in fig. 3, the apparatus includes:
the recognition module 700 is configured to recognize a plurality of objects through the hybrid radar sensor in a preset simulation environment.
The first determining module 710 is configured to determine a nearest object closest to the target object in the preset simulation environment, and determine a first polar coordinate corresponding to the target object and a second polar coordinate corresponding to the nearest object in a polar coordinate system using the hybrid radar sensor as a pole, respectively.
The first calculating module 720 is configured to calculate a distance difference and an angle difference between the target object and the nearest neighboring object according to the first polar coordinate and the second polar coordinate.
The second determining module 730 is configured to obtain a gaussian distribution function of the target object in the polar coordinate system by presetting a gaussian distribution variance.
The second calculation module 740 is configured to calculate a probability result that the target object and the nearest neighboring object are identified as the same object according to the distance difference, the angle difference, the radar resolution and the gaussian distribution function.
The third determining module 750 determines a classification result of the target object according to the likelihood result.
Based on the same application concept, please refer to fig. 4, fig. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 4, the electronic device 800 includes: processor 810, memory 820, and bus 830, memory 820 storing machine-readable instructions executable by processor 810, which when executed by processor 810 perform the steps of the object classification method as provided in any of the embodiments described above, when electronic device 800 is in operation, by communicating between processor 810 and memory 820 via bus 830.
Based on the same application concept, the embodiment of the present application further provides a computer readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps of the object classification method provided in the above embodiment are performed.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the foregoing method embodiments, which are not described herein again. In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be other manners of division in actual implementation, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, device or unit indirect coupling or communication connection, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily appreciate variations or alternatives within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (9)

1. A method of classifying objects, the method comprising:
in a preset simulation environment, a plurality of objects are obtained through the identification of the hybrid radar sensor;
determining a nearest object closest to a target object in a preset simulation environment, and respectively determining a first polar coordinate corresponding to the target object and a second polar coordinate corresponding to the nearest object under a polar coordinate system taking a hybrid radar sensor as a pole;
calculating a distance difference value and an angle difference value between the target object and the nearest object according to the first polar coordinate and the second polar coordinate;
obtaining a Gaussian distribution function of the target object under the polar coordinate system by presetting Gaussian distribution variance;
calculating a probability result that the target object and the nearest object are identified as the same object according to the distance difference value, the angle difference value, the radar resolution and the Gaussian distribution function;
determining a classification result of the target object according to the likelihood result;
the radar resolution includes a range resolution and an angle resolution,
wherein the likelihood result is determined by:
determining a distance range in which the target object and the nearest object cannot be correctly classified according to the distance difference value and the distance resolution;
calculating a first probability that the predicted distance of the target object is within the distance range according to the distance Gaussian distribution variance, the distance range and the distance resolution, wherein the first probability represents the possibility that the target object and the nearest object cannot be correctly classified in the distance dimension;
determining an angular range in which the target object and the nearest neighbor object cannot be correctly classified based on the angle difference value and the angle resolution;
calculating a second probability that the predicted angle of the target object is in the angle range according to the angular Gaussian distribution variance, the angle range and the angle resolution, wherein the second probability represents the possibility that the target object and the nearest object cannot be correctly classified in the angle dimension;
and calculating a probability result that the target object and the nearest object are identified as the same object according to the first probability and the second probability.
2. The method of claim 1, wherein the first polar coordinate comprises a first distance value and a first angle value, the second polar coordinate comprises a second distance value and a second angle value,
wherein the distance difference and the angle difference between the target object and the nearest object are determined by:
determining a difference between the first distance value and the second distance value as a distance difference between the target object and the nearest object;
a difference between the first angle value and the second angle value is determined as an angle difference between the target object and the nearest neighbor object.
3. The method of claim 2, wherein the predetermined gaussian distribution variance comprises a distance gaussian distribution variance and an angular gaussian distribution variance, the gaussian distribution function comprises a distance gaussian distribution function and an angular gaussian distribution function,
the Gaussian distribution function of the target object under the polar coordinate system is obtained by the following steps:
adding the distance Gaussian distribution variance to the first distance value to obtain a distance Gaussian distribution function, wherein the horizontal axis of the distance Gaussian distribution function represents a predicted distance value of the target object under the influence of the distance Gaussian distribution variance, and the vertical axis of the distance Gaussian distribution function represents the occurrence probability of the predicted distance value;
and adding the angular Gaussian distribution variance to the first angle value to obtain an angular Gaussian distribution function, wherein the horizontal axis of the angular Gaussian distribution function represents a plurality of predicted angle values obtained by the target object under the influence of the angular Gaussian distribution variance, and the vertical axis of the angular Gaussian distribution function represents the occurrence probability of the predicted angle.
4. A method according to claim 3, wherein the first probability is determined by the following formula:
in the course of this formula (ii) the formula,representing a first probability->Representing the difference in distance between said target object and said nearest neighbor object,/or->For distance Gaussian distribution variance +.>Represents the upper limit value of said distance range, +.>Represents the lower limit value of said distance range, +.>Representing the distance resolution;
the second probability is determined by the following formula:
in the course of this formula (ii) the formula,representing a second probability->Representing the angle difference between said target object and said nearest neighbor object,/for>For the angular gaussian distribution variance +.>Represents the upper limit value of said angle range, +.>Represents the lower limit value of said angle range, +.>Indicating angular resolution.
5. The method of claim 4, wherein calculating a likelihood result that the target object is identified as the same object as the nearest neighbor object based on the first probability and the second probability comprises:
calculating a product between the first probability and the second probability;
the difference between 1 and the product is determined as a result of the likelihood that the target object and the nearest neighbor object are identified as the same object.
6. The method of claim 1, wherein determining a classification result of the target object based on the likelihood result comprises:
comparing the possibility result with a first preset threshold value, and if the possibility result is larger than or equal to the first preset threshold value, determining that the radar sensor classifies the target object correctly;
if the possibility result is smaller than the first preset threshold value, judging whether the possibility result is smaller than a second preset threshold value or not;
if the probability result is smaller than the second preset threshold value, determining that the radar sensor is wrong in classifying the target object;
and if the probability result is greater than or equal to the second preset threshold value, determining that the radar sensor is inaccurate in classifying the target object.
7. An object classification device, the device comprising:
the identification module is used for identifying a plurality of objects through the hybrid radar sensor in a preset simulation environment;
the first determining module is used for determining the nearest object closest to the target object in a preset simulation environment and respectively determining a first polar coordinate corresponding to the target object and a second polar coordinate corresponding to the nearest object under a polar coordinate system taking the hybrid radar sensor as a pole;
the first calculation module is used for calculating a distance difference value and an angle difference value between the target object and the nearest object according to the first polar coordinate and the second polar coordinate;
the second determining module is used for obtaining a Gaussian distribution function of the target object under the polar coordinate system by presetting a Gaussian distribution variance;
the second calculation module is used for calculating a probability result that the target object and the nearest object are identified as the same object according to the distance difference value, the angle difference value, the radar resolution and the Gaussian distribution function;
a third determining module for determining a classification result of the target object according to the likelihood result;
the radar resolution includes a range resolution and an angle resolution,
wherein the second computing module is further configured to:
determining a distance range in which the target object and the nearest object cannot be correctly classified according to the distance difference value and the distance resolution;
calculating a first probability that the predicted distance of the target object is within the distance range according to the distance Gaussian distribution variance, the distance range and the distance resolution, wherein the first probability represents the possibility that the target object and the nearest object cannot be correctly classified in the distance dimension;
determining an angular range in which the target object and the nearest neighbor object cannot be correctly classified based on the angle difference value and the angle resolution;
calculating a second probability that the predicted angle of the target object is in the angle range according to the angular Gaussian distribution variance, the angle range and the angle resolution, wherein the second probability represents the possibility that the target object and the nearest object cannot be correctly classified in the angle dimension;
and calculating a probability result that the target object and the nearest object are identified as the same object according to the first probability and the second probability.
8. An electronic device, comprising: a processor, a memory and a bus, said memory storing machine readable instructions executable by said processor, said processor and said memory communicating via said bus when the electronic device is running, said machine readable instructions when executed by said processor performing the steps of the object classification method according to any of claims 1 to 6.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the object classification method according to any of claims 1 to 6.
CN202310913111.0A 2023-07-25 2023-07-25 Object classification method and device, electronic equipment and storage medium Active CN116626630B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310913111.0A CN116626630B (en) 2023-07-25 2023-07-25 Object classification method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310913111.0A CN116626630B (en) 2023-07-25 2023-07-25 Object classification method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116626630A CN116626630A (en) 2023-08-22
CN116626630B true CN116626630B (en) 2023-09-29

Family

ID=87610326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310913111.0A Active CN116626630B (en) 2023-07-25 2023-07-25 Object classification method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116626630B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004111938A1 (en) * 2003-05-30 2004-12-23 Robert Bosch Gmbh Method and device for object recognition in driver aid systems for motor vehicles
JP2014115119A (en) * 2012-12-06 2014-06-26 Toyota Motor Corp Object detector
CN105572663A (en) * 2014-09-19 2016-05-11 通用汽车环球科技运作有限责任公司 Detection of a distributed radar target based on an auxiliary sensor
WO2018234130A1 (en) * 2017-06-21 2018-12-27 Valeo Schalter Und Sensoren Gmbh Classification and localization of an object by a lidar sensor apparatus of a motor vehicle
CN110146865A (en) * 2019-05-31 2019-08-20 阿里巴巴集团控股有限公司 Target identification method and device for radar image
CN115272416A (en) * 2022-08-16 2022-11-01 太原理工大学 Vehicle and pedestrian detection tracking method and system based on multi-source sensor fusion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004111938A1 (en) * 2003-05-30 2004-12-23 Robert Bosch Gmbh Method and device for object recognition in driver aid systems for motor vehicles
JP2014115119A (en) * 2012-12-06 2014-06-26 Toyota Motor Corp Object detector
CN105572663A (en) * 2014-09-19 2016-05-11 通用汽车环球科技运作有限责任公司 Detection of a distributed radar target based on an auxiliary sensor
WO2018234130A1 (en) * 2017-06-21 2018-12-27 Valeo Schalter Und Sensoren Gmbh Classification and localization of an object by a lidar sensor apparatus of a motor vehicle
CN110146865A (en) * 2019-05-31 2019-08-20 阿里巴巴集团控股有限公司 Target identification method and device for radar image
CN115272416A (en) * 2022-08-16 2022-11-01 太原理工大学 Vehicle and pedestrian detection tracking method and system based on multi-source sensor fusion

Also Published As

Publication number Publication date
CN116626630A (en) 2023-08-22

Similar Documents

Publication Publication Date Title
CN110632617A (en) Laser radar point cloud data processing method and device
CN113295176A (en) Map updating method, map updating apparatus, and computer-readable storage medium
CN110413942B (en) Lane line equation screening method and screening module thereof
CN110426714B (en) Obstacle identification method
CN112801024B (en) Detection information processing method and device
CN116626630B (en) Object classification method and device, electronic equipment and storage medium
CN115995163B (en) Vehicle collision early warning method and system
CN116543271A (en) Method, device, electronic equipment and medium for determining target detection evaluation index
CN116740145A (en) Multi-target tracking method, device, vehicle and storage medium
CN115542312A (en) Multi-sensor association method and device
CN115951336A (en) Method, device and equipment for determining laser radar error and storage medium
CN113203424B (en) Multi-sensor data fusion method and device and related equipment
CN114743180A (en) Detection result identification method and device, storage medium and processor
CN117490710A (en) Parameter adjusting method and device for target fusion and target fusion system
CN111596288B (en) Method and device for measuring speed, vehicle-mounted terminal and vehicle-mounted speed measuring system
CN113589288A (en) Target screening method, device and equipment based on millimeter wave radar and storage medium
CN112015960A (en) Clustering method of vehicle-mounted radar measurement data, storage medium and electronic device
CN116797781B (en) Target detection method and device, electronic equipment and storage medium
WO2024183825A1 (en) Target association and matching method and apparatus, and in-vehicle infotainment system and storage medium
EP4258243A1 (en) Image processing device, image processing method, and program
CN117809486A (en) Vehicle-road cooperation method and device, storage medium and electronic equipment
CN118625310A (en) Target association matching method, device, vehicle machine and storage medium
CN114937079A (en) Obstacle detection method and device and robot
CN117687015A (en) Target association method, device, computer readable medium and terminal equipment
CN113850215A (en) Target detection method and system for automatic driving and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant