CN115079155A - Target detection method and device and vehicle - Google Patents

Target detection method and device and vehicle Download PDF

Info

Publication number
CN115079155A
CN115079155A CN202210588133.XA CN202210588133A CN115079155A CN 115079155 A CN115079155 A CN 115079155A CN 202210588133 A CN202210588133 A CN 202210588133A CN 115079155 A CN115079155 A CN 115079155A
Authority
CN
China
Prior art keywords
target
sensor
sensing data
determining
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210588133.XA
Other languages
Chinese (zh)
Inventor
关瀛洲
王祎男
刘汉旭
陈伟轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202210588133.XA priority Critical patent/CN115079155A/en
Publication of CN115079155A publication Critical patent/CN115079155A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a target detection method, a target detection device and a vehicle. Wherein, the method comprises the following steps: acquiring a perception data set of a current vehicle, wherein the perception data set comprises: first sensing data sensed by the first sensor, second sensing data sensed by the second sensor, and vehicle speed; determining a target judgment threshold corresponding to the current vehicle based on the sensing data set, wherein the target judgment threshold is used for judging whether a first target sensed by the first sensor is the same as a second target sensed by the second sensor; and comparing the first sensing data and the second sensing data with a target judgment threshold to obtain the detection results of the first sensor and the second sensor. The invention solves the technical problem of low target detection accuracy in the related technology.

Description

Target detection method and device and vehicle
Technical Field
The invention relates to the field of auxiliary driving, in particular to a target detection method and device and a vehicle.
Background
At present, the auxiliary driving technology is rapidly developed, the auxiliary driving function of the domestic and international middle-high-end vehicle models gradually becomes standard, an intelligent monocular camera and an intelligent millimeter wave radar are mainly carried to detect a front target, and the safe driving of the vehicle is kept through target data association, fusion processing, path planning and vehicle control. However, in the prior art, a fixed distance threshold is mostly adopted for target association judgment, and the error change of the sensor under the real environment condition and different scenes cannot be reasonably reflected.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a target detection method, a target detection device and a vehicle, and at least solves the technical problem of low target detection accuracy rate in the related technology.
According to an aspect of an embodiment of the present invention, there is provided a target detection method, including: acquiring a perception data set of a current vehicle, wherein the perception data set comprises: first sensing data sensed by the first sensor, second sensing data sensed by the second sensor, and vehicle speed; determining a target judgment threshold corresponding to the current vehicle based on the sensing data set, wherein the target judgment threshold is used for judging whether a first target sensed by the first sensor is the same as a second target sensed by the second sensor; and comparing the first sensing data and the second sensing data with a target judgment threshold to obtain the detection results of the first sensor and the second sensor.
Optionally, determining a target judgment threshold corresponding to the current vehicle based on the sensing data set includes: determining a target detection condition corresponding to the current vehicle based on the perception data set; acquiring perception data corresponding to target detection conditions from a preset mapping relation, wherein the preset mapping relation is used for representing data distribution corresponding to different detection conditions; and determining a target judgment threshold based on the target data distribution.
Optionally, determining a target detection condition corresponding to the current vehicle based on the sensing data set includes: determining a target class of the first target and a target longitudinal distance between the current vehicle and the first target based on the first perception data; based on the object class, the vehicle speed, and the object longitudinal distance, an object detection condition is determined.
Optionally, the method further comprises: acquiring third perception data and fourth perception data which are obtained by respectively sensing the same third target by a first sensor and a second sensor under different detection conditions; obtaining a difference value of the third sensing data and the fourth sensing data to obtain a first difference value; determining data distribution of the first difference values under different detection conditions; and generating a preset mapping relation based on different detection conditions and data distribution corresponding to the different detection conditions.
Optionally, determining the target judgment threshold based on the target data distribution includes: responding to the first sensor and the second sensor for the first time in the current period, and determining that the target judgment threshold is the sum of the mean value and the mean square error of the target data distribution; responding to the non-first detection of the first sensor and the second sensor in the current period, and determining that a first target is the same as a second target in the previous period of the current period, wherein the target judgment threshold is the sum of a mean value and a first product, and the first product is the product of a mean square error and a first preset value; and responding to the non-first detection of the first sensor and the second sensor in the current period, and determining that the first target and the second target in the first two periods of the current period are the same, and determining that the target judgment threshold is the sum of the mean value and a second product, wherein the second product is the product of the mean square error and a second preset value.
Optionally, comparing the first sensing data and the second sensing data with a target judgment threshold to obtain detection results of the first sensor and the second sensor, including: obtaining a difference value of the first sensing data and the second sensing data to obtain a second difference value; and comparing the second difference value with a target judgment threshold to obtain a detection result.
Optionally, comparing the second difference with a target judgment threshold to obtain a detection result, where the detection result includes: determining that the detection result is that the first target is the same as the second target in response to the second difference being less than or equal to the target judgment threshold; and determining that the detection result is that the first target is different from the second target in response to the second difference value being greater than the target judgment threshold.
Optionally, the target determination threshold includes at least one of: a longitudinal distance threshold, a lateral distance threshold, a longitudinal speed threshold, and a lateral speed threshold.
According to another aspect of the embodiments of the present invention, there is also provided an object detection apparatus, including: the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a perception data set of a current vehicle, and the perception data set comprises: first sensing data sensed by the first sensor, second sensing data sensed by the second sensor, and vehicle speed; the first determination module is used for determining a target judgment threshold corresponding to the current vehicle based on the perception data set, wherein the target judgment threshold is used for judging whether a first target perceived by the first sensor is the same as a second target perceived by the second sensor; and the judging module is used for comparing the first sensing data and the second sensing data with a target judging threshold to obtain the detection results of the first sensor and the second sensor, and the detection results are used for representing whether the first sensor and the second sensor are successfully associated or not.
According to another aspect of the embodiments of the present invention, there is also provided a target vehicle including: one or more processors; storage means for storing one or more programs; when executed by one or more processors, cause the one or more processors to perform any of the object detection methods described above.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium, which includes a stored program, wherein the processor of the device is controlled to execute any one of the above-mentioned object detection methods when the program runs.
In the embodiment of the present invention, acquiring a sensing data set of a current vehicle is adopted, wherein the sensing data set includes: first sensing data sensed by the first sensor, second sensing data sensed by the second sensor, and vehicle speed; determining a target judgment threshold corresponding to the current vehicle based on the sensing data set, wherein the target judgment threshold is used for judging whether a first target sensed by the first sensor is the same as a second target sensed by the second sensor; and comparing the first sensing data and the second sensing data with a target judgment threshold to obtain detection results of the first sensor and the second sensor. It is easy to note that the target judgment threshold is determined based on the sensing data set acquired in real time, and is not a fixed distance threshold but a dynamically-changing threshold, so that the target judgment threshold can reflect the error change of the sensor under the real environment condition and different scenes, the aim of accurately detecting the target is fulfilled, the technical effect of improving the target detection accuracy is achieved, and the technical problem of low target detection accuracy in the related technology is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a flow chart of a method of target detection according to an embodiment of the present invention;
FIG. 2 is a flow diagram of an alternative target detection method according to an embodiment of the invention;
fig. 3 is a schematic structural diagram of an alternative target detection apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In accordance with an embodiment of the present invention, there is provided a method embodiment of object detection, it should be noted that the steps illustrated in the flowchart of the figure may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a flowchart of an object detection method according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, acquiring a perception data set of the current vehicle, wherein the perception data set comprises: first perception data perceived by the first sensor, second perception data perceived by the second sensor, and vehicle speed.
The first sensor may be any intelligent camera sensor on a vehicle with a driving assistance function, and in this embodiment, an intelligent monocular camera sensor is taken as an example for description; the second sensor may be any intelligent radar sensor on the vehicle with the driving assistance function, and in the present embodiment, the intelligent millimeter wave radar sensor is taken as an example for description; the first perception data can be data such as a first target category, a longitudinal distance, a transverse distance, a longitudinal speed, a transverse speed and the like of a current vehicle and a first target detected by a first sensor in real time; the second sensing data can be data such as a second target category, a longitudinal distance, a transverse distance, a longitudinal speed, a transverse speed and the like of the current vehicle and a second target detected by a second sensor in real time; the vehicle speed may be a real-time speed of the current vehicle.
In an alternative embodiment, a sensing data set of the current vehicle may be obtained first, wherein the sensing data set includes: the first sensing data sensed by the first sensor may be, for example, data of longitudinal distance, transverse distance, longitudinal speed, transverse speed, and the like from the target; the second sensing data sensed by the second sensor may be, for example, data such as longitudinal distance, transverse distance, longitudinal speed, transverse speed, and the like from the target; and the current real-time speed of the vehicle.
And step S104, determining a target judgment threshold corresponding to the current vehicle based on the sensing data set, wherein the target judgment threshold is used for judging whether the first target sensed by the first sensor is the same as the second target sensed by the second sensor.
The target judgment threshold may be a judgment threshold that is set in advance by a user based on a preset mapping relationship and is used for judging whether targets detected by the first sensor and the second sensor are the same target, and the target judgment threshold includes at least one of the following: the device comprises a longitudinal distance threshold, a transverse distance threshold, a longitudinal speed threshold and a transverse speed threshold, wherein the longitudinal distance threshold, the transverse distance threshold, the longitudinal speed threshold and the transverse speed threshold are obtained by the mean value, the mean square error and a preset value of Gaussian distribution of the difference value of third sensing data and fourth sensing data; the first target may be a target sensed by the first sensor; the second target may be a target sensed by the second sensor.
The preset mapping relationship may be a data set that is set in advance by a user and can reflect a corresponding relationship between the target, the vehicle speed, the longitudinal distance of the first sensing data, the longitudinal distance difference, the lateral distance difference, the longitudinal speed difference, and the lateral speed difference between the third sensing data and the fourth sensing data, and includes, but is not limited to, data such as the target, the vehicle speed, the longitudinal distance of the third sensing data, the longitudinal distance difference, the lateral distance difference, the longitudinal speed difference, and the lateral speed difference between the third sensing data and the fourth sensing data.
In an optional embodiment, a specific condition that the current vehicle is in a preset mapping relation may be determined by a target category, a longitudinal distance of the first sensing data in the sensing data set, and a current vehicle speed, and then a corresponding judgment threshold in the corresponding condition may be obtained, so as to obtain the target judgment threshold.
And S106, comparing the first sensing data and the second sensing data with a target judgment threshold to obtain detection results of the first sensor and the second sensor.
In an optional embodiment, a difference value between the first sensing data and the second sensing data may be obtained through the first sensing data and the second sensing data in the sensing data set, the difference value is fitted to obtain a mean value and a mean square error of gaussian distribution, then a real-time longitudinal distance difference, a transverse distance difference, a longitudinal speed difference, and a transverse speed difference between the first sensing data and the second sensing data are obtained based on the mean value and the mean square error of gaussian distribution, and finally the real-time longitudinal distance difference, the transverse distance difference, the longitudinal speed difference, and the transverse speed difference are compared with a target judgment threshold, so that a result of whether the first target is the same as the second target may be obtained.
In the embodiment of the present invention, acquiring a sensing data set of a current vehicle is adopted, wherein the sensing data set includes: first sensing data sensed by the first sensor, second sensing data sensed by the second sensor, and vehicle speed; determining a target judgment threshold corresponding to the current vehicle based on the sensing data set, wherein the target judgment threshold is used for judging whether a first target sensed by the first sensor is the same as a second target sensed by the second sensor; and comparing the first sensing data and the second sensing data with a target judgment threshold to obtain detection results of the first sensor and the second sensor. It is easy to note that the target judgment threshold is determined based on the sensing data set acquired in real time, and is not a fixed distance threshold but a dynamically-changing threshold, so that the target judgment threshold can reflect the error change of the sensor under the real environment condition and different scenes, the aim of accurately detecting the target is fulfilled, the technical effect of improving the target detection accuracy is achieved, and the technical problem of low target detection accuracy in the related technology is solved.
Optionally, determining a target judgment threshold corresponding to the current vehicle based on the sensing data set includes: determining a target detection condition corresponding to the current vehicle based on the perception data set; acquiring perception data corresponding to target detection conditions from a preset mapping relation, wherein the preset mapping relation is used for representing data distribution corresponding to different detection conditions; and determining a target judgment threshold based on the target data distribution.
The above-mentioned target detection condition may be a condition that is matched with the current target of the current vehicle, the vehicle speed, and the longitudinal distance of the first sensing data in the preset mapping relationship, and may be, for example, a condition that the preset target is the same as the current target of the current vehicle, the preset vehicle speed is the same as the vehicle speed of the current vehicle, and the longitudinal distance in the third sensing data is the same as the longitudinal distance in the first sensing data.
The target data distribution may be any distribution capable of representing a data relationship between a longitudinal distance difference, a transverse distance difference, a longitudinal velocity difference, and a transverse velocity difference of the third sensing data and the fourth sensing data, and each target detection condition corresponds to one target data distribution, which is not limited in particular.
In an optional embodiment, the determining the target judgment threshold corresponding to the current vehicle based on the sensing data set includes determining a matching condition (i.e., a target detection condition) in a preset mapping relationship based on a target category and a vehicle speed in the sensing data set and a longitudinal distance in the first sensing data, obtaining sensing data sensed by a third sensor and a fourth sensor corresponding to the target detection condition in the preset mapping relationship, obtaining a mean value and a mean square error of gaussian distribution of a difference value (i.e., a target data distribution) based on a sensing data difference value, and obtaining the target judgment threshold based on the mean value, the mean square error and a preset value.
Optionally, determining a target detection condition corresponding to the current vehicle based on the sensing data set includes: determining a target class of the first target and a target longitudinal distance between the current vehicle and the first target based on the first perception data; based on the object class, the vehicle speed, and the object longitudinal distance, an object detection condition is determined.
In an alternative embodiment, the first sensing data may include a first target category detected by the first sensor in real time, a longitudinal distance, a transverse distance, a longitudinal speed, a transverse speed, and the like of the current vehicle from the first target, the target category of the first target and the longitudinal distance of the target between the current vehicle and the first target may be determined based on the first sensing data, and conditions in the preset mapping relationship, which are conditions corresponding to the target category, the real-time vehicle speed of the current vehicle and the real-time longitudinal distance of the target, that is, target detection conditions, may be determined based on the target category, the real-time vehicle speed of the current vehicle and the real-time longitudinal distance of the target.
Optionally, the method further comprises: acquiring third perception data and fourth perception data which are obtained by respectively sensing the same third target by a first sensor and a second sensor under different detection conditions; obtaining a difference value of the third sensing data and the fourth sensing data to obtain a first difference value; determining data distribution of the first difference values under different detection conditions; and generating a preset mapping relation based on different detection conditions and data distribution corresponding to the different detection conditions.
The third target may be the same target sensed by the first sensor and the second sensor selected by the user when the preset mapping relationship is established, and is not limited specifically, in this embodiment, three types of pedestrians (including two-wheeled vehicles and bicycles), small vehicles, and large vehicles are taken as an example for explanation.
The third perception data may be data perceived by the first sensor for the third target, and may include data of a third target category detected by the first sensor, a longitudinal distance, a lateral distance, a longitudinal speed, a lateral speed, and the like between the vehicle and the third target; the fourth sensing data may be data sensed by the second sensor for the third target, and may include data of a third target class detected by the second sensor, a longitudinal distance, a lateral distance, a longitudinal speed, a lateral speed, and the like of the vehicle from the third target.
The different detection conditions may be different conditions in 192 generated based on different object types, different vehicle speed ranges, and different longitudinal distance ranges between the vehicle and the third object in the third sensing data, wherein the vehicle speed ranges are not specifically limited, in the present embodiment, the 4 types of 0-40km/h, 40-80km/h, 80-120km/h, and greater than 120km/h are taken as examples, and the longitudinal distance ranges between the vehicle and the third object are not specifically limited, in the present embodiment, the longitudinal distance ranges between the vehicle and the third object are taken as examples, and the longitudinal distance ranges between the vehicle and the third object are taken as 0-10m, 10-20m, 20-30m, 30-40m, 40-50m, 50-60m, 60-70m, 70-80m, 80-90m, 90-100m, 100-ion 110m, 110-120m, 120-ion 130m, The 16 ranges of 130-, 140-, 150-and more than 150-are exemplified.
The first difference value may be a longitudinal distance difference, a lateral distance difference, a longitudinal velocity difference, or a lateral velocity difference between the third perceptual data and the fourth perceptual data.
The data distribution may be obtained by fitting the first difference to obtain a mean and a variance of gaussian distribution of longitudinal distance difference, a mean and a variance of gaussian distribution of transverse distance difference, a mean and a variance of gaussian distribution of longitudinal velocity difference, and a mean and a variance of gaussian distribution of transverse velocity difference, wherein the first difference changes correspondingly because the type of the target, the vehicle speed, and the longitudinal distance between the vehicle and the third target change, and thus corresponds to one data distribution under each detection condition.
In an optional embodiment, first, third sensing data and fourth sensing data obtained by respectively sensing a third target by a first sensor and a second sensor under different detection conditions may be obtained, then, a difference between the third sensing data and the fourth sensing data may be obtained to obtain a first difference, and a gaussian distribution mean and variance (i.e., data distribution) of the first difference may be obtained after fitting the first difference
Optionally, determining the target judgment threshold based on the target data distribution includes: responding to the first sensor and the second sensor for the first time in the current period, and determining that the target judgment threshold is the sum of the mean value and the mean square error of the target data distribution; responding to the non-first detection of the first sensor and the second sensor in the current period, and determining that a first target is the same as a second target in the previous period of the current period, wherein the target judgment threshold is the sum of a mean value and a first product, and the first product is the product of a mean square error and a first preset value; and responding to the non-first detection of the first sensor and the second sensor in the current period, and determining that the first target and the second target in the first two periods of the current period are the same, and determining that the target judgment threshold is the sum of the mean value and a second product, wherein the second product is the product of the mean square error and a second preset value.
The period may be a period set by a user in advance for determination, and the specific value may be set by the user according to the user requirement, which is not specifically limited in this embodiment; the above mean value may be expressed in μ; the mean square error may be σ; the first preset value may be a constant value of the increase determination threshold set by the user, and a specific numerical value is not limited, in this embodiment, 2 is taken as an example for description; the first product may be 2 σ; the second preset value may be a constant value of the increase determination threshold set by the user, and a specific numerical value is not limited, in this embodiment, 3 is taken as an example for description; the second product may be 3 σ.
In an alternative embodiment, if the first sensor and the second sensor are detected for the first time in the current period, the target judgment threshold is determined to be μ + σ (i.e., the sum of the mean and the mean square error of the target data distribution); if the first sensor and the second sensor are not detected for the first time in the current period, and the first target and the second target are the same in the previous period, determining that the target judgment threshold is mu +2 sigma (namely the sum of the mean value and the first product); and if the first sensor and the second sensor are not detected for the first time in the current period and the first target and the second target are the same in the previous two periods, determining that the target judgment threshold is mu +3 sigma (namely the sum of the mean value and the second product).
Optionally, comparing the first sensing data and the second sensing data with a target judgment threshold to obtain detection results of the first sensor and the second sensor, including: obtaining a difference value of the first sensing data and the second sensing data to obtain a second difference value; and comparing the second difference value with a target judgment threshold to obtain a detection result.
The second difference may be a difference obtained by fitting a difference between the first sensing data and the second sensing data to obtain a mean and a mean square error of gaussian distribution of the difference, and then summing the longitudinal distance difference, the transverse velocity difference, and the mean square error of the transverse velocity difference, and the method includes: a longitudinal distance difference XD, a lateral distance difference YD, a longitudinal speed difference XV, and a lateral speed difference YV.
In an alternative embodiment, if the first sensor and the second sensor are detected for the first time in the current period, the longitudinal distance difference XD, the transverse distance difference YD, the longitudinal speed difference XV, and the transverse speed difference YV (i.e., the second difference) are respectively compared with the target determination threshold μ + σ to obtain the detection result.
In another optional embodiment, if the first sensor and the second sensor are not detected for the first time in the current period, and the first target and the second target in the previous period are the same, the longitudinal distance difference XD, the transverse distance difference YD, the longitudinal speed difference XV, and the transverse speed difference YV (i.e., the second difference) are respectively compared with the target judgment threshold μ +2 σ to obtain the detection result.
In yet another alternative embodiment, if the first sensor and the second sensor are not detected for the first time in the current period, and the first target and the second target are the same in the previous two periods, the longitudinal distance difference XD, the transverse distance difference YD, the longitudinal speed difference XV, and the transverse speed difference YV (i.e., the second difference) are respectively compared with the target judgment threshold μ +3 σ to obtain the detection result.
Optionally, comparing the second difference with a target judgment threshold to obtain a detection result, where the detection result includes: determining that the detection result is that the first target is the same as the second target in response to the second difference being less than or equal to the target judgment threshold; and determining that the detection result is that the first target is different from the second target in response to the second difference value being greater than the target judgment threshold.
Optionally, the target determination threshold includes at least one of: a longitudinal distance threshold, a transverse distance threshold, a longitudinal velocity threshold, and a transverse velocity threshold.
In an alternative embodiment, when the longitudinal distance difference XD is less than or equal to the longitudinal distance threshold, the lateral distance difference YD is less than or equal to the lateral distance threshold, the longitudinal speed difference XV is less than or equal to the longitudinal speed threshold, and the lateral speed difference YV is less than or equal to the lateral speed threshold, the detection result is determined to be that the first target is the same as the second target.
In another alternative embodiment, when the longitudinal distance difference XD is greater than a longitudinal distance threshold, the lateral distance difference YD is greater than a lateral distance threshold, the longitudinal speed difference XV is greater than a longitudinal speed threshold, and the lateral speed difference YV is greater than a lateral speed threshold, the detection result is determined to be that the first target is different from the second target.
This embodiment is further described below with reference to fig. 2.
The invention aims to overcome the defects and shortcomings in the prior art, provides an auxiliary driving intelligent monocular camera and an intelligent millimeter wave radar data association method, and is suitable for auxiliary driving vehicles on structured roads such as high-speed roads and urban expressways.
The data association method comprises two links, namely off-line sensor difference statistics and on-line target association; an offline sensor difference counting link refers to counting the sensing difference of the intelligent monocular camera and the intelligent millimeter wave radar under different conditions; different conditions in the offline sensor difference statistics link are formed by combining the target type, the vehicle speed range and the longitudinal distance range of the camera detection target.
In the offline sensor difference statistics step, under each of the different conditions, sensing results of the camera and the millimeter waves for the same target are tested and recorded for multiple times, and gaussian distributions of longitudinal distance difference, transverse distance difference, longitudinal speed difference and transverse speed difference of the two sensors under different conditions are respectively fitted, so that a mean value and a variance of the gaussian distributions are obtained.
In the online target association link, whether a perception target of the intelligent monocular camera and a perception target of the intelligent millimeter wave radar are the same target or not is judged; the method for judging the online target association comprises the steps of judging whether the longitudinal distance difference, the transverse distance difference, the longitudinal speed difference and the transverse speed difference of two sensors sensing a target are within a judgment threshold or not; the method for judging the on-line target association comprises the steps of firstly judging which condition the current state is in an off-line sensor difference counting link, setting a corresponding judgment threshold according to the longitudinal distance difference, the transverse distance difference, the longitudinal speed difference and the mean value and the variance of Gaussian distribution of the transverse speed difference which are tested and fitted under the condition, and dynamically adjusting the judgment threshold according to the association times.
The overall steps of the invention comprise two links, namely off-line sensor difference statistics and on-line target association.
The off-line sensor difference statistics implementation steps are as follows:
first, as shown in table 1, 192 different conditions are combined according to the difference in the object type, the range of the vehicle speed of the vehicle, and the range of the longitudinal distance of the camera detected object. Wherein, the camera detection target is subdivided into 3 types of pedestrians (including two-wheel vehicles and bicycles), small vehicles and big vehicles; the longitudinal distance range of the camera for detecting the target is subdivided into 16 types of 0-10m, 10-20m, 20-30m, 30-40m, 40-50m, 50-60m, 60-70m, 70-80m, 80-90m, 90-100m, 100-containing 110m, 110-containing 120m, 120-containing 130m, 130-containing 140m, 140-containing 150m and more than 150 m; the speed range of the self vehicle is subdivided into 4 types of 0-40km/h, 40-80km/h, 80-120km/h and more than 120 km/h.
TABLE 1 schematic table of results of off-line sensor difference statistical links
Figure BDA0003666653040000101
And secondly, completing the acquisition of perception data of the intelligent monocular camera and the intelligent millimeter wave radar aiming at the same target under 192 different conditions in sequence. The perception data comprises longitudinal distance, transverse distance, longitudinal speed and transverse speed, and the acquisition times are 80-100.
And secondly, fitting Gaussian distribution of longitudinal distance difference, transverse distance difference, longitudinal speed difference and transverse speed difference of the 80-100 acquisition results of the two sensors.
Finally, a longitudinal distance difference Gaussian distribution mean value mu xd-1-mu xd-192, a variance sigma xd-1-sigma xd-192, a transverse distance difference Gaussian distribution mean value mu yd-1-mu yd-192, a variance sigma xv-1-sigma xv-192, a longitudinal velocity difference Gaussian distribution mean value mu xv-1-mu xv-192, a variance sigma xv-1-sigma xv-192, a transverse velocity difference Gaussian distribution mean value mu yv-1-mu yv-192 and a variance sigma yv-1-sigma yv-192 under 192 different conditions are obtained.
The online data association implementation steps are as follows:
firstly, the sensing data of the intelligent monocular camera, the intelligent millimeter wave radar and the vehicle in the period are obtained.
Secondly, according to the target type and the longitudinal distance in the camera sensing data and the vehicle speed in the vehicle sensing data, judging which condition the current state is under in the offline sensor difference statistics, and further obtaining the longitudinal distance difference Gaussian distribution mean value mu xd, the variance sigma xd, the transverse distance difference Gaussian distribution mean value mu yd, the variance sigma yd, the longitudinal speed difference Gaussian distribution mean value mu xv, the variance sigma xv, the transverse speed difference Gaussian distribution mean value mu yv and the variance sigma yv which are tested and fitted under the condition.
Secondly, a judgment threshold of the period is set. The total number of the judgment thresholds is 4, and the judgment thresholds are a longitudinal distance difference threshold, a transverse distance difference threshold, a longitudinal speed difference threshold and a transverse speed difference threshold respectively. When the first correlation judgment is carried out, the longitudinal distance difference threshold is 'mu xd + sigma xd', the transverse distance difference threshold is 'mu yd + sigma yd', the transverse speed difference threshold is 'mu xv + sigma xv', and the transverse speed difference threshold is 'mu yv + sigma yv'.
Next, a longitudinal distance difference XD, a transverse distance difference YD, a longitudinal speed difference XV, and a transverse speed difference YV between the camera and the millimeter wave sensing target in the current period are calculated. If the longitudinal distance difference XD is smaller than the longitudinal distance difference threshold by "μ XD + σ XD", the lateral distance difference YD is smaller than the lateral distance difference threshold by "μ YD + σ YD", the longitudinal velocity difference YV is smaller than the lateral velocity difference threshold by "μ xv + σ xv", and the lateral velocity difference YV is smaller than the lateral velocity difference threshold by "μ YV + σ YV", the target association in this period is successful.
If the first association judgment is not carried out and the perception targets of the two sensors in the previous period are successfully associated, respectively increasing 4 judgment thresholds to be mu +2 sigma.
If the first association judgment is not carried out and the perception targets of the two sensors in the first two periods are successfully associated, respectively increasing 4 judgment thresholds to be mu +3 sigma.
Fig. 2 is a flow chart of an alternative target detection method according to an embodiment of the present invention, as shown in fig. 2, the method includes the following steps:
step S201: dividing 192 medium conditions according to the target category, the vehicle speed range and the longitudinal distance range of the camera detection target;
step S202: the method comprises the following steps of sequentially completing sensing data acquisition of two sensors under 192 different conditions for the same target;
step S203: respectively fitting Gaussian distributions of longitudinal distance difference, transverse distance difference, longitudinal speed difference and transverse speed difference under each condition;
step S204: obtaining the mean value mu and the variance sigma of the Gaussian distribution in the step S203, and constructing a preset mapping relation;
step S205: acquiring first perception data, second perception data and a current vehicle speed of a current vehicle;
step S206: judging a target detection condition corresponding to the current state based on a preset mapping relation;
step S207: judging whether the first sensing data and the second sensing data are detected for the first time, if so, entering step S208, and if not, entering step S209;
step S208: the target judgment threshold is mu + sigma;
step S209: judging whether the first target and the second target in the first two periods are the same, if so, entering step S210, otherwise, entering step S211;
step S210: increasing the target judgment threshold to be mu +3 sigma;
step S211: increasing the target judgment threshold to be mu +2 sigma;
step S212: judging whether the second difference value is less than or equal to a target judgment threshold, if so, entering a step S213;
step S213: the first target is the same as the second target, and the step is ended.
The invention aims to overcome the defects and shortcomings in the prior art, provides an auxiliary driving intelligent monocular camera and an intelligent millimeter wave radar data association method, and is suitable for auxiliary driving vehicles on structured roads such as high-speed roads and urban expressways.
Example 2
According to another aspect of the embodiments of the present invention, a device for target detection is further provided, where the device may perform the target detection method provided in embodiment 1, and a specific implementation manner and a preferred application scenario are the same as those in embodiment 1, and are not described herein again.
Fig. 3 is a schematic structural diagram of an object detection apparatus according to an embodiment of the present invention, as shown in fig. 3, the apparatus includes: a first obtaining module 30, configured to obtain a perception data set of a current vehicle, where the perception data set includes: first sensing data sensed by the first sensor, second sensing data sensed by the second sensor, and vehicle speed; the first determining module 32 is configured to determine, based on the sensing data set, a target judgment threshold corresponding to the current vehicle, where the target judgment threshold is used to judge whether a first target sensed by the first sensor is the same as a second target sensed by the second sensor; the determining module 34 is configured to compare the first sensing data and the second sensing data with a target determination threshold to obtain detection results of the first sensor and the second sensor, where the detection results are used to represent whether the first sensor and the second sensor are successfully associated.
Optionally, the determining module includes: the first determining unit is used for determining a target detection condition corresponding to the current vehicle based on the perception data set; the acquisition unit is used for acquiring sensing data corresponding to target detection conditions from a preset mapping relation, wherein the preset mapping relation is used for representing data distribution corresponding to different detection conditions; and the second determining unit is used for determining the target judgment threshold based on the target data distribution.
Optionally, the first determination unit includes: the first determining subunit is used for determining a target class of the first target and a target longitudinal distance between the current vehicle and the first target based on the first perception data; and the second determining subunit is used for determining the target detection condition based on the target category, the vehicle speed and the target longitudinal distance.
Optionally, the apparatus further comprises: the second acquisition module is used for acquiring third perception data and fourth perception data which are obtained by respectively sensing the same third target by the first sensor and the second sensor under different detection conditions; the processing module is used for acquiring a difference value of the third sensing data and the fourth sensing data to obtain a first difference value; the second determining module is used for determining the data distribution of the first difference under different detection conditions; and the generation module is used for generating a preset mapping relation based on different detection conditions and data distribution corresponding to the different detection conditions.
Optionally, the second determination unit includes: the first determining subunit is used for responding to the first detection of the first sensor and the second detection of the second sensor in the current period, and determining that the target judgment threshold is the sum of the mean value and the mean square error of the target data distribution; the second determining subunit is configured to, in response to that the first sensor and the second sensor are not detected for the first time in the current period, and a first target and a second target in a previous period of the current period are the same, determine that the target determination threshold is a sum of a mean value and a first product, where the first product is a product of a mean square error and a first preset value; and the third determining subunit is used for responding to the non-first detection of the first sensor and the second sensor in the current period, determining that the target judgment threshold is the sum of the mean value and a second product in the first two periods of the current period, and the second product is the product of the mean square error and a second preset value.
Optionally, the determining module includes: the acquisition unit is used for acquiring a difference value of the first sensing data and the second sensing data to obtain a second difference value; and the detection unit is used for comparing the second difference value with the target judgment threshold to obtain a detection result.
Optionally, the detection unit comprises: the first determining subunit is configured to determine that the detection result is that the first target is the same as the second target in response to the second difference being less than or equal to the target determination threshold; and the second determining subunit is used for responding to the second difference value being greater than the target judgment threshold, and determining that the detection result is that the first target is different from the second target.
Optionally, the second determining unit is configured to determine a target judgment threshold based on the target data distribution, where the target judgment threshold includes at least one of: a longitudinal distance threshold, a transverse distance threshold, a longitudinal velocity threshold, and a transverse velocity threshold.
Example 3
According to another aspect of the embodiments of the present invention, there is also provided a target vehicle including: one or more processors; storage means for storing one or more programs; when the one or more programs are executed by the one or more processors, the one or more processors are caused to perform the object detection method described in the above embodiments.
Example 4
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium, where the non-volatile storage medium includes a stored program, and the processor of the device is controlled to execute the target detection method according to the above embodiments when the program runs.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described in detail in a certain embodiment.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (10)

1. A method of object detection, comprising:
acquiring a perception data set of a current vehicle, wherein the perception data set comprises: first sensing data sensed by the first sensor, second sensing data sensed by the second sensor, and vehicle speed;
determining a target judgment threshold corresponding to the current vehicle based on the sensing data set, wherein the target judgment threshold is used for judging whether a first target sensed by the first sensor is the same as a second target sensed by the second sensor;
and comparing the first sensing data and the second sensing data with the target judgment threshold to obtain the detection results of the first sensor and the second sensor.
2. The method of claim 1, wherein determining a target decision threshold for the current vehicle based on the set of perceptual data comprises:
determining a target detection condition corresponding to the current vehicle based on the perception data set;
acquiring perception data corresponding to the target detection condition from a preset mapping relation, wherein the preset mapping relation is used for representing data distribution corresponding to different detection conditions;
and determining the target judgment threshold based on the target data distribution.
3. The method of claim 2, wherein determining the target detection condition for the current vehicle based on the set of perceptual data comprises:
determining a target class of the first target and a target longitudinal distance between the current vehicle and the first target based on the first perception data;
determining the object detection condition based on the object class, the vehicle speed, and the object longitudinal distance.
4. The method of claim 2, further comprising:
acquiring third perception data and fourth perception data which are obtained by respectively inducing the same third target by the first sensor and the second sensor under different detection conditions;
obtaining a difference value of the third sensing data and the fourth sensing data to obtain a first difference value;
determining a data distribution of the first difference values under the different detection conditions;
and generating the preset mapping relation based on the different detection conditions and the data distribution corresponding to the different detection conditions.
5. The method of claim 2, wherein determining the target decision threshold based on the target data distribution comprises:
responding to the first sensor and the second sensor for the first time in the current period, and determining that the target judgment threshold is the sum of the mean value and the mean square error of the target data distribution;
responding to the non-first detection of the first sensor and the second sensor in the current period, and determining that the first target is the same as the second target in the previous period of the current period, and determining that the target judgment threshold is the sum of the mean value and a first product, wherein the first product is the product of the mean square error and a first preset value;
and responding to the non-first detection of the first sensor and the second sensor in the current period, and determining that the first target is the same as the second target in the first two periods of the current period, and determining that the target judgment threshold is the sum of the mean value and a second product, wherein the second product is the product of the mean square error and a second preset value.
6. The method of claim 1, wherein comparing the first sensing data and the second sensing data with the target determination threshold to obtain the detection results of the first sensor and the second sensor comprises:
obtaining a difference value between the first sensing data and the second sensing data to obtain a second difference value;
and comparing the second difference with the target judgment threshold to obtain the detection result.
7. The method of claim 6, wherein comparing the second difference with the target decision threshold to obtain the detection result comprises:
in response to the second difference being less than or equal to the target judgment threshold, determining that the detection result is that the first target is the same as the second target;
and determining that the detection result is that the first target is different from the second target in response to the second difference being greater than the target judgment threshold.
8. The method according to any one of claims 1 to 7, wherein the target decision threshold comprises at least one of: a longitudinal distance threshold, a transverse distance threshold, a longitudinal velocity threshold, and a transverse velocity threshold.
9. An object detection device, comprising:
the system comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a perception data set of a current vehicle, and the perception data set comprises: first sensing data sensed by the first sensor, second sensing data sensed by the second sensor, and vehicle speed;
a first determining module, configured to determine, based on the sensing data set, a target determination threshold corresponding to the current vehicle, where the target determination threshold is used to determine whether a first target sensed by the first sensor is the same as a second target sensed by the second sensor;
and the judging module is used for comparing the first sensing data and the second sensing data with the target judgment threshold to obtain the detection results of the first sensor and the second sensor, and the detection results are used for representing whether the first sensor and the second sensor are successfully associated or not.
10. A vehicle, characterized by comprising:
one or more processors;
storage means for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to perform the object detection method of any one of claims 1-8.
CN202210588133.XA 2022-05-27 2022-05-27 Target detection method and device and vehicle Pending CN115079155A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210588133.XA CN115079155A (en) 2022-05-27 2022-05-27 Target detection method and device and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210588133.XA CN115079155A (en) 2022-05-27 2022-05-27 Target detection method and device and vehicle

Publications (1)

Publication Number Publication Date
CN115079155A true CN115079155A (en) 2022-09-20

Family

ID=83248500

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210588133.XA Pending CN115079155A (en) 2022-05-27 2022-05-27 Target detection method and device and vehicle

Country Status (1)

Country Link
CN (1) CN115079155A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116859380A (en) * 2023-09-05 2023-10-10 长沙隼眼软件科技有限公司 Method and device for measuring target track, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109886308A (en) * 2019-01-25 2019-06-14 中国汽车技术研究中心有限公司 One kind being based on the other dual sensor data fusion method of target level and device
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN112540365A (en) * 2020-12-10 2021-03-23 中国第一汽车股份有限公司 Evaluation method, device, equipment and storage medium
CN113093178A (en) * 2021-04-21 2021-07-09 中国第一汽车股份有限公司 Obstacle target detection method and device, domain controller and vehicle
CN113611112A (en) * 2021-07-29 2021-11-05 中国第一汽车股份有限公司 Target association method, device, equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109886308A (en) * 2019-01-25 2019-06-14 中国汽车技术研究中心有限公司 One kind being based on the other dual sensor data fusion method of target level and device
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN112540365A (en) * 2020-12-10 2021-03-23 中国第一汽车股份有限公司 Evaluation method, device, equipment and storage medium
CN113093178A (en) * 2021-04-21 2021-07-09 中国第一汽车股份有限公司 Obstacle target detection method and device, domain controller and vehicle
CN113611112A (en) * 2021-07-29 2021-11-05 中国第一汽车股份有限公司 Target association method, device, equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
席轶敏, 刘渝, 靖晟: "电子侦察信号实时检测算法及性能分析", 南京航空航天大学学报, no. 03, 30 July 2001 (2001-07-30) *
郝重阳, 唐文彬: "雷达和红外成像双传感器信息融合目标识别研究", 航空学报, no. 06, 25 June 1998 (1998-06-25) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116859380A (en) * 2023-09-05 2023-10-10 长沙隼眼软件科技有限公司 Method and device for measuring target track, electronic equipment and storage medium
CN116859380B (en) * 2023-09-05 2023-11-21 长沙隼眼软件科技有限公司 Method and device for measuring target track, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109145680B (en) Method, device and equipment for acquiring obstacle information and computer storage medium
CN110263652B (en) Laser point cloud data identification method and device
US20130163869A1 (en) Apparatus and method for extracting edge in image
JP2021056608A (en) Occupancy grid map generation device, occupancy grid map generation system, occupancy grid map generation method, and program
CN115079155A (en) Target detection method and device and vehicle
CN106056034A (en) Pressure sensor-based object identification method and object identification method
CN107240104A (en) Point cloud data segmentation method and terminal
CN117197796A (en) Vehicle shielding recognition method and related device
CN112466147A (en) Multi-sensor-based library position detection method and related device
CN104680194A (en) On-line target tracking method based on random fern cluster and random projection
CN110163032B (en) Face detection method and device
CN116311144A (en) Method and device for predicting vehicle steering and computer readable storage medium
CN115792945A (en) Floating obstacle detection method and device, electronic equipment and storage medium
US20200258379A1 (en) Determination of movement information with surroundings sensors
CN112291193B (en) LDoS attack detection method based on NCS-SVM
CN115236672A (en) Obstacle information generation method, device, equipment and computer readable storage medium
KR101972095B1 (en) Method and Apparatus of adding artificial object for improving performance in detecting object
CN107809430B (en) Network intrusion detection method based on extreme point classification
CN115331447B (en) Data association method and device based on sensor fusion
CN112686155A (en) Image recognition method, image recognition device, computer-readable storage medium and processor
CN111354019A (en) Visual tracking failure detection system based on neural network and training method thereof
CN109795464B (en) Braking method, braking device and storage medium
EP4092565A1 (en) Device and method to speed up annotation quality check process
CN115284809B (en) Intelligent internet fleet active suspension control method and system and computer equipment
CN111127814B (en) Fire alarm identification method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination