CN109635868B - Method and device for determining obstacle type, electronic device and storage medium - Google Patents
Method and device for determining obstacle type, electronic device and storage medium Download PDFInfo
- Publication number
- CN109635868B CN109635868B CN201811506234.8A CN201811506234A CN109635868B CN 109635868 B CN109635868 B CN 109635868B CN 201811506234 A CN201811506234 A CN 201811506234A CN 109635868 B CN109635868 B CN 109635868B
- Authority
- CN
- China
- Prior art keywords
- time corresponding
- category
- current time
- target obstacle
- candidate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000004927 fusion Effects 0.000 claims abstract description 235
- 238000004364 calculation method Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 3
- 230000000875 corresponding effect Effects 0.000 description 303
- 238000010586 diagram Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000007796 conventional method Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000008707 rearrangement Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/254—Fusion techniques of classification results, e.g. of results related to same input data
Landscapes
- Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Testing And Monitoring For Control Systems (AREA)
- Traffic Control Systems (AREA)
Abstract
The embodiment of the invention discloses a method and a device for determining the type of an obstacle, electronic equipment and a storage medium. The method comprises the following steps: receiving the type output result of the target obstacle at the current moment, which is sent by each sensor; calculating a category fusion result of the current moment corresponding to the target obstacle according to a predetermined category fusion result of the previous moment corresponding to the target obstacle and a predetermined category output result of the current moment corresponding to the target obstacle; and determining the final type of the target obstacle according to the type fusion result of the current time corresponding to the target obstacle. The final type of the target obstacle can be determined more accurately.
Description
Technical Field
The embodiment of the invention relates to the technical field of data processing, in particular to a method and a device for determining obstacle types, electronic equipment and a storage medium.
Background
An autonomous vehicle, which may also be referred to as an unmanned vehicle, senses the surroundings of the vehicle through various sensors, and controls the steering and speed of the vehicle according to the sensed road, vehicle position, obstacle information, and the like, so that the vehicle can safely and reliably travel on the road. Therefore, when the autonomous vehicle travels, the traveling environment around the vehicle must be known in real time. While the acquisition of environmental information relies on various sensors installed on autonomous vehicles, typical sensors on current autonomous vehicles include: laser radar, cameras and millimeter wave radar. Each sensor outputs a category of a target obstacle such as a pedestrian, a vehicle, etc., and the output results of each sensor may conflict, for example, for a certain obstacle, the lidar recognizes as a pedestrian and the camera recognizes as a vehicle. In order to improve the identification capability of the target obstacle, a multi-sensor data fusion technology is used, and the complementarity of each sensor is fully utilized to form a relatively consistent perception description of the system environment.
The existing method for determining the obstacle category adopts the following two schemes: first, the output class of one of the sensors is determined as the final class of the target obstacle. The specific selection of which sensor is generally based on the classification performance of each sensorAs an evaluation index, generally, the accuracy of classifying an obstacle by a camera is higher than that of a laser sensor, so that when both the camera and the laser radar detect the same obstacle, the output result of the camera is determined as the final class of the target obstacle. And secondly, carrying out weighted fusion on the output results of the sensors by using the prior knowledge. For example, assume that an obstacle may have the following four categories: pedestrian, bicycle, vehicle, unknown obstacle, the output result of laser radar to this obstacle is: plidar=[p1,p2,p3,p4]The vector represents probabilities of belonging to four categories; correspondingly, the output result of the camera to the obstacle is: pcamera=[a1,a2,a3,a4]. The weighted fusion model is:where i denotes the ith class, MkRepresenting a weighting coefficient, Pfusion[i]And representing the probability that the weighted fused target obstacle belongs to the ith category.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art:
in the existing first method for determining the type of the obstacle, the output result of one sensor is directly determined as the final result of the target obstacle, and the method does not comprehensively consider the output results of all sensors, so that the determined final type of the target obstacle may be inaccurate; in the second existing method for determining the obstacle category, the weighting coefficient also affects the accuracy of the final category of the target obstacle.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method, an apparatus, an electronic device, and a storage medium for determining a type of an obstacle, which can more accurately determine a final type of a target obstacle.
In a first aspect, an embodiment of the present invention provides a method for determining an obstacle category, where the method includes:
receiving the type output result of the target obstacle at the current moment, which is sent by each sensor;
calculating a category fusion result of the current moment corresponding to the target obstacle according to a predetermined category fusion result of the previous moment corresponding to the target obstacle and a predetermined category output result of the current moment corresponding to the target obstacle;
and determining the final type of the target obstacle according to the type fusion result of the current time corresponding to the target obstacle.
In the above embodiment, the calculating a category fusion result at the current time corresponding to the target obstacle according to a predetermined category fusion result at the previous time corresponding to the target obstacle and a predetermined category output result at the current time corresponding to the target obstacle includes:
acquiring a confidence level fusion value of the last time corresponding to each candidate category from the category fusion result of the last time corresponding to the target obstacle;
obtaining a confidence output value of the current time corresponding to each candidate type from the type output result of the current time corresponding to the target obstacle;
and calculating a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category.
In the above embodiment, the calculating a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category includes:
calculating the credibility fusion value of the current time corresponding to each candidate category according to the credibility fusion value of the last time corresponding to each candidate category and the credibility output value of the current time corresponding to each candidate category;
and determining a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the current time corresponding to each candidate category.
In the above embodiment, the calculating a confidence fusion value of a current time corresponding to each candidate category according to a confidence fusion value of a previous time corresponding to each candidate category and a confidence output value of a current time corresponding to each candidate category includes:
determining the mutually associated candidate categories and mutually exclusive candidate categories corresponding to the candidate categories;
acquiring a confidence level fusion value at a previous moment corresponding to the mutually-associated candidate categories and a confidence level fusion value at a previous moment corresponding to the mutually-exclusive candidate categories from a category fusion result at a previous moment corresponding to the target obstacle;
obtaining a confidence output value of the current time corresponding to the mutually-associated candidate categories and a confidence output value of the current time corresponding to the mutually-exclusive candidate categories from a category output result of the current time corresponding to the target obstacle;
and calculating the reliability fusion value of the current time corresponding to each candidate category according to the reliability fusion value of the last time corresponding to the mutually associated candidate categories, the reliability fusion value of the last time corresponding to the mutually exclusive candidate categories, the reliability output value of the current time corresponding to the mutually associated candidate categories and the reliability output value of the current time corresponding to the mutually exclusive candidate categories.
In the above embodiment, the determining the final category of the target obstacle according to the category fusion result of the current time corresponding to the target obstacle includes:
sorting all candidate categories according to the category fusion result of the current time corresponding to the target obstacle, wherein the category fusion result of the current time corresponds to the target obstacle;
and determining the candidate class with the maximum confidence fusion value as the final class of the target obstacle.
In a second aspect, an embodiment of the present invention provides an apparatus for determining an obstacle category, where the apparatus includes: the device comprises a receiving module, a calculating module and a determining module; wherein,
the receiving module is used for receiving the type output result of the current moment corresponding to the target obstacle sent by each sensor;
the calculation module is used for calculating a category fusion result of the current moment corresponding to the target obstacle according to a predetermined category fusion result of the previous moment corresponding to the target obstacle and a predetermined category output result of the current moment corresponding to the target obstacle;
the determining module is used for determining the final type of the target obstacle according to the type fusion result of the current time corresponding to the target obstacle.
In the above embodiment, the calculation module includes: an acquisition submodule and a calculation submodule; wherein,
the obtaining submodule is used for obtaining a confidence level fusion value of the last moment corresponding to each candidate category from the category fusion result of the last moment corresponding to the target obstacle; obtaining a confidence output value of the current time corresponding to each candidate type from the type output result of the current time corresponding to the target obstacle;
and the calculation sub-module is used for calculating the category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the last time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category.
In the above embodiment, the calculating sub-module is configured to calculate, according to the previous-time confidence fusion value corresponding to each candidate category and the current-time confidence output value corresponding to each candidate category, a confidence fusion value corresponding to each candidate category at the current time; and determining a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the current time corresponding to each candidate category.
In the above embodiment, the computing sub-module is specifically configured to determine the mutually associated candidate categories and the mutually exclusive candidate categories corresponding to the respective candidate categories; acquiring a confidence level fusion value at a previous moment corresponding to the mutually-associated candidate categories and a confidence level fusion value at a previous moment corresponding to the mutually-exclusive candidate categories from a category fusion result at a previous moment corresponding to the target obstacle; obtaining a confidence output value of the current time corresponding to the mutually-associated candidate categories and a confidence output value of the current time corresponding to the mutually-exclusive candidate categories from a category output result of the current time corresponding to the target obstacle; and calculating the reliability fusion value of the current time corresponding to each candidate category according to the reliability fusion value of the last time corresponding to the mutually associated candidate categories, the reliability fusion value of the last time corresponding to the mutually exclusive candidate categories, the reliability output value of the current time corresponding to the mutually associated candidate categories and the reliability output value of the current time corresponding to the mutually exclusive candidate categories.
In the foregoing embodiment, the determining module is specifically configured to sort all candidate categories according to the category fusion result of the current time corresponding to the target obstacle, according to the confidence fusion value of the current time corresponding to the candidate categories; and determining the candidate class with the maximum confidence fusion value as the final class of the target obstacle.
In a third aspect, an embodiment of the present invention provides an electronic device, including:
one or more processors;
a memory for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the method for determining the obstacle category according to any embodiment of the present invention.
In a fourth aspect, an embodiment of the present invention provides a storage medium, on which a computer program is stored, which when executed by a processor, implements the method for determining the obstacle category according to any embodiment of the present invention.
The embodiment of the invention provides a method and a device for determining the type of an obstacle, electronic equipment and a storage medium, wherein the method comprises the steps of firstly receiving the type output result of a target obstacle at the current moment, which is sent by each sensor; then calculating a category fusion result of the current moment corresponding to the target obstacle according to a predetermined category fusion result of the previous moment corresponding to the target obstacle and a category output result of the current moment corresponding to the target obstacle; and determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment. That is, in the technical solution of the present invention, the category fusion result at the current time corresponding to the target obstacle may be calculated according to the category fusion result at the previous time corresponding to the target obstacle and the category output result at the current time corresponding to the target obstacle; and determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment. In the conventional method for determining the type of an obstacle, the output type of a certain sensor is determined as the final type of a target obstacle; or, the output results of each sensor are weighted and fused by using the prior knowledge. Therefore, compared with the prior art, the method, the device, the electronic equipment and the storage medium for determining the obstacle type provided by the embodiment of the invention can more accurately determine the final type of the target obstacle; moreover, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
Drawings
Fig. 1 is a schematic flowchart of a method for determining a type of an obstacle according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating a method for determining a type of an obstacle according to a second embodiment of the present invention;
fig. 3 is a flowchart illustrating a method for determining a type of an obstacle according to a third embodiment of the present invention;
fig. 4 is a first structural diagram of an obstacle category determination apparatus according to a fourth embodiment of the present invention;
fig. 5 is a second schematic structural diagram of an obstacle category determination apparatus according to a fourth embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some but not all of the relevant aspects of the present invention are shown in the drawings.
Example one
Fig. 1 is a flowchart of a method for determining an obstacle category according to an embodiment of the present invention, where the method may be executed by an apparatus or an electronic device for determining an obstacle category, where the apparatus or the electronic device may be implemented by software and/or hardware, and the apparatus or the electronic device may be integrated in any intelligent device with a network communication function. As shown in fig. 1, the method for determining the obstacle category may include the steps of:
and S101, receiving the type output result of the current time corresponding to the target obstacle sent by each sensor.
In an embodiment of the present invention, the electronic device may receive the category output result of the current time corresponding to the target obstacle sent by each sensor. For example, assume that M sensors may be provided on an unmanned vehicle, respectively: sensor 1, sensor 2, …, sensor M; the sensor 1 can send a type output result 1 of the current moment corresponding to the target obstacle to the electronic equipment; the sensor 2 can send the type output result 2 of the current moment corresponding to the target obstacle to the electronic equipment; …, respectively; the sensor M may transmit the category output result M of the current time corresponding to the target obstacle to the electronic device. Therefore, the electronic device can receive the type output result 1 at the current time corresponding to the target obstacle sent by the sensor 1; the type output result 1 of the current time corresponding to the target obstacle sent by the sensor 2 can be received; …, respectively; the type output result M of the current moment corresponding to the target obstacle sent by the sensor M can be received; wherein M is a natural number greater than 1.
And S102, calculating a category fusion result of the current time corresponding to the target obstacle according to a predetermined category fusion result of the previous time corresponding to the target obstacle and a predetermined category output result of the current time corresponding to the target obstacle.
In an embodiment of the present invention, the electronic device may calculate the category fusion result of the current time corresponding to the target obstacle according to a predetermined category fusion result of the previous time corresponding to the target obstacle and a predetermined category output result of the current time corresponding to the target obstacle. Specifically, the electronic device may first obtain a confidence fusion value of a previous time corresponding to each candidate category from a category fusion result of the previous time corresponding to the target obstacle; then obtaining a confidence output value of the current time corresponding to each candidate type from the type output result of the current time corresponding to the target obstacle; and calculating a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category.
S103, determining the final type of the target obstacle according to the type fusion result of the current time corresponding to the target obstacle.
In a specific embodiment of the present invention, the electronic device may determine the final category of the target obstacle according to the category fusion result of the current time corresponding to the target obstacle. Specifically, the electronic device may sort all candidate categories according to the category fusion result of the current time corresponding to the target obstacle, and the confidence fusion value of the current time corresponding to the candidate categories; and then determining the candidate class with the maximum confidence fusion value as the final class of the target obstacle.
The method for determining the type of the obstacle provided by the embodiment of the invention comprises the steps of firstly receiving the type output result of the target obstacle at the current moment, which is sent by each sensor; then calculating a category fusion result of the current moment corresponding to the target obstacle according to a predetermined category fusion result of the previous moment corresponding to the target obstacle and a category output result of the current moment corresponding to the target obstacle; and determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment. That is, in the technical solution of the present invention, the category fusion result at the current time corresponding to the target obstacle may be calculated according to the category fusion result at the previous time corresponding to the target obstacle and the category output result at the current time corresponding to the target obstacle; and determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment. In the conventional method for determining the type of an obstacle, the output type of a certain sensor is determined as the final type of a target obstacle; or, the output results of each sensor are weighted and fused by using the prior knowledge. Therefore, compared with the prior art, the method for determining the type of the obstacle provided by the embodiment of the invention can more accurately determine the final type of the target obstacle; moreover, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
Example two
Fig. 2 is a flowchart illustrating a method for determining an obstacle category according to a second embodiment of the present invention. As shown in fig. 2, the method for determining the obstacle category may include the steps of:
and S201, receiving the type output result of the current time corresponding to the target obstacle sent by each sensor.
In an embodiment of the present invention, the electronic device may receive the category output result of the current time corresponding to the target obstacle sent by each sensor. For example, assume that M sensors may be provided on an unmanned vehicle, respectively: sensor 1, sensor 2, …, sensor M; the sensor 1 can send a type output result 1 of the current moment corresponding to the target obstacle to the electronic equipment; the sensor 2 can send the type output result 2 of the current moment corresponding to the target obstacle to the electronic equipment; …, respectively; the sensor M may transmit the category output result M of the current time corresponding to the target obstacle to the electronic device. Therefore, the electronic device can receive the type output result 1 at the current time corresponding to the target obstacle sent by the sensor 1; the type output result 1 of the current time corresponding to the target obstacle sent by the sensor 2 can be received; …, respectively; the type output result M of the current moment corresponding to the target obstacle sent by the sensor M can be received; wherein M is a natural number greater than 1.
S202, a confidence fusion value at the previous time corresponding to each candidate category is obtained from the category fusion result at the previous time corresponding to the target obstacle.
In an embodiment of the present invention, the electronic device may obtain, from the class fusion result at the previous time corresponding to the target obstacle, a confidence fusion value at the previous time corresponding to each candidate class. Specifically, the category fusion result at the previous time corresponding to the target obstacle may include confidence fusion values at the previous time corresponding to the N candidate categories, which are: a reliability fusion value 1 at the previous moment corresponding to the candidate category 1 and a reliability fusion value 2 at the previous moment corresponding to the candidate category 2; …, respectively; and the confidence fusion value N of the last moment corresponding to the candidate type N. For example, it is assumed that the category fusion result at the previous time corresponding to the target obstacle may include the confidence fusion values at the previous time corresponding to the four candidate categories, and Ped represents a pedestrian; bic represents a bicycle; car represents a motor vehicle; unk represents an unknown class; namely:wherein,representing a category fusion result of a previous time corresponding to the target obstacle; m isK-1(Ped) represents a confidence fusion value of the pedestrian at the previous time; m isK-1(Bic) represents a confidence fusion value of the bicycle at the previous moment; m isK-1(Car) representing a confidence fusion value of the corresponding last time of the motor vehicle; m isK-1(Unk) represents the confidence fusion value at the previous time corresponding to the unknown class. Therefore, in this step, the electronic device can acquire the confidence fusion value at the previous time corresponding to each candidate category from the category fusion result at the previous time corresponding to the target obstacle.
And S203, acquiring the reliability output value of the current time corresponding to each candidate type from the type output result of the current time corresponding to the target obstacle.
In a specific embodiment of the present invention, the electronic device may obtain, from the class output result at the current time corresponding to the target obstacle, a confidence level output value at the current time corresponding to each candidate class. Specifically, eachThe category output result of the current time corresponding to the target obstacle output by each sensor may include confidence output values of the current time corresponding to N candidate categories, which are: a reliability output value 1 of the current time corresponding to the candidate type 1 and a reliability output value 2 of the current time corresponding to the candidate type 2; …, respectively; and outputting a current time reliability output value N corresponding to the candidate type N. For example, it is assumed that the category output result at the current time corresponding to the target obstacle output by the first sensor may include confidence output values at the current time corresponding to four candidate categories, and Ped represents a pedestrian; bic represents a bicycle; car represents a motor vehicle; unk represents an unknown class; namely: m is1=[a1(Ped),a1(Bic),a1(Car),a1(Unk)](ii) a Wherein m is1A category output result indicating a current time corresponding to the target obstacle output by the first sensor; a is1(Ped) represents a reliability output value of the pedestrian output by the first sensor at the current time; a is1(Bic) represents a reliability output value of the bicycle at the current time, which is output by the first sensor; a is1(Car) indicating a reliability output value of the first sensor at the current time corresponding to the motor vehicle; a is1(Unk) represents a confidence output value at the current time corresponding to the unknown class of the first sensor output. For another example, assume that the category output result at the current time corresponding to the target obstacle output by the second sensor may include confidence level output values at the current time corresponding to four candidate categories, and Ped represents a pedestrian; bic represents a bicycle; car represents a motor vehicle; unk represents an unknown class; namely: m is2=[a2(Ped),a2(Bic),a2(Car),a2(Unk)](ii) a Wherein m is2A category output result indicating a current time corresponding to the target obstacle output by the second sensor; a is2(Ped) represents a reliability output value of the pedestrian output by the second sensor at the current time; a is2(Bic) represents a reliability output value of the bicycle at the current time outputted by the second sensor; a is2(Car) represents a reliability output value of the vehicle output by the second sensor at the current time; a is2(Unk) represents a current time corresponding to the unknown class of the second sensor outputAnd outputting the value of the reliability of the moment. And so on. For another example, assume that the category output result at the current time corresponding to the target obstacle output by the M-th sensor may include confidence output values at the current time corresponding to four candidate categories, and Ped represents a pedestrian; bic represents a bicycle; car represents a motor vehicle; unk represents an unknown class; namely: m isM=[aM(Ped),aM(Bic),aM(Car),aM(Unk)](ii) a Wherein m isMA category output result indicating a current time corresponding to the target obstacle output by the M-th sensor; a isM(Ped) represents a reliability output value at the current time corresponding to the pedestrian output by the M-th sensor; a isM(Bic) represents a reliability output value of the current time corresponding to the bicycle output by the M-th sensor; a isM(Car) represents a reliability output value of the current time corresponding to the motor vehicle output by the mth sensor; a isM(Unk) represents the confidence output value at the current time corresponding to the unknown class output by the Mth sensor. Therefore, in this step, the electronic device may acquire the certainty factor output value at the current time corresponding to each candidate category from the category output result at the current time corresponding to the target obstacle.
And S204, calculating a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category.
In an embodiment of the present invention, the electronic device may calculate a category fusion result of the current time corresponding to the target obstacle according to the previous-time confidence fusion value corresponding to each candidate category and the current-time confidence output value corresponding to each candidate category. Specifically, the electronic device may calculate a confidence fusion value of the current time corresponding to each candidate category according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category; and determining a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the current time corresponding to each candidate category.
And S205, determining the final type of the target obstacle according to the type fusion result of the current time corresponding to the target obstacle.
In a specific embodiment of the present invention, the electronic device may determine the final category of the target obstacle according to the category fusion result of the current time corresponding to the target obstacle. Specifically, the electronic device may sort all candidate categories according to the category fusion result of the current time corresponding to the target obstacle, and the confidence fusion value of the current time corresponding to the candidate categories; and then determining the candidate class with the maximum confidence fusion value as the final class of the target obstacle.
The method for determining the type of the obstacle provided by the embodiment of the invention comprises the steps of firstly receiving the type output result of the target obstacle at the current moment, which is sent by each sensor; then calculating a category fusion result of the current moment corresponding to the target obstacle according to a predetermined category fusion result of the previous moment corresponding to the target obstacle and a category output result of the current moment corresponding to the target obstacle; and determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment. That is, in the technical solution of the present invention, the category fusion result at the current time corresponding to the target obstacle may be calculated according to the category fusion result at the previous time corresponding to the target obstacle and the category output result at the current time corresponding to the target obstacle; and determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment. In the conventional method for determining the type of an obstacle, the output type of a certain sensor is determined as the final type of a target obstacle; or, the output results of each sensor are weighted and fused by using the prior knowledge. Therefore, compared with the prior art, the method for determining the type of the obstacle provided by the embodiment of the invention can more accurately determine the final type of the target obstacle; moreover, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
EXAMPLE III
Fig. 3 is a flowchart illustrating a method for determining an obstacle category according to a third embodiment of the present invention. As shown in fig. 3, the method for determining the obstacle category may include the steps of:
and S301, receiving the type output result of the current time corresponding to the target obstacle sent by each sensor.
In an embodiment of the present invention, the electronic device may receive the category output result of the current time corresponding to the target obstacle sent by each sensor. For example, assume that M sensors may be provided on an unmanned vehicle, respectively: sensor 1, sensor 2, …, sensor M; the sensor 1 can send a type output result 1 of the current moment corresponding to the target obstacle to the electronic equipment; the sensor 2 can send the type output result 2 of the current moment corresponding to the target obstacle to the electronic equipment; …, respectively; the sensor M may transmit the category output result M of the current time corresponding to the target obstacle to the electronic device. Therefore, the electronic device can receive the type output result 1 at the current time corresponding to the target obstacle sent by the sensor 1; the type output result 1 of the current time corresponding to the target obstacle sent by the sensor 2 can be received; …, respectively; the type output result M of the current moment corresponding to the target obstacle sent by the sensor M can be received; wherein M is a natural number greater than 1.
And S302, acquiring a confidence fusion value of the previous time corresponding to each candidate type from the type fusion result of the previous time corresponding to the target obstacle.
In an embodiment of the present invention, the electronic device may obtain, from the class fusion result at the previous time corresponding to the target obstacle, a confidence fusion value at the previous time corresponding to each candidate class. Specifically, the category fusion result at the previous time corresponding to the target obstacle may include confidence fusion values at the previous time corresponding to the N candidate categories, which are: a reliability fusion value 1 at the previous moment corresponding to the candidate category 1 and a reliability fusion value 2 at the previous moment corresponding to the candidate category 2; …, respectively; and the confidence fusion value N of the last moment corresponding to the candidate type N. For example, it is assumed that the category fusion result at the previous time corresponding to the target obstacle may include the confidence fusion values at the previous time corresponding to the four candidate categories, and Ped represents a pedestrian; bic represents a bicycle; car represents a motor vehicle; unk represents an unknown class; namely:wherein,representing a category fusion result of a previous time corresponding to the target obstacle; m isK-1(Ped) represents a confidence fusion value of the pedestrian at the previous time; m isK-1(Bic) represents a confidence fusion value of the bicycle at the previous moment; m isK-1(Car) representing a confidence fusion value of the corresponding last time of the motor vehicle; m isK-1(Unk) represents the confidence fusion value at the previous time corresponding to the unknown class. Therefore, in this step, the electronic device can acquire the confidence fusion value at the previous time corresponding to each candidate category from the category fusion result at the previous time corresponding to the target obstacle.
And S303, acquiring a reliability output value of the current time corresponding to each candidate type from the type output result of the current time corresponding to the target obstacle.
In a specific embodiment of the present invention, the electronic device may obtain, from the class output result at the current time corresponding to the target obstacle, a confidence level output value at the current time corresponding to each candidate class. Specifically, the category output result of the current time corresponding to the target obstacle output by each sensor may include confidence output values of the current time corresponding to N candidate categories, which are: a reliability output value 1 of the current time corresponding to the candidate type 1 and a reliability output value 2 of the current time corresponding to the candidate type 2; …, respectively; and outputting a current time reliability output value N corresponding to the candidate type N. For example, it is assumed that the category output result at the current time corresponding to the target obstacle output by the first sensor may include confidence output values at the current time corresponding to four candidate categories, and Ped represents a pedestrian; bic represents a bicycle; car represents a motor vehicle; unk represents an unknown class; namely: m is1=[a1(Ped),a1(Bic),a1(Car),a1(Unk)](ii) a Wherein m is1A category output result indicating a current time corresponding to the target obstacle output by the first sensor; a is1(Ped) denotes first sensor transmissionOutputting a reliability output value of the current moment corresponding to the pedestrian; a is1(Bic) represents a reliability output value of the bicycle at the current time, which is output by the first sensor; a is1(Car) indicating a reliability output value of the first sensor at the current time corresponding to the motor vehicle; a is1(Unk) represents a confidence output value at the current time corresponding to the unknown class of the first sensor output. For another example, assume that the category output result at the current time corresponding to the target obstacle output by the second sensor may include confidence level output values at the current time corresponding to four candidate categories, and Ped represents a pedestrian; bic represents a bicycle; car represents a motor vehicle; unk represents an unknown class; namely: m is2=[a2(Ped),a2(Bic),a2(Car),a2(Unk)](ii) a Wherein m is2A category output result indicating a current time corresponding to the target obstacle output by the second sensor; a is2(Ped) represents a reliability output value of the pedestrian output by the second sensor at the current time; a is2(Bic) represents a reliability output value of the bicycle at the current time outputted by the second sensor; a is2(Car) represents a reliability output value of the vehicle output by the second sensor at the current time; a is2(Unk) represents a confidence output value at the current time corresponding to the unknown class of the second sensor output. And so on. For another example, assume that the category output result at the current time corresponding to the target obstacle output by the M-th sensor may include confidence output values at the current time corresponding to four candidate categories, and Ped represents a pedestrian; bic represents a bicycle; car represents a motor vehicle; unk represents an unknown class; namely: m isM=[aM(Ped),aM(Bic),aM(Car),aM(Unk)](ii) a Wherein m isMA category output result indicating a current time corresponding to the target obstacle output by the M-th sensor; a isM(Ped) represents a reliability output value at the current time corresponding to the pedestrian output by the M-th sensor; a isM(Bic) represents a reliability output value of the current time corresponding to the bicycle output by the M-th sensor; a isM(Car) represents a reliability output value of the current time corresponding to the motor vehicle output by the mth sensor; a isM(Unk) indicates that the Mth sensor output is notAnd (4) knowing the reliability output value of the current time corresponding to the category. Therefore, in this step, the electronic device may acquire the certainty factor output value at the current time corresponding to each candidate category from the category output result at the current time corresponding to the target obstacle.
And S304, calculating the reliability fusion value of the current time corresponding to each candidate category according to the reliability fusion value of the previous time corresponding to each candidate category and the reliability output value of the current time corresponding to each candidate category.
In an embodiment of the present invention, the electronic device may calculate the confidence fusion value of the current time corresponding to each candidate category according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category. Specifically, the electronic device may determine, first, a mutually associated candidate category and a mutually exclusive candidate category corresponding to each candidate category; then obtaining a confidence level fusion value at the previous moment corresponding to the mutually related candidate categories and a confidence level fusion value at the previous moment corresponding to the mutually exclusive candidate categories from the category fusion result at the previous moment corresponding to the target obstacle; obtaining a reliability output value of the current time corresponding to the mutually associated candidate categories and a reliability output value of the current time corresponding to the mutually exclusive candidate categories from the category output result of the current time corresponding to the target obstacle; and calculating the reliability fusion value of the current time corresponding to each candidate category according to the reliability fusion value of the last time corresponding to the associated candidate category, the reliability fusion value of the last time corresponding to the mutually exclusive candidate category, the reliability output value of the current time corresponding to the mutually associated candidate category and the reliability output value of the current time corresponding to the mutually exclusive candidate category. Specifically, the electronic device may calculate the confidence fusion value of the current time corresponding to each candidate category according to the following formula:wherein A belongs to { Ped, Bic, Car, Unk }; normalized constant
And S305, determining a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the current time corresponding to each candidate category.
In a specific embodiment of the present invention, the electronic device may determine a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the current time corresponding to each candidate category. Specifically, it is assumed that the category fusion result of the current time corresponding to the target obstacle may include confidence fusion values of the current time corresponding to the four candidate categories, and Ped represents a pedestrian; bic represents a bicycle; car represents a motor vehicle; unk represents an unknown class; namely:wherein,representing a category fusion result of the current time corresponding to the target obstacle; m isK(Ped) represents a reliability fusion value of the pedestrian at the current time; m isK(Bic) represents a reliability fusion value of the current time corresponding to the bicycle; m isK(Car) represents a reliability fusion value of the current time corresponding to the motor vehicle; m isK(Unk) represents the confidence fusion value of the current time corresponding to the unknown class. Therefore, in this step, the electronic device may determine the category fusion result of the current time corresponding to the target obstacle from the current time reliability fusion value corresponding to each candidate category.
And S306, determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment.
In a specific embodiment of the present invention, the electronic device may determine the final category of the target obstacle according to the category fusion result of the current time corresponding to the target obstacle. Specifically, the electronic device may sort all candidate categories according to the category fusion result of the current time corresponding to the target obstacle, and the confidence fusion value of the current time corresponding to the candidate categories; and then determining the candidate class with the maximum confidence fusion value as the final class of the target obstacle.
The method for determining the type of the obstacle provided by the embodiment of the invention comprises the steps of firstly receiving the type output result of the target obstacle at the current moment, which is sent by each sensor; then calculating a category fusion result of the current moment corresponding to the target obstacle according to a predetermined category fusion result of the previous moment corresponding to the target obstacle and a category output result of the current moment corresponding to the target obstacle; and determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment. That is, in the technical solution of the present invention, the category fusion result at the current time corresponding to the target obstacle may be calculated according to the category fusion result at the previous time corresponding to the target obstacle and the category output result at the current time corresponding to the target obstacle; and determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment. In the conventional method for determining the type of an obstacle, the output type of a certain sensor is determined as the final type of a target obstacle; or, the output results of each sensor are weighted and fused by using the prior knowledge. Therefore, compared with the prior art, the method for determining the type of the obstacle provided by the embodiment of the invention can more accurately determine the final type of the target obstacle; moreover, the technical scheme of the embodiment of the invention is simple and convenient to realize, convenient to popularize and wider in application range.
Example four
Fig. 4 is a first structural diagram of an obstacle category determination apparatus according to a fourth embodiment of the present invention. As shown in fig. 4, the apparatus for determining the obstacle category according to the embodiment of the present invention may include: a receiving module 401, a calculating module 402 and a determining module 403; wherein,
the receiving module 401 is configured to receive a category output result of the current time corresponding to the target obstacle sent by each sensor;
the calculating module 402 is configured to calculate a category fusion result of the current time corresponding to the target obstacle according to a predetermined category fusion result of the previous time corresponding to the target obstacle and a predetermined category output result of the current time corresponding to the target obstacle;
the determining module 403 is configured to determine a final category of the target obstacle according to a category fusion result of the current time corresponding to the target obstacle.
Fig. 5 is a second structural diagram of an obstacle category determination apparatus according to a fourth embodiment of the present invention. As shown in fig. 5, the calculation module 402 includes: an acquisition sub-module 4021 and a calculation sub-module 4022; wherein,
the obtaining sub-module 4021 is configured to obtain a confidence fusion value at a previous time corresponding to each candidate category from a category fusion result at the previous time corresponding to the target obstacle; obtaining a confidence output value of the current time corresponding to each candidate type from the type output result of the current time corresponding to the target obstacle;
the calculating sub-module 4022 is configured to calculate a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category.
Further, the calculating sub-module 4022 is configured to calculate a confidence fusion value of the current time corresponding to each candidate category according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category; and determining a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the current time corresponding to each candidate category.
Further, the calculating sub-module 4022 is specifically configured to determine the correlated candidate categories and the mutually exclusive candidate categories corresponding to the candidate categories; acquiring a confidence level fusion value at a previous moment corresponding to the mutually-associated candidate categories and a confidence level fusion value at a previous moment corresponding to the mutually-exclusive candidate categories from a category fusion result at a previous moment corresponding to the target obstacle; obtaining a confidence output value of the current time corresponding to the mutually-associated candidate categories and a confidence output value of the current time corresponding to the mutually-exclusive candidate categories from a category output result of the current time corresponding to the target obstacle; and calculating the reliability fusion value of the current time corresponding to each candidate category according to the reliability fusion value of the last time corresponding to the mutually associated candidate categories, the reliability fusion value of the last time corresponding to the mutually exclusive candidate categories, the reliability output value of the current time corresponding to the mutually associated candidate categories and the reliability output value of the current time corresponding to the mutually exclusive candidate categories.
Further, the determining module 403 is specifically configured to sort all candidate categories according to the category fusion result of the current time corresponding to the target obstacle, according to the confidence fusion value of the current time corresponding to the candidate categories; and determining the candidate class with the maximum confidence fusion value as the final class of the target obstacle.
The device for determining the obstacle category can execute the method provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method. For the technical details that are not described in detail in this embodiment, reference may be made to the method for determining the type of obstacle provided in any embodiment of the present invention.
EXAMPLE five
Fig. 6 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present invention. FIG. 6 illustrates a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the present invention. The electronic device 12 shown in fig. 6 is only an example and should not bring any limitation to the function and the scope of use of the embodiment of the present invention.
As shown in FIG. 6, electronic device 12 is embodied in the form of a general purpose computing device. The components of electronic device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. The electronic device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 6, and commonly referred to as a "hard drive"). Although not shown in FIG. 6, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, to implement the method for determining the obstacle category provided by the embodiment of the present invention.
EXAMPLE six
The sixth embodiment of the invention provides a computer storage medium.
The computer-readable storage media of embodiments of the invention may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (6)
1. A method of determining a type of obstacle, the method comprising:
receiving the type output result of the target obstacle at the current moment, which is sent by each sensor;
calculating a category fusion result of the current moment corresponding to the target obstacle according to a predetermined category fusion result of the previous moment corresponding to the target obstacle and a predetermined category output result of the current moment corresponding to the target obstacle;
determining the final type of the target obstacle according to the type fusion result of the target obstacle at the current moment;
wherein the calculating a category fusion result of the current time corresponding to the target obstacle according to a predetermined category fusion result of a previous time corresponding to the target obstacle and a predetermined category output result of the current time corresponding to the target obstacle includes:
acquiring a confidence level fusion value of the last time corresponding to each candidate category from the category fusion result of the last time corresponding to the target obstacle;
obtaining a confidence output value of the current time corresponding to each candidate type from the type output result of the current time corresponding to the target obstacle;
calculating a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category;
wherein the calculating a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the previous time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category includes:
calculating the credibility fusion value of the current time corresponding to each candidate category according to the credibility fusion value of the last time corresponding to each candidate category and the credibility output value of the current time corresponding to each candidate category;
determining a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the current time corresponding to each candidate category;
wherein, the calculating the confidence level fusion value of the current time corresponding to each candidate category according to the confidence level fusion value of the last time corresponding to each candidate category and the confidence level output value of the current time corresponding to each candidate category comprises:
determining the mutually associated candidate categories and mutually exclusive candidate categories corresponding to the candidate categories;
acquiring a confidence level fusion value at a previous moment corresponding to the mutually-associated candidate categories and a confidence level fusion value at a previous moment corresponding to the mutually-exclusive candidate categories from a category fusion result at a previous moment corresponding to the target obstacle;
obtaining a confidence output value of the current time corresponding to the mutually-associated candidate categories and a confidence output value of the current time corresponding to the mutually-exclusive candidate categories from a category output result of the current time corresponding to the target obstacle;
and calculating the reliability fusion value of the current time corresponding to each candidate category according to the reliability fusion value of the last time corresponding to the mutually associated candidate categories, the reliability fusion value of the last time corresponding to the mutually exclusive candidate categories, the reliability output value of the current time corresponding to the mutually associated candidate categories and the reliability output value of the current time corresponding to the mutually exclusive candidate categories.
2. The method according to claim 1, wherein the determining a final class of the target obstacle according to the class fusion result of the current time corresponding to the target obstacle comprises:
sorting all candidate categories according to the category fusion result of the current time corresponding to the target obstacle, wherein the category fusion result of the current time corresponds to the target obstacle;
and determining the candidate class with the maximum confidence fusion value as the final class of the target obstacle.
3. An apparatus for determining a type of an obstacle, the apparatus comprising: the device comprises a receiving module, a calculating module and a determining module; wherein,
the receiving module is used for receiving the type output result of the current moment corresponding to the target obstacle sent by each sensor;
the calculation module is used for calculating a category fusion result of the current moment corresponding to the target obstacle according to a predetermined category fusion result of the previous moment corresponding to the target obstacle and a predetermined category output result of the current moment corresponding to the target obstacle;
the determining module is used for determining the final type of the target obstacle according to the type fusion result of the current time corresponding to the target obstacle;
wherein the calculation module comprises: an acquisition submodule and a calculation submodule; wherein,
the obtaining submodule is used for obtaining a confidence level fusion value of the last moment corresponding to each candidate category from the category fusion result of the last moment corresponding to the target obstacle; obtaining a confidence output value of the current time corresponding to each candidate type from the type output result of the current time corresponding to the target obstacle;
the calculation submodule is used for calculating a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the last time corresponding to each candidate category and the confidence output value of the current time corresponding to each candidate category;
the calculation submodule is used for calculating the credibility fusion value of the current time corresponding to each candidate category according to the credibility fusion value of the last time corresponding to each candidate category and the credibility output value of the current time corresponding to each candidate category; determining a category fusion result of the current time corresponding to the target obstacle according to the confidence fusion value of the current time corresponding to each candidate category;
the computing submodule is specifically configured to determine a mutually associated candidate category and a mutually exclusive candidate category corresponding to each candidate category; acquiring a confidence level fusion value at a previous moment corresponding to the mutually-associated candidate categories and a confidence level fusion value at a previous moment corresponding to the mutually-exclusive candidate categories from a category fusion result at a previous moment corresponding to the target obstacle; obtaining a confidence output value of the current time corresponding to the mutually-associated candidate categories and a confidence output value of the current time corresponding to the mutually-exclusive candidate categories from a category output result of the current time corresponding to the target obstacle; and calculating the reliability fusion value of the current time corresponding to each candidate category according to the reliability fusion value of the last time corresponding to the mutually associated candidate categories, the reliability fusion value of the last time corresponding to the mutually exclusive candidate categories, the reliability output value of the current time corresponding to the mutually associated candidate categories and the reliability output value of the current time corresponding to the mutually exclusive candidate categories.
4. The apparatus of claim 3, wherein:
the determining module is specifically configured to sort all candidate categories according to the category fusion result of the current time corresponding to the target obstacle, according to the confidence fusion value of the current time corresponding to the candidate categories; and determining the candidate class with the maximum confidence fusion value as the final class of the target obstacle.
5. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of determining the category of obstacles of any one of claims 1-2.
6. A storage medium on which a computer program is stored, which program, when being executed by a processor, is characterized by carrying out the method of determining the obstacle category according to any one of claims 1 to 2.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811506234.8A CN109635868B (en) | 2018-12-10 | 2018-12-10 | Method and device for determining obstacle type, electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811506234.8A CN109635868B (en) | 2018-12-10 | 2018-12-10 | Method and device for determining obstacle type, electronic device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109635868A CN109635868A (en) | 2019-04-16 |
CN109635868B true CN109635868B (en) | 2021-07-20 |
Family
ID=66072547
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811506234.8A Active CN109635868B (en) | 2018-12-10 | 2018-12-10 | Method and device for determining obstacle type, electronic device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109635868B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110843792B (en) * | 2019-11-29 | 2021-05-25 | 北京百度网讯科技有限公司 | Method and apparatus for outputting information |
CN111582173A (en) * | 2020-05-08 | 2020-08-25 | 东软睿驰汽车技术(沈阳)有限公司 | Automatic driving method and system |
CN114386481A (en) * | 2021-12-14 | 2022-04-22 | 京东鲲鹏(江苏)科技有限公司 | Vehicle perception information fusion method, device, equipment and storage medium |
CN114858200B (en) * | 2022-04-19 | 2023-06-27 | 合众新能源汽车股份有限公司 | Method and device for evaluating quality of object detected by vehicle sensor |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106707293A (en) * | 2016-12-01 | 2017-05-24 | 百度在线网络技术(北京)有限公司 | Obstacle recognition method and device for vehicles |
CN108573271A (en) * | 2017-12-15 | 2018-09-25 | 蔚来汽车有限公司 | Optimization method and device, computer equipment and the recording medium of Multisensor Target Information fusion |
CN108664989A (en) * | 2018-03-27 | 2018-10-16 | 北京达佳互联信息技术有限公司 | Image tag determines method, apparatus and terminal |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8666111B2 (en) * | 2011-05-24 | 2014-03-04 | Tata Consultancy Services Limited | System and method for detecting the watermark using decision fusion |
JP5597322B1 (en) * | 2012-11-05 | 2014-10-01 | パナソニック株式会社 | RUNNING INFORMATION GENERATION DEVICE, METHOD, AND PROGRAM FOR AUTONOMOUS TRAVEL DEVICE |
US10034066B2 (en) * | 2016-05-02 | 2018-07-24 | Bao Tran | Smart device |
CN107966700A (en) * | 2017-11-20 | 2018-04-27 | 天津大学 | A kind of front obstacle detecting system and method for pilotless automobile |
-
2018
- 2018-12-10 CN CN201811506234.8A patent/CN109635868B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106707293A (en) * | 2016-12-01 | 2017-05-24 | 百度在线网络技术(北京)有限公司 | Obstacle recognition method and device for vehicles |
CN108573271A (en) * | 2017-12-15 | 2018-09-25 | 蔚来汽车有限公司 | Optimization method and device, computer equipment and the recording medium of Multisensor Target Information fusion |
CN108664989A (en) * | 2018-03-27 | 2018-10-16 | 北京达佳互联信息技术有限公司 | Image tag determines method, apparatus and terminal |
Non-Patent Citations (2)
Title |
---|
An improved classification method of concealed obstacles using UWB radar and stereo cameras;Dong Won Yang 等;《IEEE》;20111128;第1-4页 * |
决策层时空信息融合的神经网络模型研究;朱玉鹏 等;《系统工程与电子技术》;20080630;第30卷(第6期);第1098-1102页 * |
Also Published As
Publication number | Publication date |
---|---|
CN109635868A (en) | 2019-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109059902B (en) | Relative pose determination method, device, equipment and medium | |
EP3627180B1 (en) | Sensor calibration method and device, computer device, medium, and vehicle | |
CN109188457B (en) | Object detection frame generation method, device, equipment, storage medium and vehicle | |
CN109188438B (en) | Yaw angle determination method, device, equipment and medium | |
CN109635868B (en) | Method and device for determining obstacle type, electronic device and storage medium | |
CN109284348B (en) | Electronic map updating method, device, equipment and storage medium | |
CN110095752B (en) | Positioning method, apparatus, device and medium | |
CN109435955B (en) | Performance evaluation method, device and equipment for automatic driving system and storage medium | |
CN109870698B (en) | Ultrasonic array obstacle detection result processing method and system | |
CN110134126B (en) | Track matching method, device, equipment and medium | |
CN109635861B (en) | Data fusion method and device, electronic equipment and storage medium | |
CN113537362A (en) | Perception fusion method, device, equipment and medium based on vehicle-road cooperation | |
CN109118797B (en) | Information sharing method, device, equipment and storage medium | |
CN112100565A (en) | Road curvature determination method, device, equipment and storage medium | |
CN112712036A (en) | Traffic sign recognition method and device, electronic equipment and computer storage medium | |
CN114490910A (en) | Map generation method and device, electronic equipment and storage medium | |
JP2023038164A (en) | Obstacle detection method, device, automatic driving vehicle, apparatus, and storage medium | |
WO2024051344A1 (en) | Map creation method and apparatus | |
CN109270566B (en) | Navigation method, navigation effect testing method, device, equipment and medium | |
WO2023051398A1 (en) | Security compensation method and apparatus, and storage medium and electronic device | |
CN114578401B (en) | Method and device for generating lane track points, electronic equipment and storage medium | |
CN113111692A (en) | Target detection method and device, computer readable storage medium and electronic equipment | |
CN114429631B (en) | Three-dimensional object detection method, device, equipment and storage medium | |
CN108872999B (en) | Object identification method, device, identification equipment and storage medium | |
CN113721240A (en) | Target association method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |