CN113119988B - Unmanned driving decision generation method and device, storage medium and computer equipment - Google Patents

Unmanned driving decision generation method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN113119988B
CN113119988B CN201911417516.5A CN201911417516A CN113119988B CN 113119988 B CN113119988 B CN 113119988B CN 201911417516 A CN201911417516 A CN 201911417516A CN 113119988 B CN113119988 B CN 113119988B
Authority
CN
China
Prior art keywords
target information
included angle
cosines
probability distribution
distribution function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911417516.5A
Other languages
Chinese (zh)
Other versions
CN113119988A (en
Inventor
任大凯
程婕
蔡嘉
刘涛
张胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Shanghai ICT Co Ltd
CM Intelligent Mobility Network Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Shanghai ICT Co Ltd
CM Intelligent Mobility Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Shanghai ICT Co Ltd, CM Intelligent Mobility Network Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN201911417516.5A priority Critical patent/CN113119988B/en
Publication of CN113119988A publication Critical patent/CN113119988A/en
Application granted granted Critical
Publication of CN113119988B publication Critical patent/CN113119988B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0043Signal treatments, identification of variables or parameters, parameter estimation or state estimation

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

In the technical scheme of the unmanned driving decision generation method, the device, the storage medium and the computer device provided by the embodiment of the invention, the sum of cosines of a first included angle is calculated according to the obtained scale vectors of a plurality of target information, if the sum of cosines of the first included angle is judged to be not more than a preset threshold value, the weight of an initial probability distribution function of a sensor corresponding to the first target information is adjusted to generate an adjusted probability distribution function, a fusion rule function is generated according to the adjusted probability distribution function and a set conflict factor, an unmanned driving decision judgment result is calculated according to the fusion rule function and a preset judgment rule, and the reliability of an automatic driving technology is effectively improved by fusing effective information.

Description

Unmanned driving decision generation method and device, storage medium and computer equipment
[ technical field ] A method for producing a semiconductor device
The invention relates to the field of Internet of things services, in particular to a method and a device for generating an unmanned decision, a storage medium and computer equipment.
[ background of the invention ]
In the related technology, the adopted multi-source sensor data fusion method usually adopts Bayesian statistical theory to calculate the probability of accurately identifying the traffic speed limit sign, but obviously, if the signs such as the speed limit sign and the like are blocked, the identification rate is much lower, so that the reliability of automatic driving cannot be guaranteed. And the perception of the whole road situation is lacked, the research on the global road information perception and the construction of driving space perception is not carried out under the network connection automatic driving scene, and the requirements of the future 5G network connection automatic driving on vehicle and road perception are not enough supported. Therefore, a multisource sensor data fusion scheme which can accurately identify the type of target information, filter out invalid information and fuse valid information so as to improve the reliability of the automatic driving technology is lacked in the related technology.
[ summary of the invention ]
In view of the above, the invention provides a method, an apparatus, a storage medium and a computer device for generating an unmanned driving decision, which improve the reliability of an automatic driving technique by accurately identifying the type of target information, filtering out invalid information and fusing valid information.
In one aspect, an embodiment of the present invention provides a method for generating an unmanned driving decision, including:
calculating the cosine sum of a first included angle according to the obtained scale vectors of the plurality of target information;
if the sum of the cosines of the first included angle is judged to be not greater than the preset threshold value, the weight of the initial probability distribution function of the sensor corresponding to the first target information is adjusted, and the adjusted probability distribution function is generated;
generating a fusion rule function according to the adjusted probability distribution function and the set conflict factor;
and calculating a decision judgment result of the unmanned vehicle according to the fusion rule function and a preset judgment rule.
Optionally, before the calculating a sum of cosines of the first included angle according to the obtained scale vectors of the plurality of target information, the method further includes:
calculating the Euclidean distance between the scale vector of the target information and the scale vector of the target information according to an Euclidean distance formula;
judging whether the Euclidean distance is greater than a preset distance;
if the Euclidean distance is judged to be not greater than the preset distance, filtering perception information of a vehicle-mounted sensor corresponding to the target information;
and if the Euclidean distance is judged to be larger than the preset distance, continuing to execute the step of calculating the sum of the cosines of the first included angle according to the obtained scale vectors of the plurality of target information.
Optionally, the calculating a sum of cosines of the first included angle according to the obtained scale vectors of the plurality of target information includes:
calculating the cosine of an included angle between the scale vectors of any two pieces of target information according to the scale vectors of the target information;
selecting a plurality of first included angle cosines from the calculated included angle cosines, wherein the plurality of first included angle cosines comprise included angle cosines between a scale vector of first target information in the included angle cosines and scale vectors of other target information;
and adding the cosines of the plurality of first included angles to calculate the sum of the cosines of the first included angles.
Optionally, the calculating, according to the obtained scale vectors of the plurality of target information, a cosine of an included angle between the scale vectors of any two target information includes:
by the formula two:
Figure GDA0002442363820000021
calculating the cosine of an included angle between the scale vectors of any two pieces of target information, wherein mi(Ak) Represented as the on-vehicle sensor i pair occurrence target information AkProbability value of (m)j(Ak) Represented as the on-vehicle sensor j pair occurrence target information AkA probability value of (c).
Optionally, the identification framework of the target information includes Θ ═ { a1,A2,A3,.., wherein A1,A2,A3… as destination information;
if the sum of the cosines of the first included angle is judged to be not greater than the preset threshold value, the weight of the initial probability distribution function of the sensor corresponding to the first target information is adjusted, and the adjusted probability distribution function is generated, and the method comprises the following steps:
if the sum of the cosines of the first included angle is judged not to be greater than the preset threshold value, according to a formula four:
Figure GDA0002442363820000031
and the sum of the cosines of the first included angle
Figure GDA0002442363820000032
The weight α (m) of the initial probability distribution function is calculatedi);
According to an evidence theory algorithm, determining a probability distribution function m of a sensor corresponding to the first target informationi(Ai);
Generating an adjusted probability distribution function according to the weight and the probability distribution function
Figure GDA0002442363820000033
Wherein m'i(Θ)=α(mi)·mi(Θ)-α(mi)+1。
Optionally, the generating a fusion rule function according to the adjusted probability distribution function and the set collision factor includes:
obtaining a collision factor
Figure GDA0002442363820000034
According to the adjusted probability distribution function
Figure GDA0002442363820000035
Generating a fusion rule function by summing the collision factor K
Figure GDA0002442363820000036
Optionally, the preset decision rule comprises
Figure GDA0002442363820000037
The calculating of the decision judgment result of the unmanned vehicle according to the fusion rule function and the preset judgment rule comprises the following steps:
acquiring the number n of target information;
according to the fusion rule function:
Figure GDA0002442363820000038
n-1 times of fusion is carried out on the n item label information to obtain the decision judgment result, wherein m (A)lastIs the result of the last fusion during the fusion process.
In another aspect, an embodiment of the present invention provides an apparatus for generating an unmanned driving decision, where the apparatus includes:
the first calculation module is used for calculating the cosine sum of a first included angle according to the acquired scale vectors of the plurality of target information;
the first generation module is used for adjusting the weight of an initial probability distribution function of the sensor corresponding to the first target information and generating an adjusted probability distribution function if the sum of the cosines of the first included angle is judged to be not greater than a preset threshold;
the second generation module is used for generating a fusion rule function according to the adjusted probability distribution function and the set conflict factor;
and the second calculation module is used for calculating a decision judgment result of the unmanned vehicle according to the fusion rule function and a preset judgment rule.
In another aspect, an embodiment of the present invention provides a storage medium, where the storage medium includes a stored program, where when the program runs, a device where the storage medium is located is controlled to execute the above-mentioned unmanned driving decision generation method.
In another aspect, an embodiment of the present invention provides a computer device, including a memory for storing information including program instructions and a processor for controlling execution of the program instructions, the program instructions being loaded by the processor and executing the steps of the above-mentioned unmanned decision making method.
In the technical scheme provided by the embodiment of the invention, the cosine sum of a first included angle is calculated according to the obtained scale vectors of a plurality of target information, if the cosine sum of the first included angle is judged to be not more than a preset threshold value, the weight of an initial probability distribution function of a sensor corresponding to the first target information is adjusted to generate an adjusted probability distribution function, a fusion rule function is generated according to the adjusted probability distribution function and a set conflict factor, an unmanned decision judgment result is calculated according to the fusion rule function and a preset judgment rule, and the reliability of the automatic driving technology is effectively improved by fusing effective information.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of a method for generating an unmanned driving decision according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for generating an unmanned driving decision according to yet another embodiment of the present invention;
fig. 3 is a schematic structural diagram of an apparatus for generating an unmanned decision according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a computer device according to an embodiment of the present invention.
[ detailed description ] embodiments
For better understanding of the technical solutions of the present invention, the following detailed descriptions of the embodiments of the present invention are provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the examples of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "and/or" as used herein is merely one type of associative relationship that describes an associated object, meaning that three types of relationships may exist, e.g., A and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
Fig. 1 is a flowchart of a method for generating an unmanned driving decision according to an embodiment of the present invention, as shown in fig. 1, the method includes:
step 101, calculating the sum of cosines of a first included angle according to the obtained scale vectors of the plurality of target information.
And 102, if the sum of the cosines of the first included angle is judged to be not greater than the preset threshold value, adjusting the weight of the initial probability distribution function of the sensor corresponding to the first target information, and generating an adjusted probability distribution function.
And 103, generating a fusion rule function according to the adjusted probability distribution function and the set conflict factor.
And step 104, calculating a decision judgment result of the unmanned vehicle according to the fusion rule function and a preset judgment rule.
In the technical scheme provided by the embodiment of the invention, the cosine sum of a first included angle is calculated according to the obtained scale vectors of a plurality of target information, if the cosine sum of the first included angle is judged to be not more than a preset threshold value, the weight of an initial probability distribution function of a sensor corresponding to the first target information is adjusted to generate an adjusted probability distribution function, a fusion rule function is generated according to the adjusted probability distribution function and a set conflict factor, an unmanned decision judgment result is calculated according to the fusion rule function and a preset judgment rule, and the reliability of the automatic driving technology is effectively improved by fusing effective information.
Fig. 2 is a flowchart of a method for generating an unmanned driving decision according to another embodiment of the present invention, as shown in fig. 2, the method includes:
step 201, converting the acquired perception information of the plurality of vehicle-mounted sensors into a plurality of corresponding target information.
In the embodiment of the invention, the vehicle-mounted sensor can comprise other sensors such as a binocular high-definition video camera, an infrared high-definition camera, a laser radar and a GPS (global positioning system) positioning sensor. The perception information of the vehicle-mounted sensor is the information collected by the vehicle-mounted sensor. The target information may also represent a target recognition result, and the target information may include the presence of an obstacle within 0-20m, the presence of an obstacle within 20-50m, the presence of an obstacle within 50-100m, the presence of an obstacle within 100-200m, or the absence of an obstacle within 200 m. The perception information of one vehicle-mounted sensor can be converted into corresponding at least one target information.
In the embodiment of the invention, the aim of adopting a plurality of vehicle-mounted sensors is to effectively avoid the limitation of a single sensor, so that the corresponding target information can be accurately converted, and the reliability of the automatic driving technology is improved.
In the embodiment of the present invention, step 201 specifically includes:
in step 2011, feature information is extracted from the acquired sensing information of the plurality of vehicle-mounted sensors.
In the embodiment of the invention, for example, when the vehicle-mounted sensor is a binocular high-definition video camera, the feature information extracted from the perception information is image parameter information and the like. For example, when the in-vehicle sensor is a GPS positioning, the feature information extracted from the sensing information is position parameter information or the like.
Step 2012, according to the characteristic information, inquiring target information corresponding to the characteristic information from a pre-established characteristic database.
In the embodiment of the invention, for example, according to the position parameter information, the target information corresponding to the position parameter information is inquired from a pre-established characteristic database and is 0-20m of the existing obstacle.
And 202, forming a probability distribution matrix m × p according to the number m of the acquired vehicle-mounted sensors and the number p of the target information.
In the embodiment of the invention, for example, the vehicle-mounted sensor may include 4 sensors of a binocular high-definition video camera, an infrared high-definition camera, a laser radar and a GPS. The target information may include 5 kinds of target information, i.e., the presence of obstacles in 0-20m, the presence of obstacles in 20-50m, the presence of obstacles in 50-100m, the presence of obstacles in 100-200m, or the absence of obstacles in 200m, thereby forming a 4 x 5 probability distribution matrix. For example, a probability distribution matrix of 4 x 5 is formed as shown in table 1 below:
TABLE 1
A1 A2 A3 A4 A5
m1 m1(A1) m1(A2) m1(A3) m1(A4) m1(A5)
m2 m2(A1) m2(A2) m2(A3) m2(A4) m2(A5)
m3 m3(A1) m3(A2) m3(A3) m3(A4) m3(A5)
m4 m4(A1) m4(A2) m4(A3) m4(A4) m4(A5)
Step 203, defining a p-dimensional vector according to the probability distribution matrix, including
Figure GDA0002442363820000071
And using the p-dimensional vector as the eyeScale vector of the label information.
In the embodiment of the invention, p represents the number of target information, so that the p-dimensional vector in the invention is
Figure GDA0002442363820000072
Since it is assumed that the probability of occurrence is the same among the 5 kinds of target information, the probability of occurrence is the same
Figure GDA0002442363820000073
The p-dimensional vector can be used as a scale vector of the target information.
Step 204, p-dimensional row vector m of probability distribution matrixiAs a scale vector of the target information, i is 1,2, … … m, and m indicates the number of sensors.
In the embodiment of the invention, the vector miIncludes a probability value for occurrence of a plurality of target information. For example, as shown in Table 1 above, m1Including m1(A1)、m1(A2)、m1(A3)、m1(A4)、m1(A5) I.e. the probability value of the vehicle-mounted sensor 1 for generating a plurality of target information.
Step 205, according to the formula one:
Figure GDA0002442363820000081
calculating the Euclidean distance between the scale vector of the target information and the scale vector of the target information, wherein mijAnd distributing row and column positions in the matrix for the probability.
In the embodiment of the invention, if the calculated Euclidean distance d is less than the set Euclidean distanceiThe larger the size, the stronger the directivity of the perception information of the plurality of sensors, and thus the greater the fusion value. Otherwise, if the calculated Euclidean distance diThe smaller the information is, the smaller the fusion value of the perception information of the plurality of vehicle-mounted sensors after the perception information is converted into the target information is. Therefore, by calculating the Euclidean distance d between the scale vector of the target information and the scale vector of the target informationiThe value of the perception information of a plurality of sensors can be reflected.
Step 206, judging whether the Euclidean distance is larger than a preset distance, if so, executing step 207; if not, go to step 207'.
In the embodiment of the invention, the preset distance is
Figure GDA0002442363820000082
Wherein the threshold lambda is a preset scale vector
Figure GDA0002442363820000083
Euclidean distance from the scale vector p.
In the embodiment of the invention, the preset scale vector is set according to the probability of historical data or the requirement, and the preset scale vector is used for determining the threshold lambda and is different from the scale vector of the target information. When d isiWhen the lambda is less than or equal to lambda, the sensing information fusion value of the vehicle-mounted sensor is low, so that the sensing information of the vehicle-mounted sensor is filtered, and the residual vehicle-mounted sensor information is fused. When d isiWhen lambda is larger than lambda, the sensing information fusion value of the sensor is higher, and the sensing information of the vehicle-mounted sensor is reserved.
And step 207, calculating the cosine of an included angle between the scale vectors of any two pieces of target information according to the obtained scale vectors of the plurality of pieces of target information.
In the embodiment of the present invention, step 207 specifically includes:
by the formula two:
Figure GDA0002442363820000084
calculating the cosine of an included angle between the scale vectors of any two pieces of target information, wherein mi(Ak) Represented as the on-vehicle sensor i pair occurrence target information AkProbability value of (m)j(Ak) Represented as the on-vehicle sensor j pair occurrence target information AkThe probability value of (2).
In the embodiment of the invention, the cosine of the included angle between the scale vectors of two pieces of target information is calculated by acquiring the probability values of different vehicle-mounted sensors for generating the same target information. If the scales of two target informationThe smaller the cosine of the angle between the vectors is, the cosine value cijThe closer to 1, the higher the direction similarity of the scale vectors of the two pieces of target information, the smaller the conflict between the pieces of target information of the two vehicle-mounted sensors is; if the cosine of the included angle between the scale vectors of the two target information is larger, the cosine value cijThe closer to 0, the lower the direction similarity of the scale vectors of the two pieces of target information, which indicates that the target information of the two vehicle-mounted sensors has a larger conflict, so that the degree of conflict between the two pieces of target information can be determined according to step 207.
In the embodiment of the present invention, further, the method further includes: and generating a cosine similarity matrix by calculating the cosine of an included angle between the scale vectors of any two pieces of target information.
And step 207', filtering the perception information of the vehicle-mounted sensor corresponding to the target information.
In the embodiment of the invention, after the sensing information of the vehicle-mounted sensor with low fusion value is filtered or the sensing information of the vehicle-mounted sensor with low fusion value is eliminated, the problem of large D-S multiple evidence calculation amount can be solved, so that the calculation amount is relieved, the target can be quickly identified, and the reliability of the automatic driving technology is improved.
And 208, selecting a plurality of first included angle cosines from the calculated plurality of included angle cosines, wherein the plurality of first included angle cosines comprise included angle cosines between a scale vector of first target information in the plurality of included angle cosines and scale vectors of other target information.
In the embodiment of the present invention, for example, the cosine of the first angle is the target information A1And other target information A2、A3、A4、A5Cosine of the angle between the scale vectors.
Step 209, add the cosines of the first angles to calculate the sum of the cosines of the first angles.
In the embodiment of the present invention, step 209 specifically includes:
according to the formula three:
Figure GDA0002442363820000091
and calculating the sum of the cosines of the first included angle, wherein cij represents the cosine of the first included angle. In the embodiment of the invention, when the cosine of the first included angle is the target information A1Scale vector of (2) and other target information A2、A3、A4、A5Is expressed as c, the cosine of the first angle is expressed as c1j。
In the examples of the present invention, Sum (m)i) Can be expressed as a collision coefficient between the scale vector of the first target information and the scale vectors of the other target information, Sum (m)i) The larger the value of (c), the lower the collision, Sum (m)i) A smaller value of (c) indicates a higher collision. By adjusting the weight of the initial probability distribution function of the sensor corresponding to the first target information, the degree of conflict between the scale vector of the first target information and the scale vectors of other target information can be improved.
Step 210, if it is determined that the sum of the cosines of the first included angle is not greater than the predetermined threshold, adjusting the weight of the initial probability distribution function of the sensor corresponding to the first target information, and generating an adjusted probability distribution function.
In the embodiment of the present invention, the preset threshold may be set according to historical experience, which is not limited in the present invention. Through steps 207 to 209, when it is determined that the conflict between the target information of the plurality of in-vehicle sensors is greater than the preset threshold, the adjusted probability distribution function is generated by adjusting the weight of the initial probability distribution function of the sensor corresponding to the first target information.
In the embodiment of the present invention, further, the method further includes:
and if the first included angle cosine sum is judged to be larger than the preset threshold value first included angle cosine sum, generating a fusion rule function according to the weight of the initial probability distribution function and the set conflict factor.
By setting a preset threshold, high-conflict target information can be selected for fusion, so that the fusion efficiency of the target information and the fusion accuracy of the target information can be improved, and the calculation complexity can be reduced.
In the embodiment of the present invention, step 210 specifically includes:
step 2101, according to the formula four:
Figure GDA0002442363820000101
and the sum of cosines of the first included angle
Figure GDA0002442363820000102
The weight α (m) of the probability distribution function is calculatedi) And adjusting the weight of the initial probability distribution function of the sensor corresponding to the first target information to alpha (m)i). In the embodiment of the invention, the value range of the cosine sum of the first included angle is [0,1 ]]When the sum of the cosines of the first included angle approaches to the direction of 1, it is indicated that the smaller the conflict between the scale vector of the first target information and the scale vectors of other target information is. Thus when the weight α (m) of the initial probability distribution function is calculatedi) The larger the size, the smaller the conflict between the scale vector indicating the first target information and the scale vectors of other target information; when the weight α (m) of the initial probability distribution function is calculatedi) The smaller the number of collisions between the scale vector indicating the first target information and the scale vectors of the other target information. Thus will be alpha (m)i) As the weight of the probability distribution function, the method can effectively solve the fusion and judgment result deviation caused by overlarge collision coefficient, and simultaneously enhances the robustness of the fusion rule function.
Step 2102, according to an evidence theory algorithm, determining a probability distribution function m of a sensor corresponding to the first target informationi(Ai)。
In the embodiment of the present invention, the identification frame of the target information is Θ ═ a1,A2,A3,A4,A5In which A1、A2、A3、A4、A5To identify subsets in the frame, all expressed as target information, let function m: 2Θ→[0,1]And satisfy
Figure GDA0002442363820000111
Figure GDA0002442363820000112
Therein, 2ΘExpressed as the number of subsets of the recognition framework, hence m (A)i) Is AiSo as to determine the probability distribution function mi (ai) expressed as the vehicle-mounted sensor i to the generation target information AiWherein i is 1,2,3 … …, and m is the number of sensors.
Step 2103, generating an adjusted probability distribution function according to the weight and the probability distribution function
Figure GDA0002442363820000113
Wherein m'i(Θ)=α(mi)·mi(Θ)-α(mi)+1。
And step 211, generating a fusion rule function according to the adjusted probability distribution function and the set conflict factor.
In this embodiment, step 211 specifically includes:
step 2111, obtaining conflict factor
Figure GDA0002442363820000114
In the embodiment of the invention, K is a conflict factor and represents m1,m2,.. and mmM represents the number of sensors.
Step 2112, distributing function according to adjusted probability
Figure GDA0002442363820000115
Generating a fusion rule function by summing the collision factor K
Figure GDA0002442363820000116
In the embodiment of the invention, through the steps before the step 211, the conflict among the target information of a plurality of sensors can be reduced, and the problem of incomplete identification frame can be solved. In the D-S evidence theory in the prior art, in the process of fusing the target information of the vehicle-mounted sensor, the target information of multiple sensors conflicts due to complex road conditions and external environments, or the fusion result is deviated due to the imperfect recognition framework, so the embodiment of the invention needs to improve the combination rule of the D-S evidence theory. The improved D-S combination rule distributes local conflicts to corresponding propositions according to the confidence level instead of ignoring information hidden in conflict evidences, so that the reliability and the reasonableness of the fusion result are improved.
And step 212, calculating a decision judgment result of the unmanned vehicle according to the fusion rule function and a preset judgment rule.
In the embodiment of the invention, the preset judgment rule comprises
Figure GDA0002442363820000121
Wherein epsilon1And ε2The threshold value is preset and can be set according to the mean value obtained by probability statistics.
In an embodiment of the present invention, step 212 specifically includes:
2121, acquiring the number n of target information;
in the embodiment of the invention, the number n of target information is equal to the number m of sensors. For example, when detecting a target, 4 sensors that are turned on can perform observation monitoring on the target, thereby generating 4 items of target information.
Step 2122, according to the fusion rule function:
Figure GDA0002442363820000122
fusing the n items of label information for n-1 times to obtain a decision judgment result, wherein m (A)lastIs the result of the last fusion during the fusion process.
In the embodiment of the invention, the fusion of the vehicle-mounted sensors and the decision judgment of unmanned driving are carried out by an improved D-S evidence theory, so that the problems of large calculation amount and weak fusion of high-conflict perception information sets can be solved, the requirement of an unmanned actual information fusion scene is met, the accuracy of target detection in the automatic driving technology is increased, the false detection of a single sensor is avoided, and the reliability of the automatic driving technology is improved.
In the technical scheme provided by the embodiment of the invention, the cosine sum of a first included angle is calculated according to the obtained scale vectors of a plurality of target information, if the cosine sum of the first included angle is judged to be not more than a preset threshold value, the weight of an initial probability distribution function of a sensor corresponding to the first target information is adjusted to generate an adjusted probability distribution function, a fusion rule function is generated according to the adjusted probability distribution function and a set conflict factor, an unmanned decision judgment result is calculated according to the fusion rule function and a preset judgment rule, and the reliability of the automatic driving technology is effectively improved by fusing effective information.
Fig. 3 is a schematic structural diagram of an apparatus for generating an unmanned decision according to an embodiment of the present invention, as shown in fig. 3, the apparatus includes: a first calculation module 11, a first generation module 12, a second generation module 13 and a second calculation module 14.
The first calculating module 11 is configured to calculate a sum of cosines of the first included angle according to the obtained scale vectors of the plurality of target information.
The first generating module 12 is configured to adjust a weight of an initial probability distribution function of the sensor corresponding to the first target information if it is determined that the sum of the cosines of the first included angle is not greater than a predetermined threshold, and generate an adjusted probability distribution function.
And a second generating module 13, configured to generate a fusion rule function according to the adjusted probability distribution function and the set collision factor.
And the second calculating module 14 is configured to calculate a decision-making judgment result of the unmanned vehicle according to the fusion rule function and a preset judgment rule.
In the embodiment of the present invention, the apparatus further includes: a judging module 15 and a filtering module 16.
The first calculating module 11 is further configured to calculate a euclidean distance between the scale vector of the target information and the scale vector of the target information according to a euclidean distance formula.
The judging module 15 is configured to judge whether the euclidean distance is greater than a preset distance. And if the Euclidean distance is judged to be greater than the preset distance, triggering the calculation module 11 to continue executing the step of calculating the cosine sum of the first included angle according to the obtained scale vectors of the plurality of target information.
The filtering module 16 is configured to filter the sensing information of the vehicle-mounted sensor corresponding to the target information if the determining module 15 determines that the euclidean distance is not greater than the preset distance.
In this embodiment of the present invention, the first computing module 11 of the apparatus specifically includes: a calculation submodule 111 and a selection submodule 112.
The calculating submodule 111 is configured to calculate a cosine of an included angle between the scale vectors of any two pieces of target information according to the scale vectors of the pieces of target information.
The selecting submodule 112 is configured to select a plurality of first included cosine from the plurality of calculated included cosine, where the plurality of first included cosine includes included cosine between a scale vector of first target information in the plurality of included cosine and scale vectors of other target information.
The calculating submodule 111 is further configured to add the cosines of the plurality of first included angles, and calculate a sum of the cosines of the first included angles.
In the embodiment of the present invention, the calculation sub-module 111 of the apparatus specifically includes:
by the formula two:
Figure GDA0002442363820000141
calculating the cosine of an included angle between the scale vectors of any two pieces of target information, wherein mi(Ak) Represented as the on-vehicle sensor i pair occurrence target information AkProbability value of (m)j(Ak) Represented as the on-vehicle sensor j pair occurrence target information AkA probability value of (c).
In the embodiment of the present invention, the identification framework of the target information includes Θ ═ a1,A2,A3,.., wherein A1,A2,A3…, as destination information.
The first generating module 12 of the device specifically comprises: a calculation sub-module 121, a determination sub-module 122, and a generation sub-module 123.
The calculating submodule 121 is configured to, if it is determined that the sum of the cosines of the first included angle is not greater than the predetermined threshold, according to a formula four:
Figure GDA0002442363820000142
and the sum of the cosines of the first included angle
Figure GDA0002442363820000143
The weight of the initial probability distribution function is calculated.
The determining submodule 122 is configured to determine, according to an evidence theory algorithm, a probability distribution function m of the sensor corresponding to the first target informationi(Ai)。
The generating submodule 123 is configured to generate an adjusted probability distribution function according to the weight and the probability distribution function
Figure GDA0002442363820000144
Wherein m'i(Θ)=α(mi)·mi(Θ)-α(mi)+1。
In this embodiment of the present invention, the second generating module 13 of the apparatus specifically includes: an acquisition submodule 131 and a generation submodule 132.
The obtaining submodule 131 is used for obtaining the conflict factor
Figure GDA0002442363820000145
The generation submodule 132 is further configured to assign functions according to the adjusted probabilities
Figure GDA0002442363820000151
Generating a fusion rule function by summing the conflict factor K
Figure GDA0002442363820000152
In the embodiment of the invention, the preset judgment rule comprises
Figure GDA0002442363820000153
The second computing module 14 of the apparatus further comprises: an acquisition submodule 141 and a fusion submodule 142.
The obtaining submodule 141 is configured to obtain the number n of target information.
The fusion submodule 142 is configured to:
Figure GDA0002442363820000154
fusing the n items of label information for n-1 times to obtain the decision judgment result, wherein m (A)lastIs the result of the last fusion during the fusion process.
In the technical scheme provided by the embodiment of the invention, the acquired sensing information of a plurality of vehicle-mounted sensors is converted into a plurality of corresponding target information, the cosine sum of a first included angle is calculated according to the scale vectors of the plurality of acquired target information, if the cosine sum of the first included angle is judged to be not more than a preset threshold value, the weight of an initial probability distribution function of the sensor corresponding to the first target information is adjusted to generate an adjusted probability distribution function, a fusion rule function is generated according to the adjusted probability distribution function and a set conflict factor, a decision judgment result of unmanned driving is calculated according to the fusion rule function and a preset judgment rule, and the reliability of the automatic driving technology is effectively improved by fusing effective information.
An embodiment of the present invention provides a storage medium, where the storage medium includes a stored program, where, when the program runs, a device on which the storage medium is located is controlled to execute each step of the above-mentioned method for generating an unmanned decision, and for a specific description, reference may be made to the above-mentioned method for generating an unmanned decision.
An embodiment of the present invention provides a computer device, including a memory and a processor, where the memory is used to store information including program instructions, and the processor is used to control execution of the program instructions, and the program instructions are loaded by the processor and executed to implement the steps of the above-mentioned unmanned driving decision generation method. For a detailed description, reference may be made to the above-described embodiments of the method for generating an unmanned decision.
Fig. 4 is a schematic diagram of a computer device according to an embodiment of the present invention. As shown in fig. 4, the computer device 4 of this embodiment includes: the processor 41, the memory 42, and the computer program 43 stored in the memory 42 and capable of running on the processor 41, where the computer program 43 is executed by the processor 41 to implement the generation method applied to the unmanned decision in the embodiment, and in order to avoid repetition, details are not repeated herein. Alternatively, the computer program is executed by the processor 41 to implement the functions of the models/units applied to the generating device for unmanned driving decision in the embodiment, and in order to avoid repetition, the details are not repeated here.
The computer device 4 includes, but is not limited to, a processor 41, a memory 42. Those skilled in the art will appreciate that fig. 4 is merely an example of a computing device 4 and is not intended to limit computing device 4 and may include more or fewer components than those shown, or some of the components may be combined, or different components, e.g., computing device 4 may also include input output devices, network access devices, buses, etc.
The Processor 41 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 42 may be an internal storage unit of the computer device 4, such as a hard disk or a memory of the computer device 4. The memory 42 may also be an external storage device of the computer device 4, such as a plug-in hard disk provided on the computer device 4, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 42 may also include both internal storage units of the computer device 4 and external storage devices. The memory 42 is used for storing computer programs and other programs and data required by the computer device 4. The memory 42 may also be used to temporarily store data that has been output or is to be output.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is only one type of logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or in the form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a Processor (Processor) to execute some steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (9)

1. A method for generating an unmanned driving decision, wherein a plurality of target information includes first target information, comprising:
calculating the cosine sum of a first included angle according to the obtained scale vectors of the plurality of target information;
if the sum of the cosines of the first included angle is judged to be not greater than the preset threshold value, the weight of the initial probability distribution function of the sensor corresponding to the first target information is adjusted, and the adjusted probability distribution function is generated;
generating a fusion rule function according to the adjusted probability distribution function and the set conflict factor;
calculating a decision judgment result of unmanned driving according to the fusion rule function and a preset judgment rule;
the calculating the sum of the cosines of the first included angle according to the obtained scale vectors of the plurality of target information includes:
calculating the cosine of an included angle between the scale vectors of any two pieces of target information according to the scale vectors of the target information;
selecting a plurality of first included angle cosines from the calculated included angle cosines, wherein the plurality of first included angle cosines comprise included angle cosines between a scale vector of first target information in the included angle cosines and scale vectors of other target information;
and adding the cosines of the plurality of first included angles to calculate the sum of the cosines of the first included angles.
2. The method according to claim 1, before calculating a sum of cosines of the first included angle according to the obtained scale vectors of the plurality of target information, further comprising:
calculating the Euclidean distance between the scale vector of the target information and the scale vector of the target information according to an Euclidean distance formula;
judging whether the Euclidean distance is greater than a preset distance;
if the Euclidean distance is judged to be not greater than the preset distance, filtering perception information of a vehicle-mounted sensor corresponding to the target information;
and if the Euclidean distance is judged to be larger than the preset distance, continuing to execute the step of calculating the sum of the cosines of the first included angle according to the obtained scale vectors of the plurality of target information.
3. The method according to claim 1, wherein the calculating a cosine of an included angle between the scale vectors of any two pieces of target information according to the obtained scale vectors of the plurality of pieces of target information includes:
by the formula two:
Figure FDA0003636211540000021
calculating the cosine of an included angle between the scale vectors of any two pieces of target information, wherein mi(Ak) Represented as the on-vehicle sensor i pair occurrence target information AkProbability value of (m)j(Ak) Represented as the on-vehicle sensor j pair occurrence target information AkThe probability value of (2).
4. The method of claim 1, wherein the identification framework of the target information comprises Θ ═ a1,A2,A3,.., wherein A1,A2,A3… as destination information;
if the sum of the cosines of the first included angle is judged to be not greater than the preset threshold value, the weight of the initial probability distribution function of the sensor corresponding to the first target information is adjusted, and the adjusted probability distribution function is generated, and the method comprises the following steps:
if the sum of the cosines of the first included angle is judged not to be greater than the preset threshold value, according to a formula four:
Figure FDA0003636211540000022
and the sum of the cosines of the first included angle
Figure FDA0003636211540000023
The weight α (m) of the initial probability distribution function is calculatedi);
According to an evidence theory algorithm, determining a probability distribution function m of a sensor corresponding to the first target informationi(Ai);
Generating an adjusted probability distribution function according to the weight and the probability distribution function
Figure FDA0003636211540000024
Wherein m'i(Θ)=α(mi)·mi(Θ)-α(mi)+1。
5. The method according to claim 1, wherein the generating a fusion rule function according to the adjusted probability distribution function and the set collision factor comprises:
obtaining a collision factor
Figure FDA0003636211540000031
According to the adjusted probability distribution function
Figure FDA0003636211540000032
Generating a fusion rule function by summing the conflict factor K
Figure FDA0003636211540000033
6. The method of claim 1, wherein the predetermined decision rule comprises
m(A1)-m(A2)>ε1
m(Θ)<ε2
m(A1) > m (Θ); wherein epsilon1And ε2Is a predetermined threshold value, m (A)1) Is A1M (A2) is A2M (Θ) is the probability distribution function of Θ;
the calculating of the decision judgment result of the unmanned vehicle according to the fusion rule function and the preset judgment rule comprises the following steps:
acquiring the number n of target information;
according to the fusion rule function:
Figure FDA0003636211540000034
n-1 times of fusion is carried out on the n item label information to obtain the decision judgment result, wherein m (A)lastIs the result of the last fusion during the fusion process.
7. An apparatus for generating an unmanned driving decision, wherein a plurality of target information includes first target information, the apparatus comprising:
the first calculation module is used for calculating the cosine sum of the first included angle according to the acquired scale vectors of the plurality of target information;
the first generation module is used for adjusting the weight of an initial probability distribution function of the sensor corresponding to the first target information and generating an adjusted probability distribution function if the sum of the cosines of the first included angle is judged to be not greater than a preset threshold;
the second generation module is used for generating a fusion rule function according to the adjusted probability distribution function and the set conflict factor;
the second calculation module is used for calculating a decision judgment result of the unmanned vehicle according to the fusion rule function and a preset judgment rule;
the first calculation module specifically includes: a calculation submodule and a selection submodule;
the calculation submodule is used for calculating the cosine of an included angle between the scale vectors of any two pieces of target information according to the scale vectors of the pieces of target information;
the selecting submodule is used for selecting a plurality of first included angle cosines from the calculated included angle cosines, and the plurality of first included angle cosines comprise included angle cosines between a scale vector of first target information in the included angle cosines and scale vectors of other target information;
the calculating submodule is further configured to add cosines of the plurality of first included angles, and calculate a sum of the cosines of the first included angles.
8. A storage medium, characterized in that the storage medium comprises a stored program, wherein the program, when executed, controls a device on which the storage medium is located to perform the method of generating an unmanned driving decision according to any one of claims 1 to 6.
9. A computer device comprising a memory for storing information comprising program instructions and a processor for controlling the execution of the program instructions, characterized in that the program instructions are loaded and executed by the processor to implement the steps of the method of generating an unmanned decision according to any of claims 1 to 6.
CN201911417516.5A 2019-12-31 2019-12-31 Unmanned driving decision generation method and device, storage medium and computer equipment Active CN113119988B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911417516.5A CN113119988B (en) 2019-12-31 2019-12-31 Unmanned driving decision generation method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911417516.5A CN113119988B (en) 2019-12-31 2019-12-31 Unmanned driving decision generation method and device, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN113119988A CN113119988A (en) 2021-07-16
CN113119988B true CN113119988B (en) 2022-07-12

Family

ID=76770851

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911417516.5A Active CN113119988B (en) 2019-12-31 2019-12-31 Unmanned driving decision generation method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN113119988B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108960083A (en) * 2018-06-15 2018-12-07 北京邮电大学 Based on automatic Pilot objective classification method combined of multi-sensor information and system
CN110297484A (en) * 2018-03-23 2019-10-01 广州汽车集团股份有限公司 Unmanned control method, device, computer equipment and storage medium
CN110555193A (en) * 2019-08-14 2019-12-10 北京市天元网络技术股份有限公司 Modified cosine similarity-based conflict measurement method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9969389B2 (en) * 2016-05-03 2018-05-15 Ford Global Technologies, Llc Enhanced vehicle operation
US10613489B2 (en) * 2017-06-20 2020-04-07 Baidu Usa Llc Method and system for determining optimal coefficients of controllers for autonomous driving vehicles
US11561541B2 (en) * 2018-04-09 2023-01-24 SafeAI, Inc. Dynamically controlling sensor behavior

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110297484A (en) * 2018-03-23 2019-10-01 广州汽车集团股份有限公司 Unmanned control method, device, computer equipment and storage medium
CN108960083A (en) * 2018-06-15 2018-12-07 北京邮电大学 Based on automatic Pilot objective classification method combined of multi-sensor information and system
CN110555193A (en) * 2019-08-14 2019-12-10 北京市天元网络技术股份有限公司 Modified cosine similarity-based conflict measurement method and device

Also Published As

Publication number Publication date
CN113119988A (en) 2021-07-16

Similar Documents

Publication Publication Date Title
CN112292711B (en) Associating LIDAR data and image data
Simon et al. Complexer-yolo: Real-time 3d object detection and tracking on semantic point clouds
US10035508B2 (en) Device for signalling objects to a navigation module of a vehicle equipped with this device
CN107038723B (en) Method and system for estimating rod-shaped pixels
US9824586B2 (en) Moving object recognition systems, moving object recognition programs, and moving object recognition methods
EP3007099A1 (en) Image recognition system for a vehicle and corresponding method
CN112930554A (en) Electronic device, system and method for determining a semantic grid of a vehicle environment
US10726275B2 (en) System and method for correlating vehicular sensor data
US20160217335A1 (en) Stixel estimation and road scene segmentation using deep learning
CN111753623B (en) Method, device, equipment and storage medium for detecting moving object
EP3899778A1 (en) A method for multi-modal sensor fusion using object trajectories for cross-domain correspondence
EP3703008A1 (en) Object detection and 3d box fitting
CN112528781B (en) Obstacle detection method, device, equipment and computer readable storage medium
CN112509032A (en) Design method of front sensing module based on automobile distributed sensing platform
CN114730472A (en) Calibration method for external parameters of vehicle-mounted camera and related device
CN114241448A (en) Method and device for obtaining heading angle of obstacle, electronic equipment and vehicle
CN113119988B (en) Unmanned driving decision generation method and device, storage medium and computer equipment
CN112529011A (en) Target detection method and related device
CN112241963A (en) Lane line identification method and system based on vehicle-mounted video and electronic equipment
Rezaei et al. Multisensor data fusion strategies for advanced driver assistance systems
CN112949374A (en) Method for ambient environment detection and data processing unit
EP4336467A1 (en) Method and apparatus for modeling object, storage medium, and vehicle control method
CN113963027B (en) Uncertainty detection model training method and device, and uncertainty detection method and device
US20230410490A1 (en) Deep Association for Sensor Fusion
CN118172743A (en) Space recognition method and space recognition system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant