CN111832154A - Identification test method for space decision consistency under large-scale data drive - Google Patents

Identification test method for space decision consistency under large-scale data drive Download PDF

Info

Publication number
CN111832154A
CN111832154A CN202010532090.4A CN202010532090A CN111832154A CN 111832154 A CN111832154 A CN 111832154A CN 202010532090 A CN202010532090 A CN 202010532090A CN 111832154 A CN111832154 A CN 111832154A
Authority
CN
China
Prior art keywords
time
spatial
score
data
calculation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010532090.4A
Other languages
Chinese (zh)
Other versions
CN111832154B (en
Inventor
陈彦君
任建军
王磊
张昊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING HUARU TECHNOLOGY CO LTD
Original Assignee
BEIJING HUARU TECHNOLOGY CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING HUARU TECHNOLOGY CO LTD filed Critical BEIJING HUARU TECHNOLOGY CO LTD
Priority to CN202010532090.4A priority Critical patent/CN111832154B/en
Publication of CN111832154A publication Critical patent/CN111832154A/en
Application granted granted Critical
Publication of CN111832154B publication Critical patent/CN111832154B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation

Abstract

The invention discloses an identification test method for space decision consistency under large-scale data drive, which comprises the following steps: collecting spatial scene data; generating spatial scene CSV data; detecting and calculating a space target; calculating a space target detection score; modeling and calculating the environment; and (4) calculating the environment modeling total score. The invention has the beneficial effects that: the identification test method of the space decision consistency under the large-scale data driving of the multi-factor strategy is adopted, and the problems of dimension disaster and historical disaster which are involved in the large-scale data accurate algorithm solving by applying the POMDP algorithm frame are solved. The identification test method for space decision consistency under large-scale data driving of a multi-factor strategy is adopted, firstly, the space target detection multi-factor strategy with time complexity of logarithmic level is classified, then, environment modeling and calculation with the same order of magnitude are completed, and finally, the time complexity of the identification test for space decision consistency is reduced from exponential order to logarithmic order.

Description

Identification test method for space decision consistency under large-scale data drive
Technical Field
The invention belongs to an identification test method, and particularly relates to an identification test method for space decision consistency.
Background
The certification test refers to a test performed by an ordering party under a prescribed condition using a representative product in order to confirm the conformity of the product with the design requirement, and is used as a basis for approval of the finalization. The identification test method of space decision consistency under data drive is to test the engineering activity of the ability to complete the specified functions under the specified conditions and in the specified time. Its purpose is to check the reliability of the use of spatial decision consistency.
For a long time, the identification test of the consistency of spatial decision under data driving has been concerned. At present, the methods for identifying and testing the consistency of spatial decision mainly include a method based on rule decision, a method based on learning algorithm, and the like, wherein the method based on learning algorithm is a relatively active research direction, and a large number of research achievements have emerged in recent years. Many learning algorithms, such as deep learning algorithm and POMDP (Partially Observable Markov decision process), perform autonomous learning on environmental samples, establish a behavior rule base by data driving, perform behavior matching directly according to different environmental information by using different learning methods and network structures, and output decision behavior determination results. The POMDP architecture has the universality degree which is enough to simulate different real-world continuous processes, and has great advantages in the fields of (industrial, scientific, commercial, military and social) unmanned vehicle behavior decision-making systems, unmanned aerial vehicle behavior decision-making systems, robot navigation problems, uncertain planning and application.
The identification test method for the consistency of the spatial decision support under data driving is a process of judging spatial decision from different sensors, different scenes or targets, such as a process of finding one-to-one corresponding relation characteristics of a vehicle or a system with the target scene under the conditions of various road conditions, climates, time, regions and the like, and carrying out multi-factor strategy consistency judgment on the spatial decision in the process. The multi-factor classification classifies factors into major categories such as target hit factors, time factors, position factors, definition factors, statistical factors and the like according to different styles or meanings. Space decision under data drive can provide higher reliability and safety for the space decision, so that the space decision is widely applied to simulation test, closed circuit test and open circuit test before the mass production of the automatic driving automobile comes into the market; carrying out automatic cruise test on the unmanned aerial vehicle; the identification test for consistency of space decision support under data drive is the key for reliable and safe space decision.
At present, however, POMDP is often computationally difficult to solve in practice, it is only a general framework to handle decision-making problems under uncertain conditions, and most applications based on POMDP are not implemented, mainly due to lack of effective algorithm support. The identification test for realizing the sequential spatial decision consistency under data drive by applying the POMDP universal framework can only adopt an approximate algorithm and cannot adopt an accurate algorithm, and the accurate algorithm can theoretically obtain an optimal solution.
In the space decision consistency identification test under large-scale data, the POMDP algorithm is used as a model for carrying out sequential decision in a dynamic uncertain environment to carry out accurate algorithm solution, which usually falls into the problems of dimension disaster and historical disaster, and the time complexity of the space decision consistency identification test is exponential due to the problems. For example, in a scene where large-scale data is actually applied, such as automatic unmanned vehicle space behavior decision identification, unmanned plane cruising route judgment, a robot automatic navigation decision system and the like, the problems of 'dimension disaster' and 'history disaster' occur in a space decision consistency identification test based on the POMDP algorithm, and the identification test based on the POMDP algorithm cannot be applied to the actual scene due to the time complexity of exponential order.
Disclosure of Invention
The invention aims to provide an identification test method for spatial decision consistency under large-scale data driving, which can solve the problem that an identification test method for spatial decision consistency under effective large-scale data is lacked in the field of identification test of spatial decision consistency under data driving at present.
The technical scheme of the invention is as follows: the identification test method of the space decision consistency under the drive of large-scale data comprises the following steps:
step S1: collecting spatial scene data;
step S2: generating spatial scene CSV data;
step S3: detecting and calculating a space target;
step S4: calculating a space target detection score;
step S5: modeling and calculating the environment;
step S6: and (4) calculating the environment modeling total score.
In the step S1, scene data is collected by a monocular camera, a CANBUS gear, a CANBUS encoder, a CANBUS steering wheel, a geographic information positioning device, an inertial navigation system, a 32-line laser radar, and other sensing and collecting devices.
The CSV in step S2 is a plain text file, which is a group of character sequences, the characters are separated by commas or tabs of english characters to generate a spatial scene CSV format, and the information in each line includes a sequence number, a timestamp 1, a timestamp 2, an associated frame type, a target type, a bounding box, a target position, a target sequence number, a start time TSAnd an end time TE
The spatial target detection calculation in step S3 specifically includes effectiveness evaluation calculation, spatial accuracy calculation, time score calculation, position score calculation,
the specific calculation process is as follows:
step S31: effectiveness evaluation calculation
Retrieving the earliest start time T of an object's associated timestamp from a CSV fileSAnd the latest end time TEAnd is recorded as a target retrievable interval [ T ]S,TE];
When the frame type is image equipment, the effectiveness Threshold is Threshold, and the area of the bounding box provided by the space decision consistency decision system is SBThe space data CSV has an area SAIf and only if time stamps
Figure BDA0002535719260000031
Validity Valid of space detection target:
Figure BDA0002535719260000032
if Valid is more than or equal to Threshold, the result is considered to be Valid, otherwise, the result is recorded as invalid;
when the frame type is laser equipment, the effectiveness Threshold is Threshold, and the area of the bounding box provided by the space decision consistency decision system is SBThe space data CSV has an area SAIf and only if time stamps
Figure BDA0002535719260000041
And time stamp
Figure BDA0002535719260000042
Validity Valid of space detection target:
Figure BDA0002535719260000043
if Valid is more than or equal to Threshold, the result is considered to be Valid, otherwise, the result is recorded as invalid;
step S32: spatial accuracy calculation
For the kth object, the spatial decision consistency decision system returns the results within the target retrievable interval a total of M times, where M iskIf the result is valid, the accuracy score of the kth class object is:
Figure BDA0002535719260000044
step S33: time score calculation
For the k object, record the data transmission time stamp of the first transmission of the frame associated with the k object, denoted as T0And receiving a valid receipt time stamp T for the first timeR
The time accuracy S of the objectTComprises the following steps:
Figure BDA0002535719260000045
step S34: position score calculation
For the k-th object, the frame result closest to the spatial data in the spatial decision consistency decision system is marked as (x)R,yR) The spatial data result is (x)A,yA) Then the distance L between the two can be calculated
Figure BDA0002535719260000046
Setting a distance threshold Ll/LhThe location score is then:
Figure BDA0002535719260000047
the step S4 includes the steps of,
k objects need to be detected, wherein k' tasks are completed, and the tasks of the objects are divided into:
Figure BDA0002535719260000051
the spatial target detection score calculates the total formula: with K objects, the target detection is generally divided into:
Figure BDA0002535719260000052
wherein S is the final score, WMScoring task weight, default 100, WAThe scored portion is weighted, with 400 being the default,
Figure BDA0002535719260000053
for the exact weighting of the mth class, default 0.25,
Figure BDA0002535719260000054
time division of the m-th classThe weight, default 0.25,
Figure BDA0002535719260000055
for the clear weight of class m, default 0.25,
Figure BDA0002535719260000056
the location of class m is weighted, defaulted to 0.25,
the above weights need to satisfy the relationship:
Figure BDA0002535719260000057
Figure BDA0002535719260000058
the step S5 includes platform positioning calculation, environment calculation, and modeling calculation, and the specific calculation process is as follows:
step S51: platform positioning calculation
The data returned by the nth snapshot point is (x)R,yR,θR) The spatial data is (x)A,yA,θA) Then its Euclidean distance
Figure BDA0002535719260000059
The angular difference is Lθ=|θRAL, minimum distance error ofdlMaximum error of LdhThe minimum error of the angle difference is LθlMaximum error of LθhThen, the location score of the extracted point is calculated as:
Figure BDA0002535719260000061
Figure BDA0002535719260000062
SL=αSd+βSθ
wherein, alpha is 0.3, beta is 0.7
Recording the time stamp of the sending of the positioning data of the measuring points as T0The time for receiving the result of the sampling point is TRRecording a maximum allowable time of
Figure BDA0002535719260000063
The sampling point positioning time is divided into:
Figure BDA0002535719260000064
step S52: scenario understanding score calculation
The data returned by the nth sampling point is a matrix Xr,cThe spatial matrix is Yr,cThe CSV is provided with a plurality of scores provided with KnThe coordinates of the kth score point are (i)k,jk) The value of the corresponding line in the CSV file is marked as YijThen the score for that term is calculated as:
Figure BDA0002535719260000065
obtaining the matching rate by the matching quantity/total score point quantity;
time points of scene understanding SUTIn accordance with the time division of the positioning section, a time threshold needs to be added
Figure BDA0002535719260000066
Step S53: computation of environmental modeling
For the nth reference point, the record returns a result of (x)R,yR) The truth file result is (x)A,yA) Then the distance can be calculated
Figure BDA0002535719260000067
Setting a distance threshold Ll/LhThe location score is then:
Figure BDA0002535719260000071
recording the time stamp of the completion of the transmission of the last frame of data as T0The map collection completion time is TRMaximum allowable time of
Figure BDA0002535719260000072
The sampling point positioning time is divided into
Figure BDA0002535719260000073
The step S5 includes the steps of,
Figure BDA0002535719260000074
wherein S is the final score, WLSelecting 100W for the positioning weight of the measuring pointsLTSelecting 50W for the positioning time weight of the measuring pointsUWeighting for understanding of local scene, selecting 100WUTSelecting 50W for the time weight of the local sceneGSelecting 120, W for the modeling weight division of the global control pointGTSelecting 80 weight for modeling time of the global control point;
the above weights need to satisfy the relationship:
WL+WLT=150
WU+WUT=150
WG+WGT=200。
the invention has the beneficial effects that: the identification test method of the space decision consistency under the large-scale data driving of the multi-factor strategy is adopted, and the problems of dimension disaster and historical disaster which are involved in the large-scale data accurate algorithm solving by applying the POMDP algorithm frame are solved. The identification test method for space decision consistency under large-scale data driving of a multi-factor strategy is adopted, firstly, the space target detection multi-factor strategy with time complexity of logarithmic level is classified, then, environment modeling and calculation with the same order of magnitude are completed, and finally, the time complexity of the identification test for space decision consistency is reduced from exponential order to logarithmic order. The time complexity of the logarithmic order has good efficiency in practical scene applications.
Drawings
FIG. 1 is a schematic flow chart of a method for evaluating consistency of spatial decisions under large-scale data driving according to the present invention;
FIG. 2 is a graph of the trend of curves of different time complexity algorithms.
Detailed Description
The invention is described in further detail below with reference to the figures and the embodiments.
In the process of identifying, testing and comprehensively evaluating the automatic unmanned vehicle space behavior decisions submitted by each parameter team in the identification test comprehensive evaluation process of the automatic unmanned vehicle space behavior decisions under the complex environment, along with the improvement of environment, road conditions, climate, time and region complexity, the identification test comprehensive evaluation method comprises the rapid increase of data from equipment such as a camera, a CANBUS bus, GPS/Beidou positioning, an inertial navigation system, a multi-line laser radar and a single-line laser radar.
POMDP is used as an ideal model for carrying out sequential decision under dynamic uncertain environment, and six-element group is used for POMDP<S,A,T,R,Z,O>Describing (S represents a state set, an action set, a state transfer function set, R reward function combination, a Z observation set and an O observation function set), wherein the time complexity is an exponential order O (| S |2| A | | | Z | | | rt-1| | | Z |), wherein
Figure BDA0002535719260000081
T is the length of time from time 0 to time T. The time complexity of the exponential order can immediately fall into the problems of "dimensional disaster" and "historical disaster".
The invention discloses an identification test method of space decision consistency under the drive of multi-factor strategy data, which decomposes the process of large-scale comprehensive data one-to-one corresponding relation characteristics of equipment such as a camera, a CANBUS bus, GPS/Beidou positioning, an inertial navigation system, a multi-line laser radar, a single-line laser radar and the like into different multi-factor classifications, divides the factors into a target hit factor V, a time factor T, a position factor L, a definition factor M, a statistical factor D and the like according to different styles or meanings, and adopts the time complexity of an algorithm as a logarithmic scale O (log (V | | | T | | | M | | D |).
According to the identification test method for the space decision consistency under the large-scale data driving, the data acquired by a space system and the real scene condition are compared and analyzed, the space decision consistency test identification is subjected to multi-factor evaluation, objective and representative analysis results are obtained, and the height of the identification capability of the space decision consistency test is evaluated. The invention tests the consistency of the space decision through two aspects of target detection and environment modeling.
As shown in fig. 1, the method for identifying and testing the consistency of spatial decision under large-scale data driving comprises the following steps:
step S1: collecting spatial scene data
Scene data is collected mainly through sensing and collecting devices of monocular cameras (Camera), CANBUS gears, CANBUS encoders, CANBUS steering wheels, geographic information positioning devices, inertial navigation systems, 32-line laser radars and the like.
Step S2: generating spatial scene CSV data
CSV is a plain text file of a special format. I.e., a sequence of characters separated by a comma or Tab of an english character. The CSV data for generating the spatial scene may be generated in a CSV format by editing the spatial scene data acquired in step S1 using an Excel table at a computer terminal and then using Excel storage as a function.
Generating a CSV format of the spatial scene, wherein the information of each line comprises a sequence number, a timestamp 1, a timestamp 2, an associated frame type, a target type, a bounding box, a target position, a target sequence number and a start time TSAnd an end time TE
The serial number is incremented from 1 and used as a unique identifier.
Timestamp 1 is the timestamp of the original associated data frame. The timestamps will be the same when different frame types (images or lasers) timestamps are the same or when multiple objects to be detected appear in the same frame. When the frame type is generated by a laser device, the timestamp 1 represents that the laser timestamp is the start timestamp of a frame.
Timestamp 2 is the timestamp of the original associated data frame. It is valid when the frame type is such that the laser device generates this field. The time stamp 2 represents that the laser time stamp is an end time stamp of one frame.
The associated frame type is a corresponding usage sensor type, such as: marked with C (video) or L (laser radar).
The target type is the kind of the detected target, such as: labeled H (human), V (vehicle), B (entity), O (obstacle).
The bounding box is the coordinates of the four vertices of a rectangular box, arranged clockwise. For example: for an image coordinate system, the origin of coordinates is the upper left corner of the image; for a laser coordinate system, the origin of coordinates is the laser origin.
The target position is the accurate position of the target in a global coordinate system, the coordinate system is consistent with a GPS coordinate system, and the target position is latitude and longitude in a WGS84 coordinate system in a format of 'latitude and longitude'.
The target serial number is the unique identification code of the detected target, namely the same object is corresponding to the same target serial number.
Starting time TSA timestamp that is fully visible (detectable) for the first time for the target (participants do not need to send, and are used in scoring).
End time TE: the last time the target was fully visible (detectable) timestamp (participants did not need to send, which was used during scoring).
Step S3: spatial object detection computation
And evaluating the space target detection recognition capability of the space decision consistency algorithm to be subjected to the identification test so as to obtain a quantitative value of the space decision consistency algorithm on the recognition capability of the space target detector, and performing iterative improvement and optimization of the algorithm according to the quality of the quantitative value.
The space target detection calculation specifically comprises 4 parts of effectiveness evaluation calculation, space accuracy calculation, time score calculation and position score calculation, and the specific calculation process is as follows:
step S31: effectiveness evaluation calculation
First the earliest start time T of the object's associated timestamp is retrieved from the CSV fileSAnd the latest end time TEAnd is recorded as a target retrievable interval [ T ]S,TE]。
When the frame type is image equipment, the effectiveness Threshold is Threshold (the Threshold value ranges from 0 to 1, the larger the value is, the higher the effectiveness is, 0.8 is selected in the embodiment of the invention), and the area of an enclosure frame provided by a spatial decision consistency algorithm of a test to be identified is SBThe space data CSV has an area SAIf and only if time stamps
Figure BDA0002535719260000111
Validity Valid of space detection target:
Figure BDA0002535719260000112
if Valid is more than or equal to Threshold, the result is considered to be Valid, otherwise, the result is recorded as invalid.
When the frame type is laser equipment, the effectiveness Threshold is Threshold (the Threshold value ranges from 0 to 1, the larger the value is, the higher the effectiveness is, 0.8 is selected in the embodiment of the invention), and the area of an enclosure frame provided by a spatial decision consistency algorithm of a test to be identified is SBThe space data CSV has an area SAIf and only if time stamps
Figure BDA0002535719260000113
And time stamp
Figure BDA0002535719260000114
Validity Valid of space detection target:
Figure BDA0002535719260000115
if Valid is more than or equal to Threshold, the result is considered to be Valid, otherwise, the result is recorded as invalid.
Particularly, if no target object exists in the corresponding frame of the answer returned by the spatial decision consistency algorithm of the test to be identified, marking as invalid; and if the bounding box returned by the space decision consistency algorithm of the test to be identified is larger than the region, marking as invalid.
Step S32: spatial accuracy calculation
For the kth object (e.g., human), the spatial decision consistency decision system returns the results within the target retrievable interval a total of M times, where M iskIf the result is valid, the accuracy score of the kth class object is:
Figure BDA0002535719260000116
in particular, if there is no one frame of valid data, the score is 0.
Step S33: time score calculation
For the kth object, record the data transmission time stamp (earlier if there are more) of the first transmission frame associated with the kth object, denoted as T0And receiving a valid receipt time stamp T for the first timeR
The time accuracy S of the objectTComprises the following steps:
Figure BDA0002535719260000121
in particular, if there is no one frame of valid data, the score is 0.
Step S34: position score calculation
Under a global coordinate system, for the kth object, a frame result closest to the target position in the spatial data CSV in the spatial decision consistency algorithm of the test to be authenticated is marked as (x)R,yR) The target position of the CSV spatial data is marked as (x)A,yA) Then the distance L between the two can be calculated
Figure BDA0002535719260000122
Setting a distance threshold Ll/LhThe location score is then:
Figure BDA0002535719260000123
Llfor the minimum distance threshold, default 0.1m is selected in the embodiments of the present invention.
LhFor the maximum distance threshold, the default value is 0.5m in the embodiment of the present invention
Step S4: spatial target detection score calculation
If k objects need to be detected and k' tasks are completed (the four items all have scores), the tasks of the objects are divided into SMComprises the following steps:
Figure BDA0002535719260000124
the spatial target detection score calculates the total formula: if K objects are arranged, the total target detection score S is as follows:
Figure BDA0002535719260000125
in the formula, WMThe scored task is weighted, 100, W is selectedAFor partial weighting of the scores, 400 is selected,
Figure BDA0002535719260000126
is the correct score value of the kth object,
Figure BDA0002535719260000127
Is the time score of the kth object,
Figure BDA0002535719260000131
Definition score for kth object
Figure BDA0002535719260000132
Is the position score of the kth object,
Figure BDA0002535719260000133
Is the accuracy weight of the kth object in the mth class,
A sharpness weight for the kth object in the mth class,
Figure BDA0002535719260000135
Time-weighted for the kth object in the mth class,
Figure BDA0002535719260000136
Weighting the position of the kth object in the mth class;
note that the above weights need to satisfy the relationship:
Figure BDA0002535719260000137
Figure BDA0002535719260000138
the tasks scored for the mth class are weighted,
Figure BDA0002535719260000139
a portion of the mth class score is weighted and 400 is selected.
Figure BDA00025357192600001310
Figure BDA00025357192600001311
For the exact weighting of the mth class, 0.25 is chosen,
Figure BDA00025357192600001312
for the m-th class of time-weighted, 0.25 is chosen,
Figure BDA00025357192600001313
for the clear weight of class m, 0.25 is chosen,
Figure BDA00025357192600001314
a weight of 0.25 is chosen for the location of the mth class.
Step S5: environmental modeling and computing
And performing environment modeling and calculation recognition capability evaluation on the space decision consistency algorithm to be subjected to the identification test so as to obtain a quantitative value of the space decision consistency algorithm on the environment modeling and calculation capability, and performing iterative improvement and optimization on the algorithm according to the quality of the quantitative value.
The environment modeling and calculation specifically comprises 3 parts of platform positioning calculation, environment calculation and modeling calculation, and the specific calculation process is as follows:
step S51: platform positioning calculation
The data returned by the nth sampling point is set as (x)R,yR,θR) The spatial data is (x)A,yA,θA) Then its Euclidean distance
Figure BDA00025357192600001315
The angular difference is Lθ=|θRAL, minimum distance error ofdlMaximum error of LdhThe minimum error of the angle difference is LθlMaximum error of LθhThen, the location score of the extracted point is calculated as:
Figure BDA0002535719260000141
Figure BDA0002535719260000142
SL=αSd+βSθ
wherein α is 0.3, β is 0.7, SLTo locate the score, SdIs a position score, SθIs an angle score;
recording the time stamp of the sending of the positioning data of the measuring points as T0The time for receiving the result of the sampling point is TRRecording a maximum allowable time of
Figure BDA0002535719260000143
The sampling point is positioned for a time division SLTComprises the following steps:
Figure BDA0002535719260000144
step S52: scenario understanding score calculation
Setting the data returned by the nth sampling point as a matrix Xr,cThe spatial matrix is Yr,cThe CSV is provided with a plurality of scores provided with KnThe coordinates of the kth score point are (i)k,jk) The value of the corresponding line in the CSV file is marked as YijThe value of the corresponding column in the file is marked as XijThen the item is scored
Figure BDA0002535719260000145
Is calculated as:
Figure BDA0002535719260000146
namely the matching quantity/the total score point quantity, to obtain the matching rate.
Time points of scene understanding SUTIn accordance with the time division of the positioning section, a time threshold needs to be added
Figure BDA0002535719260000147
Step S53: computation of environmental modeling
For the nth reference point, the record returns a result of (x)R,yR) CSV data is (x)A,yA) Then the distance can be calculated
Figure BDA0002535719260000151
Setting a distance threshold L1/LhThe location score is then:
Figure BDA0002535719260000152
recording the time stamp of the completion of the transmission of the last frame of data as T0The map collection completion time is TRMaximum allowable time of
Figure BDA0002535719260000153
The positioning time of the sampling point is divided into SGT
Figure BDA0002535719260000154
Step S6: total score calculation for environmental modeling
Figure BDA0002535719260000155
Wherein S is the final score, WLSelecting 100W for the positioning weight of the measuring pointsLTSelecting 50W for the positioning time weight of the measuring pointsUWeighting for understanding of local scene, selecting 100WUTSelecting 50W for the time weight of the local sceneGSelecting 120, W for the modeling weight division of the global control pointGTSelecting 80 weight for modeling time of the global control point;
Figure BDA0002535719260000156
in order to position the position point,
Figure BDA0002535719260000157
in order to locate the time minute,
Figure BDA0002535719260000158
in order to understand the position score of the scene,
Figure BDA0002535719260000159
in order for the scene to understand the time division,
Figure BDA00025357192600001510
for modeling the position score, SGTFor modeling time points
The above weights need to satisfy the relationship:
WL+WLT=150
WU+WUT=150
WG+WGT=200
the practical effects of using the POMDP algorithm and using the multi-factor strategy algorithm are described in detail below in one of the typical application scenarios for automated unmanned vehicle spatial behavior decision making qualification.
The specific method is as follows:
step S1: collecting spatial scene data
Scene data is collected mainly through monocular Camera (Camera), CANBUS gear, CANBUS encoder, CANBUS steering wheel, geographic information positioning device, inertial navigation system, 32-line laser radar and other equipment sensing and acquisition device communication protocols.
Step S2: generating spatial scene CSV data
Examples of data file content are as follows:
Figure BDA0002535719260000161
Figure BDA0002535719260000171
Figure BDA0002535719260000181
Figure BDA0002535719260000191
Figure BDA0002535719260000201
step S3: spatial object detection computation
The method specifically comprises effectiveness evaluation calculation, space accuracy calculation, time score calculation and position score calculation.
Step S31: effectiveness evaluation calculation
Let example at a certain point in time:
the target x-coordinate values of the CSV data are: [ -2.8,50, -2.8,50.7, -2.1,50.7, -2.1,50]
The target y values provided by the system to be authenticated are: [ -2.84018,50.0957, -2.84017,50.7957, -2.14017,50.7957, -2.14018,50.0957]","[39.7863350667,116.0028280400]
SiArea of intersection of x and y
SuX and y union area
Threshold 0.8 (default Threshold)
If S isi/Su>Threshold indicates that the valid value at the current time point is valid, otherwise, it is invalid.
Step S32: spatial accuracy calculation
Examples at some point in time:
the target numbers provided by the system to be authenticated are: 1
Accumulating the effective quantity V of the target by obtaining the effective valueaAnd an invalid number UvThen, then
Mk=Va,M=Va+Uv
Effective rate Sc=Mk/M=Va/(Va+Uv)
Step S33: time score calculation
TS-the earliest start time of the CSV file retrieval object's associated timestamp;
TE-retrieving the latest end time of the associated timestamp of the object for the CSV file;
T0the receiving time provided by the system to be authenticated for the first time of transmitting the object;
TRthe first time the object is received, which is provided by the system to be authenticated, is valid;
time score ST
Figure BDA0002535719260000211
Step S34: position score calculation
Let example at a certain point in time:
the target longitude value tlat is: 39.7863350667, target latitude value tlng is: 116.0028280400 the target longitude value alat provided by the system to be authenticated is: 39.7863350000, the system to be authenticated provides a target latitude value of alng: 116.0028280400.
step S341: converting the longitude and latitude tlat, tlng, alat and alng into a plane coordinate system according to the standard of GPS84, wherein the converted value is XR,YR,XA,YA
Step S342: calculating the distance between two points according to the distance formula
Figure BDA0002535719260000212
Step S343: let LlIs a minimum distance threshold of 0.1m, L by defaulthThe maximum distance threshold value is defaulted to 0.5m, and then the position score is:
if the distance L is less than the minimum threshold value LlThe value is 1;
if the distance L is between the minimum threshold Ll and the maximum threshold Lh, the value is (L-L)l)/(Lh-Ll);
If the distance L is greater than the maximum threshold value LhThe value is 0.
Step S4: spatial target detection score calculation
The method comprises the following steps:
step S41: circularly traversing the target provided by the system to be identified, and detecting the detected target space target by using a score meter if the target has effectiveness evaluation score, space accuracy, time score and position scoreCalculating the accumulation to obtain an accumulated value SM
Step S42: assuming that the total target number is K, S1=K1K100 (scored task weight threshold) step S43: and circularly traversing all the targets and accumulating the following formulas to obtain an accumulated value t: t + ═ Sc·0.25+St·0.25+·0.25+Sp·0.25
ScTo correct the score value, StAs a time score, SdFor clarity score, SpThe position weight, correct weight, time weight, and clear weight were 0.25, which is the position score.
Step S44: dividing the accumulated value by the total target number to obtain a process value S2
S2=t/k·400
Where 400 is the partial weight of the score.
Step S45: final target probe score S ═ S1+S2
Step S5: environmental modeling and computing
The method specifically comprises positioning calculation, environment calculation and platform calculation.
Step S51: location calculation
Let example at a certain point in time:
the target longitude value tlat of the CSV data is: 39.7863350667, respectively;
the target latitude values tlng of the CSV data are: 116.0028280400
The target height value, then, for the CSV data is: 253.685
The target longitude values alat provided by the system to be authenticated are: 39.7863350000
The target latitude value alng provided by the system to be identified is as follows: 116.0028280400
The target degree high value ahed provided by the system to be identified is as follows: 253.00
Step S511: converting the longitude and latitude tlat, tlng, alat and alng into a plane coordinate system according to the standard of GPS84, wherein the converted value is XR,YR,XA,YA
Step S512: calculating the distance between two points according to the distance formula
Figure BDA0002535719260000231
Step S513: calculating the angular difference L0
L0=|thed-ahed|=|253.685-253|=0.685
Step S514: calculating the position score as Sd
If the distance L isdLess than a minimum threshold LdlThen the value is 1
If the distance L isdAt a minimum threshold LdlAnd a maximum threshold value LdhIn between, then the value is
(Ld-Ldl(/(Ldh-Ldl)
If the distance L isdGreater than a maximum threshold value LdhIf it is 0
Step S515: calculating the angle score So
If the distance L isoLess than a minimum threshold Lo1Then the value is 1
If the distance L isoAt a minimum threshold Lo1And a maximum threshold value LohIn between, then the value is
(Lo-Lol)/(Loh-Lol)
If the distance L isoGreater than a maximum threshold value LohIf it is 0
Step S516: calculating a location score Sl
Sl=0.3·Sd+0.7·So
Step S517: calculating the time score Slt
Slt=1-(tr-t0)/1000=1-(36389652-36389852)/1000=08
Step S52: scenario understanding score calculation
Step S521: data of the scene CSV data is obtained, and the data format is as follows:
Figure BDA0002535719260000241
step S522: acquiring scene data of a system to be identified, wherein the format of the data is a byte array of 81 × 81, the subscript of the array corresponds to the row and column of CSV data, and the value in the array corresponds to the understood type value;
step S523: matching type values in corresponding arrays by circularly traversing CSV data, wherein the traversal time is knIf the data value is the same as the CSV data value, accumulating the times to obtain an accumulated value n;
step S524: scenario understanding score Su=n/kn
Step S53: the scene understanding score is calculated as follows:
t0time stamp for the transmission of positioning data of a measuring point provided by a system to be evaluated
trThe time provided by the system to be authenticated to receive the result of the sampling point
Step S531: obtaining environmental modeling time score Sgt
If t isr-t0< 1000 (environmental modeling time threshold), time is divided by SgtIs 0
If t isr-t0If greater than 1000, then Sgt=1-(tr-t0)/1000
Step S532: obtaining environmental modeling position score Sg
Converting the longitude and latitude tlat, tlng, alat and alng into a plane coordinate system according to the standard of GPS84, wherein the converted value is XR,YR,XA,YA
Calculating the distance between two points according to the distance formula
Figure BDA0002535719260000242
If the distance L isdLess than a minimum threshold LdlThen the value is 1
If the distance L isdAt a minimum threshold LdlAnd a maximum threshold value LdhIn between, then the value is
(Ld-Ldl)/(Ldh-Ldl)
If the distance L isdGreater than a maximum threshold value LdhIf it is 0
Step S6: total score calculation for environmental modeling
Comprises the following steps of (a) carrying out,
step S61: loop traversal environment modeling answer records
Sla+=Sl100 (positioning position threshold) + Slt50 (decide time threshold)
Sua+=Su100 (scene understanding location threshold) + Sut50 (scene understanding time threshold)
Sga+=Sg120 (modeled position threshold) + Sgt50 (modeling time threshold)
Step S62: the environmental modeling summary score S is
S=Sla+Sua+Sga
As shown in fig. 2, it can be seen from the curve trend graph of the above algorithm with different time complexity that the identification test method of the multi-factor strategy space decision consistency is applied to the identification of the specific automatic unmanned vehicle space behavior decision, the speed and efficiency of the identification test of the space decision consistency under data driving are greatly improved along with the reduction of the algorithm calculation time, and a good effect is achieved in practical application.

Claims (7)

1. The identification test method for the consistency of the spatial decision under the drive of large-scale data is characterized by comprising the following steps of:
step SI: collecting spatial scene data;
step S2: generating spatial scene CSV data;
step S3: detecting and calculating a space target;
step S4: calculating a space target detection score;
step S5: modeling and calculating the environment;
step S6: and (4) calculating the environment modeling total score.
2. The large-scale data-driven consistency-of-spatial-decision-making qualification test method of claim 1, further comprising: in the step S1, scene data is collected by a monocular camera, a CANBUS gear, a CANBUS encoder, a CANBUS steering wheel, a geographic information positioning device, an inertial navigation system, a 32-line laser radar, and other sensing and collecting devices.
3. The large-scale data-driven consistency-of-spatial-decision-making qualification test method of claim 1, further comprising: the CSV in step S2 is a plain text file, which is a group of character sequences, the characters are separated by commas or tabs of english characters to generate a spatial scene CSV format, and the information in each line includes a sequence number, a timestamp 1, a timestamp 2, an associated frame type, a target type, a bounding box, a target position, a target sequence number, a start time TSAnd an end time TE
4. The large-scale data-driven consistency-of-spatial-decision-making qualification test method of claim 1, further comprising: the spatial target detection calculation in step S3 specifically includes effectiveness evaluation calculation, spatial accuracy calculation, time score calculation, position score calculation,
the specific calculation process is as follows:
step S31: effectiveness evaluation calculation
Retrieving the earliest start time T of an object's associated timestamp from a CSV fileSAnd the latest end time TEAnd is recorded as a target retrievable interval [ T ]S,TE];
When the frame type is image equipment, the effectiveness Threshold is Threshold, the bounding box area provided by the space decision consistency decision system is SB, the bounding box area of the space data CSV is SA, and if and only if the timestamp
Figure FDA0002535719250000021
Validity Valid of space detection target:
Figure FDA0002535719250000022
if Valid is more than or equal to Threshold, the result is considered to be Valid, otherwise, the result is recorded as invalid;
when the frame type is a laser device, the effectiveness Threshold is Threshold, the bounding box area provided by the spatial decision consistency decision system is SB, the bounding box area of the spatial data CSV is SA, and if and only if the timestamp
Figure FDA0002535719250000023
And time stamp
Figure FDA0002535719250000024
Validity Valid of space detection target:
Figure FDA0002535719250000025
if Valid is more than or equal to Threshold, the result is considered to be Valid, otherwise, the result is recorded as invalid;
step S32: spatial accuracy calculation
For the kth object, the spatial decision consistency decision system returns the results within the target retrievable interval a total of M times, where M iskIf the result is valid, the accuracy score of the kth class object is:
Figure FDA0002535719250000026
step S33: time score calculation
For the k object, record the data transmission time stamp of the first transmission of the frame associated with the k object, denoted as T0And receiving a valid receipt time stamp T for the first timeR
The time accuracy S of the objectTComprises the following steps:
Figure FDA0002535719250000031
step S34: position score calculation
For the k-th object, the frame result closest to the spatial data in the spatial decision consistency decision system is marked as (x)R,yR) The spatial data result is (x)A,yA) Then the distance L between the two can be calculated
Figure FDA0002535719250000032
Setting a distance threshold Ll/LhThe location score is then:
Figure FDA0002535719250000033
5. the large-scale data-driven consistency-of-spatial-decision-making qualification test method of claim 1, further comprising: the step S4 includes the steps of,
k objects need to be detected, wherein k' tasks are completed, and the tasks of the objects are divided into:
Figure FDA0002535719250000034
the spatial target detection score calculates the total formula: with K objects, the target detection is generally divided into:
Figure FDA0002535719250000035
wherein S is the final score, WMScoring task weight, WAThe part of the score is given a weight,
Figure FDA0002535719250000041
is accurate of the m-th classThe weight is divided into a plurality of weight components,
Figure FDA0002535719250000042
is a time-weighted weight for the m-th class,
Figure FDA0002535719250000043
for the explicit weight-division of the m-th class,
Figure FDA0002535719250000044
weights are assigned to the location of the mth class,
the above weights need to satisfy the relationship:
Figure FDA0002535719250000045
(highest score)
Figure FDA0002535719250000046
6. The large-scale data-driven consistency-of-spatial-decision-making qualification test method of claim 1, further comprising: the step S5 includes platform positioning calculation, environment calculation, and modeling calculation, and the specific calculation process is as follows:
step S51: platform positioning calculation
The data returned by the nth snapshot point is (x)R,yR,θR) The spatial data is (x)A,yA,θA) Then its Euclidean distance
Figure FDA0002535719250000047
The angular difference is Lθ=|θRAL, minimum distance error ofdlMaximum error of LdhThe minimum error of the angle difference is LθlMaximum error of LθhThen, the location score of the extracted point is calculated as:
Figure FDA0002535719250000048
Figure FDA0002535719250000049
SL=αSd+βSθ
wherein, alpha is 0.3, beta is 0.7
Recording the time stamp of the sending of the positioning data of the sampling point as T0, the time of receiving the result of the sampling point as TR, and recording the maximum allowable time as TR
Figure FDA0002535719250000051
The sampling point positioning time is divided into:
Figure FDA0002535719250000052
step S52: scenario understanding score calculation
The data returned by the nth sampling point is a matrix Xr,cThe spatial matrix is Yr,cThe CSV is provided with a plurality of scores provided with KnThe coordinates of the kth score point are (i)k,jk) The value of the corresponding line in the CSV file is marked as YijThen the score for that term is calculated as:
Figure FDA0002535719250000053
obtaining the matching rate by the matching quantity/total score point quantity;
time points of scene understanding SUTIn accordance with the time division of the positioning section, a time threshold needs to be added
Figure FDA0002535719250000054
Step S53: computation of environmental modeling
For the nth reference point, the record returns a result of (x)R,yR) The truth file result is (x)A,yA) Then the distance can be calculated
Figure FDA0002535719250000055
Setting a distance threshold Ll/LhThe location score is then:
Figure FDA0002535719250000061
recording the time stamp of the completion of the transmission of the last frame of data as T0The map collection completion time is TRMaximum allowable time of
Figure FDA0002535719250000062
The sampling point positioning time is divided into
Figure FDA0002535719250000063
7. The large-scale data-driven consistency-of-spatial-decision-making qualification test method of claim 1, further comprising: the step S5 includes the steps of,
Figure FDA0002535719250000064
wherein S is the final score, WLSelecting 100W for the positioning weight of the measuring pointsLTSelecting 50W for the positioning time weight of the measuring pointsUWeighting for understanding of local scene, selecting 100WUTSelecting 50W for the time weight of the local sceneGSelecting 120, W for the modeling weight division of the global control pointGTSelecting 80 weight for modeling time of the global control point;
the above weights need to satisfy the relationship:
WL+WLT=150
WU+WUT=150
WG+WGT=200。
CN202010532090.4A 2020-06-11 2020-06-11 Identification test method for space decision consistency under large-scale data drive Active CN111832154B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010532090.4A CN111832154B (en) 2020-06-11 2020-06-11 Identification test method for space decision consistency under large-scale data drive

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010532090.4A CN111832154B (en) 2020-06-11 2020-06-11 Identification test method for space decision consistency under large-scale data drive

Publications (2)

Publication Number Publication Date
CN111832154A true CN111832154A (en) 2020-10-27
CN111832154B CN111832154B (en) 2021-05-18

Family

ID=72897637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010532090.4A Active CN111832154B (en) 2020-06-11 2020-06-11 Identification test method for space decision consistency under large-scale data drive

Country Status (1)

Country Link
CN (1) CN111832154B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120093407A1 (en) * 2010-09-21 2012-04-19 Sony Corporation Object detection and classification method and apparatus
CN104391921A (en) * 2014-11-19 2015-03-04 武汉大学 Method and system for establishing geographic space decision element model for isomeric model management
CN110533695A (en) * 2019-09-04 2019-12-03 深圳市唯特视科技有限公司 A kind of trajectory predictions device and method based on DS evidence theory
CN110751847A (en) * 2019-10-15 2020-02-04 清华大学 Decision-making method and system for automatically driving vehicle behaviors

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120093407A1 (en) * 2010-09-21 2012-04-19 Sony Corporation Object detection and classification method and apparatus
CN104391921A (en) * 2014-11-19 2015-03-04 武汉大学 Method and system for establishing geographic space decision element model for isomeric model management
CN110533695A (en) * 2019-09-04 2019-12-03 深圳市唯特视科技有限公司 A kind of trajectory predictions device and method based on DS evidence theory
CN110751847A (en) * 2019-10-15 2020-02-04 清华大学 Decision-making method and system for automatically driving vehicle behaviors

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CYRIL ROBIN 等: "Multi-robot target detection and tracking: taxonomy and survey", 《AUTON ROBOT》 *
黄朝峰 等: "模糊条件下的决策单元相对有效性评价", 《模糊系统与数学》 *

Also Published As

Publication number Publication date
CN111832154B (en) 2021-05-18

Similar Documents

Publication Publication Date Title
CN107463907B (en) Vehicle collision detection method and device, electronic equipment and vehicle
Eikelboom et al. Improving the precision and accuracy of animal population estimates with aerial image object detection
JP6831414B2 (en) Methods for positioning, devices, devices and computers for positioning Readable storage media
CN111002980B (en) Road obstacle trajectory prediction method and system based on deep learning
JP2021139898A (en) Positioning method, apparatus, computing device, computer-readable storage medium, and computer program
CN109870698B (en) Ultrasonic array obstacle detection result processing method and system
CN113377888B (en) Method for training object detection model and detection object
CN111339826B (en) Landslide unmanned aerial vehicle linear sensor network frame detecting system
CN115588040A (en) System and method for counting and positioning coordinates based on full-view imaging points
CN116206223A (en) Fire detection method and system based on unmanned aerial vehicle edge calculation
CN111832154B (en) Identification test method for space decision consistency under large-scale data drive
RU2740708C1 (en) Radio monitoring results processing method
CN117333846A (en) Detection method and system based on sensor fusion and incremental learning in severe weather
CN115542338B (en) Laser radar data learning method based on point cloud spatial distribution mapping
Bäumler et al. ‘Generating representative test scenarios: The fuse for representativity (Fuse4Rep) process model for collecting and analysing traffic observation data
CN116964588A (en) Target detection method, target detection model training method and device
CN115755134A (en) Vehicle positioning method and device based on Informer network and computer
Cowlagi et al. Risk quantification for automated driving using information from v2v basic safety messages
CN114782500A (en) Kart race behavior analysis method based on multi-target tracking
CN112747752B (en) Vehicle positioning method, device, equipment and storage medium based on laser odometer
US20210101614A1 (en) Spatio-temporal pose/object database
Wickert et al. Lessons Learned on Conducting Dwelling Detection on VHR Satellite Imagery for the Management of Humanitarian Operations
US20230109494A1 (en) Methods and devices for building a training dataset
CN115856931B (en) Unmanned ship berthing garage position repositioning method based on laser radar
CN113963027B (en) Uncertainty detection model training method and device, and uncertainty detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant