CN105109484B - Target disorders object determines method and device - Google Patents
Target disorders object determines method and device Download PDFInfo
- Publication number
- CN105109484B CN105109484B CN201510526498.XA CN201510526498A CN105109484B CN 105109484 B CN105109484 B CN 105109484B CN 201510526498 A CN201510526498 A CN 201510526498A CN 105109484 B CN105109484 B CN 105109484B
- Authority
- CN
- China
- Prior art keywords
- obstructing objects
- sensor
- vehicle
- data
- measurement data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000005259 measurement Methods 0.000 claims abstract description 130
- 230000004927 fusion Effects 0.000 claims abstract description 38
- 230000004438 eyesight Effects 0.000 claims description 31
- 238000012545 processing Methods 0.000 claims description 7
- 238000012216 screening Methods 0.000 claims description 7
- 238000001514 detection method Methods 0.000 abstract description 15
- 230000004888 barrier function Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 4
- 239000000155 melt Substances 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 239000004744 fabric Substances 0.000 description 2
- 241000894007 species Species 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 241001225883 Prosopis kuntzei Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000002844 melting Methods 0.000 description 1
- 230000008018 melting Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- VIKNJXKGJWUCNN-XGXHKTLJSA-N norethisterone Chemical compound O=C1CC[C@@H]2[C@H]3CC[C@](C)([C@](CC4)(O)C#C)[C@@H]4[C@@H]3CCC2=C1 VIKNJXKGJWUCNN-XGXHKTLJSA-N 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
Abstract
The invention discloses a kind of target disorders object to determine method and device, belongs to intelligent transportation field.Methods described includes:At least n measurement data that are directed to obstructing objects of the n sensor set on vehicle at current time are obtained, the n is more than or equal to 1;Fusion treatment is carried out at least n measurement data for obstructing objects, obtains fused data group, the fused data group have recorded each fused data corresponding to obstructing objects in the obstructing objects that the n sensor detects;Determine the distance of each obstructing objects and vehicle recorded in the fused data group;The obstructing objects that distance is less than to pre-determined distance threshold value are defined as target disorders object.Distance of the present invention according to obstructing objects apart from vehicle, it is determined that target disorders object, improve reliability of the vehicle to the result of detection of surrounding road environment.
Description
Technical field
The present invention relates to intelligent transportation field, more particularly to a kind of target disorders object determines method and device.
Background technology
In order to detect the road environment of vehicle periphery, vision sensor, millimetre-wave radar sensing are typically provided with vehicle
The multiple sensors such as device and laser radar sensor.Vehicle can obtain the measurement data of multiple sensors collection, and to this
The measurement data of multiple sensors collection carries out fusion treatment, i.e., the measurement data of the plurality of sensor collection is carried out it is complementary and
Optimum organization improves detectivity of the vehicle to surrounding road environment to generate more reliable more accurate information.
In correlation technique, vehicle can carry out fusion treatment, the mistake of the fusion treatment to the data that multiple sensors gather
Journey mainly includes:The measurement data for the target obstacle that each sensor detects is obtained first, afterwards to the target obstacle
Measurement data carry out time-space relation, the measurement data of each sensor is transformed into same coordinate system, and will each be passed
The measurement data of sensor is synchronized to the same moment, and then the measurement data after the plurality of sensor time-space relation is closed
Connection, obtains the measurement data for belonging to same target, and last information fuse device can will belong to the different sensors of same target
Measurement data merged by certain algorithm after output to vehicle.
But in correlation technique, when the target obstacle that sensor detects is more, the information fuse device in vehicle
Need that the measurement data of each target obstacle is associated and merged, and by the information of multiple target obstacles after fusion
Export to vehicle, the barrier that will not be impacted to vehicle, therefore, vehicle pair are there may be in the plurality of target obstacle
The reliability of the result of detection of surrounding road environment is relatively low.
The content of the invention
In order to solve problem of the prior art, the invention provides a kind of target disorders object to determine method and device, institute
It is as follows to state technical scheme:
On the one hand, there is provided a kind of target disorders object determines method, and methods described includes:
The n sensor set on vehicle is obtained at least n measurement data for being directed to obstructing objects at current time, institute
N is stated more than or equal to 1;
Fusion treatment is carried out at least n measurement data for obstructing objects, obtains fused data group, it is described to melt
Close data group and have recorded each fused data corresponding to obstructing objects in the obstructing objects that the n sensor detects;
Determine the distance of each obstructing objects and vehicle recorded in the fused data group;
The obstructing objects that distance is less than to pre-determined distance threshold value are defined as target disorders object.
Optionally, it is described the obstructing objects are defined as target disorders after, methods described also includes:
The history fused data of the target disorders object is obtained, the history fused data is default in target data
Place states the fused data stored before current time;
The fused data of target disorders object according to the fused data group and the history fused data, it is determined that
The motion state of the target disorders object;
The motion state of the target disorders object is stored into the target database.
Optionally, the n sensor is set in the surrounding of the vehicle, and the n is more than or equal to 14.
Optionally, the n sensor includes:6 millimetre-wave radar sensors, 6 laser radar sensors and 2 regard
Feel sensor;
2 vision sensors are arranged on the front windshield of the vehicle, 6 millimetre-wave radar sensors
The surrounding of the vehicle is evenly distributed on, 6 laser radar sensors are evenly distributed on the surrounding of the vehicle.
Optionally, it is described that fusion treatment is carried out at least n measurement data for obstructing objects, including:
Observation model is established for each sensor in the n sensor, the observation model is used for according to the biography
Measurement data of the sensor at current time to obstructing objects, prediction data of the subsequent time to the obstructing objects is obtained, it is described
Subsequent time differs t with the current time, and the t is more than 0;
According to the observation model, obtain in the n sensor each sensor the current time to it is described extremely
The prediction data of few obstructing objects;
Screening can merge measurement data set at least n measurement data for obstructing objects, described to merge
Each measurement data is less than predetermined threshold value with the error of corresponding prediction data in measurement data set;
Fusion treatment is carried out to the measurement data merged in measurement data set.
On the other hand, there is provided one kind mark obstructing objects determining device, described device include:
First acquisition unit, obstacle is directed to for obtaining at least n of the n sensor set on vehicle at current time
The measurement data of object, the n are more than or equal to 1;
Processing unit, for carrying out fusion treatment at least n measurement data for obstructing objects, merged
Data group, the fused data group have recorded in the obstructing objects that the n sensor detects corresponding to each obstructing objects
Fused data;
First determining unit, for each obstructing objects for determining to record in the fused data group and the vehicle away from
From;
Second determining unit, the obstructing objects for distance to be less than to pre-determined distance threshold value are defined as target disorders object.
Optionally, described device also includes:
Second acquisition unit, for obtaining the history fused data of the target disorders object, the history fused data
For the default fused data stored before current time described in target database;
3rd determining unit, for the fused data of target disorders object according to the fused data group and described
History fused data, determine the motion state of the target disorders object;
Memory cell, for the motion state of the target disorders object to be stored into the target database.
Optionally, the n sensor is set in the surrounding of the vehicle, and the n is more than or equal to 14.
Optionally, the n sensor includes:6 millimetre-wave radar sensors, 6 laser radar sensors and 2 regard
Feel sensor;
2 vision sensors are arranged on the front windshield of the vehicle, 6 millimetre-wave radar sensors
The surrounding of the vehicle is evenly distributed on, 6 laser radar sensors are evenly distributed on the surrounding of the vehicle.
Optionally, the processing unit, is additionally operable to:
Observation model is established for each sensor in the n sensor, the observation model is used for according to the biography
Measurement data of the sensor at current time to obstructing objects, prediction data of the subsequent time to the obstructing objects is obtained, it is described
Subsequent time differs t with the current time, and the t is more than 0;
According to the observation model, obtain in the n sensor each sensor the current time to it is described extremely
The prediction data of few obstructing objects;
Screening can merge measurement data set at least n measurement data for obstructing objects, described to merge
Each measurement data is less than predetermined threshold value with the error of corresponding prediction data in measurement data set;
Fusion treatment is carried out to the measurement data merged in measurement data set.
The beneficial effect that technical scheme provided in an embodiment of the present invention is brought is:
Target disorders object provided in an embodiment of the present invention determines method and device, and vehicle can obtain what is set on vehicle
At least n measurement data that are directed to obstructing objects of the n sensor at current time, and it is directed to obstructing objects at least n
Measurement data carry out fusion treatment, obtain fused data group, afterwards, vehicle can determine to record in the fused data group every
The distance of individual obstructing objects and the vehicle, and the obstructing objects that distance can be less than to pre-determined distance threshold value are defined as target disorders
Object, and the barrier for being more than pre-determined distance threshold value for distance then determines that the obstructing objects will not cause shadow to the traveling of vehicle
Ring, the reliability this improves vehicle to the result of detection of surrounding road environment.
Brief description of the drawings
Technical scheme in order to illustrate the embodiments of the present invention more clearly, make required in being described below to embodiment
Accompanying drawing is briefly described, it should be apparent that, drawings in the following description are only some embodiments of the present invention, for
For those of ordinary skill in the art, on the premise of not paying creative work, other can also be obtained according to these accompanying drawings
Accompanying drawing.
Fig. 1 is that a kind of target disorders object provided in an embodiment of the present invention determines method flow diagram;
Fig. 2-1 is that another target disorders object provided in an embodiment of the present invention determines method flow diagram;
Fig. 2-2 is that a kind of sensor provided in an embodiment of the present invention sets orientation schematic diagram;
Fig. 2-3 is the method stream that a kind of measurement data to obstructing objects provided in an embodiment of the present invention carries out fusion treatment
Cheng Tu;
Fig. 3-1 is a kind of structural representation of target disorders object determining device provided in an embodiment of the present invention;
Fig. 3-2 is the structural representation of another target disorders object determining device provided in an embodiment of the present invention.
Embodiment
To make the object, technical solutions and advantages of the present invention clearer, below in conjunction with accompanying drawing to embodiment party of the present invention
Formula is described in further detail.
The embodiments of the invention provide a kind of target disorders object to determine method, and referring to Fig. 1, this method includes:
At least n surveys that are directed to obstructing objects of the n sensor set in step 101, acquisition vehicle at current time
Data are measured, the n is more than or equal to 1.
Step 102, fusion treatment is carried out at least n measurement data for obstructing objects, obtains fused data group,
The fused data group have recorded each fused data corresponding to obstructing objects in the obstructing objects that the n sensor detects.
Step 103, the distance for determining each obstructing objects and vehicle recorded in the fused data group.
Step 104, target disorders object will be defined as apart from the obstructing objects for being less than pre-determined distance threshold value.
In summary, target disorders object provided in an embodiment of the present invention determines method, and vehicle can be obtained on vehicle and set
At least n measurement data that are directed to obstructing objects of the n sensor put at current time, and it is directed to obstacle at least n
The measurement data of object carries out fusion treatment, obtains fused data group, afterwards, vehicle can determine to record in the fused data group
Each obstructing objects and the vehicle distance, and distance can be less than the obstructing objects of pre-determined distance threshold value and be defined as target
Obstructing objects, and the barrier for being more than pre-determined distance threshold value for distance then determines that the obstructing objects will not be made to the traveling of vehicle
Into influence, the reliability this improves vehicle to the result of detection of surrounding road environment.
Optionally, after the obstructing objects are defined as into target disorders, this method also includes:
The history fused data of the target disorders object is obtained, the history fused data is somebody's turn to do to be default in target database
The fused data stored before current time;
According to the fused data of the target disorders object in the fused data group and the history fused data, the target is determined
The motion state of obstructing objects;
The motion state of the target disorders object is stored into the target database.
Optionally, the n sensor is set in the surrounding of the vehicle, and the n is more than or equal to 14.
Optionally, the n sensor includes:6 millimetre-wave radar sensors, 6 laser radar sensors and 2 visions
Sensor;
2 vision sensors are arranged on the front windshield of the vehicle, and 6 millimetre-wave radar sensors uniformly divide
For cloth in the surrounding of the vehicle, 6 laser radar sensors are evenly distributed on the surrounding of the vehicle.
Optionally, this carries out fusion treatment at least n measurement data for obstructing objects, including:
Observation model is established for each sensor in the n sensor, the observation model is used to exist according to the sensor
Current time to the measurement data of obstructing objects, obtains prediction data of the subsequent time to the obstructing objects, the subsequent time with
The current time differs t, and the t is more than 0;
According to the observation model, obtain in the n sensor each sensor in this prior the moment at least one barrier
Hinder the prediction data of object;
Screening can merge measurement data set at least n measurement data for obstructing objects, and this can merge measurement
Each measurement data is less than predetermined threshold value with the error of corresponding prediction data in data group;
The measurement data that can be merged to this in measurement data set carries out fusion treatment.
In summary, target disorders object provided in an embodiment of the present invention determines method, and vehicle can be obtained on vehicle and set
At least n measurement data that are directed to obstructing objects of the n sensor put at current time, and it is directed to obstacle at least n
The measurement data of object carries out fusion treatment, obtains fused data group, afterwards, vehicle can determine to record in the fused data group
Each obstructing objects and the vehicle distance, and distance can be less than the obstructing objects of pre-determined distance threshold value and be defined as target
Obstructing objects, and the barrier for being more than pre-determined distance threshold value for distance then determines that the obstructing objects will not be made to the traveling of vehicle
Into influence, the reliability this improves vehicle to the result of detection of surrounding road environment.
The embodiments of the invention provide another target disorders object to determine method, and as shown in Fig. 2-1, this method includes:
At least n surveys that are directed to obstructing objects of the n sensor set in step 201, acquisition vehicle at current time
Data are measured, the n is more than or equal to 1.
In embodiments of the present invention, the n sensor can be set in the surrounding of the vehicle, and the n is more than or equal to 14.Show
Example, the n sensor can include:6 millimetre-wave radar sensors, 6 laser radar sensors and 2 visual sensings
Device, 2 vision sensors are arranged on the front windshield of the vehicle, and 6 millimetre-wave radar sensors are evenly distributed on
The surrounding of the vehicle, 6 laser radar sensors are evenly distributed on the surrounding of the vehicle.Fig. 2-2 is that the embodiment of the present invention carries
A kind of sensor supplied sets orientation schematic diagram, and as shown in Fig. 2-2,2 vision sensors 01 and 02 can be symmetricly set on
On the front windshield 00 of the vehicle, 6 millimetre-wave radar sensors can be evenly distributed on the surrounding of the vehicle, wherein milli
Metre wave radar sensor 11,12 can be arranged at vehicle front car light, before millimetre-wave radar sensor 13 can be arranged on vehicle
The lower section of square car plate, millimetre-wave radar sensor 14 can be arranged at vehicle left back door, other two millimetre-wave radar sensing
Device can be arranged at vehicle right-rear-door and (not marked in figure) at vehicle tail car plate;6 laser radar sensors can
To be evenly distributed on the surrounding of the vehicle, wherein, laser radar sensor 21,22 can be arranged at vehicle front car light, laser
Radar sensor 23 can be arranged on vehicle front windshield 00, and laser radar sensor 24 can be arranged on vehicle left back door
Place, other two laser radar sensor can be arranged at vehicle right-rear-door and (not marked in figure at vehicle tail car plate
Go out).
In actual applications, laser radar sensor and millimeter wave thunder sensor are provided with up to being typically arranged in pairs
The detection range of the millimetre-wave radar sensor of vehicle front can be 150 meters (m), be arranged on the millimeter wave thunder at vehicle side rear
Detection range up to sensor can be 50m.The detection range of laser radar sensor is generally 60m.Vision sensor positions
Distance is 80m.Multiple sensors that vehicle's surroundings are set can realize the effect of 360 degrees omnidirection detection Vehicle target,
Improve detectivity of the vehicle to surrounding road environment.The number of the sensor set on the vehicle can be actual according to vehicle
Situation is increased and decreased, and the orientation of each sensor can also be configured according to actual conditions, and the embodiment of the present invention is not done to this
Limit.
In embodiments of the present invention, vehicle can obtain at least n of the n sensor at current time and be directed to obstructing objects
Measurement data, when the sensor is millimetre-wave radar sensor, the millimetre-wave radar sensor that vehicle obtains is when current
The measurement data that t is carved for obstructing objects can be expressed as:
ZR(t)={ r1,r2,...,rp}
Wherein,I=1 ..., p, p represent that millimetre-wave radar sensor detects at current time
The number of the obstructing objects arrived, T representing matrix transposition, riFor millimetre-wave radar sensor at current time to i-th of barrier
The measurement data of body, wherein (xi,yi) it is the coordinate of i-th of obstructing objects in the coordinate system of millimetre-wave radar sensor,For i-th of obstructing objects current time speed.
When the sensor is laser radar sensor, the laser radar sensor that vehicle obtains is directed in current time t
The measurement data of obstructing objects can be expressed as:
ZL(t)={ l1,l2,...,lq}
Wherein,I=1 ..., q, q represent laser radar sensor when current
Carve the number of the obstructing objects detected, liMeasurement number for laser radar sensor at current time to i-th of obstructing objects
According to wherein (xi,yi) it is the coordinate of i-th of obstructing objects in the coordinate system of laser radar sensor, φiFor i-th of obstacle
Yaw angle of the object in the coordinate system of laser radar sensor,For i-th of obstructing objects current time speed
Degree, (wi,li) for the width and length of i-th obstructing objects.
When the sensor is vision sensor, the vision sensor that vehicle obtains is directed to obstructing objects in current time t
Measurement data can be expressed as:
ZC(t)={ c1,c2,...,co}
Wherein, ci=[(xi1,yi1),(xi2,yi2), class]T, i=1 ..., o, o represent vision sensor when current
Carve the number of the obstructing objects detected, liMeasurement data for vision sensor at current time to i-th of obstructing objects, its
In (xi1,yi1) coordinate in the upper left corner of the external frames of i-th of obstructing objects that detects for vision sensor, (xi2,yi2)
The coordinate in the lower right corner of the external frame of i-th of the obstructing objects detected for vision sensor, class are vision sensor
The type of i-th of the obstructing objects determined, the wherein type of obstructing objects can include:Pedestrian, bicycle and vehicle etc..Depending on
Feel sensor it is determined that obstructing objects type when, the class that will can be stored in the image and database of the obstructing objects that detected
Pattern plate is contrasted, and matching degree highest type is defined as to the type of the obstructing objects.
Step 202, fusion treatment is carried out at least n measurement data for obstructing objects, obtain fused data group.
The fused data group have recorded in the obstructing objects that the n sensor detects and be merged corresponding to each obstructing objects
Data.According to the difference of the species of the sensor set on vehicle, the fused data can include different types of information, example
, it is assumed that the sensor set on vehicle includes vision sensor, laser radar sensor and millimetre-wave radar sensor, then melts
The positional information of the obstructing objects, speed can be included by closing in fused data corresponding to each obstructing objects recorded in data group
Information, the width and height and the type of the obstructing objects etc. of the obstructing objects.
Fig. 2-3 is the method stream that a kind of measurement data to obstructing objects provided in an embodiment of the present invention carries out fusion treatment
Cheng Tu, as Figure 2-3, this method include:
Step 2021, for each sensor in n sensor establish observation model.
The observation model is used for the measurement data according to the sensor at current time to obstructing objects, obtains subsequent time
To the prediction data of the obstructing objects.The subsequent time differs t with the current time, and the t is more than 0.In embodiments of the present invention,
Vehicle can establish observation model according to the species of each sensor for the sensor.Example, sensed for millimetre-wave radar
Device, the observation model that vehicle is established can be:
Wherein r (t+1) represents millimetre-wave radar sensor to obstructing objects in the prediction data at t+1 moment, (x (t), y
(t)) it is coordinate of the obstructing objects in t in millimetre-wave radar sensor coordinate system,For obstructing objects t when
The speed at quarter, vr(t) measurement noise of the millimetre-wave radar sensor in t is represented.
For laser radar sensor, the observation model that vehicle is established can be:
L (t+1)=[(x (t), y (t)), φ (t), v (t) cos (φ (t)), v (t) sin (φ (t)), w (t), l (t)]T+
vl(t)
Wherein l (t+1) represents laser radar sensor to prediction data of the obstructing objects at the t+1 moment, (x (t), y (t))
For coordinate of the obstructing objects t in laser radar sensor coordinate system, φ (t) is obstructing objects in laser radar sensor
Coordinate system in yaw angle, w (t), l (t) be obstructing objects t width and length, vl(t) laser radar sensing is represented
Measurement noise of the device in t.
For vision sensor, the observation model that vehicle is established can be:
Wherein x2(t+1)-x1(t+1) represent the obstructing objects of vision sensor prediction in the width at t+1 moment, y2(t+
1)-y1(t+1) obstructing objects of vision sensor prediction are represented in the length at t+1 moment, when w (t), l (t) are obstructing objects t
The width and length at quarter, wherein, w (t), l (t) they can be the obstructing objects that vision sensor directly obtains from laser sensor
Measurement data, or measurement data of the vision sensor in t to obstructing objects, i.e. w (t)=x2(t)-x1(t), l
(t)=y2(t)-y1(t), fpFor the focal length of vision sensor, d is the distance between the barrier and vehicle, vc(t) represent to swash
Measurement noise of the optical radar sensor in t.Vision sensor can be by the measurement data to the barrier, the obstacle
The distance between the obstructing objects and vehicle d is calculated in the coordinate of object, and the vision sensor can also obtain millimeter wave thunder
Measurement data up to sensor or laser radar sensor to the obstructing objects, so obtain more accurately the obstructing objects with
The distance between vehicle d.
Vehicle can also be that each obstructing objects establish motion mould except that can be that each sensor establishes observation model
Type, the motion model are used to characterize motion state of each obstructing objects at current time.For millimetre-wave radar sensor
The obstructing objects detected, vehicle are that the motion model that obstructing objects are established can be:
Wherein (x (t), y (t)),Represent the obstructing objects in the position at current time respectively
Information, velocity information and acceleration information.
The obstructing objects detected for laser radar and vision sensor, vehicle are the motion model that obstructing objects are established
Can be:
L (t)=[(x (t), y (t)), φ (t), v (t), φ ' (t), a (t), w (t), l (t), h (t)]T
Wherein, (x (t), y (t)) is the centre coordinate of the obstructing objects, and φ (t), v (t), φ ' (t), a (t) are respectively should
Yaw angle of the obstructing objects in sensor coordinate system, the translational speed of the obstructing objects, yaw rate, and acceleration.w
(t), l (t), h (t) are then the width of the obstructing objects, length and height.
Step 2022, according to the observation model, obtain in the n sensor each sensor in this prior the moment to this extremely
The prediction data of few obstructing objects.
In embodiments of the present invention, obstructing objects are directed to when vehicle obtains at least n of the n sensor at current time
Measurement data after, measurement data that vehicle can be according to each sensor in last moment to each obstructing objects passes through
The observation model of the sensor, prediction data of the sensor at current time to each obstructing objects is calculated, and then obtains
To n sensor current time the prediction data at least one obstructing objects.
Step 2023, screening can merge measurement data set at least n measurement data for obstructing objects, and this can
Each measurement data is less than predetermined threshold value with the error of corresponding prediction data in fusion measurement data set.
In embodiments of the present invention, it can calculate the sensor for each sensor in n sensor, vehicle and be directed to
The each measurement data of obstructing objects and the error of prediction data, and judge whether the error is less than predetermined threshold value, if the error
Less than predetermined threshold value, then vehicle can determine that the sensor matches to the measurement data and prediction data of the barrier,
And the measurement data is defined as to merge measurement data.Example, it is assumed that laser radar sensor detects in current time t
The measurement data to barrier A be lA(t), according to the observation model of laser radar sensor, the current time pair got
Obstructing objects A prediction data is l'A(t), then vehicle can calculate measurement of the laser radar sensor to obstructing objects A
Data lA(t) with prediction data l'A(t) error between, and measurement data l is determined when the error is less than predetermined threshold valueA
(t) it is that can merge measurement data.In actual applications, vehicle in contrast sensor to the measurement data of each obstructing objects and pre-
When surveying data, it will usually the positional information in the measurement data and prediction data is contrasted, when measurement data and prediction number
When positional information in matches, you can determine the measurement data as measurement data can be merged, or this can also be predicted
Data are defined as that measurement data can be merged.
Step 2024, the measurement data that can be merged to this in measurement data set carry out fusion treatment.
In embodiments of the present invention, vehicle to this can merge measurement data in measurement data set carry out fusion treatment can be with
Including:First, the measurement data for obstructing objects of each sensor in n sensor is transformed into bodywork reference frame
In, and the measurement data of each sensor is synchronized to the same moment;Then to the survey after the time-space relation of the n sensor
Amount data are associated, and the survey for same obstructing objects is obtained from the measurement data for obstructing objects of n sensor
Data are measured, the measurement data for finally detecting the different sensors for same obstructing objects is closed by certain algorithm
And and then obtain the fused data of each obstructing objects.Wherein, the detailed process of fusion treatment is carried out to measurement data to join
Correlation technique is examined, the embodiment of the present invention will not be described here.
Step 203, the distance for determining each obstructing objects and vehicle recorded in the fused data group.
In embodiments of the present invention, vehicle can merge number according to corresponding to each obstructing objects in the fused data group
The positional information of the obstructing objects in, calculate the distance of the current time obstructing objects and vehicle.Example, it is assumed that fusion
The obstructing objects A recorded in data group fused data is { (xA,yA),vA, (wA,lA), pedestrian }, can from the fused data
Know, coordinates of the obstructing objects A in bodywork reference frame is (xA,yA), the translational speed at obstructing objects A current time is
vA, the obstructing objects A width and length are (wA,lA), and obstructing objects A is pedestrian.Then vehicle can be according to the fusion
The positional information of obstructing objects A in data, i.e. coordinate (xs of the obstructing objects A in bodywork reference frameA,yA), it is calculated
Obstructing objects A is apart from the distance s of vehicle:
Step 204, target disorders object will be defined as apart from the obstructing objects for being less than pre-determined distance threshold value.
In embodiments of the present invention, the pre-determined distance threshold value can be what vehicle was pre-set, when obstructing objects and vehicle
Distance when being less than the pre-determined distance threshold value, vehicle determines that the obstructing objects may impact to the traveling of vehicle, and will
The obstructing objects are defined as target disorders object.Afterwards, vehicle can store the fused data of the obstructing objects to number of targets
By the display device that is set in vehicle drive room or pass through voice according in storehouse, and by the data stored in the target database
System informs driver, so that driver according to the fused data, can adjust the transport condition of vehicle in time.Work as obstructing objects
When being more than the pre-determined distance threshold value with the distance of vehicle, vehicle can determine that the obstructing objects will not cause shadow to the traveling of vehicle
Ring, then do not have to store the fused data of the obstructing objects into target database, and then improve vehicle to surrounding road ring
The reliability of the result of detection in border.Example, it is assumed that pre-determined distance threshold value is 100m in vehicle, and obstructing objects A is in car body coordinate
Coordinate (x in systemA,yA)=(30,40), then vehicle is calculated obstructing objects A and is apart from the distance s of vehicle:Because obstructing objects A is apart from the distance of vehicle:50m be less than it is default away from
From threshold value 100m, then obstructing objects A can be defined as target disorders object by vehicle.
Step 205, the history fused data for obtaining the target disorders object, the history fused data are default number of targets
According to the fused data stored before storehouse in this prior moment.
In embodiments of the present invention, after vehicle determines target disorders object, the mesh can also be obtained from target database
Mark the history fused data of barrier, the i.e. fused data of the target disorders object before current time.Example, it is assumed that car
The target obstacle A obtained from target database history fused data includes melting for last moment obstructing objects A
Data are closed, the history fused data can be:{ (40,40), 1 metre per second (m/s) (m/s), (0.5,1.7), pedestrian }.
Step 206, according to the fused data of the target disorders object in the fused data group and the history fused data, really
The motion state of the fixed target disorders object.
In embodiments of the present invention, vehicle can according to the fused data of the target disorders object in the fused data group and
The history fused data, determine the motion state of the target disorders object.Vehicle can be according to the fusion of the target disorders object
Positional information in data and history fused data, judge whether the target disorders object is motion state, if motion shape
State, then it is also to be proximate to vehicle away from vehicle that can determine whether the target disorders object;Vehicle can also be according to the mesh
Mark obstructing objects fused data and history fused data in velocity information, judge the target disorders object be accelerate or
Slow down etc..
Example, it is assumed that target disorders object A fused data is in the fused data group:(30,40), 1m/s,
(0.5,1.7), pedestrian }, target disorders object A history fused data is:(40,40), 3.3m/s, (0.5,1.7), OK
People }, then vehicle can be according to the positional information in the fused data:And the positional information in history fused data (30,40):
(40,40) are changed, and it is motion state to determine target disorders object A.Further, vehicle can also melt according to history
Close the positional information in data:(40,40) target disorders object is determined:A is in last moment and the distance of this carAnd the positional information in fused data:(30,40) determine that target disorders object A is working as
Preceding moment and the distance of this carTarget disorders object A can be calculated current in vehicle
Moment and the distance s2 and the difference of last moment and the distance s1 of this car distance of this car are:Δ s=s1-s2=6.6m, and
Vehicle detection is 10m/s to current travel speed, and the cycle of sensor detection is 0.5s, i.e. last moment and current time
Difference be 0.5s, then vehicle move forward distance S=10m/s × 0.5s=5m, because target disorders object A is at current time
It is more than the distance S of vehicle movement with the distance s2 and the difference Δ s=6.6m of last moment and the distance s1 of this car distance of this car
=5m, then vehicle can determine that target disorders object A motion shape body is:Close to this car, and can be by the target obstacle
Body A motion state:Store to target database close to this car and inform driver.
It should be noted that in actual applications, it is a variety of that vehicle also needs to the travel speed with reference to this car, travel direction etc.
The motion state of parameters on target obstructing objects is more accurately judged.
Step 207, the motion state of the target disorders object stored into the target database.
Vehicle can store the motion state of target disorders object into the target database, and by the target database
The motion state of middle storage informs driver by the display device that is set in vehicle drive room or by voice system, so as to
Driver can be adjusted according to the fused data and motion state of target disorders object to the transport condition of vehicle.Car
The transport condition of this car can also be judged according to the fused data and motion state of target disorders object, work as judgement
When the transport condition of this car is the state of emergency, for example, it is closer to the distance when target disorders object and this car, and the traveling of this car
During speed, vehicle can intervene the transport condition of this car automatically.
It should be noted that vehicle can carry out real-time update to the information stored in the target database, when target hinders
After hindering the distance of object and vehicle to be more than pre-determined distance threshold value, vehicle can delete the target disorders object from target database
Fused data and motion state, avoid produce wrong report.
It should also be noted that, the sequencing for the step of target disorders object provided in an embodiment of the present invention determines method
Can suitably it be adjusted, step according to circumstances can also accordingly be increased and decreased.Any one skilled in the art
The invention discloses technical scope in, the method that can readily occur in change should be all included within the scope of the present invention,
Therefore repeat no more.
In summary, target disorders object provided in an embodiment of the present invention determines method, and vehicle can be obtained on vehicle and set
At least n measurement data that are directed to obstructing objects of the n sensor put at current time, and it is directed to obstacle at least n
The measurement data of object carries out fusion treatment, obtains fused data group, afterwards, vehicle can determine to record in the fused data group
Each obstructing objects and the vehicle distance, and distance can be less than the obstructing objects of pre-determined distance threshold value and be defined as target
Obstructing objects, and the barrier for being more than pre-determined distance threshold value for distance then determines that the obstructing objects will not be made to the traveling of vehicle
Into influence, the reliability this improves vehicle to the result of detection of surrounding road environment.
The embodiments of the invention provide a kind of target disorders object determining device, the target disorders object determining device can be with
For the part on vehicle or vehicle, as shown in figure 3-1, the target disorders object determining device 300 includes:
First acquisition unit 301, it is directed to for obtaining at least n of the n sensor set on vehicle at current time
The measurement data of obstructing objects, the n are more than or equal to 1.
Processing unit 302, for carrying out fusion treatment at least n measurement data for obstructing objects, melted
Data group is closed, the fused data group have recorded in the obstructing objects that the n sensor detects melts corresponding to each obstructing objects
Close data.
First determining unit 303, for each obstructing objects for determining to record in the fused data group and the vehicle away from
From.
Second determining unit 304, the obstructing objects for distance to be less than to pre-determined distance threshold value are defined as target obstacle
Body.
In summary, target disorders object determining device provided in an embodiment of the present invention, vehicle can be obtained on vehicle and set
At least n measurement data that are directed to obstructing objects of the n sensor put at current time, and it is directed to obstacle at least n
The measurement data of object carries out fusion treatment, obtains fused data group, afterwards, vehicle can determine to record in the fused data group
Each obstructing objects and the vehicle distance, and distance can be less than the obstructing objects of pre-determined distance threshold value and be defined as target
Obstructing objects, and the barrier for being more than pre-determined distance threshold value for distance then determines that the obstructing objects will not be made to the traveling of vehicle
Into influence, the reliability this improves vehicle to the result of detection of surrounding road environment.
The embodiments of the invention provide another target disorders object determining device, as shown in figure 3-2, the target obstacle
Body determining device 300 includes:
First acquisition unit 301, it is directed to for obtaining at least n of the n sensor set on vehicle at current time
The measurement data of obstructing objects, the n are more than or equal to 1.
Processing unit 302, for carrying out fusion treatment at least n measurement data for obstructing objects, melted
Data group is closed, the fused data group have recorded in the obstructing objects that the n sensor detects melts corresponding to each obstructing objects
Close data.
First determining unit 303, for each obstructing objects for determining to record in the fused data group and the vehicle away from
From.
Second determining unit 304, the obstructing objects for distance to be less than to pre-determined distance threshold value are defined as target obstacle
Body.
Second acquisition unit 305, for obtaining the history fused data of the target disorders object, the history fused data is
The default fused data stored before the target database current time.
3rd determining unit 306, for being gone through according to the fused data of the target disorders object in the fused data group with this
History fused data, determine the motion state of the target disorders object.
Memory cell 307, for the motion state of the target disorders object to be stored into the target database.
Optionally, the n sensor is set in the surrounding of the vehicle, and the n is more than or equal to 14.
Optionally, the n sensor includes:6 millimetre-wave radar sensors, 6 laser radar sensors and 2 visions
Sensor;
2 vision sensors are arranged on the front windshield of the vehicle, and 6 millimetre-wave radar sensors uniformly divide
For cloth in the surrounding of the vehicle, 6 laser radar sensors are evenly distributed on the surrounding of the vehicle.
Optionally, the processing unit 302, is additionally operable to:
Observation model is established for each sensor in the n sensor, the observation model is used to exist according to the sensor
Current time to the measurement data of obstructing objects, obtains prediction data of the subsequent time to the obstructing objects, the subsequent time with
The current time differs t, and the t is more than 0;
According to the observation model, obtain in the n sensor each sensor in this prior the moment at least one barrier
Hinder the prediction data of object;
Screening can merge measurement data set at least n measurement data for obstructing objects, and this can merge measurement
Each measurement data is less than predetermined threshold value with the error of corresponding prediction data in data group;
The measurement data that can be merged to this in measurement data set carries out fusion treatment.
In summary, target disorders object determining device provided in an embodiment of the present invention, vehicle can be obtained on vehicle and set
At least n measurement data that are directed to obstructing objects of the n sensor put at current time, and it is directed to obstacle at least n
The measurement data of object carries out fusion treatment, obtains fused data group, afterwards, vehicle can determine to record in the fused data group
Each obstructing objects and the vehicle distance, and distance can be less than the obstructing objects of pre-determined distance threshold value and be defined as target
Obstructing objects, and the barrier for being more than pre-determined distance threshold value for distance then determines that the obstructing objects will not be made to the traveling of vehicle
Into influence, the reliability this improves vehicle to the result of detection of surrounding road environment.
It is apparent to those skilled in the art that for convenience and simplicity of description, the device of foregoing description
With the specific work process of unit, the corresponding process in preceding method embodiment is may be referred to, will not be repeated here.
The foregoing is only presently preferred embodiments of the present invention, be not intended to limit the invention, it is all the present invention spirit and
Within principle, any modification, equivalent substitution and improvements made etc., it should be included in the scope of the protection.
Claims (8)
1. a kind of target disorders object determines method, it is characterised in that methods described includes:
The n sensor set on vehicle is obtained at least n measurement data for being directed to obstructing objects at current time, the n
More than or equal to 1;
Fusion treatment is carried out at least n measurement data for obstructing objects, obtains fused data group, the fusion number
Fused data corresponding to each obstructing objects in the obstructing objects that the n sensor detect is have recorded according to group;
Determine the distance of each obstructing objects and vehicle recorded in the fused data group;
The obstructing objects that distance is less than to pre-determined distance threshold value are defined as target disorders object;
It is described that fusion treatment is carried out at least n measurement data for obstructing objects, including:
Observation model is established for each sensor in the n sensor, the observation model is used for according to the sensor
Measurement data at current time to obstructing objects, prediction data of the subsequent time to the obstructing objects is obtained, it is described next
Moment differs t with the current time, and the t is more than 0;
According to the observation model, each sensor is obtained in the n sensor at the current time to described at least one
The prediction data of individual obstructing objects;
Screening can merge measurement data set at least n measurement data for obstructing objects, described to merge measurement
Each measurement data is less than predetermined threshold value with the error of corresponding prediction data in data group;
Fusion treatment is carried out to the measurement data merged in measurement data set.
2. according to the method for claim 1, it is characterised in that it is described by the obstructing objects be defined as target disorders it
Afterwards, methods described also includes:
The history fused data of the target disorders object is obtained, the history fused data is default in target data place
State the fused data stored before current time;
The fused data of target disorders object according to the fused data group and the history fused data, it is determined that described
The motion state of target disorders object;
The motion state of the target disorders object is stored into the target database.
3. according to the method for claim 1, it is characterised in that the n sensor is set in the surrounding of the vehicle, institute
N is stated more than or equal to 14.
4. method according to any one of claims 1 to 3, it is characterised in that the n sensor includes:6 millimeter wave thunders
Up to sensor, 6 laser radar sensors and 2 vision sensors;
2 vision sensors are arranged on the front windshield of the vehicle, and 6 millimetre-wave radar sensors are uniform
The surrounding of the vehicle is distributed in, 6 laser radar sensors are evenly distributed on the surrounding of the vehicle.
5. a kind of target disorders object determining device, it is characterised in that described device includes:
First acquisition unit, obstructing objects are directed to for obtaining at least n of the n sensor set on vehicle at current time
Measurement data, the n be more than or equal to 1;
Processing unit, for carrying out fusion treatment at least n measurement data for obstructing objects, obtain fused data
Group, the fused data group have recorded in the obstructing objects that the n sensor detects and merged corresponding to each obstructing objects
Data;
First determining unit, for the distance of each obstructing objects and the vehicle that determine to record in the fused data group;
Second determining unit, the obstructing objects for distance to be less than to pre-determined distance threshold value are defined as target disorders object;
The processing unit, is specifically used for:
Observation model is established for each sensor in the n sensor, the observation model is used for according to the sensor
Measurement data at current time to obstructing objects, prediction data of the subsequent time to the obstructing objects is obtained, it is described next
Moment differs t with the current time, and the t is more than 0;
According to the observation model, each sensor is obtained in the n sensor at the current time to described at least one
The prediction data of individual obstructing objects;
Screening can merge measurement data set at least n measurement data for obstructing objects, described to merge measurement
Each measurement data is less than predetermined threshold value with the error of corresponding prediction data in data group;
Fusion treatment is carried out to the measurement data merged in measurement data set.
6. device according to claim 5, it is characterised in that described device also includes:
Second acquisition unit, for obtaining the history fused data of the target disorders object, the history fused data is pre-
If the fused data stored before current time described in target database;
3rd determining unit, for the fused data of target disorders object and the history according to the fused data group
Fused data, determine the motion state of the target disorders object;
Memory cell, for the motion state of the target disorders object to be stored into the target database.
7. device according to claim 5, it is characterised in that the n sensor is set in the surrounding of the vehicle, institute
N is stated more than or equal to 14.
8. according to any described device of claim 5 to 7, it is characterised in that the n sensor includes:6 millimeter wave thunders
Up to sensor, 6 laser radar sensors and 2 vision sensors;
2 vision sensors are arranged on the front windshield of the vehicle, and 6 millimetre-wave radar sensors are uniform
The surrounding of the vehicle is distributed in, 6 laser radar sensors are evenly distributed on the surrounding of the vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510526498.XA CN105109484B (en) | 2015-08-21 | 2015-08-21 | Target disorders object determines method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510526498.XA CN105109484B (en) | 2015-08-21 | 2015-08-21 | Target disorders object determines method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105109484A CN105109484A (en) | 2015-12-02 |
CN105109484B true CN105109484B (en) | 2017-11-14 |
Family
ID=54657680
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510526498.XA Active CN105109484B (en) | 2015-08-21 | 2015-08-21 | Target disorders object determines method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105109484B (en) |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6796653B2 (en) * | 2016-09-15 | 2020-12-09 | 株式会社小糸製作所 | Sensor system |
CN110023783B (en) * | 2016-12-06 | 2023-01-17 | 本田技研工业株式会社 | Vehicle surrounding information acquisition device and vehicle |
CN106585623B (en) * | 2016-12-21 | 2023-12-01 | 驭势科技(北京)有限公司 | Detection system for detecting objects around vehicle and application thereof |
CN106767852B (en) * | 2016-12-30 | 2019-10-11 | 东软集团股份有限公司 | A kind of method, apparatus and equipment generating detection target information |
DE102017203838B4 (en) * | 2017-03-08 | 2022-03-17 | Audi Ag | Process and system for detecting the surroundings |
CN107092252A (en) * | 2017-04-11 | 2017-08-25 | 杭州光珀智能科技有限公司 | A kind of robot automatic obstacle avoidance method and its device based on machine vision |
CN107330925B (en) * | 2017-05-11 | 2020-05-22 | 北京交通大学 | Multi-obstacle detection and tracking method based on laser radar depth image |
CN113466822A (en) * | 2017-07-04 | 2021-10-01 | 百度在线网络技术(北京)有限公司 | Method and apparatus for detecting obstacles |
CN107578427B (en) * | 2017-07-31 | 2021-05-18 | 深圳市易成自动驾驶技术有限公司 | Method and device for detecting dynamic obstacle and computer readable storage medium |
CN107918386B (en) * | 2017-10-25 | 2021-01-01 | 北京汽车集团有限公司 | Multi-sensor data fusion method and device for vehicle and vehicle |
CN108458745A (en) * | 2017-12-23 | 2018-08-28 | 天津国科嘉业医疗科技发展有限公司 | A kind of environment perception method based on intelligent detection equipment |
CN108458746A (en) * | 2017-12-23 | 2018-08-28 | 天津国科嘉业医疗科技发展有限公司 | One kind being based on sensor method for self-adaption amalgamation |
GB201803433D0 (en) * | 2018-03-02 | 2018-04-18 | Secr Defence | Dual polarised antenna |
CN108646739A (en) * | 2018-05-14 | 2018-10-12 | 北京智行者科技有限公司 | A kind of sensor information fusion method |
CN111316126A (en) * | 2018-12-28 | 2020-06-19 | 深圳市大疆创新科技有限公司 | Target detection method, radar, vehicle, and computer-readable storage medium |
CN109649408B (en) * | 2019-01-21 | 2020-09-25 | 厦门理工学院 | Vehicle peripheral obstacle detection method and vehicle |
DE102019102923B4 (en) * | 2019-02-06 | 2022-12-01 | Bayerische Motoren Werke Aktiengesellschaft | Method and device for sensor data fusion for a vehicle |
CN110018470A (en) * | 2019-03-01 | 2019-07-16 | 北京纵目安驰智能科技有限公司 | Based on example mask method, model, terminal and the storage medium merged before multisensor |
CN109885056A (en) * | 2019-03-07 | 2019-06-14 | 格陆博科技有限公司 | A kind of more scene selection methods merged based on monocular cam and millimetre-wave radar |
CN111923898B (en) * | 2019-05-13 | 2022-05-06 | 广州汽车集团股份有限公司 | Obstacle detection method and device |
CN110597269B (en) * | 2019-09-30 | 2023-06-02 | 潍柴动力股份有限公司 | Autonomous obstacle avoidance method and autonomous obstacle avoidance system for vehicle |
CN110703732B (en) * | 2019-10-21 | 2021-04-13 | 北京百度网讯科技有限公司 | Correlation detection method, device, equipment and computer readable storage medium |
CN110884489B (en) * | 2019-12-18 | 2021-07-06 | 常州市武进悦达电声器材有限公司 | New energy automobile is with chassis obstacle detection early warning device all around |
CN111231982B (en) * | 2020-01-08 | 2021-05-04 | 中国第一汽车股份有限公司 | Obstacle identification method and device for intelligent driving, vehicle and storage medium |
CN111273279B (en) * | 2020-02-18 | 2022-05-10 | 中国科学院合肥物质科学研究院 | Multi-radar data processing method based on acceleration noise parameters |
CN112924960B (en) * | 2021-01-29 | 2023-07-18 | 重庆长安汽车股份有限公司 | Target size real-time detection method, system, vehicle and storage medium |
CN113325826B (en) * | 2021-06-08 | 2022-08-30 | 矿冶科技集团有限公司 | Underground vehicle control method and device, electronic equipment and storage medium |
CN113753076B (en) * | 2021-08-06 | 2023-04-28 | 北京百度网讯科技有限公司 | Method and device for judging effective obstacle, electronic equipment and automatic driving vehicle |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102385055A (en) * | 2010-08-03 | 2012-03-21 | 株式会社电装 | Vehicle-use obstacle detection apparatus |
CN104477167A (en) * | 2014-11-26 | 2015-04-01 | 浙江大学 | Intelligent driving system and control method thereof |
CN104798124A (en) * | 2012-11-21 | 2015-07-22 | 丰田自动车株式会社 | Driving-assistance device and driving-assistance method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8060306B2 (en) * | 2007-10-04 | 2011-11-15 | Deere & Company | Method and system for obstacle avoidance for a vehicle |
-
2015
- 2015-08-21 CN CN201510526498.XA patent/CN105109484B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102385055A (en) * | 2010-08-03 | 2012-03-21 | 株式会社电装 | Vehicle-use obstacle detection apparatus |
CN104798124A (en) * | 2012-11-21 | 2015-07-22 | 丰田自动车株式会社 | Driving-assistance device and driving-assistance method |
CN104477167A (en) * | 2014-11-26 | 2015-04-01 | 浙江大学 | Intelligent driving system and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
CN105109484A (en) | 2015-12-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105109484B (en) | Target disorders object determines method and device | |
Liu et al. | Robust target recognition and tracking of self-driving cars with radar and camera information fusion under severe weather conditions | |
US10809364B2 (en) | Determining relative velocity using co-located pixels | |
CN106842188B (en) | A kind of object detection fusing device and method based on multisensor | |
CN106808482B (en) | A kind of crusing robot multisensor syste and method for inspecting | |
CN110239535A (en) | A kind of bend active collision avoidance control method based on Multi-sensor Fusion | |
CN1963867A (en) | Monitoring apparatus | |
CN105404844A (en) | Road boundary detection method based on multi-line laser radar | |
CN107797118A (en) | Trailer identification and blind area adjustment based on camera | |
CN109064495A (en) | A kind of bridge floor vehicle space time information acquisition methods based on Faster R-CNN and video technique | |
CN103448724A (en) | Lane departure early warning method and device | |
CN102288121A (en) | Method for measuring and pre-warning lane departure distance based on monocular vision | |
CN206734295U (en) | A kind of detection system for being used to detect Vehicle target and its application | |
CN105930787A (en) | Vehicle door opening early-warning method | |
CN110378202A (en) | One kind being based on fish-eye comprehensive pedestrian impact method for early warning | |
CN109212377B (en) | High-voltage line obstacle identification method and device and inspection robot | |
CN109747643A (en) | A kind of information fusion method of intelligent vehicle sensory perceptual system | |
CN104464305A (en) | Intelligent vehicle converse driving detecting device and method | |
CN105160356A (en) | Method and system for fusing sensor data of vehicle active safety system | |
Binelli et al. | A modular tracking system for far infrared pedestrian recognition | |
WO2023274177A1 (en) | Map construction method and apparatus, device, warehousing system, and storage medium | |
CN116310679A (en) | Multi-sensor fusion target detection method, system, medium, equipment and terminal | |
CN115985122A (en) | Unmanned system sensing method | |
CN110422173A (en) | A kind of environment recognition methods | |
CN106408593A (en) | Video-based vehicle tracking method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20220218 Address after: 241006 Anshan South Road, Wuhu Economic and Technological Development Zone, Anhui Province Patentee after: Wuhu Sambalion auto technology Co.,Ltd. Address before: 241006 Changchun Road, Wuhu economic and Technological Development Zone, Wuhu, Anhui, 8 Patentee before: CHERY AUTOMOBILE Co.,Ltd. |
|
TR01 | Transfer of patent right |