CN105109484A - Target-barrier determining method and device - Google Patents

Target-barrier determining method and device Download PDF

Info

Publication number
CN105109484A
CN105109484A CN201510526498.XA CN201510526498A CN105109484A CN 105109484 A CN105109484 A CN 105109484A CN 201510526498 A CN201510526498 A CN 201510526498A CN 105109484 A CN105109484 A CN 105109484A
Authority
CN
China
Prior art keywords
data
obstructing objects
vehicle
sensor
take
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510526498.XA
Other languages
Chinese (zh)
Other versions
CN105109484B (en
Inventor
谷明琴
张绍勇
杜金枝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhu Lion Automotive Technologies Co Ltd
Original Assignee
Chery Automobile Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chery Automobile Co Ltd filed Critical Chery Automobile Co Ltd
Priority to CN201510526498.XA priority Critical patent/CN105109484B/en
Publication of CN105109484A publication Critical patent/CN105109484A/en
Application granted granted Critical
Publication of CN105109484B publication Critical patent/CN105109484B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • B60W30/0956Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a target-barrier determining method and device, and belongs to the field of intelligent traffic. The target-barrier determining method comprises the following steps: obtaining at least n pieces of measurement data aiming at barriers at the current moment, of n sensors arranged on a vehicle, wherein n is larger than or equal to 1; performing fusion processing on the n measurement data aiming at the barriers so as to obtain a fusion data group, wherein the fusion data group records the fusion data corresponding to each of the barriers, detected by the sensors; determining the distance between each of the barriers recorded in the fusion data group and the vehicle; and determining the barrier from which the distance to the vehicle is smaller than a preset distance threshold as a target barrier. According to the target-barrier determining method and device disclosed by the invention, the target barrier is determined according to the distance from the barrier to the vehicle, so that the reliability of probing results of the vehicle to the surrounding road environment is improved.

Description

Target disorders object defining method and device
Technical field
The present invention relates to intelligent transportation field, particularly a kind of target disorders object defining method and device.
Background technology
In order to detect the road environment of vehicle periphery, in vehicle, be generally provided with the multiple sensors such as vision sensor, millimeter wave radar sensor and laser radar sensor.Vehicle can obtain the take off data that this multiple sensors gathers, and fusion treatment is carried out to the take off data that this multiple sensors gathers, namely the take off data gathered the plurality of sensor carries out complementation and optimal combination to generate more reliable more accurate information, and then improves vehicle to the detectability of surrounding road environment.
In correlation technique, the data that vehicle can gather multiple sensor carry out fusion treatment, the process of this fusion treatment mainly comprises: the take off data first obtaining the target obstacle that each sensor detects, afterwards time-space relation is carried out to the take off data of this target obstacle, the take off data of each sensor is transformed in same system of axes, and the take off data of each sensor is synchronized to the same moment, then the take off data after the plurality of sensor time-space relation is associated, obtain the take off data belonging to same target, last information fuse device exports vehicle to after the take off data belonging to the different sensors of same target can being merged by certain algorithm.
But, in correlation technique, when the target obstacle that sensor detects is more, information fuse device in vehicle needs to associate the take off data of each target obstacle and merge, and all export the information of the multiple target obstacle after fusion to vehicle, may there is the obstacle that can not impact vehicle in the plurality of target obstacle, therefore, the reliability of vehicle to the result of detection of surrounding road environment is lower.
Summary of the invention
In order to solve the problem of prior art, the invention provides a kind of target disorders object defining method and device, described technical scheme is as follows:
On the one hand, provide a kind of target disorders object defining method, described method comprises:
N the sensor that acquisition vehicle is arranged is at least n the take off data for obstructing objects of current time, and described n is more than or equal to 1;
Carry out fusion treatment for the take off data of obstructing objects to described at least n, obtain fused data group, described fused data group have recorded the fused data that in the obstructing objects that a described n sensor detects, each obstructing objects is corresponding;
Determine the distance of each obstructing objects and the described vehicle recorded in described fused data group;
Obstructing objects distance being less than predeterminable range threshold value is defined as target disorders object.
Optionally, described described obstructing objects is defined as target disorders after, described method also comprises:
Obtain the history fused data of described target disorders object, described history fused data is the default fused data stored before current time described in target database;
According to fused data and the described history fused data of target disorders object described in described fused data group, determine the state of kinematic motion of described target disorders object;
The state of kinematic motion of described target disorders object is stored in described target database.
Optionally, a described n sensor is arranged in the surrounding of described vehicle, and described n is more than or equal to 14.
Optionally, a described n sensor comprises: 6 millimeter wave radar sensors, 6 laser radar sensors and 2 vision sensors;
Described 2 vision sensors are arranged on the front windshield of described vehicle, and described 6 millimeter wave radar sensors are evenly distributed on the surrounding of described vehicle, and described 6 laser radar sensors are evenly distributed on the surrounding of described vehicle.
Optionally, described for the take off data of obstructing objects, fusion treatment is carried out to described at least n, comprising:
For each sensor in a described n sensor sets up observation model, described observation model is used for according to described sensor in the take off data of current time to obstructing objects, obtain subsequent time to the predicted data of described obstructing objects, described subsequent time differs t with described current time, and described t is greater than 0;
According to described observation model, to obtain in a described n sensor each sensor in the predicted data of described current time at least one obstructing objects described;
In described at least n is for the take off data of obstructing objects, screening can merge measurement data set, and the described error merging each take off data and corresponding predicted data in measurement data set is less than predetermined threshold value;
Fusion treatment is carried out to the described take off data merged in measurement data set.
On the other hand, provide a kind of mark obstructing objects determining device, described device comprises:
First acquiring unit, for obtaining n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, described n is more than or equal to 1;
Processing unit, for carrying out fusion treatment for the take off data of obstructing objects to described at least n, obtains fused data group, and described fused data group have recorded the fused data that in the obstructing objects that a described n sensor detects, each obstructing objects is corresponding;
First determining unit, for determining the distance of each obstructing objects and the described vehicle recorded in described fused data group;
Second determining unit, is defined as target disorders object for obstructing objects distance being less than predeterminable range threshold value.
Optionally, described device also comprises:
Second acquisition unit, for obtaining the history fused data of described target disorders object, described history fused data is the default fused data stored before current time described in target database;
3rd determining unit, for according to the fused data of target disorders object described in described fused data group and described history fused data, determines the state of kinematic motion of described target disorders object;
Memory cell, for being stored to the state of kinematic motion of described target disorders object in described target database.
Optionally, a described n sensor is arranged in the surrounding of described vehicle, and described n is more than or equal to 14.
Optionally, a described n sensor comprises: 6 millimeter wave radar sensors, 6 laser radar sensors and 2 vision sensors;
Described 2 vision sensors are arranged on the front windshield of described vehicle, and described 6 millimeter wave radar sensors are evenly distributed on the surrounding of described vehicle, and described 6 laser radar sensors are evenly distributed on the surrounding of described vehicle.
Optionally, described processing unit, also for:
For each sensor in a described n sensor sets up observation model, described observation model is used for according to described sensor in the take off data of current time to obstructing objects, obtain subsequent time to the predicted data of described obstructing objects, described subsequent time differs t with described current time, and described t is greater than 0;
According to described observation model, to obtain in a described n sensor each sensor in the predicted data of described current time at least one obstructing objects described;
In described at least n is for the take off data of obstructing objects, screening can merge measurement data set, and the described error merging each take off data and corresponding predicted data in measurement data set is less than predetermined threshold value;
Fusion treatment is carried out to the described take off data merged in measurement data set.
The beneficial effect that the technical scheme that the embodiment of the present invention provides is brought is:
The target disorders object defining method that the embodiment of the present invention provides and device, vehicle can obtain n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, and to this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group, afterwards, vehicle can determine the distance of each obstructing objects and this vehicle recorded in this fused data group, and the obstructing objects that distance can be less than predeterminable range threshold value is defined as target disorders object, obstacle distance being greater than to predeterminable range threshold value then determines that this obstructing objects can not impact the traveling of vehicle, therefore, improve the reliability of vehicle to the result of detection of surrounding road environment.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, below the accompanying drawing used required in describing embodiment is briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skill in the art, under the prerequisite not paying creative work, other accompanying drawing can also be obtained according to these accompanying drawings.
Fig. 1 is a kind of target disorders object defining method diagram of circuit that the embodiment of the present invention provides;
Fig. 2-1 is the another kind of target disorders object defining method diagram of circuit that the embodiment of the present invention provides;
Fig. 2-2 is a kind of sensor setting orientation schematic diagrams that the embodiment of the present invention provides;
Fig. 2-3 is method flow diagrams that a kind of take off data to obstructing objects that the embodiment of the present invention provides carries out fusion treatment;
Fig. 3-1 is the structural representation of a kind of target disorders object determining device that the embodiment of the present invention provides;
Fig. 3-2 is the structural representations of the another kind of target disorders object determining device that the embodiment of the present invention provides.
Detailed description of the invention
For making the object, technical solutions and advantages of the present invention clearly, below in conjunction with accompanying drawing, embodiment of the present invention is described further in detail.
Embodiments provide a kind of target disorders object defining method, see Fig. 1, the method comprises:
N the sensor that step 101, acquisition vehicle are arranged is at least n the take off data for obstructing objects of current time, and this n is more than or equal to 1.
Step 102, to this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group, this fused data group have recorded the fused data that in the obstructing objects that this n sensor detects, each obstructing objects is corresponding.
Step 103, determine the distance of each obstructing objects and this vehicle recorded in this fused data group.
Step 104, obstructing objects distance being less than predeterminable range threshold value are defined as target disorders object.
In sum, the target disorders object defining method that the embodiment of the present invention provides, vehicle can obtain n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, and to this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group, afterwards, vehicle can determine the distance of each obstructing objects and this vehicle recorded in this fused data group, and the obstructing objects that distance can be less than predeterminable range threshold value is defined as target disorders object, obstacle distance being greater than to predeterminable range threshold value then determines that this obstructing objects can not impact the traveling of vehicle, therefore, improve the reliability of vehicle to the result of detection of surrounding road environment.
Optionally, after this obstructing objects is defined as target disorders, the method also comprises:
Obtain the history fused data of this target disorders object, this history fused data is the default fused data stored before this current time of target database;
According to fused data and this history fused data of this target disorders object in this fused data group, determine the state of kinematic motion of this target disorders object;
The state of kinematic motion of this target disorders object is stored in this target database.
Optionally, this n sensor is arranged in the surrounding of this vehicle, and this n is more than or equal to 14.
Optionally, this n sensor comprises: 6 millimeter wave radar sensors, 6 laser radar sensors and 2 vision sensors;
These 2 vision sensors are arranged on the front windshield of this vehicle, and these 6 millimeter wave radar sensors are evenly distributed on the surrounding of this vehicle, and these 6 laser radar sensors are evenly distributed on the surrounding of this vehicle.
Optionally, this to this at least n carry out fusion treatment for the take off data of obstructing objects, comprising:
For each sensor in this n sensor sets up observation model, this observation model is used for according to this sensor in the take off data of current time to obstructing objects, obtain subsequent time to the predicted data of this obstructing objects, this subsequent time differs t with this current time, and this t is greater than 0;
According to this observation model, to obtain in this n sensor each sensor in this prior the moment to the predicted data of this at least one obstructing objects;
This at least n for the take off data of obstructing objects in screening can merge measurement data set, this error that can merge each take off data and corresponding predicted data in measurement data set is less than predetermined threshold value;
Fusion treatment is carried out to the take off data that this can merge in measurement data set.
In sum, the target disorders object defining method that the embodiment of the present invention provides, vehicle can obtain n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, and to this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group, afterwards, vehicle can determine the distance of each obstructing objects and this vehicle recorded in this fused data group, and the obstructing objects that distance can be less than predeterminable range threshold value is defined as target disorders object, obstacle distance being greater than to predeterminable range threshold value then determines that this obstructing objects can not impact the traveling of vehicle, therefore, improve the reliability of vehicle to the result of detection of surrounding road environment.
Embodiments provide another kind of target disorders object defining method, as shown in Fig. 2-1, the method comprises:
N the sensor that step 201, acquisition vehicle are arranged is at least n the take off data for obstructing objects of current time, and this n is more than or equal to 1.
In embodiments of the present invention, this n sensor can be arranged in the surrounding of this vehicle, and this n is more than or equal to 14.Example, this n sensor can comprise: 6 millimeter wave radar sensors, 6 laser radar sensors and 2 vision sensors, these 2 vision sensors are arranged on the front windshield of this vehicle, these 6 millimeter wave radar sensors are evenly distributed on the surrounding of this vehicle, and these 6 laser radar sensors are evenly distributed on the surrounding of this vehicle.Fig. 2-2 is a kind of sensor setting orientation schematic diagrams that the embodiment of the present invention provides, as shown in Fig. 2-2, these 2 vision sensors 01 and 02 can be symmetricly set on the front windshield 00 of this vehicle, these 6 millimeter wave radar sensors can be evenly distributed on the surrounding of this vehicle, wherein millimeter wave radar sensor 11, 12 can be arranged on vehicle front car light place, millimeter wave radar sensor 13 can be arranged on the below of vehicle front car plate, millimeter wave radar sensor 14 can be arranged on vehicle left back door place, all the other two millimeter wave radar sensors can be arranged on vehicle right-rear-door place and vehicle tail car plate place (not marking in figure), these 6 laser radar sensors can be evenly distributed on the surrounding of this vehicle, wherein, laser radar sensor 21,22 can be arranged on vehicle front car light place, laser radar sensor 23 can be arranged on vehicle front windshield 00, laser radar sensor 24 can be arranged on vehicle left back door place, and all the other two laser radar sensors can be arranged on vehicle right-rear-door place and vehicle tail car plate place (not marking in figure).
In actual applications, laser radar sensor and millimeter wave thunder sensor reach general paired setting, the detection range being wherein arranged on the millimeter wave radar sensor of vehicle front can be 150 meters (m), and the detection range being arranged on the millimeter wave radar sensor at vehicle side rear can be 50m.The detection range of laser radar sensor is generally 60m.Vision sensor orientation distance is 80m.Multiple sensors that vehicle's surroundings is arranged can realize the effect of 360 degree of omnidirectional detection vehicle periphery targets, improve the detectability of vehicle to surrounding road environment.The number of the sensor that this vehicle is arranged can increase and decrease according to vehicle actual conditions, and the orientation of each sensor also can be arranged according to actual conditions, and the embodiment of the present invention does not limit this.
In embodiments of the present invention, vehicle can obtain at least n the take off data for obstructing objects of n sensor at current time, when this sensor is millimeter wave radar sensor, the millimeter wave radar sensor that vehicle obtains can be expressed as the take off data of obstructing objects at current time t:
Z R(t)={r 1,r 2,...,r p}
Wherein, i=1 ..., p, p represent the number of the obstructing objects that millimeter wave radar sensor detects at current time, T representing matrix transposition, r ifor millimeter wave radar sensor is in the take off data of current time to i-th obstructing objects, wherein (x i, y i) be the coordinate of this i-th obstructing objects in the system of axes of millimeter wave radar sensor, for this i-th obstructing objects is in the speed of current time.
When this sensor is laser radar sensor, the laser radar sensor that vehicle obtains can be expressed as the take off data of obstructing objects at current time t:
Z L(t)={l 1,l 2,...,l q}
Wherein, i=1 ..., q, q represent the number of the obstructing objects that laser radar sensor detects at current time, l ifor laser radar sensor is in the take off data of current time to i-th obstructing objects, wherein (x i, y i) be the coordinate of this i-th obstructing objects in the system of axes of laser radar sensor, φ ibe the yaw angle of i-th obstructing objects in the system of axes of laser radar sensor, for this i-th obstructing objects is in the speed of current time, (w i, l i) be width and the length of i-th obstructing objects.
When this sensor is vision sensor, the vision sensor that vehicle obtains can be expressed as the take off data of obstructing objects at current time t:
Z C(t)={c 1,c 2,...,c o}
Wherein, c i=[(x i1, y i1), (x i2, y i2), class] t, i=1 ..., o, o represent the number of the obstructing objects that vision sensor detects at current time, l ifor vision sensor is in the take off data of current time to i-th obstructing objects, wherein (x i1, y i1) coordinate in the upper left corner of the external frame of this i-th obstructing objects that detects for vision sensor, (x i2, y i2) coordinate in the lower right corner of the external frame of this i-th obstructing objects that detects for vision sensor, class is the type of this i-th obstructing objects that vision sensor is determined, wherein the type of obstructing objects can comprise: pedestrian, bicycle and vehicle etc.The template types stored in the image of the obstructing objects detected and data bank, when determining the type of obstructing objects, can contrast, and type the highest for matching degree is defined as the type of this obstructing objects by vision sensor.
Step 202, to this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group.
This fused data group have recorded the fused data that in the obstructing objects that this n sensor detects, each obstructing objects is corresponding.According to the difference of the kind of the sensor that vehicle is arranged, this fused data can comprise dissimilar information, example, suppose that the sensor that vehicle is arranged comprises vision sensor, laser radar sensor and millimeter wave radar sensor, the location information of this obstructing objects can be comprised, velocity information, the width of this obstructing objects and the type etc. of height and this obstructing objects in the fused data that each obstructing objects then recorded in fused data group is corresponding.
Fig. 2-3 is method flow diagrams that a kind of take off data to obstructing objects that the embodiment of the present invention provides carries out fusion treatment, and as Figure 2-3, the method comprises:
Step 2021, set up observation model for each sensor in n sensor.
This observation model is used for according to this sensor in the take off data of current time to obstructing objects, obtains subsequent time to the predicted data of this obstructing objects.This subsequent time differs t with this current time, and this t is greater than 0.In embodiments of the present invention, vehicle can according to the kind of each sensor, for this sensor sets up observation model.Example, for millimeter wave radar sensor, the observation model that vehicle is set up can be:
Wherein r (t+1) represents that millimeter wave radar sensor is to the predicted data of obstructing objects in the t+1 moment, (x (t), y (t)) for obstructing objects is at the coordinate of t in millimeter wave radar sensor coordinate system, for the speed of obstructing objects t, v rt () represents the measurement noises of millimeter wave radar sensor in t.
For laser radar sensor, the observation model that vehicle is set up can be:
l(t+1)=[(x(t),y(t)),φ(t),v(t)cos(φ(t)),v(t)sin(φ(t)),w(t),l(t)] T+v l(t)
Wherein l (t+1) represents that laser radar sensor is to the predicted data of obstructing objects in the t+1 moment, (x (t), y (t)) be the coordinate of obstructing objects t in laser radar sensor system of axes, φ (t) is the yaw angle of obstructing objects in the system of axes of laser radar sensor, w (t), the width that l (t) is obstructing objects t and length, v lt () represents the measurement noises of laser radar sensor in t.
For vision sensor, the observation model that vehicle is set up can be:
x 2 ( t + 1 ) - x 1 ( t + 1 ) y 2 ( t + 1 ) - y 1 ( t + 1 ) = w ( t ) f p / d l ( t ) f p / d + v c ( t )
Wherein x 2(t+1)-x 1(t+1) width of obstructing objects in the t+1 moment of vision sensor prediction is represented, y 2(t+1)-y 1(t+1) length of obstructing objects in the t+1 moment of vision sensor prediction is represented, w (t), the width that l (t) is obstructing objects t and length, wherein, w (t), the take off data of the obstructing objects that l (t) directly can obtain for vision sensor from laser sensor, also can be vision sensor in the take off data of t to obstructing objects, i.e. w (t)=x 2(t)-x 1(t), l (t)=y 2(t)-y 1(t), f pfor the focal length of vision sensor, d is the distance between this obstacle and vehicle, v ct () represents the measurement noises of laser radar sensor in t.Vision sensor can pass through in the take off data of this obstacle, the coordinate of this obstructing objects calculates the distance d between this obstructing objects and vehicle, this vision sensor also can obtain millimeter wave radar sensor or laser radar sensor to the take off data of this obstructing objects, and then obtains the distance d more accurately between this obstructing objects and vehicle.
Vehicle, except setting up observation model for each sensor, can also set up kinematic model for each obstructing objects, and this kinematic model is for characterizing the state of kinematic motion of this each obstructing objects at current time.For the obstructing objects that millimeter wave radar sensor detects, vehicle be obstructing objects set up kinematic model can be:
Wherein (x (t), y (t)), represent that this obstructing objects is at the location information of current time, velocity information and acceleration information respectively.
For the obstructing objects that laser radar and vision sensor detect, vehicle be obstructing objects set up kinematic model can be:
l(t)=[(x(t),y(t)),φ(t),v(t),φ'(t),a(t),w(t),l(t),h(t)] T
Wherein, (x (t), y (t)) be the centre coordinate of this obstructing objects, φ (t), v (t), φ ' (t), a (t) is respectively the yaw angle of this obstructing objects in sensor coordinate system, the moving velocity of this obstructing objects, rate of yaw, and acceleration/accel.W (t), l (t), h (t) is then the width of this obstructing objects, length and height.
Step 2022, according to this observation model, to obtain in this n sensor each sensor in this prior the moment to the predicted data of this at least one obstructing objects.
In embodiments of the present invention, when vehicle obtains n sensor after the individual take off data for obstructing objects of at least n of current time, vehicle can according to each sensor in a upper moment to the take off data of each obstructing objects, by the observation model of this sensor, calculate this sensor in the predicted data of current time to each obstructing objects, and then obtain the predicted data at least one obstructing objects of n sensor at current time.
Step 2023, this at least n for the take off data of obstructing objects in screening can merge measurement data set, this error that can merge each take off data and corresponding predicted data in measurement data set is less than predetermined threshold value.
In embodiments of the present invention, for each sensor in n sensor, vehicle can calculate the take off data of this sensor tip to each obstructing objects and the error of predicted data, and judge whether this error is less than predetermined threshold value, if this error is less than predetermined threshold value, then vehicle can determine that this sensor matches to the take off data of this obstacle and predicted data, and this take off data is defined as merging take off data.Example, suppose that laser radar sensor is l in the take off data to obstacle A that current time t detects at (), according to the observation model of laser radar sensor, the predicted data of the current time got to this obstructing objects A is l' at (), then vehicle can calculate the take off data l of this laser radar sensor to obstructing objects A a(t) and predicted data l' aerror between (t), and determine this take off data l when this error is less than predetermined threshold value at () is for merging take off data.In actual applications; vehicle is when contrasting sensor to the take off data of each obstructing objects and predicted data; usually can the location information in this take off data and predicted data be contrasted; when the location information in take off data and predicted data matches; can determine that this take off data is for can merge take off data, or also this predicted data can be defined as can merge take off data.
Step 2024, fusion treatment is carried out to the take off data that this can merge in measurement data set.
In embodiments of the present invention, vehicle carries out fusion treatment to the take off data that this can merge in measurement data set and can comprise: first, the take off data for obstructing objects of each sensor in n sensor is transformed in bodywork reference frame, and the take off data of each sensor is synchronized to the same moment; Then the take off data after the time-space relation of this n sensor is associated, from n sensor for the take off data obtained the take off data of obstructing objects for same obstructing objects, finally the take off data that the different sensors for same obstructing objects detects is merged by certain algorithm, and then obtain the fused data of each obstructing objects.Wherein, the detailed process of take off data being carried out to fusion treatment can with reference to correlation technique, and the embodiment of the present invention does not repeat at this.
Step 203, determine the distance of each obstructing objects and this vehicle recorded in this fused data group.
In embodiments of the present invention, vehicle can the location information of this obstructing objects in the fused data corresponding according to each obstructing objects in this fused data group, calculates the distance of this obstructing objects of current time and vehicle.Example, suppose that the fused data of the obstructing objects A recorded in fused data group is for { (x a, y a), v a, (w a, l a), pedestrian }, known from this fused data, the coordinate of this obstructing objects A in bodywork reference frame is (x a, y a), the moving velocity of the current time of this obstructing objects A is v a, the width of this obstructing objects A and length are (w a, l a), and this obstructing objects A is pedestrian.Then vehicle can according to the location information of the obstructing objects A in this fused data, i.e. coordinate (the x of this obstructing objects A in bodywork reference frame a, y a), the distance s calculating this obstructing objects A distance vehicle is:
Step 204, obstructing objects distance being less than predeterminable range threshold value are defined as target disorders object.
In embodiments of the present invention, this predeterminable range threshold value can pre-set for vehicle, when the distance of obstructing objects and vehicle is less than this predeterminable range threshold value, vehicle determines that this obstructing objects may impact the traveling of vehicle, and this obstructing objects is defined as target disorders object.Afterwards, the fused data of this obstructing objects can be stored in target database by vehicle, and by the data stored in this target database by the read out instrument of the indoor setting of vehicular drive or inform chaufeur by voice system, so that chaufeur can according to this fused data, the motoring condition of adjustment vehicle in time.When the distance of obstructing objects and vehicle is greater than this predeterminable range threshold value, vehicle can determine that this obstructing objects can not impact the traveling of vehicle, then the fused data of this obstructing objects need not be stored in target database, and then improve the reliability of vehicle to the result of detection of surrounding road environment.Example, suppose that in vehicle, predeterminable range threshold value is 100m, the coordinate (x of obstructing objects A in bodywork reference frame a, y a)=(30,40) the distance s that, then vehicle calculates this obstructing objects A distance vehicle is: s = x A 2 + y A 2 = 30 2 + 40 2 = 50 ( m ) , Distance due to this obstructing objects A distance vehicle: 50m is less than predeterminable range threshold value 100m, then this obstructing objects A can be defined as target disorders object by vehicle.
Step 205, obtain the history fused data of this target disorders object, this history fused data is the fused data that default target database stores before the moment in this prior.
In embodiments of the present invention, after vehicle determination target disorders object, the history fused data of this target obstacle can also be obtained from target database, i.e. the fused data of this target disorders object before current time.Example, suppose that the history fused data of this target obstacle A that vehicle obtains from target database comprises the fused data of a upper moment this obstructing objects A, this history fused data can be: { (40,40), 1 metre per second (m/s) (m/s), (0.5,1.7), pedestrian }.
Step 206, according to the fused data of this target disorders object in this fused data group and this history fused data, determine the state of kinematic motion of this target disorders object.
In embodiments of the present invention, vehicle according to the fused data of this target disorders object in this fused data group and this history fused data, can determine the state of kinematic motion of this target disorders object.Vehicle can according to the location information in the fused data of this target disorders object and history fused data, judge whether this target disorders object is state of kinematic motion, if state of kinematic motion, then can judge that this target disorders object is away from vehicle or near vehicle further; Vehicle according to the velocity information in the fused data of this target disorders object and history fused data, can also judge that this target disorders object is in acceleration or deceleration etc.
Example, suppose that the fused data of this target disorders object A in this fused data group is: { (30,40), 1m/s, (0.5,1.7), pedestrian }, the history fused data of target disorders object A is: { (40,40), 3.3m/s, (0.5,1.7), pedestrian }, then vehicle can according to the location information in this fused data: the location information in (30,40) and history fused data: (40,40) there occurs change, determine that this target disorders object A is state of kinematic motion.Further, vehicle can also according to the location information in history fused data: (40,40) determine target disorders object: A is in the distance of a upper moment and this car and according to the location information in fused data: (30,40) determine the distance of target disorders object A at current time and this car vehicle can calculate this target disorders object A current time and this car distance s2 with on the difference of distance of distance s1 an of moment and this car be: Δ s=s1-s2=6.6m, and detecting test of vehicle is 10m/s to current moving velocity, and the cycle that sensor detects is 0.5s, namely the difference going up a moment and current time is 0.5s, then distance S=10m/s × 0.5s=5m of moving forward of vehicle, due to target disorders object A current time and this car distance s2 with on the difference Δ s=6.6m of distance of distance s1 an of moment and this car be greater than the distance S=5m of vehicle movement, then vehicle can determine that the motion shape body of this target disorders object A is: near this car, and can by the state of kinematic motion of this target disorders object A: be stored to target database near this car and inform chaufeur.
It should be noted that, in actual applications, vehicle also needs to judge more accurately in conjunction with the state of kinematic motion of many kinds of parameters to target disorders object such as moving velocity, travel direction of this car.
Step 207, the state of kinematic motion of this target disorders object to be stored in this target database.
The state of kinematic motion of target disorders object can be stored in this target database by vehicle, and by the state of kinematic motion stored in this target database by the read out instrument of the indoor setting of vehicular drive or inform chaufeur by voice system, so that chaufeur according to the fused data of target disorders object and state of kinematic motion, can adjust the motoring condition of vehicle.Vehicle can also judge by the motoring condition to this car according to the fused data of target disorders object and state of kinematic motion, when the motoring condition judging this car is the emergency state, such as when the close together of target disorders object and this car, and when the moving velocity of this car is very fast, vehicle can be intervened the motoring condition of this car automatically.
It should be noted that, vehicle can carry out real-time update to the information stored in this target database, after the distance of target disorders object and vehicle is greater than predeterminable range threshold value, vehicle can delete fused data and the state of kinematic motion of this target disorders object from target database, avoids producing wrong report.
Also it should be noted that, the sequencing of the step of the target disorders object defining method that the embodiment of the present invention provides can suitably adjust, and step also according to circumstances can carry out corresponding increase and decrease.Anyly be familiar with those skilled in the art in the technical scope that the present invention discloses, the method changed can be expected easily, all should be encompassed within protection scope of the present invention, therefore repeat no more.
In sum, the target disorders object defining method that the embodiment of the present invention provides, vehicle can obtain n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, and to this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group, afterwards, vehicle can determine the distance of each obstructing objects and this vehicle recorded in this fused data group, and the obstructing objects that distance can be less than predeterminable range threshold value is defined as target disorders object, obstacle distance being greater than to predeterminable range threshold value then determines that this obstructing objects can not impact the traveling of vehicle, therefore, improve the reliability of vehicle to the result of detection of surrounding road environment.
Embodiments provide a kind of target disorders object determining device, this target disorders object determining device can be the part on vehicle or vehicle, and as shown in figure 3-1, this target disorders object determining device 300 comprises:
First acquiring unit 301, for obtaining n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, this n is more than or equal to 1.
Processing unit 302, for this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group, this fused data group have recorded the fused data that in the obstructing objects that this n sensor detects, each obstructing objects is corresponding.
First determining unit 303, for determining the distance of each obstructing objects and this vehicle recorded in this fused data group.
Second determining unit 304, is defined as target disorders object for obstructing objects distance being less than predeterminable range threshold value.
In sum, the target disorders object determining device that the embodiment of the present invention provides, vehicle can obtain n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, and to this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group, afterwards, vehicle can determine the distance of each obstructing objects and this vehicle recorded in this fused data group, and the obstructing objects that distance can be less than predeterminable range threshold value is defined as target disorders object, obstacle distance being greater than to predeterminable range threshold value then determines that this obstructing objects can not impact the traveling of vehicle, therefore, improve the reliability of vehicle to the result of detection of surrounding road environment.
Embodiments provide another kind of target disorders object determining device, as shown in figure 3-2, this target disorders object determining device 300 comprises:
First acquiring unit 301, for obtaining n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, this n is more than or equal to 1.
Processing unit 302, for this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group, this fused data group have recorded the fused data that in the obstructing objects that this n sensor detects, each obstructing objects is corresponding.
First determining unit 303, for determining the distance of each obstructing objects and this vehicle recorded in this fused data group.
Second determining unit 304, is defined as target disorders object for obstructing objects distance being less than predeterminable range threshold value.
Second acquisition unit 305, for obtaining the history fused data of this target disorders object, this history fused data is the default fused data stored before this current time of target database.
3rd determining unit 306, for according to the fused data of this target disorders object in this fused data group and this history fused data, determines the state of kinematic motion of this target disorders object.
Memory cell 307, for being stored to the state of kinematic motion of this target disorders object in this target database.
Optionally, this n sensor is arranged in the surrounding of this vehicle, and this n is more than or equal to 14.
Optionally, this n sensor comprises: 6 millimeter wave radar sensors, 6 laser radar sensors and 2 vision sensors;
These 2 vision sensors are arranged on the front windshield of this vehicle, and these 6 millimeter wave radar sensors are evenly distributed on the surrounding of this vehicle, and these 6 laser radar sensors are evenly distributed on the surrounding of this vehicle.
Optionally, this processing unit 302, also for:
For each sensor in this n sensor sets up observation model, this observation model is used for according to this sensor in the take off data of current time to obstructing objects, obtain subsequent time to the predicted data of this obstructing objects, this subsequent time differs t with this current time, and this t is greater than 0;
According to this observation model, to obtain in this n sensor each sensor in this prior the moment to the predicted data of this at least one obstructing objects;
This at least n for the take off data of obstructing objects in screening can merge measurement data set, this error that can merge each take off data and corresponding predicted data in measurement data set is less than predetermined threshold value;
Fusion treatment is carried out to the take off data that this can merge in measurement data set.
In sum, the target disorders object determining device that the embodiment of the present invention provides, vehicle can obtain n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, and to this at least n carry out fusion treatment for the take off data of obstructing objects, obtain fused data group, afterwards, vehicle can determine the distance of each obstructing objects and this vehicle recorded in this fused data group, and the obstructing objects that distance can be less than predeterminable range threshold value is defined as target disorders object, obstacle distance being greater than to predeterminable range threshold value then determines that this obstructing objects can not impact the traveling of vehicle, therefore, improve the reliability of vehicle to the result of detection of surrounding road environment.
Those skilled in the art can be well understood to, and for convenience and simplicity of description, the device of foregoing description and the specific works process of unit, with reference to the corresponding process in preceding method embodiment, can not repeat them here.
The foregoing is only preferred embodiment of the present invention, not in order to limit the present invention, within the spirit and principles in the present invention all, any amendment done, equivalent replacement, improvement etc., all should be included within protection scope of the present invention.

Claims (10)

1. a target disorders object defining method, is characterized in that, described method comprises:
N the sensor that acquisition vehicle is arranged is at least n the take off data for obstructing objects of current time, and described n is more than or equal to 1;
Carry out fusion treatment for the take off data of obstructing objects to described at least n, obtain fused data group, described fused data group have recorded the fused data that in the obstructing objects that a described n sensor detects, each obstructing objects is corresponding;
Determine the distance of each obstructing objects and the described vehicle recorded in described fused data group;
Obstructing objects distance being less than predeterminable range threshold value is defined as target disorders object.
2. method according to claim 1, is characterized in that, described described obstructing objects is defined as target disorders after, described method also comprises:
Obtain the history fused data of described target disorders object, described history fused data is the default fused data stored before current time described in target database;
According to fused data and the described history fused data of target disorders object described in described fused data group, determine the state of kinematic motion of described target disorders object;
The state of kinematic motion of described target disorders object is stored in described target database.
3. method according to claim 1, is characterized in that, a described n sensor is arranged in the surrounding of described vehicle, and described n is more than or equal to 14.
4., according to the arbitrary described method of claims 1 to 3, it is characterized in that, a described n sensor comprises: 6 millimeter wave radar sensors, 6 laser radar sensors and 2 vision sensors;
Described 2 vision sensors are arranged on the front windshield of described vehicle, and described 6 millimeter wave radar sensors are evenly distributed on the surrounding of described vehicle, and described 6 laser radar sensors are evenly distributed on the surrounding of described vehicle.
5. according to the arbitrary described method of claims 1 to 3, it is characterized in that, describedly carry out fusion treatment to described at least n for the take off data of obstructing objects, comprising:
For each sensor in a described n sensor sets up observation model, described observation model is used for according to described sensor in the take off data of current time to obstructing objects, obtain subsequent time to the predicted data of described obstructing objects, described subsequent time differs t with described current time, and described t is greater than 0;
According to described observation model, to obtain in a described n sensor each sensor in the predicted data of described current time at least one obstructing objects described;
In described at least n is for the take off data of obstructing objects, screening can merge measurement data set, and the described error merging each take off data and corresponding predicted data in measurement data set is less than predetermined threshold value;
Fusion treatment is carried out to the described take off data merged in measurement data set.
6. a target disorders object determining device, is characterized in that, described device comprises:
First acquiring unit, for obtaining n sensor that vehicle is arranged at least n the take off data for obstructing objects at current time, described n is more than or equal to 1;
Processing unit, for carrying out fusion treatment for the take off data of obstructing objects to described at least n, obtains fused data group, and described fused data group have recorded the fused data that in the obstructing objects that a described n sensor detects, each obstructing objects is corresponding;
First determining unit, for determining the distance of each obstructing objects and the described vehicle recorded in described fused data group;
Second determining unit, is defined as target disorders object for obstructing objects distance being less than predeterminable range threshold value.
7. device according to claim 6, is characterized in that, described device also comprises:
Second acquisition unit, for obtaining the history fused data of described target disorders object, described history fused data is the default fused data stored before current time described in target database;
3rd determining unit, for according to the fused data of target disorders object described in described fused data group and described history fused data, determines the state of kinematic motion of described target disorders object;
Memory cell, for being stored to the state of kinematic motion of described target disorders object in described target database.
8. device according to claim 6, is characterized in that, a described n sensor is arranged in the surrounding of described vehicle, and described n is more than or equal to 14.
9., according to the arbitrary described device of claim 6 to 8, it is characterized in that, a described n sensor comprises: 6 millimeter wave radar sensors, 6 laser radar sensors and 2 vision sensors;
Described 2 vision sensors are arranged on the front windshield of described vehicle, and described 6 millimeter wave radar sensors are evenly distributed on the surrounding of described vehicle, and described 6 laser radar sensors are evenly distributed on the surrounding of described vehicle.
10., according to the arbitrary described device of claim 6 to 8, it is characterized in that, described processing unit, also for:
For each sensor in a described n sensor sets up observation model, described observation model is used for according to described sensor in the take off data of current time to obstructing objects, obtain subsequent time to the predicted data of described obstructing objects, described subsequent time differs t with described current time, and described t is greater than 0;
According to described observation model, to obtain in a described n sensor each sensor in the predicted data of described current time at least one obstructing objects described;
In described at least n is for the take off data of obstructing objects, screening can merge measurement data set, and the described error merging each take off data and corresponding predicted data in measurement data set is less than predetermined threshold value;
Fusion treatment is carried out to the described take off data merged in measurement data set.
CN201510526498.XA 2015-08-21 2015-08-21 Target disorders object determines method and device Active CN105109484B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510526498.XA CN105109484B (en) 2015-08-21 2015-08-21 Target disorders object determines method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510526498.XA CN105109484B (en) 2015-08-21 2015-08-21 Target disorders object determines method and device

Publications (2)

Publication Number Publication Date
CN105109484A true CN105109484A (en) 2015-12-02
CN105109484B CN105109484B (en) 2017-11-14

Family

ID=54657680

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510526498.XA Active CN105109484B (en) 2015-08-21 2015-08-21 Target disorders object determines method and device

Country Status (1)

Country Link
CN (1) CN105109484B (en)

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106585623A (en) * 2016-12-21 2017-04-26 驭势科技(北京)有限公司 Detection system for detecting targets around vehicle and application of detection system
CN106767852A (en) * 2016-12-30 2017-05-31 东软集团股份有限公司 A kind of method for generating detection target information, device and equipment
CN107092252A (en) * 2017-04-11 2017-08-25 杭州光珀智能科技有限公司 A kind of robot automatic obstacle avoidance method and its device based on machine vision
CN107330925A (en) * 2017-05-11 2017-11-07 北京交通大学 A kind of multi-obstacle avoidance detect and track method based on laser radar depth image
CN107578427A (en) * 2017-07-31 2018-01-12 深圳市易成自动驾驶技术有限公司 Detection method, device and the computer-readable recording medium of dynamic barrier
CN107918386A (en) * 2017-10-25 2018-04-17 北京汽车集团有限公司 Multi-Sensor Information Fusion Approach, device and vehicle for vehicle
CN108458746A (en) * 2017-12-23 2018-08-28 天津国科嘉业医疗科技发展有限公司 One kind being based on sensor method for self-adaption amalgamation
CN108458745A (en) * 2017-12-23 2018-08-28 天津国科嘉业医疗科技发展有限公司 A kind of environment perception method based on intelligent detection equipment
CN108569295A (en) * 2017-03-08 2018-09-25 奥迪股份公司 Method and system for environment measuring
CN108646739A (en) * 2018-05-14 2018-10-12 北京智行者科技有限公司 A kind of sensor information fusion method
CN109212532A (en) * 2017-07-04 2019-01-15 百度在线网络技术(北京)有限公司 Method and apparatus for detecting barrier
CN109649408A (en) * 2019-01-21 2019-04-19 厦门理工学院 A kind of vehicle-surroundings disorder detection method and vehicle
CN109690349A (en) * 2016-09-15 2019-04-26 株式会社小糸制作所 Sensing system
CN109885056A (en) * 2019-03-07 2019-06-14 格陆博科技有限公司 A kind of more scene selection methods merged based on monocular cam and millimetre-wave radar
CN110018470A (en) * 2019-03-01 2019-07-16 北京纵目安驰智能科技有限公司 Based on example mask method, model, terminal and the storage medium merged before multisensor
CN110036308A (en) * 2016-12-06 2019-07-19 本田技研工业株式会社 Vehicle-surroundings information acquisition device and vehicle
CN110597269A (en) * 2019-09-30 2019-12-20 潍柴动力股份有限公司 Vehicle autonomous obstacle avoidance method and vehicle autonomous obstacle avoidance system
CN110703732A (en) * 2019-10-21 2020-01-17 北京百度网讯科技有限公司 Correlation detection method, device, equipment and computer readable storage medium
CN110884489A (en) * 2019-12-18 2020-03-17 新昌县三维精工机械有限公司 New energy automobile is with chassis obstacle detection early warning device all around
CN111231982A (en) * 2020-01-08 2020-06-05 中国第一汽车股份有限公司 Obstacle identification method and device for intelligent driving, vehicle and storage medium
CN111273279A (en) * 2020-02-18 2020-06-12 中国科学院合肥物质科学研究院 Multi-radar data processing method based on acceleration noise parameters
WO2020133223A1 (en) * 2018-12-28 2020-07-02 深圳市大疆创新科技有限公司 Target detection method, radar, vehicle and computer-readable storage medium
CN111923898A (en) * 2019-05-13 2020-11-13 广州汽车集团股份有限公司 Obstacle detection method and device
CN112924960A (en) * 2021-01-29 2021-06-08 重庆长安汽车股份有限公司 Target size real-time detection method, system, vehicle and storage medium
JP2021516518A (en) * 2018-03-02 2021-07-01 イギリス国The Secretary Of State For Defence In Her Britannic Majesty’S Government Of The Uneted Kingdom Of Great Britain And Northern Ireland Bipolarized omnidirectional antenna device
CN113227834A (en) * 2019-02-06 2021-08-06 宝马股份公司 Method and device for sensor data fusion of a vehicle
CN113325826A (en) * 2021-06-08 2021-08-31 矿冶科技集团有限公司 Underground vehicle control method and device, electronic equipment and storage medium
CN113753076A (en) * 2021-08-06 2021-12-07 北京百度网讯科技有限公司 Method and device for judging effective barrier, electronic equipment and automatic driving vehicle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090093960A1 (en) * 2007-10-04 2009-04-09 Jeffrey Scott Puhalla Method and system for obstacle avoidance for a vehicle
CN102385055A (en) * 2010-08-03 2012-03-21 株式会社电装 Vehicle-use obstacle detection apparatus
CN104477167A (en) * 2014-11-26 2015-04-01 浙江大学 Intelligent driving system and control method thereof
CN104798124A (en) * 2012-11-21 2015-07-22 丰田自动车株式会社 Driving-assistance device and driving-assistance method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090093960A1 (en) * 2007-10-04 2009-04-09 Jeffrey Scott Puhalla Method and system for obstacle avoidance for a vehicle
CN102385055A (en) * 2010-08-03 2012-03-21 株式会社电装 Vehicle-use obstacle detection apparatus
CN104798124A (en) * 2012-11-21 2015-07-22 丰田自动车株式会社 Driving-assistance device and driving-assistance method
CN104477167A (en) * 2014-11-26 2015-04-01 浙江大学 Intelligent driving system and control method thereof

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109690349B (en) * 2016-09-15 2023-10-31 株式会社小糸制作所 sensor system
CN109690349A (en) * 2016-09-15 2019-04-26 株式会社小糸制作所 Sensing system
CN110036308A (en) * 2016-12-06 2019-07-19 本田技研工业株式会社 Vehicle-surroundings information acquisition device and vehicle
CN106585623B (en) * 2016-12-21 2023-12-01 驭势科技(北京)有限公司 Detection system for detecting objects around vehicle and application thereof
CN106585623A (en) * 2016-12-21 2017-04-26 驭势科技(北京)有限公司 Detection system for detecting targets around vehicle and application of detection system
CN106767852B (en) * 2016-12-30 2019-10-11 东软集团股份有限公司 A kind of method, apparatus and equipment generating detection target information
CN106767852A (en) * 2016-12-30 2017-05-31 东软集团股份有限公司 A kind of method for generating detection target information, device and equipment
CN108569295B (en) * 2017-03-08 2021-11-23 奥迪股份公司 Method and system for environmental detection
US11009602B2 (en) 2017-03-08 2021-05-18 Audi Ag Method and system for environment detection
CN108569295A (en) * 2017-03-08 2018-09-25 奥迪股份公司 Method and system for environment measuring
CN107092252A (en) * 2017-04-11 2017-08-25 杭州光珀智能科技有限公司 A kind of robot automatic obstacle avoidance method and its device based on machine vision
CN107330925A (en) * 2017-05-11 2017-11-07 北京交通大学 A kind of multi-obstacle avoidance detect and track method based on laser radar depth image
CN107330925B (en) * 2017-05-11 2020-05-22 北京交通大学 Multi-obstacle detection and tracking method based on laser radar depth image
CN109212532A (en) * 2017-07-04 2019-01-15 百度在线网络技术(北京)有限公司 Method and apparatus for detecting barrier
CN113466822A (en) * 2017-07-04 2021-10-01 百度在线网络技术(北京)有限公司 Method and apparatus for detecting obstacles
CN109212532B (en) * 2017-07-04 2021-08-20 百度在线网络技术(北京)有限公司 Method and apparatus for detecting obstacles
CN107578427A (en) * 2017-07-31 2018-01-12 深圳市易成自动驾驶技术有限公司 Detection method, device and the computer-readable recording medium of dynamic barrier
CN107918386B (en) * 2017-10-25 2021-01-01 北京汽车集团有限公司 Multi-sensor data fusion method and device for vehicle and vehicle
CN107918386A (en) * 2017-10-25 2018-04-17 北京汽车集团有限公司 Multi-Sensor Information Fusion Approach, device and vehicle for vehicle
CN108458745A (en) * 2017-12-23 2018-08-28 天津国科嘉业医疗科技发展有限公司 A kind of environment perception method based on intelligent detection equipment
CN108458746A (en) * 2017-12-23 2018-08-28 天津国科嘉业医疗科技发展有限公司 One kind being based on sensor method for self-adaption amalgamation
JP2021516518A (en) * 2018-03-02 2021-07-01 イギリス国The Secretary Of State For Defence In Her Britannic Majesty’S Government Of The Uneted Kingdom Of Great Britain And Northern Ireland Bipolarized omnidirectional antenna device
JP7366939B2 (en) 2018-03-02 2023-10-23 イギリス国 Dual polarization omnidirectional antenna device
CN108646739A (en) * 2018-05-14 2018-10-12 北京智行者科技有限公司 A kind of sensor information fusion method
WO2020133223A1 (en) * 2018-12-28 2020-07-02 深圳市大疆创新科技有限公司 Target detection method, radar, vehicle and computer-readable storage medium
CN109649408A (en) * 2019-01-21 2019-04-19 厦门理工学院 A kind of vehicle-surroundings disorder detection method and vehicle
CN109649408B (en) * 2019-01-21 2020-09-25 厦门理工学院 Vehicle peripheral obstacle detection method and vehicle
CN113227834A (en) * 2019-02-06 2021-08-06 宝马股份公司 Method and device for sensor data fusion of a vehicle
CN110018470A (en) * 2019-03-01 2019-07-16 北京纵目安驰智能科技有限公司 Based on example mask method, model, terminal and the storage medium merged before multisensor
CN109885056A (en) * 2019-03-07 2019-06-14 格陆博科技有限公司 A kind of more scene selection methods merged based on monocular cam and millimetre-wave radar
CN111923898A (en) * 2019-05-13 2020-11-13 广州汽车集团股份有限公司 Obstacle detection method and device
CN110597269A (en) * 2019-09-30 2019-12-20 潍柴动力股份有限公司 Vehicle autonomous obstacle avoidance method and vehicle autonomous obstacle avoidance system
CN110703732A (en) * 2019-10-21 2020-01-17 北京百度网讯科技有限公司 Correlation detection method, device, equipment and computer readable storage medium
CN110884489B (en) * 2019-12-18 2021-07-06 常州市武进悦达电声器材有限公司 New energy automobile is with chassis obstacle detection early warning device all around
CN110884489A (en) * 2019-12-18 2020-03-17 新昌县三维精工机械有限公司 New energy automobile is with chassis obstacle detection early warning device all around
CN111231982B (en) * 2020-01-08 2021-05-04 中国第一汽车股份有限公司 Obstacle identification method and device for intelligent driving, vehicle and storage medium
CN111231982A (en) * 2020-01-08 2020-06-05 中国第一汽车股份有限公司 Obstacle identification method and device for intelligent driving, vehicle and storage medium
CN111273279B (en) * 2020-02-18 2022-05-10 中国科学院合肥物质科学研究院 Multi-radar data processing method based on acceleration noise parameters
CN111273279A (en) * 2020-02-18 2020-06-12 中国科学院合肥物质科学研究院 Multi-radar data processing method based on acceleration noise parameters
CN112924960A (en) * 2021-01-29 2021-06-08 重庆长安汽车股份有限公司 Target size real-time detection method, system, vehicle and storage medium
CN112924960B (en) * 2021-01-29 2023-07-18 重庆长安汽车股份有限公司 Target size real-time detection method, system, vehicle and storage medium
CN113325826A (en) * 2021-06-08 2021-08-31 矿冶科技集团有限公司 Underground vehicle control method and device, electronic equipment and storage medium
CN113753076A (en) * 2021-08-06 2021-12-07 北京百度网讯科技有限公司 Method and device for judging effective barrier, electronic equipment and automatic driving vehicle

Also Published As

Publication number Publication date
CN105109484B (en) 2017-11-14

Similar Documents

Publication Publication Date Title
CN105109484A (en) Target-barrier determining method and device
CN110239535B (en) Curve active collision avoidance control method based on multi-sensor fusion
CN106240458B (en) A kind of vehicular frontal impact method for early warning based on vehicle-mounted binocular camera
JP5939357B2 (en) Moving track prediction apparatus and moving track prediction method
US9359009B2 (en) Object detection during vehicle parking
CN102288121B (en) Method for measuring and pre-warning lane departure distance based on monocular vision
CN105160356B (en) A kind of active safety systems of vehicles Data Fusion of Sensor method and system
CN112562405A (en) Radar video intelligent fusion and early warning method and system
CN105404844A (en) Road boundary detection method based on multi-line laser radar
CN107563256A (en) Aid in driving information production method and device, DAS (Driver Assistant System)
CN112693466A (en) System and method for evaluating performance of vehicle environment perception sensor
WO2018142527A1 (en) Travel history storage method, method for producing travel path model, method for estimating local position, and travel history storage device
CN105922990A (en) Vehicle environment perceiving and controlling method based on cloud machine learning
CN104181534A (en) Probabilistic target selection and threat assessment method and application to intersection collision alert system
CN104573646A (en) Detection method and system, based on laser radar and binocular camera, for pedestrian in front of vehicle
CN102673560A (en) Method for recognizing turn-off maneuver and driver assistance system
CN103661375A (en) Lane departure alarming method and system with driving distraction state considered
CN112379674B (en) Automatic driving equipment and system
CN104192063B (en) Vehicle safe driving caution system and corresponding alarming method for power
CN109878530B (en) Method and system for identifying lateral driving condition of vehicle
CN112116031A (en) Target fusion method and system based on road side equipment, vehicle and storage medium
Woo et al. Dynamic potential-model-based feature for lane change prediction
CN107200016A (en) Road adaptive forecasting method and the Vehicular system using this method
CN106428003A (en) Lane departure forewarning device and method for vehicle on highway under adverse weather
CN114396958B (en) Lane positioning method and system based on multiple lanes and multiple sensors and vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220218

Address after: 241006 Anshan South Road, Wuhu Economic and Technological Development Zone, Anhui Province

Patentee after: Wuhu Sambalion auto technology Co.,Ltd.

Address before: 241006 Changchun Road, Wuhu economic and Technological Development Zone, Wuhu, Anhui, 8

Patentee before: CHERY AUTOMOBILE Co.,Ltd.

TR01 Transfer of patent right