CN106428000A - Vehicle speed control device and method - Google Patents
Vehicle speed control device and method Download PDFInfo
- Publication number
- CN106428000A CN106428000A CN201610806022.6A CN201610806022A CN106428000A CN 106428000 A CN106428000 A CN 106428000A CN 201610806022 A CN201610806022 A CN 201610806022A CN 106428000 A CN106428000 A CN 106428000A
- Authority
- CN
- China
- Prior art keywords
- pedestrian
- behavior
- vehicle
- information
- jaywalks
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 230000019771 cognition Effects 0.000 claims abstract description 86
- 230000008569 process Effects 0.000 claims abstract description 13
- 230000006399 behavior Effects 0.000 claims description 200
- 230000001149 cognitive effect Effects 0.000 claims description 60
- 230000004927 fusion Effects 0.000 claims description 31
- 230000004888 barrier function Effects 0.000 claims description 26
- 238000001514 detection method Methods 0.000 claims description 18
- 230000007613 environmental effect Effects 0.000 claims description 17
- 230000000007 visual effect Effects 0.000 claims description 15
- 238000010276 construction Methods 0.000 claims description 9
- 238000007781 pre-processing Methods 0.000 claims description 9
- 230000009466 transformation Effects 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 8
- 238000012546 transfer Methods 0.000 claims description 6
- 230000003542 behavioural effect Effects 0.000 claims description 5
- 230000000694 effects Effects 0.000 claims description 5
- 239000011159 matrix material Substances 0.000 claims description 5
- 238000005315 distribution function Methods 0.000 claims description 4
- 230000010354 integration Effects 0.000 abstract 2
- 230000003993 interaction Effects 0.000 abstract 1
- 208000006011 Stroke Diseases 0.000 description 8
- 238000011160 research Methods 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 238000013459 approach Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 206010008190 Cerebrovascular accident Diseases 0.000 description 3
- 230000000875 corresponding effect Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 238000003064 k means clustering Methods 0.000 description 2
- 241000208340 Araliaceae Species 0.000 description 1
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 1
- 235000003140 Panax quinquefolius Nutrition 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 231100000870 cognitive problem Toxicity 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 235000008434 ginseng Nutrition 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000005498 polishing Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000014616 translation Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/08—Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
- B60W30/095—Predicting travel path or likelihood of collision
- B60W30/0956—Predicting travel path or likelihood of collision the prediction being responsive to traffic or environmental parameters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/04—Traffic conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2554/00—Input parameters relating to objects
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a vehicle speed control device and method. The vehicle speed control device comprises an environment sensing and integration module, a pedestrian jaywalking behavior cognition module and an automatic driving decision-making module, wherein the environment sensing and integration module is used for detecting the information of a jaywalking pedestrian in front of a running vehicle and outputting the information of the jaywalking pedestrian; the pedestrian jaywalking behavior cognition module is used for receiving the information of the jaywalking pedestrian and outputting the pedestrian's behavior cognition information according to the received information of the jaywalking pedestrian combined with the dynamic time series information and the interaction between the vehicle and the pedestrian; and the automatic driving decision-making module is used for receiving environment information and pedestrian's behavior cognition parameters, giving the current traffic speed of the vehicle, and outputting the current traffic speed of the vehicle. According to the vehicle speed control device and method disclosed by the invention, in the running process of the vehicle, the pedestrian target in front of the vehicle can be accurately recognized, the pedestrian's intention can be predicted, real-time judgment is performed, reasonable decision can be made, and autonomous recognition and decision-making of an autonomous vehicle can be realized when a pedestrian crosses the road, so that the vehicle can safely and smoothly give way to the pedestrian crossing the road.
Description
Technical field
The present invention relates to automatic Pilot field, more particularly to a kind of vehicle speed control device and method.
Background technology
Automatic driving vehicle realizes automatic Pilot by the technology such as environment sensing, cognition, decision-making and full-vehicle control.From
Dynamic drive vehicle traffic safety, traffic efficiency and environmental protection and energy saving etc. are played the role of positive.Countries in the world government and traffic
Operator of road correlation etc. is more and more interested in autonomous driving vehicle, it is desirable to which autonomous driving vehicle can allow road traffic more
Safe, more efficient and more environmentally friendly.Automatic driving vehicle perceives environment, cognition, decision-making and control by autonomous, improves to traffic ring
The response speed in border, includes rapidly to be operated accordingly under dangerous scene in different scenes, such as brakes and turns to, enters
And improve traffic safety.The research of automatic Pilot technology is favorably improved vehicle intellectualized level, and vehicle intellectualized industry is sent out
Exhibition is significant.Pedestrian recognizes pedestrian and pedestrian behavior cognition as main user and the vulnerable of road
It is the indispensable key technology of automatic driving vehicle with pedestrian is avoided.Pedestrian is recognized as automatic Pilot environment perception technology
Important component part, is the research emphasis of frontal collisions early warning system or pedestrains safety protection system.Pedestrian behavior cognition is helped
In the intelligent level of the system of raising, the decision-making to automated driving system has important directive significance.Pedestrian is avoided as certainly
The dynamic key technology for driving Vehicle Decision Method, has reference significance to avoiding other targets during traveling.With auto industry and
The development of road traffic, the probability that vehicle accident occurs and hazardness are increasing.Therefore, pedestrian's identification and behavior cognitive approach
Research, can be that automatic driving vehicle provides early warning information, allow automatic driving vehicle using corresponding measures to keep clear, and then prevent
The generation of pedestrian impact accident.
At present, more for pedestrian's identification and the research of pedestrian behavior cognitive approach.Pedestrian's identification mainly using monocular or
The sensors such as person's binocular camera, multi-line laser radar, by feature extraction, using feature as the input that classifies, using supervision
Learning algorithm carries out classification learning, and its classification capacity is preferable, but kinematic parameter identification capability is poor.And pedestrian behavior cognitive approach
Mainly cognitive model is built by Markov theory, be primarily directed to pedestrian itself, do not account between vehicle and pedestrian
Influence each other.For the research of pedestrian protection system, the research of pedestrian recognition method is primarily directed to, is lacked to pedestrian behavior
The research of cognitive approach, consider pedestrian identification and behavior cognitive approach research, further consideration pedestrian identification and
Behavior cognition impact of the result to Vehicle Decision Method.
Automatic Pilot technology, pedestrian identification and the behavior cognitive problem with the following aspects are directed at this stage:1) no
The behavior that the autonomous cognitive learning of energy and prediction pedestrian jaywalk, decision system intelligent level is relatively low.2) do not account for
Influencing each other between pedestrian and vehicle and act under travel environment.3) jaywalk in pedestrian or jaywalk intention
When, decision-making being carried out simply by setting distance threshold can not meet the demand under complicated traffic environment.4) single by
Camera is difficult to provide the information such as corresponding position and motion while accurately detection pedestrian.Wherein, although binocular vision can be known
Other target classification, provides target depth information, but computation complexity is higher, it is impossible to meets automatic Pilot and wants to recognizing real-time
Ask.And monocular vision can not export target depth information, it is unfavorable for the decision-making of automatic Pilot.5) pass by multi-line laser radar etc.
Sensor carries out pedestrian's identification and judgement, high cost, it is difficult to realize industrialization and the popularization and application of automatic Pilot.
Thus, it is desirable to have a kind of technical scheme come overcome or at least mitigate prior art drawbacks described above at least one
Individual.
Content of the invention
It is an object of the invention to provide a kind of vehicle speed control device and method are overcoming or at least mitigate existing skill
At least one of drawbacks described above of art.
For achieving the above object, the present invention provides a kind of vehicle speed control device, the vehicle speed control device bag
Include:Environment sensing and Fusion Module, are to jaywalk pedestrian information for detecting that vehicle travels front, and export and jaywalk row
People's information;Pedestrian jaywalks behavior cognition module, jaywalks pedestrian information for receiving, crosses horse further according to received
Road pedestrian information, and with reference to Dynamic Time Series information and vehicle and the effect of influencing each other of pedestrian, pedestrian behavior is entered
Row is cognitive, and exports pedestrian behavior cognitive information;With automatic Pilot decision-making module, recognize for receiving environmental information, pedestrian behavior
Know parameter, and according to the intention for jaywalking pedestrian information and the pedestrian behavior cognitive information, predicting and judge pedestrian,
And provide the current passage rate of vehicle, the current passage rate of output vehicle.
Further, the environment sensing and Fusion Module include:Information acquisition unit, the vehicle for collecting is travelled
Front related data;Data pre-processing unit, the vehicle for collecting to described information collecting unit travels front dependency number
According to carrying out filtration treatment;Data fusion unit, before to travelling via the pretreated vehicle of the data pre-processing unit
Square related data carries out combined calibrating, data syn-chronization, K-means cluster, to obtain partial barriers area distribution;Pedestrian detection
Unit, for detecting pedestrian target in the regional area scattergram of the barrier that the data fusion unit is set up;And row
People's tracking cell, for setting up the linear movement model of pedestrian target, and is examined to the pedestrian using linear goal track algorithm
Surveying the pedestrian target that detects of unit carries out successive frame target following, obtains stable jaywalking pedestrian information.
Further, described information collecting unit includes single line laser radar and monocular cam, the single line laser thunder
Reach the radar information that preceding object thing is travelled for collection vehicle, the monocular cam is used for the row that collection vehicle travels front
People's video information;
The work process of the data fusion unit is specific as follows:
31) combined calibrating:Using perspective transform by radar fix system [Xlidar,Ylidar, 1] and image coordinate system [Xcamera,
Ycamera, 1] and the vehicle axis system [X that is transformed into centered on vehicle front bumper barvehicle,Yvehicle, 1], Perspective transformation model
As follows:
Using the local coordinate under 12 pairs of laser radars, image and three coordinate systems of vehicle, Perspective transformation model is obtained
(2) and (3) transformation matrix parameter;
32) data syn-chronization:The frequency acquisition of the monocular cam and the single line laser radar is set, with the monocular
On the basis of the frequency acquisition of photographic head, the data of single line laser radar collection are sampled in real time, is realized the monocular
Photographic head and the data syn-chronization of the single line laser radar;
33) K-means is clustered to obtain partial barriers area distribution, is specifically included:
331) region of cluster is limited using geometrical constraint rule, and the region is the visual angle View of the single line laser radar
(Lidar) and monocular cam visual angle View (Camera) intersection area { View (Lidar) ∩ View (Camera) }:
The geometrical constraint rule is:Obstacle target need to be in the visual angle View (Lidar) of single line laser radar and monocular
Visual angle View (Camera) intersection area of photographic head:
LΔT,i∈{View(Lidar)∩View(Camera)} (4)
332) using clustering algorithm 331) intersection area that limits sets up partial barriers area distribution.
Further, the pedestrian jaywalks behavior cognition module and includes:The behavior database structure that pedestrian jaywalks
Build unit, for export in the environment sensing and Fusion Module jaywalk pedestrian information on the basis of, collection pedestrian's horizontal stroke
The real roads data of road are worn, the behavior that pedestrian jaywalks is defined, and each pedestrian is crossed by horse by the method that demarcates
The behavior on road is demarcated, and is stored as the example that pedestrian jaywalks, used as the behavior database that pedestrian jaywalks;
The behavior cognition network model construction unit for jaywalking with pedestrian, jaywalks the ginseng of behavior cognition network for defining pedestrian
Number and pedestrian jaywalk the structure of behavior cognition network, and the behavior database for being jaywalked based on the pedestrian builds list
The pedestrian that unit is given jaywalks behavior database, optimizes and learning network parameter, obtains pedestrian behavior cognitive information.
Further, the automatic Pilot decision-making module includes:Environmental information and cognitive parameters unit, for input environment
Information and cognitive parameter;Fuzzy operation unit, for by fuzzification operation environmental information and cognitive parameter, building expert decision-making
Rule, obtains fuzzy result;With Anti-fuzzy operating unit, for using weighting method by the fuzzy operation unit export obscure
As a result anti fuzzy method is carried out, actual vehicle passage rate is obtained, be sent to by output vehicle pass-through speed submodule and drive automatically
Sail vehicle bottom actuator.
The present invention provides a kind of car speed control method, and the car speed control method comprises the steps:
S100:It is to jaywalk pedestrian information that detection vehicle travels front, and exports and jaywalk pedestrian information;S200:
According to jaywalking pedestrian information, and the effect of influencing each other with reference to Dynamic Time Series information and vehicle and pedestrian, right
Pedestrian behavior carries out cognition, and exports pedestrian behavior cognitive information;And S300:Pedestrian information and institute are jaywalked according to described
Pedestrian behavior cognitive information, the intention of prediction and judgement pedestrian is stated, and the current passage rate of vehicle is given, output vehicle currently leads to
Scanning frequency degree.
Further, the S100 is specifically included:S110, single line laser radar collection vehicle travels the thunder of preceding object thing
Information is reached, monocular cam collection vehicle travels pedestrian's video information in front;S120, the vehicle row collected by step S110
Sailing the radar information of preceding object thing and pedestrian's video information carries out filtration treatment;S130, after via the pretreatment of step S120
Vehicle travel the radar information of preceding object thing and pedestrian's video information carries out combined calibrating, data syn-chronization, K-means and gathers
Class, to obtain partial barriers area distribution;S140, detects in the regional area scattergram of the barrier that step S130 is set up
Go out pedestrian target;And S150, the linear movement model of pedestrian target is set up, and using linear goal track algorithm to step
The pedestrian target that S140 is detected carries out successive frame target following, obtains stable jaywalking pedestrian information.
Further, the S200 is specifically included:S210, on the basis of pedestrian information is jaywalked, collection pedestrian's horizontal stroke
The real roads data of road are worn, the behavior that pedestrian jaywalks is defined, and each pedestrian is crossed by horse by the method that demarcates
The behavior on road is demarcated, and is stored as the example that pedestrian jaywalks, used as the behavior database that pedestrian jaywalks;
And S220, define pedestrian and jaywalk the knot that the parameter of behavior cognition network and pedestrian jaywalk behavior cognition network
Structure, and behavior database is jaywalked based on the pedestrian, optimize and learning network parameter, obtain pedestrian behavior cognitive information.
Further, step S210 includes:
S211, gathers real roads data:Pedestrian information, collection are jaywalked based on what environment sensing and fusion mould were obtained
The real roads data that pedestrian jaywalks, the real roads data that the pedestrian jaywalks are expressed as vectorial D, D={ vc,P,
xr,yr, wherein:vcRepresent the travel speed of vehicle, P has indicated whether pedestrian, xrRepresent the longitudinally opposed position of vehicle and pedestrian
Put, yrRepresent the laterally opposed position of vehicle and pedestrian;
S212, defines pedestrian and jaywalks behavior:Pedestrian jaywalks behavior representation for vector B={ wait, cross
Straight fast, cross straight slowly }, wherein, wait is expressed as wait behavior, cross straight
Fast is expressed as quickly crossing behavior, and cross straight slowly represents at a slow speed and crosses;
S213, demarcates pedestrian and jaywalks behavior:According to the real roads that the pedestrian that step S211 is collected jaywalks
Data, the behavior for each pedestrian being jaywalked by the method that demarcates according to step S212, defined in pedestrian jaywalk
Behavior carries out classification demarcation;
S214, storage pedestrian jaywalks behavioral data, used as the behavior database that pedestrian jaywalks:Storing step
The pedestrian that S213 acceptance of the bid is set jaywalks the example that behavior is jaywalked as multiple pedestrians, forms the row that pedestrian jaywalks
For data base.
Further, step S220 includes:
S221, defines the parameter that pedestrian jaywalks behavior cognition network:Based on the cognition network of Dynamic Bayesian, dynamic
Bayesian cognition network includes pro-active network B1With transfer network B→, pro-active network B1It is defined as the conditional probability of original state
Relation between distribution and multivariate, shifts network B→The relation between t-1 moment and t is defined as with conditional probability distribution,
And the pedestrian behavior cognitive information meets single order Markov hypothesis;
S222, pedestrian behavior cognitive information is the directed acyclic graph with time dimension, and variable is represented with node, variable
Between mutual relation represent with oriented arrow and conditional probability, conditional probability distribution function is defined as follows three kinds of situations:
If node and its father node are all discrete variables, conditional probability represents as follows:
P (Z=i | Pa (Z)=j)=P (i, j)
If node discrete variable and be continuous variable with its father node, conditional probability represents as follows:
If node continuous variable and be also continuous variable with its father node, conditional probability is as follows:
The variable that pedestrian jaywalks behavior cognition network model and includes has travel speed v of vehiclec, vehicle and pedestrian's
Relative position information (xr, yr), the lateral attitude of pedestrian and movable information (yp,vy) and the behavior that jaywalks of pedestrian be intended to
Information (Pw,Pcf,Pcl);
S223, defines the structure that pedestrian jaywalks behavior cognition network:Pedestrian jaywalks behavior cognition network structure
It is by expertise and driving experience, the relation that influences each other between each variable in network is defined, represents change with oriented arrow
The relation that influences each other between amount;
Pedestrian jaywalks the node in behavior cognition network structure includes state of motion of vehicle node (V), becomes discharge observation
Node O, pedestrian movement's state node (P), pedestrian and vehicle relative motion state node (R) and pedestrian jaywalk behavior and recognize
Know node (B), specific nodal information includes as follows:
State of motion of vehicle node:V=[vc];
Variable observer nodes:
Pedestrian movement's state node:
Pedestrian and vehicle relative motion state node:
Pedestrian jaywalks behavior cognitive nodes:B=[Pw,Pcf,Pcl];
Behavior cognition network structure is jaywalked by described pedestrian, can be turned with the state between following formula expression variable
Move:
P(Bt,Rt,Pt|Bt-1,Rt-1,Pt-1,Vt)=P (Bt|Bt-1,Rt-1)P(Rt|Bt,Rt-1,Vt)P(Pt|Bt,Rt,Pt-1);
S224, learns the parameter that pedestrian jaywalks behavior cognition network:The pedestrian for being obtained by step S214 crosses horse
Road behavior database, learns network parameter based on MLE method.
Further, the S300 is specifically included:Input environment information and cognitive parameter;Believed by fuzzification operation environment
Breath and cognitive parameter, build expert decision-making rule, obtain fuzzy result;And using weighting method, the fuzzy result is carried out instead
Obfuscation, obtains actual vehicle passage rate, is sent to automatic driving vehicle bottom by exporting vehicle pass-through speed submodule
Actuator.
The present invention can accurately identify vehicle front pedestrian target in vehicle travel process, and prediction pedestrian is intended to, in real time
Judge and make rational decision-making, realize autonomous classification and decision-making of the automatic driving vehicle when pedestrian jaywalks so that car
Safety, unobstructed avoid the pedestrian for jaywalking.
Description of the drawings
Fig. 1 is the scene graph that the pedestrian of the present invention jaywalks and sensor configuration scheme schematic diagram.
Fig. 2 is the theory diagram of vehicle speed control device according to one preferred embodiment of the present invention.
Fig. 3 is the theory diagram of the environment sensing of a preferred embodiment and Fusion Module in Fig. 1.
Fig. 4 is that the pedestrian of the present invention jaywalks behavior definition description schematic diagram.
Fig. 5 is that the pedestrian of the present invention jaywalks behavior cognition network structural representation.
Specific embodiment
In the accompanying drawings, represent same or similar element or with same or like function using same or similar label
Element.Below in conjunction with the accompanying drawings embodiments of the invention are described in detail.
In describing the invention, term " " center ", " longitudinal direction ", " horizontal ", "front", "rear", "left", "right", " vertical ",
The orientation of instruction such as " level ", " top ", " bottom ", " interior ", " outward " or position relationship are to be closed based on orientation shown in the drawings or position
System, is for only for ease of and describes the present invention and simplify description, rather than indicate or imply that the device of indication or element must have
Specific orientation, with specific azimuth configuration and operation, therefore it is not intended that limiting the scope of the invention.
As shown in figure 1, the vehicle speed control device that provided of the present embodiment and method be for automatic driving vehicle (under
Text is all referred to as " vehicle ") pedestrian jaywalks under running environment Scenario Design.Pedestrian jaywalks scene includes configuration
There is the automatic driving vehicle of AVT Mako G-125C single camera sensor 1a, LMS511-2D single line laser radar sensor 1b
With some pedestrian M for jaywalking.The scene that pedestrian jaywalks can occur in automatic driving vehicle under road conditions
Any running section.
As shown in Fig. 2 the vehicle speed control device provided by the present embodiment includes environment sensing and Fusion Module, pedestrian
Jaywalk behavior cognition module and automatic Pilot decision-making module.Wherein:
It is to jaywalk pedestrian information that environment sensing and Fusion Module are used for detecting that vehicle travels front, and exports and cross horse
Road pedestrian information.Above-mentioned " jaywalking pedestrian information " includes to jaywalk positional information and movable information of pedestrian etc., its
In " movable information " including pedestrian relative vehicle speed.
In one embodiment, as shown in Figures 2 and 3, environment sensing and Fusion Module are travelled specifically for collection vehicle
Preceding data, the vehicle to collecting travels preceding data and carries out filtration treatment, then carries out combined calibrating data synchronization, obtains
The distributed areas of partial barriers being taken, then many pedestrians target detection is carried out in the distributed areas of partial barriers, filters out pedestrian
Target, finally carries out successive frame tracking to the pedestrian target for detecting, and obtains the stable pedestrian information that jaywalks, and will travel across
Road pedestrian information is sent to pedestrian and jaywalks behavior cognition module.
Preferably, environment sensing and Fusion Module include information acquisition unit 1, data pre-processing unit 2, data fusion list
Unit 3, pedestrian detection unit 4 and pedestrian tracking unit 5, wherein:
The vehicle that information acquisition unit 1 is collected travels front related data, and which includes single line laser radar 1a and monocular
Photographic head 1b, single line laser radar 1a travels the thunder of preceding object thing at vehicle front bumper bar for collection vehicle
Information is reached, primarily to providing possible pedestrian candidate region.Monocular cam 1b can fix the windshield right side in the car
Top, travels pedestrian's video information in front for collection vehicle.
The vehicle traveling front related data that data pre-processing unit 2 is used for collecting information acquisition unit 1 was carried out
Filter is processed, and specific work process is:
21) radar data collected by single line laser radar 1a:Filter the null object in radar data and rejecting
Bounce target in radar data, and the value of the data point of one group of laser radar after filtering and rejecting is processed is constituted one
Vector, for characterizing the 2D characteristic vector of each preceding object thing that single line laser radar 1a is detected:
LΔT,i=[disinθi,dicosθi]T(1)
In above formula, Δ T is No. ID that data collection cycle, i represents the barrier after filtering, diRepresent barrier with respect to
The air line distance of laser radar, θiRepresent angle of the barrier with respect to laser radar.
22) video stream data collected by monocular cam 1b:At medium filtering noise reduction process and picture compression
Reason, such as by resolution 1292*964 compression component resolution 720*480.
Data fusion unit 3 is used for sequentially entering via the pretreated radar data of data pre-processing unit 2 and image
Row combined calibrating, data syn-chronization, K-means cluster, to obtain partial barriers area distribution.The work of data fusion unit 3
Process is specific as follows:
31) combined calibrating:Using perspective transform by radar fix system [Xlidar,Ylidar, 1] and image coordinate system [Xcamera,
Ycamera, 1] and the vehicle axis system [X that is transformed into centered on vehicle front bumper barvehicle,Yvehicle, 1], Perspective transformation model
As follows:
As there are 12 variables each Perspective transformation model (2) and (3), (including 9 rotary variable, 3 translations become
Amount), therefore need, using the local coordinate under 12 pairs of laser radars, image and three coordinate systems of vehicle, to obtain perspective transform mould
Transformation matrix parameter (a of type (2) and (3)ij, bij, ci, li, i=1,2,3, j=1,2,3).
32) data syn-chronization:The frequency acquisition of monocular cam 1b, such as 15Hz, the collection of single line laser radar 1a are set
The data of single line laser radar 1a collection, on the basis of the frequency acquisition of monocular cam 1b, are carried out reality by frequency, such as 25Hz
When sample, such as 66 times per second, realize the data syn-chronization of monocular cam 1b and single line laser radar 1a.
33) K-means is clustered to obtain partial barriers area distribution, is specifically included:
331) region of cluster is limited using geometrical constraint rule, and the region is the visual angle View of single line laser radar 1a
(Lidar) and monocular cam 1b visual angle View (Camera) intersection area { View (Lidar) ∩ View (Camera) }.Tool
Body is as follows:
The geometrical constraint rule is:Obstacle target need to be in the visual angle View (Lidar) of single line laser radar 1a and list
Visual angle View (Camera) intersection area of mesh photographic head 1b:
LΔT,i∈{View(Lidar)∩View(Camera)} (4)
For example:The visual angle of single line laser radar 1a is 180 ± 5 °, and angular resolution is 0.5 °, and scanning every time obtains 361 barriers
Hinder object point, Use barriers thing target need to be at the visual angle View (Lidar) of single line laser radar 1a and the visual angle of monocular cam 1b
In the range of the common viewing angle of View (Camera), it would be desirable to the point of cluster, closed area { View (Lidar) ∩ View is limited in
(Camera)}.
332) using clustering algorithm 331) intersection area that limits sets up partial barriers area distribution.
Pedestrian detection unit 4 is used for detecting in the regional area scattergram of the barrier that data fusion unit 3 is set up
Pedestrian target.The work process of pedestrian detection unit 4 is exemplified below:
41) picture for gathering monocular cam 1b in data preprocessing phase due to data pre-processing unit 2 is from 1292*
964 are compressed into 720*480, and the single area-of-interest of pedestrian target is set to distance center in cluster centre point and such
The distance of the horizontal and vertical farthest point of point constitutes the regional area of barrier for the rectangular area of the length of side, several rectangular areas
Scattergram.The SVM classifier that pedestrian detection unit 4 is obtained using HOG Feature Descriptor and precondition, recycles many size inspections
Survey device detection pedestrian target.Wherein, cell size is 8 × 8,9 direction bin, and detection window size is 48 × 96, step-size in search
For 8 × 8, it is 1.15 that image border polishing parameter is 16 × 16, scaling coefficient, and (this feature vector is referred to characteristic vector
Using HOG Feature Descriptor obtain need detection rectangular area characteristic vector) and SVM division hyperplane distance threshold be
0.9, it is that number is K 6, therefore setting K-means clustering algorithm maximum is classified that can arrange most search targetsmax=6.Cell and bin
It is the parameter for seeking image HOG feature.Cell is cell, is made up of several pixels.Bin finger direction block, such as 180 ° are divided into 9 sides
To block, each accounts for 20 °.
42) using improved K-means clustering algorithm, the data point after fusion is at most divided into KmaxClass, per apoplexy due to endogenous wind with
Cluster centre point and the distance in such apart from the horizontal and vertical farthest point of central point are the length of side, are extended to rectangular area, build
The multiple semi-cylindrical hills block of vertical pedestrian target, most KmaxIndividual.
Wherein, the width w of the rectangular area of each the classification extension after the clustercWith height hcRespectively:
In above formula, μc_ x is the abscissa of the cluster centre point of c class, μc_ y is the vertical seat of the cluster centre point of c class
Mark, lcI () _ x is the abscissa of c i-th sample point of apoplexy due to endogenous wind, lcI () _ y is the vertical coordinate of c i-th sample point of apoplexy due to endogenous wind.
43) using pedestrian detection unit algorithm in step 41) target search is carried out in the multiple semi-cylindrical hills block that determines,
Detect pedestrian target.
Pedestrian tracking unit 5 is used for setting up linear movement model G (t) of pedestrian target, and is calculated using linear goal tracking
Method, such as Kalman algorithm carry out successive frame target following to the pedestrian target that pedestrian detection unit 4 is detected, and are stablized
Jaywalk pedestrian information." jaywalking pedestrian information " includes pedestrian relative position information (xr, yr) and relative velocity
(vrx,vry), it is supplied to pedestrian and jaywalks the use of behavior cognition module.The work process of pedestrian tracking unit 5 specifically include with
Lower step:
51) set up linear movement model G (t) of pedestrian target, it is assumed that the pedestrian's uniform motion for jaywalking, ignore process
Noise, its linear movement model is as follows:
G (t)=[X (t), Y (t), X ' (t), Y ' (t)]T
G (t)=AG (t-1) (5)
In above formula,Omit the X={ x of time variable t1x2… xNFor in N number of target
The lateral attitude set of heart point, omits the Y={ y of time variable t1y2… yNBe N number of target's center's point lengthwise position collection
Close, X '={ x '1x′2… x′NIt is N number of target's center's point lateral velocity set, Y '={ y '1y′2… y′NIt is N number of mesh
Mark central point longitudinal velocity set, Δ T is data collection cycle.
52) using Kalman algorithm, successive frame target following is carried out:
521) using the position of the monocular vision acquisition of information t pedestrian target central point after the fusion of data fusion unit 3
Confidence ceases [Xz(t),Yz(t)] and single line laser radar 1a obtain pedestrian target central point relative velocity [X 'z(t),Y′z
(t)], by [the X of simultaneousz(t),Yz(t),X′z(t),Y′z(t)]TObservation vector as system.
In the same manner, XzT () is the lateral attitude set of N number of observed object central point, YzT () is N number of observed object central point
Lengthwise position set, X 'zT () is the lateral velocity set of N number of observed object central point, Y 'zT () is N number of observed object central point
Longitudinal velocity set.
522) matrix being updated using observing matrix and state, estimates the position of subsequent timePre- in Kalman algorithm
The position field that estimates, arranges an area-of-interest, scans for coupling using SVM, if in this region detection to target, and greatly
In matching threshold, then persistently track, otherwise, as emerging pedestrian target.
As shown in figure 1, pedestrian jaywalks behavior cognition module jaywalks pedestrian information for receiving, further according to reception
To jaywalk pedestrian information, and the effect of influencing each other with reference to Dynamic Time Series information and vehicle and pedestrian, right
Pedestrian behavior carries out cognition." cognition " includes definition, study and predicts, and exports pedestrian behavior cognitive information, in pedestrian's horizontal stroke
Wear in road, " pedestrian behavior cognitive information " includes quickly cross behavior, cross at a slow speed behavior and wait behavior.
In one embodiment, as shown in Figure 2 and Figure 4, pedestrian jaywalks behavior cognition module and crosses horse for pedestrian
Under the scene of road, on the basis of environment sensing and Fusion Module, behavior database and pedestrian's horizontal stroke is jaywalked by building pedestrian
Road behavior cognition network model is worn, then by the historical information of certain period of time, the pedestrian's state for currently jaywalking is entered
Every trade exports, for estimating and predicting, the pedestrian's state for currently jaywalking.Historical information includes vehicle speed information and pedestrian
Relative information and between vehicle.
Preferably, pedestrian jaywalks the behavior database construction unit 6 that behavior cognition module includes that pedestrian jaywalks
The behavior cognition network model construction unit 7 for jaywalking with pedestrian, wherein:
The behavior database construction unit 6 that pedestrian jaywalks crosses horse for exported in environment sensing and Fusion Module
On the basis of the pedestrian information of road, the real roads data that pedestrian jaywalks are gathered, the behavior that pedestrian jaywalks is defined, and is led to
Cross the method that demarcates the behavior that each pedestrian jaywalks is demarcated, and the example that pedestrian jaywalks be stored as,
Its example includes the correlated variabless information in time serieses, using the behavior database for jaywalking as pedestrian.Specific works mistake
Cheng Wei:
61) real roads data are gathered:Pedestrian information, collection row are jaywalked based on what environment sensing and fusion mould were obtained
The real roads data that people jaywalks, the scene that the data are jaywalked just for pedestrian.The true road that pedestrian jaywalks
Circuit-switched data includes travel speed v of vehiclec, if there is a pedestrian P, the longitudinally opposed position x of vehicle and pedestrianrWith vehicle and pedestrian
Laterally opposed position yrEtc. information.The real roads data that pedestrian jaywalks can be expressed as vectorial D, D={ vc,P,xr,
yr}.
62) define pedestrian and jaywalk behavior:As shown in figure 4, pedestrian is jaywalked in behavior description schematic diagram, according to driving
The prioris such as the experience of cognitive driving environment of experience or driver's accumulation are sailed, the behavior that pedestrian jaywalks includes to wait
(wait), quickly cross (cross straight fast) and (cross straight slowly) is crossed at a slow speed, which can use
Vector representation is B, B={ wait, cross straight fast, cross straight slowly }.
63) demarcate pedestrian and jaywalk behavior:According to step 611) the real roads number that jaywalks of the pedestrian that collects
According to the behavior for jaywalking each pedestrian by the method that demarcates is according to step 612) defined in pedestrian jaywalk behavior
Carry out classification demarcation.
64) storage pedestrian jaywalks behavioral data, used as the behavior database that pedestrian jaywalks:Storing step 63)
The pedestrian that acceptance of the bid is set jaywalks the example that behavior is jaywalked as multiple pedestrians, forms the behavior number that pedestrian jaywalks
According to storehouse.
The example that one pedestrian jaywalks includes the correlated variabless information in time serieses (1 to t), variable letter
Breath is expressed as information collection D1:t, the time that t is jaywalked for pedestrian, D is variable information.
As shown in Fig. 2 the behavior cognition network model construction unit 7 that pedestrian jaywalks is jaywalked for defining pedestrian
The parameter of behavior cognition network and pedestrian jaywalk the structure of behavior cognition network, finally, are jaywalked based on pedestrian
The pedestrian that behavior database construction unit 6 is given jaywalks behavior database, optimizes and learning network parameter, obtains pedestrian's horizontal stroke
Wear the behavior cognition network model of road.Specific work process is as follows:
71) parameter that pedestrian jaywalks behavior cognition network is defined:As shown in figure 5, the cognition based on Dynamic Bayesian
Network, the cognition network of Dynamic Bayesian includes pro-active network B1With transfer network B→.Pro-active network B1It is defined as original state
Relation between conditional probability distribution and multivariate.Transfer network B→The relation being defined as between t-1 moment and t is general with condition
Rate is distributed, and the behavior cognition network model that the pedestrian jaywalks meets single order Markov hypothesis.
The behavior cognition network model that pedestrian jaywalks is the directed acyclic graph with time dimension, variable node
Represent, the mutual relation between variable is represented with oriented arrow and conditional probability.Conditional probability distribution function is defined as follows three kinds
Situation:
If node and its father node are all discrete variables, conditional probability represents as follows:
P (Z=i | Pa (Z)=j)=P (i, j) (6)
If node discrete variable and be continuous variable with its father node, conditional probability represents as follows:
If node continuous variable and be also continuous variable with its father node, conditional probability is as follows:
The variable that pedestrian jaywalks behavior cognition network model and includes has travel speed v of vehiclec, vehicle and pedestrian's
Relative position information (xr, yr), the lateral attitude of pedestrian and movable information (yp,vy) and the behavior that jaywalks of pedestrian be intended to
Information (Pw, Pcf,Pcl).
72) structure that pedestrian jaywalks behavior cognition network is defined:Pedestrian jaywalks behavior cognition network structure
By expertise and driving experience, the relation that influences each other between each variable in network is defined, represents variable with oriented arrow
Between the relation that influences each other.
Pedestrian jaywalks the node in behavior cognition network structure includes state of motion of vehicle node (V), becomes discharge observation
Node O, pedestrian movement's state node (P), pedestrian and vehicle relative motion state node (R) and pedestrian jaywalk behavior and recognize
Know node (B).Specific nodal information includes as follows:
State of motion of vehicle node:V=[vc].
Variable observer nodes:
Pedestrian movement's state node:
Pedestrian and vehicle relative motion state node:
Pedestrian jaywalks behavior cognitive nodes:B=[Pw,Pcf,Pcl].
Behavior cognition network structure is jaywalked by described pedestrian, can be turned with the state between following formula expression variable
Move:
P(Bt,Rt,Pt|Bt-1,Rt-1,Pt-1,Vt)=P (Bt|Bt-1,Rt-1)P(Rt|Bt,Rt-1,Vt)P(Pt|Bt,Rt,
Pt-1) (9)
73) study pedestrian jaywalks the parameter of behavior cognition network:As shown in Fig. 2 passing through step 64) pedestrian that obtains
Behavior database is jaywalked, network parameter is learnt based on Maximum Likelihood Estimation (MLE) method.
For example:In real road, described is jaywalked to current pedestrian by the historical information of certain time in real time
Behavior estimated and predicted be by the historical information of current context information and certain time be input to study after pedestrian
In the behavior cognition network model for jaywalking, the behavior cognition network model that pedestrian jaywalks can pass through which in step 64)
The pedestrian for obtaining jaywalks the experience and knowledge that acquires in behavior database, enters every trade to the pedestrian's state for currently jaywalking
For estimating and predicting, the pedestrian's state for currently jaywalking is exported, the pedestrian's state for currently jaywalking includes what pedestrian waited
Probability (Pwait), the probability (P that quickly crossescross fast), the probability (P that crosses at a slow speedcross slowly), it is abbreviated as P respectivelyw,Pcf,
Pcl.
As shown in Fig. 2 automatic Pilot decision-making module is used for receiving environmental information, pedestrian behavior cognition parameter, and according to institute
State and pedestrian information and the pedestrian behavior cognitive information is jaywalked, the intention of prediction and judgement pedestrian, and it is current to provide vehicle
Passage rate, the current passage rate of output vehicle.
Preferably, automatic Pilot decision-making module includes environmental information and cognitive parameters unit 8, fuzzy operation unit 9 and anti-
Fuzzy operation unit 10, wherein:
Environmental information and cognitive parameters unit 8 are used for input environment information and cognitive parameter:Environmental information parameter comes from
Pedestrian tracking module 5 in environment sensing and Fusion Module, and including Vehicle Speed (vc), pedestrian's in the road horizontal
Positional information (yp), the relative position information (x of pedestrian and vehicler, yr), relative velocity (vrx,vry), the horizontal speed of pedestrian
Degree information (vy) etc. environmental information parameter.Cognitive parameter comes from step 71), and the behavior for jaywalking including pedestrian is intended to letter
Breath (Pw,Pcf,Pcl).
Fuzzy operation unit 9 is used for, by fuzzification operation environmental information and cognitive parameter, building expert decision-making rule, obtaining
Obtain fuzzy result.Its work process is as follows:
91) fuzzification operation:By ambiguity in definition function, the ring that environmental information and cognitive parameters unit 8 are input into
Environment information and cognitive parameter carry out Fuzzy processing, environmental information and cognitive parameter information are converted into corresponding fuzzy
Collection.The fuzzy language of definition information is represented with T, such as, for Vehicle Speed:
T(vc)={ Z, S, M, H }
Wherein, Z represents that speed goes to zero, and S represents that speed is relatively low, and M represents speed medium speed, and H represents that speed is higher.Value
Must illustrated is the velocity ambiguity that velocity ambiguity here is referred under road conditions, because not examining in fastlink
Consider the presence of pedestrian.Described membership function is as follows:
92) expert decision-making rule is built:According to expert knowledge library, by step 91) obtain fuzzy set in fuzzy variable enter
Row reasoning computing, obtains fuzzy result.
Described expert knowledge library is as follows, and wherein, P has indicated whether pedestrian (F indicates Wu T indicates):
If P=F, then vo=H;If (when front does not have pedestrian, vehicle can pass through at a high speed)
If P=T, Pw=(Z or S), Pcf=(Z or S), and xr=(Z or S), then vo=Z;
If P=T, Pw=(Z or S), Pcf=(M or H), and xr=(Z or S), then vo=Z;
If P=T, Pw=(Z or S), Pcf=(Z or S), and xr=M, then vo=S;
If P=T, Pw=(Z or S), Pcf=(M or H), and xr=M, then vo=Z;
If P=T, Pw=(Z or S), Pcf=(Z or S), and xr=H, then vo=M;
If P=T, Pw=(Z or S), Pcf=(M or H), and xr=H, then vo=MS;
If P=F, then vo=H;
If P=T, Pw=(Z or S), Pcf=(Z or S), and xr=(Z or S), then vo=Z;
If P=T, Pw=(Z or S), Pcf=(M or H), and xr=(Z or S), then vo=Z;
If P=T, Pw=(Z or S), Pcf=(Z or S), and xr=M, then vo=S;
If P=T, Pw=(Z or S), Pcf=(M or H), and xr=M, then vo=Z;
If P=T, Pw=(Z or S), Pcf=(Z or S), and xr=H, then vo=M;
If P=T, Pw=(Z or S), Pcf=(M or H), and xr=H, then vo=MS;
If P=T, Pw=M, Pcf=(Z or S), and xr=H, then vo=H;
If P=T, Pw=M, Pcf=(M or H), and xr=H, then vo=MH;
If P=T, Pw=M, Pcf=(Z or S), and xr=M, then vo=M;
If P=T, Pw=M, Pcf=(M or H), and xr=M, then vo=MS;
If P=T, Pw=M, Pcf=(Z or S), and xr=S or Z, then vo=S;
If P=T, Pw=M, Pcf=(M or H), and xr=S or Z, then vo=Z;
If P=T, Pw=H and xr=M or H, then vo=H;
If P=T, Pw=H and xr=Z or S, then vo=M
Fuzzy variable in fuzzy set includes that Vehicle Speed fuzzy variable, pedestrian lateral attitude in the road obscures
The relative position fuzzy variable of variable, pedestrian and vehicle, the lateral velocity fuzzy variable of pedestrian and pedestrian behavior are intended to fuzzy
Variable.
Fuzzy result includes vehicle pass-through velocity ambiguity information and its fuzziness size.
The fuzzy result that fuzzy operation unit 9 is exported is carried out anti fuzzy method using weighting method by Anti-fuzzy operating unit 10,
Actual vehicle passage rate is obtained, and automatic driving vehicle bottom execution machine is sent to by exporting vehicle pass-through speed submodule
Structure.
Shown in the formula specific as follows of anti fuzzy method:
Wherein λiIt is the weight of fuzzy output variable, viRepresent the model output speed according to membership function.
The present invention also provides a kind of car speed control method, and the method comprises the steps:
S100:It is to jaywalk pedestrian information that detection vehicle travels front, and exports and jaywalk pedestrian information;
S200:According to jaywalking pedestrian information, and the phase with reference to Dynamic Time Series information and vehicle and pedestrian
Mutually influence, carries out cognition, and exports pedestrian behavior cognitive information to pedestrian behavior;And
S300:Pedestrian information and the pedestrian behavior cognitive information are jaywalked according to described, predict and judge pedestrian's
It is intended to, and the current passage rate of vehicle is given, the current passage rate of output vehicle.
The present embodiment can accurately identify vehicle front pedestrian target in vehicle travel process, and prediction pedestrian is intended to, real
When judge and make rational decision-making, realize autonomous classification and decision-making of the automatic driving vehicle when pedestrian jaywalks so that
Vehicle safety, unobstructed avoid the pedestrian for jaywalking.
In one embodiment, the S100 is specifically included:
S110 single line laser radar collection vehicle travels the radar information of preceding object thing, monocular cam collection vehicle row
Sail pedestrian's video information in front;
The vehicle that S120 is collected to step S110 travels the radar information of preceding object thing and pedestrian's video information and carries out
Filtration treatment;
S130, to travelling radar information and pedestrian's video letter of preceding object thing via the pretreated vehicle of step S120
Breath carries out combined calibrating, data syn-chronization, K-means cluster, to obtain partial barriers area distribution;
S140, detects pedestrian target in the regional area scattergram of the barrier that step S130 is set up;And
S150, is set up the linear movement model of pedestrian target, and using linear goal track algorithm, step S140 is detected
Pedestrian target out carries out successive frame target following, obtains stable jaywalking pedestrian information.
In one embodiment, the S200 is specifically included:
S210, on the basis of pedestrian information is jaywalked, gathers the real roads data that pedestrian jaywalks, definition line
The behavior that people jaywalks, and by the method that demarcates, the behavior that each pedestrian jaywalks is demarcated, and stored
For the example that pedestrian jaywalks, as the behavior database that pedestrian jaywalks;And
S220, definition pedestrian jaywalks the parameter of behavior cognition network and pedestrian jaywalks behavior cognition network
Structure, and behavior database is jaywalked based on the pedestrian, optimize and learning network parameter, obtain the cognitive letter of pedestrian behavior
Breath.
Preferably, step S210 includes:
S211, gathers real roads data:Pedestrian information, collection are jaywalked based on what environment sensing and fusion mould were obtained
The real roads data that pedestrian jaywalks, the real roads data that the pedestrian jaywalks are expressed as vectorial D, D={ vc,P,
xr,yr, wherein:vcRepresent the travel speed of vehicle, P has indicated whether pedestrian, xrRepresent the longitudinally opposed position of vehicle and pedestrian
Put, yrRepresent the laterally opposed position of vehicle and pedestrian;
S212, defines pedestrian and jaywalks behavior:Pedestrian jaywalks behavior representation for vector B={ wait, cross
straight fast,cross straight slowly}.Wherein, wait is expressed as wait behavior, cross straight
Fast is expressed as quickly crossing behavior, and cross straight slowly represents at a slow speed and crosses;
S213, demarcates pedestrian and jaywalks behavior:According to the real roads that the pedestrian that step S211 is collected jaywalks
Data, the behavior for each pedestrian being jaywalked by the method that demarcates according to step S212, defined in pedestrian jaywalk
Behavior carries out classification demarcation;
S214, storage pedestrian jaywalks behavioral data, used as the behavior database that pedestrian jaywalks:Storing step
The pedestrian that S213 acceptance of the bid is set jaywalks the example that behavior is jaywalked as multiple pedestrians, forms the row that pedestrian jaywalks
For data base.
Preferably, step S220 includes:
S221, defines the parameter that pedestrian jaywalks behavior cognition network:Based on the cognition network of Dynamic Bayesian, dynamic
Bayesian cognition network includes pro-active network B1With transfer network B→, pro-active network B1It is defined as the conditional probability of original state
Relation between distribution and multivariate, shifts network B→The relation between t-1 moment and t is defined as with conditional probability distribution,
And the pedestrian behavior cognitive information meets single order Markov hypothesis;
S222, pedestrian behavior cognitive information is the directed acyclic graph with time dimension, and variable is represented with node, variable
Between mutual relation represent with oriented arrow and conditional probability, conditional probability distribution function is defined as follows three kinds of situations:
If node and its father node are all discrete variables, conditional probability represents as follows:
P (Z=i | Pa (Z)=j)=P (i, j)
If node discrete variable and be continuous variable with its father node, conditional probability represents as follows:
If node continuous variable and be also continuous variable with its father node, conditional probability is as follows:
The variable that pedestrian jaywalks behavior cognition network model and includes has travel speed v of vehiclec, vehicle and pedestrian's
Relative position information (xr, yr), the lateral attitude of pedestrian and movable information (yp,vy) and the behavior that jaywalks of pedestrian be intended to
Information (Pw,Pcf,Pcl);
S223, defines the structure that pedestrian jaywalks behavior cognition network:Pedestrian jaywalks behavior cognition network structure
It is by expertise and driving experience, the relation that influences each other between each variable in network is defined, represents change with oriented arrow
The relation that influences each other between amount;
Pedestrian jaywalks the node in behavior cognition network structure includes state of motion of vehicle node (V), becomes discharge observation
Node O, pedestrian movement's state node (P), pedestrian and vehicle relative motion state node (R) and pedestrian jaywalk behavior and recognize
Know node (B), specific nodal information includes as follows:
State of motion of vehicle node:V=[vc];
Variable observer nodes:
Pedestrian movement's state node:
Pedestrian and vehicle relative motion state node:
Pedestrian jaywalks behavior cognitive nodes:B=[Pw,Pcf,Pcl];
Behavior cognition network structure is jaywalked by described pedestrian, can be turned with the state between following formula expression variable
Move:
P(Bt, Rt, Pt | Bt-1, Rt-1, Pt-1, Vt)=P (Bt | Bt-1,Rt-1)P(Rt|Bt,Rt-1,Vt)P(Pt|Bt,Rt,
Pt-1)
S224, learns the parameter that pedestrian jaywalks behavior cognition network:The pedestrian for being obtained by step S214 crosses horse
Road behavior database, learns network parameter based on MLE method.
For example:In real road, described is jaywalked to current pedestrian by the historical information of certain time in real time
Behavior estimated and predicted be by the historical information of current context information and certain time be input to study after pedestrian
In behavior cognitive information, pedestrian behavior cognitive information can pass through which in step 64) pedestrian that obtains jaywalks behavior database
In the experience and knowledge acquired, the pedestrian's state to currently jaywalking carries out behavior estimation and prediction, and horse is currently crossed in output
Pedestrian's state on road, the pedestrian's state for currently jaywalking includes the probability (P that pedestrian waitswait), the probability that quickly crosses
(Pcross fast), the probability (P that crosses at a slow speedcross slowly).
In one embodiment, step S300 is specifically included:
Input environment information and cognitive parameter;
By fuzzification operation environmental information and cognitive parameter, expert decision-making rule is built, obtains fuzzy result;And adopt
With weighting method, the fuzzy result is carried out anti fuzzy method, actual vehicle passage rate is obtained, by exporting vehicle pass-through speed
Submodule is sent to automatic driving vehicle bottom actuator.
The specific implementation method of above steps has been elaborated above, be will not be described here.
Last it is to be noted that:Above example only in order to technical scheme to be described, rather than a limitation.This
The those of ordinary skill in field should be understood:Technical scheme described in foregoing embodiments can be modified, or right
Which part technical characteristic carries out equivalent;These modifications or replacement, do not make the essence of appropriate technical solution depart from this
Invent the spirit and scope of each embodiment technical scheme.
Claims (11)
1. a kind of vehicle speed control device, it is characterised in that:Which includes:
Environment sensing and Fusion Module, are to jaywalk pedestrian information for detecting that vehicle travels front, and export and jaywalk
Pedestrian information;
Pedestrian jaywalks behavior cognition module, jaywalks pedestrian information for receiving, further according to jaywalking for receiving
Pedestrian information, and with reference to Dynamic Time Series information and vehicle and the effect of influencing each other of pedestrian, pedestrian behavior is carried out
Cognition, and export pedestrian behavior cognitive information;With
Automatic Pilot decision-making module, for receiving environmental information, pedestrian behavior cognition parameter, and jaywalks pedestrian according to described
Information and the pedestrian behavior cognitive information, the intention of prediction and judgement pedestrian, and the current passage rate of vehicle is given, export car
Current passage rate.
2. vehicle speed control device as claimed in claim 1, it is characterised in that:The environment sensing and Fusion Module bag
Include:
Information acquisition unit (1), the vehicle for collecting travels front related data;
Data pre-processing unit (2), the vehicle for collecting to described information collecting unit (1) travels front related data and enters
Row filtration treatment;
Data fusion unit (3), for related to travelling front via the pretreated vehicle of the data pre-processing unit (2)
Data carry out combined calibrating, data syn-chronization, K-means cluster, to obtain partial barriers area distribution;
Pedestrian detection unit (4), for examining in the regional area scattergram of the barrier that data fusion unit (3) are set up
Measure pedestrian target;With
Pedestrian tracking unit (5), for setting up the linear movement model of pedestrian target, and using linear goal track algorithm to institute
Stating the pedestrian target that pedestrian detection unit (4) detects carries out successive frame target following, obtains stable jaywalking pedestrian
Information.
3. vehicle speed control device as claimed in claim 2, it is characterised in that:
Described information collecting unit (1) includes single line laser radar (1a) and monocular cam (1b), the single line laser radar
(1a) it is used for the radar information that collection vehicle travels preceding object thing, before described monocular cam (1b) is used for collection vehicle traveling
Pedestrian's video information of side;
The work process of data fusion unit (3) is specific as follows:
31) combined calibrating:Using perspective transform by radar fix system [Xlidar,Ylidar, 1] and image coordinate system [Xcamera,
Ycamera, 1] and the vehicle axis system [X that is transformed into centered on vehicle front bumper barvehicle,Yvehicle, 1], Perspective transformation model
As follows:
Using the local coordinate under 12 pairs of laser radars, image and three coordinate systems of vehicle, obtain Perspective transformation model (2) and
(3) transformation matrix parameter;
32) data syn-chronization:The frequency acquisition of the monocular cam (1b) and the single line laser radar (1a) is set, with described
On the basis of the frequency acquisition of monocular cam (1b), the data gathered by the single line laser radar (1a) are sampled in real time,
Realize the data syn-chronization of the monocular cam (1b) and the single line laser radar (1a);
33) K-means is clustered to obtain partial barriers area distribution, is specifically included:
331) region of cluster is limited using geometrical constraint rule, and the region is the visual angle View of single line laser radar (1a)
(Lidar) and monocular cam (1b) visual angle View (Camera) intersection area { View (Lidar) ∩ View (Camera) }:
The geometrical constraint rule is:Obstacle target need to be in the visual angle View (Lidar) of single line laser radar (1a) and monocular
Visual angle View (Camera) intersection area of photographic head (1b):
LΔT,i∈{View(Lidar)∩View(Camera)} (4)
332) using clustering algorithm 331) intersection area that limits sets up partial barriers area distribution.
4. the vehicle speed control device as described in claim 1 or 2 or 3, it is characterised in that:The pedestrian jaywalks behavior
Cognition module includes:
The behavior database construction unit (6) that pedestrian jaywalks, for the horizontal stroke for exporting in the environment sensing and Fusion Module
On the basis of wearing road pedestrian information, the real roads data that pedestrian jaywalks are gathered, the behavior that pedestrian jaywalks are defined,
And by the method that demarcates, the behavior that each pedestrian jaywalks is demarcated, and it is stored as the thing that pedestrian jaywalks
Example, used as the behavior database that pedestrian jaywalks;With
Behavior cognition network model construction unit (7) that pedestrian jaywalks, jaywalks behavior cognition net for defining pedestrian
The parameter of network and pedestrian jaywalk the structure of behavior cognition network, and the behavior database for jaywalking based on the pedestrian
The pedestrian that construction unit (6) is given jaywalks behavior database, optimizes and learning network parameter, obtains the cognitive letter of pedestrian behavior
Breath.
5. vehicle speed control device as claimed in claim 4, it is characterised in that:The automatic Pilot decision-making module includes:
Environmental information and cognitive parameters unit (8), for input environment information and cognitive parameter;
Fuzzy operation unit (9), for by fuzzification operation environmental information and cognitive parameter, building expert decision-making rule, obtaining
Obtain fuzzy result;With
Anti-fuzzy operating unit (10), for being carried out the fuzzy result that fuzzy operation unit (9) export using weighting method
Anti fuzzy method, obtains actual vehicle passage rate, is sent to automatic driving vehicle bottom by exporting vehicle pass-through speed submodule
Layer actuator.
6. a kind of car speed control method, it is characterised in that:Which comprises the steps:
S100, it is to jaywalk pedestrian information that detection vehicle travels front, and exports and jaywalk pedestrian information;
S200, according to jaywalking pedestrian information, and the mutual shadow with reference to Dynamic Time Series information and vehicle and pedestrian
Effect is rung, and cognition is carried out to pedestrian behavior, and exports pedestrian behavior cognitive information;And
S300, according to the intention for jaywalking pedestrian information and the pedestrian behavior cognitive information, predicting and judge pedestrian,
And provide the current passage rate of vehicle, the current passage rate of output vehicle.
7. car speed control method as claimed in claim 6, it is characterised in that:Step S100 is specifically included:
S110, single line laser radar collection vehicle travels the radar information of preceding object thing, and monocular cam collection vehicle is travelled
Pedestrian's video information in front;
S120, the vehicle collected by step S110 travels the radar information of preceding object thing and pedestrian's video information is filtered
Process;
S130, enters to the radar information via the pretreated vehicle traveling preceding object thing of step S120 and pedestrian's video information
Row combined calibrating, data syn-chronization, K-means cluster, to obtain partial barriers area distribution;
S140, detects pedestrian target in the regional area scattergram of the barrier that step S130 is set up;And
S150, is set up the linear movement model of pedestrian target, and using linear goal track algorithm, step S140 is detected
Pedestrian target carry out successive frame target following, obtain stable jaywalking pedestrian information.
8. car speed control method as claimed in claims 6 or 7, it is characterised in that:Step S200 is specifically included:
S210, on the basis of pedestrian information is jaywalked, gathers the real roads data that pedestrian jaywalks, and defines pedestrian's horizontal stroke
The behavior of road is worn, and by the method that demarcates, the behavior that each pedestrian jaywalks is demarcated, and be stored as row
The example that people jaywalks, used as the behavior database that pedestrian jaywalks;And
S220, defines pedestrian and jaywalks the knot that the parameter of behavior cognition network and pedestrian jaywalk behavior cognition network
Structure, and behavior database is jaywalked based on the pedestrian, optimize and learning network parameter, obtain pedestrian behavior cognitive information.
9. car speed control method as claimed in claim 8, it is characterised in that:Step S210 includes:
S211, gathers real roads data:Pedestrian information is jaywalked based on what environment sensing and fusion mould were obtained, gather pedestrian
The real roads data for jaywalking, the real roads data that the pedestrian jaywalks are expressed as vectorial D, D={ vc,P,xr,
yr, wherein:vcRepresent the travel speed of vehicle, P has indicated whether pedestrian, xrRepresent the longitudinally opposed position of vehicle and pedestrian,
yrRepresent the laterally opposed position of vehicle and pedestrian;
S212, defines pedestrian and jaywalks behavior:Pedestrian jaywalks behavior representation for vector B={ wait, cross
Straight fast, cross straight slowly }, wherein, wait is expressed as wait behavior, cross straight
Fast is expressed as quickly crossing behavior, and cross straight slowly represents at a slow speed and crosses;
S213, demarcates pedestrian and jaywalks behavior:According to the real roads data that the pedestrian that step S211 is collected jaywalks,
The behavior for each pedestrian being jaywalked by the method that demarcates according to step S212, defined in pedestrian jaywalk behavior and enter
Row classification is demarcated;
S214, storage pedestrian jaywalks behavioral data, used as the behavior database that pedestrian jaywalks:In storing step S213
The pedestrian for having demarcated jaywalks the example that behavior is jaywalked as multiple pedestrians, forms the behavioral data that pedestrian jaywalks
Storehouse.
10. car speed control method as claimed in claim 9, it is characterised in that:Step S220 includes:
S221, defines the parameter that pedestrian jaywalks behavior cognition network:Based on the cognition network of Dynamic Bayesian, dynamic pattra leaves
This cognition network includes pro-active network B1With transfer network B→, pro-active network B1It is defined as the conditional probability distribution of original state
And the relation between multivariate, shift network B→The relation between t-1 moment and t is defined as with conditional probability distribution, and
The pedestrian behavior cognitive information meets single order Markov hypothesis;
S222, pedestrian behavior cognitive information is the directed acyclic graph with time dimension, and variable is represented with node, between variable
Mutual relation represent with oriented arrow and conditional probability, conditional probability distribution function is defined as follows three kinds of situations:
If node and its father node are all discrete variables, conditional probability represents as follows:
P (Z=i | Pa (Z)=j)=P (i, j)
If node discrete variable and be continuous variable with its father node, conditional probability represents as follows:
If node continuous variable and be also continuous variable with its father node, conditional probability is as follows:
The variable that pedestrian jaywalks behavior cognition network model and includes has travel speed v of vehiclec, vehicle and pedestrian's is relative
Positional information (xr, yr), the lateral attitude of pedestrian and movable information (yp,vy) and the behavior intent information that jaywalks of pedestrian
(Pw,Pcf,Pcl);
S223, defines the structure that pedestrian jaywalks behavior cognition network:It is logical that pedestrian jaywalks behavior cognition network structure
Expertise and driving experience is crossed, the relation that influences each other between each variable in network is defined, is represented between variable with oriented arrow
The relation that influences each other;
Pedestrian jaywalks the node in behavior cognition network structure includes state of motion of vehicle node (V), variable observer nodes
O, pedestrian movement's state node (P), pedestrian and vehicle relative motion state node (R) and pedestrian jaywalk the cognitive section of behavior
Point (B), specific nodal information includes as follows:
State of motion of vehicle node:V=[vc];
Variable observer nodes:
Pedestrian movement's state node:
Pedestrian and vehicle relative motion state node:
Pedestrian jaywalks behavior cognitive nodes:B=[Pw,Pcf,Pcl];
Behavior cognition network structure is jaywalked by described pedestrian, can be with the state transfer between following formula expression variable:
P(Bt,Rt,Pt|Bt-1,Rt-1,Pt-1,Vt)=
P(Bt|Bt-1,Rt-1)P(Rt|Bt,Rt-1,Vt)P(Pt|Bt,Rt,Pt-1);
S224, learns the parameter that pedestrian jaywalks behavior cognition network:The pedestrian for being obtained by step S214 jaywalks row
For data base, network parameter is learnt based on MLE method.
11. vehicle speed control devices as claimed in claim 10, it is characterised in that:Step S300 is specifically included:
Input environment information and cognitive parameter;
By fuzzification operation environmental information and cognitive parameter, expert decision-making rule is built, obtains fuzzy result;And
The fuzzy result is carried out by anti fuzzy method using weighting method, actual vehicle passage rate is obtained, led to by exporting vehicle
Scanning frequency degree submodule is sent to automatic driving vehicle bottom actuator.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610806022.6A CN106428000B (en) | 2016-09-07 | 2016-09-07 | A kind of vehicle speed control device and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610806022.6A CN106428000B (en) | 2016-09-07 | 2016-09-07 | A kind of vehicle speed control device and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106428000A true CN106428000A (en) | 2017-02-22 |
CN106428000B CN106428000B (en) | 2018-12-21 |
Family
ID=58164773
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610806022.6A Active CN106428000B (en) | 2016-09-07 | 2016-09-07 | A kind of vehicle speed control device and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106428000B (en) |
Cited By (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108256487A (en) * | 2018-01-19 | 2018-07-06 | 北京工业大学 | A kind of driving state detection device and method based on reversed binocular |
CN108733042A (en) * | 2017-04-19 | 2018-11-02 | 上海汽车集团股份有限公司 | The method for tracking target and device of automatic driving vehicle |
CN108764373A (en) * | 2018-06-08 | 2018-11-06 | 北京领骏科技有限公司 | A kind of sensing data filtering and fusion method in automatic Pilot |
CN108803626A (en) * | 2018-08-16 | 2018-11-13 | 大连民族大学 | The system of Autonomous Vehicle or DAS (Driver Assistant System) programme path |
WO2018205245A1 (en) * | 2017-05-12 | 2018-11-15 | 中国科学院深圳先进技术研究院 | Strategy network model generation method and apparatus for automatic vehicle driving |
CN108961838A (en) * | 2018-08-16 | 2018-12-07 | 大连民族大学 | Road pedestrian categorizing system |
CN108974006A (en) * | 2017-06-01 | 2018-12-11 | 本田技研工业株式会社 | Prediction meanss, vehicle, prediction technique and storage medium |
CN109034120A (en) * | 2018-08-27 | 2018-12-18 | 合肥工业大学 | Scene understanding method towards smart machine independent behaviour |
CN109063642A (en) * | 2018-08-01 | 2018-12-21 | 广州大学 | The prediction technique and system that pedestrian based on HMM algorithm goes across the road |
CN109147389A (en) * | 2018-08-16 | 2019-01-04 | 大连民族大学 | The method of Autonomous Vehicle or DAS (Driver Assistant System) programme path |
CN109165591A (en) * | 2018-08-16 | 2019-01-08 | 大连民族大学 | road pedestrian classification method |
CN109407660A (en) * | 2017-08-18 | 2019-03-01 | 通用汽车环球科技运作有限责任公司 | It is controlled using strategy triggering and the independent behaviour executed |
CN109532826A (en) * | 2017-09-21 | 2019-03-29 | 天津所托瑞安汽车科技有限公司 | A kind of radar anticollision method for early warning based on the optimization of lane line Visual identification technology |
CN109544604A (en) * | 2018-11-28 | 2019-03-29 | 天津工业大学 | Method for tracking target based on cognition network |
CN109670597A (en) * | 2017-09-20 | 2019-04-23 | 顾泽苍 | A kind of more purpose control methods of the machine learning of automatic Pilot |
CN109703563A (en) * | 2017-10-25 | 2019-05-03 | 本田技研工业株式会社 | Vehicle, travel controlling system and travel control method |
CN109712388A (en) * | 2019-01-24 | 2019-05-03 | 华南理工大学 | The street crossing of a kind of non-motor vehicle or pedestrian are intended to detection system and method |
CN109859527A (en) * | 2019-01-30 | 2019-06-07 | 杭州鸿泉物联网技术股份有限公司 | A kind of non-motor vehicle turning method for early warning and device |
CN110378178A (en) * | 2018-09-30 | 2019-10-25 | 长城汽车股份有限公司 | Method for tracking target and device |
CN110531753A (en) * | 2018-05-24 | 2019-12-03 | 通用汽车环球科技运作有限责任公司 | Control system, control method and the controller of autonomous vehicle |
WO2019232913A1 (en) * | 2018-06-08 | 2019-12-12 | 北京洪泰同创信息技术有限公司 | Method for controlling transportation means, device, and system |
CN110738081A (en) * | 2018-07-19 | 2020-01-31 | 杭州海康威视数字技术股份有限公司 | Abnormal road condition detection method and device |
CN111045025A (en) * | 2018-10-15 | 2020-04-21 | 图森有限公司 | Vehicle tracking method and system based on light detection and distance measurement |
CN111095380A (en) * | 2017-09-20 | 2020-05-01 | 本田技研工业株式会社 | Vehicle control device, vehicle control method, and program |
CN111344761A (en) * | 2017-11-15 | 2020-06-26 | 三菱电机株式会社 | Vehicle exterior communication device, vehicle exterior communication method, information processing device, and vehicle exterior communication program |
CN111542295A (en) * | 2017-12-28 | 2020-08-14 | 四川金瑞麒智能科学技术有限公司 | Automatic driving method and system for intelligent wheelchair and computer readable medium |
CN111785009A (en) * | 2020-07-20 | 2020-10-16 | 华录易云科技有限公司 | Pedestrian crossing active early warning method and system based on video detection |
CN111886638A (en) * | 2018-03-28 | 2020-11-03 | 京瓷株式会社 | Image processing device, imaging device, and moving object |
CN111907520A (en) * | 2020-07-31 | 2020-11-10 | 东软睿驰汽车技术(沈阳)有限公司 | Pedestrian posture recognition method and device and unmanned automobile |
CN112009470A (en) * | 2020-09-08 | 2020-12-01 | 科大讯飞股份有限公司 | Vehicle running control method, device, equipment and storage medium |
CN112015178A (en) * | 2020-08-20 | 2020-12-01 | 中国第一汽车股份有限公司 | Control method, device, equipment and storage medium |
CN112131756A (en) * | 2020-10-10 | 2020-12-25 | 清华大学 | Pedestrian crossing scene simulation method considering individual shock rate |
CN112242069A (en) * | 2019-07-17 | 2021-01-19 | 华为技术有限公司 | Method and device for determining vehicle speed |
CN112572462A (en) * | 2019-09-30 | 2021-03-30 | 北京百度网讯科技有限公司 | Automatic driving control method and device, electronic equipment and storage medium |
WO2021134357A1 (en) * | 2019-12-30 | 2021-07-08 | 深圳元戎启行科技有限公司 | Perception information processing method and apparatus, computer device and storage medium |
WO2021134169A1 (en) * | 2019-12-30 | 2021-07-08 | 华为技术有限公司 | Trajectory prediction method and related device |
US20210294323A1 (en) * | 2020-03-17 | 2021-09-23 | Nissan North America, Inc. | Apparatus and Method for Post-Processing a Decision-Making Model of an Autonomous Vehicle Using Multivariate Data |
CN113561974A (en) * | 2021-08-25 | 2021-10-29 | 清华大学 | Collision risk prediction method based on vehicle behavior interaction and road structure coupling |
WO2021217752A1 (en) * | 2020-04-27 | 2021-11-04 | 清华大学 | Vehicle-pedestrian collision risk region calculation method and safety evaluation system |
CN114475587A (en) * | 2022-01-30 | 2022-05-13 | 重庆长安汽车股份有限公司 | Risk assessment algorithm introducing target behaviors and collision probability |
US11500380B2 (en) | 2017-02-10 | 2022-11-15 | Nissan North America, Inc. | Autonomous vehicle operational management including operating a partially observable Markov decision process model instance |
WO2022247733A1 (en) * | 2021-05-25 | 2022-12-01 | 华为技术有限公司 | Control method and apparatus |
US11577746B2 (en) | 2020-01-31 | 2023-02-14 | Nissan North America, Inc. | Explainability of autonomous vehicle decision making |
US11613269B2 (en) | 2019-12-23 | 2023-03-28 | Nissan North America, Inc. | Learning safety and human-centered constraints in autonomous vehicles |
US11635758B2 (en) | 2019-11-26 | 2023-04-25 | Nissan North America, Inc. | Risk aware executor with action set recommendations |
CN116186336A (en) * | 2023-03-01 | 2023-05-30 | 丰田自动车株式会社 | Driving data acquisition and calibration method, device and storage medium |
US11702070B2 (en) | 2017-10-31 | 2023-07-18 | Nissan North America, Inc. | Autonomous vehicle operation with explicit occlusion reasoning |
US11714971B2 (en) | 2020-01-31 | 2023-08-01 | Nissan North America, Inc. | Explainability of autonomous vehicle decision making |
US11874120B2 (en) | 2017-12-22 | 2024-01-16 | Nissan North America, Inc. | Shared autonomous vehicle operational management |
US11899454B2 (en) | 2019-11-26 | 2024-02-13 | Nissan North America, Inc. | Objective-based reasoning in autonomous vehicle decision-making |
US11904854B2 (en) | 2020-03-30 | 2024-02-20 | Toyota Research Institute, Inc. | Systems and methods for modeling pedestrian activity |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07105477A (en) * | 1993-10-08 | 1995-04-21 | Toyota Motor Corp | Pedestrain sensing alarm system for vehicle |
CN102096803A (en) * | 2010-11-29 | 2011-06-15 | 吉林大学 | Safe state recognition system for people on basis of machine vision |
CN104573646A (en) * | 2014-12-29 | 2015-04-29 | 长安大学 | Detection method and system, based on laser radar and binocular camera, for pedestrian in front of vehicle |
EP2883769A2 (en) * | 2013-12-12 | 2015-06-17 | Robert Bosch Gmbh | Method and device for the lateral guidance of a motor vehicle, in particular for assisting evasive action |
CN104802796A (en) * | 2014-01-27 | 2015-07-29 | 罗伯特·博世有限公司 | Method for operating a driver assistance system, and driver assistance system |
CN104915628A (en) * | 2014-03-14 | 2015-09-16 | 株式会社理光 | Pedestrian movement prediction method and device by carrying out scene modeling based on vehicle-mounted camera |
-
2016
- 2016-09-07 CN CN201610806022.6A patent/CN106428000B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07105477A (en) * | 1993-10-08 | 1995-04-21 | Toyota Motor Corp | Pedestrain sensing alarm system for vehicle |
CN102096803A (en) * | 2010-11-29 | 2011-06-15 | 吉林大学 | Safe state recognition system for people on basis of machine vision |
EP2883769A2 (en) * | 2013-12-12 | 2015-06-17 | Robert Bosch Gmbh | Method and device for the lateral guidance of a motor vehicle, in particular for assisting evasive action |
CN104802796A (en) * | 2014-01-27 | 2015-07-29 | 罗伯特·博世有限公司 | Method for operating a driver assistance system, and driver assistance system |
CN104915628A (en) * | 2014-03-14 | 2015-09-16 | 株式会社理光 | Pedestrian movement prediction method and device by carrying out scene modeling based on vehicle-mounted camera |
CN104573646A (en) * | 2014-12-29 | 2015-04-29 | 长安大学 | Detection method and system, based on laser radar and binocular camera, for pedestrian in front of vehicle |
Non-Patent Citations (1)
Title |
---|
马超奇: ""雷达与视频相融合的行人检测技术研究"", 《国防科学技术大学硕士学位论文》 * |
Cited By (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11500380B2 (en) | 2017-02-10 | 2022-11-15 | Nissan North America, Inc. | Autonomous vehicle operational management including operating a partially observable Markov decision process model instance |
CN108733042A (en) * | 2017-04-19 | 2018-11-02 | 上海汽车集团股份有限公司 | The method for tracking target and device of automatic driving vehicle |
WO2018205245A1 (en) * | 2017-05-12 | 2018-11-15 | 中国科学院深圳先进技术研究院 | Strategy network model generation method and apparatus for automatic vehicle driving |
CN108974006B (en) * | 2017-06-01 | 2021-08-10 | 本田技研工业株式会社 | Prediction device |
CN108974006A (en) * | 2017-06-01 | 2018-12-11 | 本田技研工业株式会社 | Prediction meanss, vehicle, prediction technique and storage medium |
CN109407660A (en) * | 2017-08-18 | 2019-03-01 | 通用汽车环球科技运作有限责任公司 | It is controlled using strategy triggering and the independent behaviour executed |
CN111095380B (en) * | 2017-09-20 | 2022-03-15 | 本田技研工业株式会社 | Vehicle control device, vehicle control method, and storage medium |
US11276312B2 (en) | 2017-09-20 | 2022-03-15 | Honda Motor Co., Ltd. | Vehicle control apparatus, vehicle control method, and program |
CN109670597A (en) * | 2017-09-20 | 2019-04-23 | 顾泽苍 | A kind of more purpose control methods of the machine learning of automatic Pilot |
CN111095380A (en) * | 2017-09-20 | 2020-05-01 | 本田技研工业株式会社 | Vehicle control device, vehicle control method, and program |
CN109532826A (en) * | 2017-09-21 | 2019-03-29 | 天津所托瑞安汽车科技有限公司 | A kind of radar anticollision method for early warning based on the optimization of lane line Visual identification technology |
CN109703563B (en) * | 2017-10-25 | 2022-03-11 | 本田技研工业株式会社 | Vehicle, travel control device, and travel control method |
CN109703563A (en) * | 2017-10-25 | 2019-05-03 | 本田技研工业株式会社 | Vehicle, travel controlling system and travel control method |
US11702070B2 (en) | 2017-10-31 | 2023-07-18 | Nissan North America, Inc. | Autonomous vehicle operation with explicit occlusion reasoning |
CN111344761A (en) * | 2017-11-15 | 2020-06-26 | 三菱电机株式会社 | Vehicle exterior communication device, vehicle exterior communication method, information processing device, and vehicle exterior communication program |
US11874120B2 (en) | 2017-12-22 | 2024-01-16 | Nissan North America, Inc. | Shared autonomous vehicle operational management |
CN111542295A (en) * | 2017-12-28 | 2020-08-14 | 四川金瑞麒智能科学技术有限公司 | Automatic driving method and system for intelligent wheelchair and computer readable medium |
CN108256487A (en) * | 2018-01-19 | 2018-07-06 | 北京工业大学 | A kind of driving state detection device and method based on reversed binocular |
CN111886638A (en) * | 2018-03-28 | 2020-11-03 | 京瓷株式会社 | Image processing device, imaging device, and moving object |
CN111886638B (en) * | 2018-03-28 | 2023-01-03 | 京瓷株式会社 | Image processing device, imaging device, and moving object |
CN110531753A (en) * | 2018-05-24 | 2019-12-03 | 通用汽车环球科技运作有限责任公司 | Control system, control method and the controller of autonomous vehicle |
CN110531753B (en) * | 2018-05-24 | 2022-10-21 | 通用汽车环球科技运作有限责任公司 | Control system, control method and controller for autonomous vehicle |
WO2019232913A1 (en) * | 2018-06-08 | 2019-12-12 | 北京洪泰同创信息技术有限公司 | Method for controlling transportation means, device, and system |
CN108764373B (en) * | 2018-06-08 | 2021-11-30 | 北京领骏科技有限公司 | Sensor data filtering and fusing method for automatic driving |
CN108764373A (en) * | 2018-06-08 | 2018-11-06 | 北京领骏科技有限公司 | A kind of sensing data filtering and fusion method in automatic Pilot |
CN110738081A (en) * | 2018-07-19 | 2020-01-31 | 杭州海康威视数字技术股份有限公司 | Abnormal road condition detection method and device |
CN109063642A (en) * | 2018-08-01 | 2018-12-21 | 广州大学 | The prediction technique and system that pedestrian based on HMM algorithm goes across the road |
CN109147389A (en) * | 2018-08-16 | 2019-01-04 | 大连民族大学 | The method of Autonomous Vehicle or DAS (Driver Assistant System) programme path |
CN109165591A (en) * | 2018-08-16 | 2019-01-08 | 大连民族大学 | road pedestrian classification method |
CN108961838A (en) * | 2018-08-16 | 2018-12-07 | 大连民族大学 | Road pedestrian categorizing system |
CN108803626B (en) * | 2018-08-16 | 2021-01-26 | 大连民族大学 | System for planning a route for an autonomous vehicle or a driver assistance system |
CN108803626A (en) * | 2018-08-16 | 2018-11-13 | 大连民族大学 | The system of Autonomous Vehicle or DAS (Driver Assistant System) programme path |
CN109034120A (en) * | 2018-08-27 | 2018-12-18 | 合肥工业大学 | Scene understanding method towards smart machine independent behaviour |
CN109034120B (en) * | 2018-08-27 | 2022-05-10 | 合肥工业大学 | Scene understanding method for autonomous behavior of intelligent device |
CN110378178A (en) * | 2018-09-30 | 2019-10-25 | 长城汽车股份有限公司 | Method for tracking target and device |
CN110378178B (en) * | 2018-09-30 | 2022-01-28 | 毫末智行科技有限公司 | Target tracking method and device |
CN111045025A (en) * | 2018-10-15 | 2020-04-21 | 图森有限公司 | Vehicle tracking method and system based on light detection and distance measurement |
CN109544604B (en) * | 2018-11-28 | 2023-12-01 | 深圳拓扑视通科技有限公司 | Target tracking method based on cognitive network |
CN109544604A (en) * | 2018-11-28 | 2019-03-29 | 天津工业大学 | Method for tracking target based on cognition network |
CN109712388B (en) * | 2019-01-24 | 2021-03-30 | 华南理工大学 | Street crossing intention detection system and method for non-motor vehicle or pedestrian |
CN109712388A (en) * | 2019-01-24 | 2019-05-03 | 华南理工大学 | The street crossing of a kind of non-motor vehicle or pedestrian are intended to detection system and method |
CN109859527A (en) * | 2019-01-30 | 2019-06-07 | 杭州鸿泉物联网技术股份有限公司 | A kind of non-motor vehicle turning method for early warning and device |
WO2021008605A1 (en) * | 2019-07-17 | 2021-01-21 | 华为技术有限公司 | Method and device for determining vehicle speed |
CN112242069A (en) * | 2019-07-17 | 2021-01-19 | 华为技术有限公司 | Method and device for determining vehicle speed |
US11273838B2 (en) | 2019-07-17 | 2022-03-15 | Huawei Technologies Co., Ltd. | Method and apparatus for determining vehicle speed |
CN112242069B (en) * | 2019-07-17 | 2021-10-01 | 华为技术有限公司 | Method and device for determining vehicle speed |
CN112572462A (en) * | 2019-09-30 | 2021-03-30 | 北京百度网讯科技有限公司 | Automatic driving control method and device, electronic equipment and storage medium |
US11529971B2 (en) | 2019-09-30 | 2022-12-20 | Apollo Intelligent Driving Technology (Beijing) Co., Ltd. | Method and apparatus for autonomous driving control, electronic device, and storage medium |
US11899454B2 (en) | 2019-11-26 | 2024-02-13 | Nissan North America, Inc. | Objective-based reasoning in autonomous vehicle decision-making |
US11635758B2 (en) | 2019-11-26 | 2023-04-25 | Nissan North America, Inc. | Risk aware executor with action set recommendations |
US11613269B2 (en) | 2019-12-23 | 2023-03-28 | Nissan North America, Inc. | Learning safety and human-centered constraints in autonomous vehicles |
WO2021134357A1 (en) * | 2019-12-30 | 2021-07-08 | 深圳元戎启行科技有限公司 | Perception information processing method and apparatus, computer device and storage medium |
CN113383283A (en) * | 2019-12-30 | 2021-09-10 | 深圳元戎启行科技有限公司 | Perception information processing method and device, computer equipment and storage medium |
WO2021134169A1 (en) * | 2019-12-30 | 2021-07-08 | 华为技术有限公司 | Trajectory prediction method and related device |
US11714971B2 (en) | 2020-01-31 | 2023-08-01 | Nissan North America, Inc. | Explainability of autonomous vehicle decision making |
US11577746B2 (en) | 2020-01-31 | 2023-02-14 | Nissan North America, Inc. | Explainability of autonomous vehicle decision making |
US11782438B2 (en) * | 2020-03-17 | 2023-10-10 | Nissan North America, Inc. | Apparatus and method for post-processing a decision-making model of an autonomous vehicle using multivariate data |
US20210294323A1 (en) * | 2020-03-17 | 2021-09-23 | Nissan North America, Inc. | Apparatus and Method for Post-Processing a Decision-Making Model of an Autonomous Vehicle Using Multivariate Data |
US11904854B2 (en) | 2020-03-30 | 2024-02-20 | Toyota Research Institute, Inc. | Systems and methods for modeling pedestrian activity |
WO2021217752A1 (en) * | 2020-04-27 | 2021-11-04 | 清华大学 | Vehicle-pedestrian collision risk region calculation method and safety evaluation system |
CN111785009A (en) * | 2020-07-20 | 2020-10-16 | 华录易云科技有限公司 | Pedestrian crossing active early warning method and system based on video detection |
CN111907520A (en) * | 2020-07-31 | 2020-11-10 | 东软睿驰汽车技术(沈阳)有限公司 | Pedestrian posture recognition method and device and unmanned automobile |
CN112015178B (en) * | 2020-08-20 | 2022-10-21 | 中国第一汽车股份有限公司 | Control method, device, equipment and storage medium |
CN112015178A (en) * | 2020-08-20 | 2020-12-01 | 中国第一汽车股份有限公司 | Control method, device, equipment and storage medium |
CN112009470A (en) * | 2020-09-08 | 2020-12-01 | 科大讯飞股份有限公司 | Vehicle running control method, device, equipment and storage medium |
CN112009470B (en) * | 2020-09-08 | 2022-01-14 | 科大讯飞股份有限公司 | Vehicle running control method, device, equipment and storage medium |
CN112131756A (en) * | 2020-10-10 | 2020-12-25 | 清华大学 | Pedestrian crossing scene simulation method considering individual shock rate |
CN112131756B (en) * | 2020-10-10 | 2021-04-30 | 清华大学 | Pedestrian crossing scene simulation method considering individual shock rate |
WO2022247733A1 (en) * | 2021-05-25 | 2022-12-01 | 华为技术有限公司 | Control method and apparatus |
CN113561974A (en) * | 2021-08-25 | 2021-10-29 | 清华大学 | Collision risk prediction method based on vehicle behavior interaction and road structure coupling |
CN113561974B (en) * | 2021-08-25 | 2023-11-24 | 清华大学 | Collision risk prediction method based on coupling of vehicle behavior interaction and road structure |
CN114475587A (en) * | 2022-01-30 | 2022-05-13 | 重庆长安汽车股份有限公司 | Risk assessment algorithm introducing target behaviors and collision probability |
CN114475587B (en) * | 2022-01-30 | 2024-04-30 | 重庆长安汽车股份有限公司 | Risk assessment algorithm for introducing target behaviors and collision probability |
CN116186336A (en) * | 2023-03-01 | 2023-05-30 | 丰田自动车株式会社 | Driving data acquisition and calibration method, device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN106428000B (en) | 2018-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106428000B (en) | A kind of vehicle speed control device and method | |
Gao et al. | Multivariate time series prediction of lane changing behavior using deep neural network | |
CN107609522B (en) | Information fusion vehicle detection system based on laser radar and machine vision | |
Olabiyi et al. | Driver action prediction using deep (bidirectional) recurrent neural network | |
Masmoudi et al. | Object detection learning techniques for autonomous vehicle applications | |
Sivaraman et al. | Learning multi-lane trajectories using vehicle-based vision | |
Zhang et al. | A framework for turning behavior classification at intersections using 3D LIDAR | |
CN112614373B (en) | BiLSTM-based weekly vehicle lane change intention prediction method | |
CN114724392B (en) | Dynamic signal control method for expressway exit ramp and adjacent intersection | |
Palazzo et al. | Domain adaptation for outdoor robot traversability estimation from RGB data with safety-preserving loss | |
CN109343051A (en) | A kind of multi-Sensor Information Fusion Approach driven for advanced auxiliary | |
Zhang | Resnet-based model for autonomous vehicles trajectory prediction | |
Zhao et al. | A review for the driving behavior recognition methods based on vehicle multisensor information | |
Wang et al. | Deep understanding of big geospatial data for self-driving: Data, technologies, and systems | |
Mori et al. | Moving objects detection and classification based on trajectories of LRF scan data on a grid map | |
Schmuedderich et al. | System approach for multi-purpose representations of traffic scene elements | |
Zhao et al. | Improving Autonomous Vehicle Visual Perception by Fusing Human Gaze and Machine Vision | |
Xu et al. | Multiview Fusion 3D Target Information Perception Model in Nighttime Unmanned Intelligent Vehicles | |
Zhang et al. | Prediction of Pedestrian Risky Level for Intelligent Vehicles | |
Jawed et al. | Data-driven vehicle trajectory forecasting | |
EP3879461A1 (en) | Device and method for training a neuronal network | |
Englund | Aware and Intelligent Infrastructure for Action Intention Recognition of Cars and Bicycles. | |
Luo | LiDAR based perception system: Pioneer technology for safety driving | |
Kress et al. | Pose based action recognition of vulnerable road users using recurrent neural networks | |
Maurya et al. | Pedestrian detection and vulnerability decision in videos |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |