CN109878512A - Automatic Pilot control method, device, equipment and computer readable storage medium - Google Patents
Automatic Pilot control method, device, equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN109878512A CN109878512A CN201910036809.2A CN201910036809A CN109878512A CN 109878512 A CN109878512 A CN 109878512A CN 201910036809 A CN201910036809 A CN 201910036809A CN 109878512 A CN109878512 A CN 109878512A
- Authority
- CN
- China
- Prior art keywords
- current driving
- time section
- traveling together
- group traveling
- driving object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Abstract
The present invention provides a kind of automatic Pilot control method, device, equipment and computer readable storage medium.The image of Driving Scene where the embodiment of the present invention acquires current driving object, pass through first nerves network, based at least action message of a group traveling together in a historical time section t1 in described image, the prediction at least movement trend of a group traveling together in future time section t2;According to the motion planning information of movement trend and the current driving object of the described at least a group traveling together in future time section t2, Driving control is carried out to the current driving object, due to reference at least action message of a group traveling together in a historical time section t1, the Accurate Prediction of the movement trend at least a group traveling together in future time section t2 may be implemented, and movement trend of the pedestrian in future time section t2 is combined, to realize the accurate Driving control to current driving object.
Description
[technical field]
The present invention relates to automatic Pilot technology more particularly to a kind of automatic Pilot control method, device, equipment and computers
Readable storage medium storing program for executing.
[background technique]
With the development of automatic Pilot technology, artificial intelligence, vision calculating, radar, monitoring device and the whole world can be relied on
Positioning system cooperative cooperating allows computer can operate vehicle to automatic safe under the operation of nobody class active.Automatically
Driving technology can be avoided occur in pilot steering due to operating mistake, fatigue driving etc. caused by traffic accident,
To improve traffic safety.
Then, existing automatic driving vehicle can only recognize people or vehicle, not in the perception identification to pedestrian
It can judge that the movement of people from garage is intended to, only by pedestrian as the traveling of stationary obstruction control vehicle, and in fact, pedestrian may
It is movement, even if the previous second is stationary, but next second is possible to go ahead, walk back, and only treats as pedestrian
The traveling of stationary obstruction control vehicle may cause traffic accident, to reduce traffic safety.
[summary of the invention]
Many aspects of the invention provide a kind of automatic Pilot control method, device, equipment and computer-readable storage medium
Matter, to improve the safety of automatic Pilot technology.
An aspect of of the present present invention provides a kind of automatic Pilot control method, comprising:
The image of Driving Scene where acquiring current driving object;
By first nerves network, believed based at least movement of a group traveling together in a historical time section t1 in described image
Breath, the prediction at least movement trend of a group traveling together in future time section t2;
According to the movement of movement trend and the current driving object of the described at least a group traveling together in future time section t2
Planning information carries out Driving control to the current driving object.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the current line
Sailing object includes: vehicle or robot.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the movement rule
Draw information include: driving path, on driving path each position point driving direction, velocity and acceleration.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the movement letter
Breath includes following any one or more: waving, mentions leg, turns round, headwork;
The movement trend includes: static or movement.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, it is described to be based on institute
At least action message of a group traveling together in a historical time section t1 is stated in image, predicts described at least a group traveling together in future time
Movement trend in section t2, comprising:
Respectively for an at least at least frame history image of a group traveling together in a historical time section t1, history image is extracted
In each pedestrian limb action feature;
Limb action feature of each pedestrian in historical time section t1 is indicated with a motion vector respectively;
The first offset moment matrix is generated according to the motion vector of described at least a group traveling together, based on the first offset moment matrix
Obtain the second offset moment matrix for indicating at least movement trend of a group traveling together in future time section t2;
It is obtained by the second offset moment matrix for indicating at least limbs of a group traveling together in future time section t2
The motion vector of motion characteristic;
The motion vector pair for indicating at least limb action feature of a group traveling together in future time section t2 is obtained respectively
The limb action feature answered;
Respectively according at least limb action feature of a group traveling together in future time section t2, obtains at least a group traveling together and exist
Movement trend in future time section t2.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the movement become
When gesture is static, at least movement trend and the current driving object of a group traveling together in future time section t2 according to
Motion planning information, to the current driving object carry out Driving control, comprising:
According to the distance between the position of described at least a group traveling together, described at least a group traveling together and described current driving object,
And the motion planning information and current motion state of the current driving object, determine the driving of the current driving object
Control strategy;
Motion planning information and Driving control strategy based on the current driving object, to the current driving object into
Row Driving control.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the movement become
When gesture is movement, at least movement trend and the current driving object of a group traveling together in future time section t2 according to
Motion planning information, to the current driving object carry out Driving control, comprising:
By nervus opticus network, movement trend based on described at least a group traveling together in future time section t2 predicts institute
State at least walking behavioural information of a group traveling together in future time section t2;
According to walking behavioural information of the described at least a group traveling together in future time section t2 and the current driving object
Motion planning information and current motion state, determine the Driving control strategy of the current driving object;
Motion planning information and Driving control strategy based on the current driving object, to the current driving object into
Row Driving control.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the walking row
It include: direction of travel, the speed of travel, walking path for information.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the driving control
System strategy includes following any one: ignoring, deceleration, acceleration, stops, following, bypassing;
The motion planning information and Driving control strategy based on the current driving object, to the current driving pair
As carrying out Driving control, comprising:
Driving control strategy based on the current driving object to the motion planning information of the current driving object into
Row adjustment, and Driving control is carried out to the current driving object based on motion planning information adjusted.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, further includes:
The first nerves network is trained using the Sample video for including pedestrian.
Another aspect of the present invention provides a kind of automatic Pilot control device, comprising:
Image acquisition units, the image for Driving Scene where acquiring current driving object;
First predicting unit, for passing through first nerves network, based at least a group traveling together in described image in a history
Action message in time period t 1, the prediction at least movement trend of a group traveling together in future time section t2;
Control unit, for according at least movement trend and the current line of a group traveling together in future time section t2
The motion planning information for sailing object carries out Driving control to the current driving object.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the current line
Sailing object includes: vehicle or robot.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the movement rule
Draw information include: driving path, on driving path each position point driving direction, velocity and acceleration.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the movement letter
Breath includes following any one or more: waving, mentions leg, turns round, headwork;
The movement trend includes: static or movement.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, described first is pre-
Unit is surveyed, is specifically used for passing through the first nerves network:
Respectively for an at least at least frame history image of a group traveling together in a historical time section t1, history image is extracted
In each pedestrian limb action feature;
Limb action feature of each pedestrian in historical time section t1 is indicated with a motion vector respectively;
The first offset moment matrix is generated according to the motion vector of described at least a group traveling together, based on the first offset moment matrix
Obtain the second offset moment matrix for indicating at least movement trend of a group traveling together in future time section t2;
It is obtained by the second offset moment matrix for indicating at least limbs of a group traveling together in future time section t2
The motion vector of motion characteristic;
The motion vector pair for indicating at least limb action feature of a group traveling together in future time section t2 is obtained respectively
The limb action feature answered;
Respectively according at least limb action feature of a group traveling together in future time section t2, obtains at least a group traveling together and exist
Movement trend in future time section t2.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the movement become
When gesture is static, described control unit is specifically used for:
According to the distance between the position of described at least a group traveling together, described at least a group traveling together and described current driving object,
And the motion planning information and current motion state of the current driving object, determine the driving of the current driving object
Control strategy;
Motion planning information and Driving control strategy based on the current driving object, to the current driving object into
Row Driving control.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the movement become
When gesture is movement, described device further include:
Second predicting unit, for by nervus opticus network, at least a group traveling together based on described in be in future time section t2
Movement trend, the prediction at least walking behavioural information of a group traveling together in future time section t2;
Described control unit is specifically used for: being believed according to walking behavior of the described at least a group traveling together in future time section t2
The motion planning information and current motion state of breath and the current driving object, determine the current driving object
Driving control strategy;Motion planning information and Driving control strategy based on the current driving object, to the current driving
Object carries out Driving control.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the walking row
It include: direction of travel, the speed of travel, walking path for information.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, the driving control
System strategy includes following any one: ignoring, deceleration, acceleration, stops, following, bypassing;
Motion planning information and Driving control strategy of the described control unit based on the current driving object, work as to described
When preceding traveling object carries out Driving control, specifically for the Driving control strategy based on the current driving object to described current
Traveling object motion planning information be adjusted, and based on motion planning information adjusted to the current driving object into
Row Driving control.
The aspect and any possible implementation manners as described above, it is further provided a kind of implementation, further includes:
Training unit, for being trained using the Sample video for including pedestrian to the first nerves network.
Another aspect of the present invention, provides a kind of equipment, and the equipment includes:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processing
Device realizes the automatic Pilot control method as provided by above-mentioned one side.
Another aspect of the invention provides a kind of computer readable storage medium, is stored thereon with computer program, the journey
The automatic Pilot control method as provided by above-mentioned one side is realized when sequence is executed by processor.
As shown from the above technical solution, the figure of Driving Scene where the embodiment of the present invention can acquire current driving object
Picture, by first nerves network, based at least action message of a group traveling together in a historical time section t1 in described image, in advance
At least movement trend of a group traveling together in future time section t2 is surveyed, according to described at least a group traveling together in future time section t2
Movement trend and current driving object motion planning information, to current driving object carry out Driving control.Due to reference to extremely
Few action message of a group traveling together in a historical time section t1, may be implemented at least a group traveling together in future time section t2
The Accurate Prediction of movement trend, and movement trend of the pedestrian in future time section t2 is combined, to realize to current line
The accurate Driving control for sailing object effectively prevents pedestrian causing traffic thing as the traveling of stationary obstruction control vehicle
Therefore to it improve traffic safety.
In addition, using technical solution provided by the present invention, by first nerves network, based in described image at least one
Action message of the pedestrian in a historical time section t1, prediction at least movement trend of a group traveling together in future time section t2,
Due to the deep learning function of neural network, the Accurate Prediction for acting trend in future time section to pedestrian may be implemented, mention
The high accuracy of movement trend prediction.
[Detailed description of the invention]
It to describe the technical solutions in the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art
Needed in attached drawing be briefly described, it should be apparent that, the accompanying drawings in the following description is some realities of the invention
Example is applied, it for those of ordinary skill in the art, without any creative labor, can also be attached according to these
Figure obtains other attached drawings.
Fig. 1 is the flow diagram for the automatic Pilot control method that one embodiment of the invention provides;
Fig. 2 be another embodiment of the present invention provides automatic Pilot control device structural schematic diagram;
Fig. 3 be another embodiment of the present invention provides automatic Pilot control device structural schematic diagram;
Fig. 4 is the block diagram suitable for being used to realize the exemplary computer system/server of embodiment of the present invention.
[specific embodiment]
In order to make the object, technical scheme and advantages of the embodiment of the invention clearer, below in conjunction with the embodiment of the present invention
In attached drawing, technical scheme in the embodiment of the invention is clearly and completely described, it is clear that described embodiment is
A part of the embodiment of the present invention, instead of all the embodiments.Based on the embodiments of the present invention, those of ordinary skill in the art
Whole other embodiments obtained without creative efforts, shall fall within the protection scope of the present invention.
It should be noted that terminal involved in the embodiment of the present invention can include but is not limited to mobile phone, individual digital
It is assistant (PersonalDigital Assistant, PDA), radio hand-held equipment, tablet computer (Tablet Computer), a
People's computer (PersonalComputer, PC), MP3 player, MP4 player, wearable device are (for example, intelligent glasses, intelligence
Wrist-watch, Intelligent bracelet etc.) etc..
In addition, the terms "and/or", only a kind of incidence relation for describing affiliated partner, indicates may exist
Three kinds of relationships, for example, A and/or B, can indicate: individualism A exists simultaneously A and B, these three situations of individualism B.Separately
Outside, character "/" herein typicallys represent the relationship that forward-backward correlation object is a kind of "or".
Main idea is that movement trend of the prediction pedestrian in future time section, according to pedestrian in future time
The motion planning information of movement trend and current driving object in section carries out Driving control to current driving object, to realize
To the accurate Driving control of current driving object, to improve traffic safety.
Fig. 1 is the flow diagram for the automatic Pilot control method that one embodiment of the invention provides, as shown in Figure 1.
101, the image of Driving Scene where acquisition current driving object.
For example, can be by being deployed at least one sensor of current driving data collection, such as camera, radar
Deng the image of Driving Scene where in real time or acquiring current driving object according to shorter time interval.
102, by first nerves network, based at least a group traveling together is dynamic in a historical time section t1 in described image
Make information, the prediction at least movement trend of a group traveling together in future time section t2.
103, according to the movement of at least movement trend and current driving object of a group traveling together in future time section t2
Planning information carries out Driving control to current driving object.
It should be noted that some or all of 101~103 executing subject can be to be located locally terminal i.e. service to mention
It for the application of the terminal device of quotient, or can also be the plug-in unit being arranged in the application of local terminal or software development work
The functional units such as tool packet (Software Development Kit, SDK), or can also be in network side server
Engine is handled, or can also be for positioned at the distributed system of network side, the present embodiment is to this without being particularly limited to.
It is understood that the application can be mounted in the local program (nativeApp) in terminal, or may be used also
To be a web page program (webApp) of browser in terminal, the present embodiment is to this without being particularly limited to.
In this way, by first nerves network, based at least a group traveling together in the image of Driving Scene where current driving object
Action message in a historical time section t1, the prediction at least movement trend of a group traveling together in future time section t2,
According to the motion planning information of movement trend and current driving object of the described at least a group traveling together in future time section t2, to working as
Preceding traveling object carries out Driving control.It, can be with due to reference at least action message of a group traveling together in a historical time section t1
It realizes the Accurate Prediction of the movement trend at least a group traveling together in future time section t2, and combines pedestrian in future time section
Movement trend in t2 effectively prevents treating as pedestrian to realize the accurate Driving control to current driving object
The traveling of stationary obstruction control vehicle causes traffic accident, to improve traffic safety.
Optionally, in a possible implementation of the present embodiment, the current driving object may include but not
Be limited to: vehicle or robot, etc. can arbitrarily use the object of automatic Pilot technology.Vehicle therein for example can be vapour
Vehicle, electric vehicle, toy car etc. arbitrarily travel object, and the present embodiment is to this without being particularly limited to.
Optionally, in a possible implementation of the present embodiment, the motion planning information may include but not
Be limited to: driving path, on driving path each position point driving direction, velocity and acceleration.
Optionally, in a possible implementation of the present embodiment, the action message includes but is not limited to following
It is any one or more: to wave, mention leg, turn round, headwork, etc..The movement trend may include: static or fortune
It is dynamic.
Optionally, in a possible implementation of the present embodiment, 102 may include:
By first nerves network, respectively for an at least at least frame history of a group traveling together in a historical time section t1
Image extracts the limb action feature of each pedestrian in history image;
Limb action feature of each pedestrian in historical time section t1 is indicated with a motion vector respectively;
The first offset moment matrix is generated according to the motion vector of described at least a group traveling together, is obtained based on the first offset moment matrix
It takes in the second offset moment matrix for indicating at least movement trend of a group traveling together in future time section t2;
It is obtained by the second offset moment matrix for indicating at least limbs of a group traveling together in future time section t2
The motion vector of motion characteristic;
The motion vector pair for indicating at least limb action feature of a group traveling together in future time section t2 is obtained respectively
The limb action feature answered;
Respectively according at least limb action feature of a group traveling together in future time section t2, obtains at least a group traveling together and exist
Movement trend in future time section t2.
Optionally, it in a possible implementation of above-described embodiment, can be gone by first nerves network
People's detection, and limbs critical point detection is carried out to the pedestrian detected, determine that limbs are gone forward side by side based on the limbs key point detected
Row limb action feature extraction.Limbs key point therein can be preset, for example, can preset hand key point,
Upper part of the body key point, arm key point, leg key point, waist key point, header key point, etc..
Optionally, in a possible implementation of the present embodiment, when the movement trend is static, 103 can be with
Include:
According to the distance between the position of described at least a group traveling together, described at least a group traveling together and described current driving object,
And the motion planning information and current motion state of the current driving object, determine the driving of the current driving object
Control strategy;
Motion planning information and Driving control strategy based on the current driving object, to the current driving object into
Row Driving control.
Optionally, in a possible implementation of the present embodiment, when the movement trend is movement, 103 can be with
Include:
By nervus opticus network, movement trend based on described at least a group traveling together in future time section t2 predicts institute
State at least walking behavioural information of a group traveling together in future time section t2;
According to walking behavioural information of the described at least a group traveling together in future time section t2 and the current driving object
Motion planning information and current motion state, determine the Driving control strategy of the current driving object;
Motion planning information and Driving control strategy based on the current driving object, to the current driving object into
Row Driving control.
The walking behavioural information therein can include but is not limited to: direction of travel, the speed of travel, walking path.
When the movement trend of pedestrian is movement, its behavioural information of walking of the movement trend prediction based on pedestrian, and according to
The walking behavioural information of pedestrian and the motion planning information of current driving object and current motion state, determine current line
The Driving control strategy for sailing object, motion planning information and Driving control strategy based on current driving object, to current driving
Object carries out Driving control and improves friendship in this way, realizing to the motion planning of current driving object more refined and control
Logical safety.
Optionally, in a possible implementation of the present embodiment, the Driving control strategy includes but is not limited to
Following any one: ignore, deceleration, acceleration, stop, following, bypassing, etc..Correspondingly, in the embodiment, it is based on current line
The motion planning information and Driving control strategy for sailing object can be based on current when carrying out Driving control to current driving object
The Driving control strategy of traveling object is adjusted the motion planning information of current driving object, and is based on movement adjusted
Planning information carries out Driving control to the current driving object.
For example, it is assumed that according to the motion planning information of the walking behavioural information of certain a group traveling together and current driving object, prediction
The route of the pedestrian and current driving object exists in subsequent time and intersects out, subtracts then current driving object can be controlled in advance
Speed, stop motion or adjustment driving path are around the pedestrian etc., the generation of event to avoid collision, to improve traffic
Safety.
Wherein, in the Driving control strategy, ignore and refer to: pedestrian outside current driving object a certain range, pedestrian's
Behavior will not influence the safe driving of current driving object;
Deceleration refers to: pedestrian is in current driving object a certain range, the behavior of pedestrian, it is possible to intervene and arrive current driving
The safe driving of object, at this point, control current driving object pre-decelerating, is touched to avoid current driving object with pedestrian
It hits;
Acceleration refers to: pedestrian is in current driving object a certain range, the behavior of pedestrian, leads to the following sometime pedestrian
It may collide with current driving object, at this point, control current driving object gives it the gun, so that current driving object shifts to an earlier date
By possible crash site, current driving object and pedestrian collision are avoided;
Stopping refers to: in current driving object a certain range, current driving object can not bypass pedestrian, or around having
Risk, the safe driving of the behavioral implications automatic Pilot current driving object of pedestrian, control the stop motion of current driving object with
It waits.
Follow and refer to: pedestrian is in front of the current driving object in a certain range, pedestrian and current driving object direction of travel
Unanimously, current driving object can not bypass, or around risky, control current driving object with safety movement speed follower row
People's traveling;
Around referring to: for pedestrian in front of the current driving object in a certain range, current driving object can be safely around space
People controls current driving object then around pedestrian traveling.
Further optionally, in a possible implementation of the various embodiments described above, first nerves network can also be mentioned
Take current time acquire image in each pedestrian limb action feature, based on current time acquisition image in each pedestrian limb
Body motion characteristic determines that the state of current time each pedestrian is static or movement, as corresponding pedestrian relative to future time section
Original state.
That is: for each pedestrian, possible original state includes two kinds: static or movement;Each pedestrian is in future time
Movement trend in section t2 also includes two kinds: static or movement.
It is static pedestrian for original state, there are two types of from current time to the movement trend in future time section t2
May: the first, which is to continue with, remain stationary, and second is to be switched to movement from static.
For the first case, avoidance can be carried out using the pedestrian as static-obstacle thing, according to the position of the pedestrian, be somebody's turn to do
(driving path is travelling road to the motion planning information of the distance between pedestrian and current driving object and current driving object
Driving direction, the velocity and acceleration of each position point on diameter) and current motion state (driving direction, velocity and acceleration),
The Driving control strategy for determining current driving object is to ignore, deceleration, acceleration, stop, following, bypassing, etc., is then based on and works as
The motion planning information and Driving control strategy of preceding traveling object, carry out Driving control to current driving object.
For second situation, it can predict that the pedestrian exists based on movement trend of the pedestrian in future time section t2
Direction of travel, the speed of travel in future time section t2 construct walking path of the pedestrian in future time section t2, as this
Walking behavioural information of the pedestrian in future time section t2, according to walking behavioural information of the pedestrian in future time section t2,
And current driving object motion planning information (driving path, on driving path the driving direction, speed of each position point and
Acceleration) and current motion state (driving direction, velocity and acceleration), determine the Driving control strategy of current driving object
To ignore, deceleration, acceleration, stop, following, around, etc., motion planning information and Driving control based on current driving object
Strategy carries out Driving control to current driving object.
It is the pedestrian of movement for original state, there is also two from current time to the movement trend in future time section t2
Kind may: the first is to continue with holding movement, be for second be switched to from movement it is static.
For the first case, it can predict that the pedestrian exists based on movement trend of the pedestrian in future time section t2
Direction of travel, the speed of travel in future time section t2 construct walking path of the pedestrian in future time section t2, as this
Walking behavioural information of the pedestrian in future time section t2, according to walking behavioural information of the pedestrian in future time section t2,
And current driving object motion planning information (driving path, on driving path the driving direction, speed of each position point and
Acceleration) and current motion state (driving direction, velocity and acceleration), determine the Driving control strategy of current driving object
To ignore, deceleration, acceleration, stop, following, around, etc., motion planning information and Driving control based on current driving object
Strategy carries out Driving control to current driving object.
For second situation, avoidance can be carried out using the pedestrian as static-obstacle thing, according to the position of the pedestrian, be somebody's turn to do
(driving path is travelling road to the motion planning information of the distance between pedestrian and current driving object and current driving object
Driving direction, the velocity and acceleration of each position point on diameter) and current motion state (driving direction, velocity and acceleration),
The Driving control strategy for determining current driving object is to ignore, deceleration, acceleration, stop, following, bypassing, etc., is then based on and works as
The motion planning information and Driving control strategy of preceding traveling object, carry out Driving control to current driving object.
In the present embodiment, the motion planning based on movement trend of the pedestrian in future time section t2, current driving object
Information and current motion state, assessment security risk determine Driving control strategy in advance, and are based on the Driving control strategy pair
Original motion planning information is adjusted, to realize safe, the stable automatic Pilot of traveling object.
In addition, before the process of the various embodiments described above, it can also be using the Sample video including pedestrian to described first
Neural network is trained, until meeting training completion condition.
Sample video (referred to as first sample video) therein include continuous period T (including the T1 period with
The T2 period) in multiple sample images for acquiring, T=T1+T2 therein can be for each pedestrian point in every sample image
Not Biao Zhu action message (referred to as reference action information), the sample image in the T1 period is inputted into first nerves network, by the
One neural network exports movement tendency prediction information of each pedestrian within the T2 period, the movement exported according to first nerves network
Difference between tendency prediction information and corresponding reference action information is adjusted the network parameter in first nerves network.
The above process can be executed with iteration, until meeting training completion condition, such as the movement trend prediction of first nerves network output
Difference between information and corresponding reference action information is less than preset threshold, or the repetitive exercise to first nerves network
Number reaches preset times.
In this way, the training to first nerves network is realized, so as to accurately predict pedestrian by first nerves network
Movement trend in future time section.
In addition, before the process of the various embodiments described above, it can also be using the Sample video including pedestrian to described second
Neural network is trained, until meeting training completion condition.Sample video (referred to as the second Sample video) therein includes one
Multiple sample images acquired in a continuous period T2 can mark dynamic respectively for each pedestrian in every sample image
Make trend, mark the walking behavioural information of each pedestrian for the second Sample video (referred to as reference line walks behavioural information).By second
Sample video inputs nervus opticus network, the walking behavior prediction information of each pedestrian is exported by nervus opticus network, according to second
The walking behavior prediction information of neural network output walks the difference between behavioural information to nervus opticus net with corresponding reference line
Network parameter in network is adjusted.The above process can be executed with iteration, until meeting training completion condition, such as nervus opticus
The walking behavior prediction information of network output walks the difference between behavioural information less than preset threshold with corresponding reference line, or
Preset times are reached to the repetitive exercise number of nervus opticus network.
In this way, the training to nervus opticus network is realized, so as to accurately predict pedestrian by nervus opticus network
Walking behavioural information in future time section.
Optionally, in a possible implementation of the present embodiment, first nerves network, nervus opticus network can be with
It is realized based on the frameworks mode such as RFCN, SSD, RCNN, FastRCNN, FasterRCNN, SPPNet, DPM, OverFeat, YOLO,
The present embodiment is to this without being particularly limited to.
In the present embodiment, the image of Driving Scene, passes through first nerves network, base where can acquiring current driving object
At least action message of a group traveling together in a historical time section t1 in described image predicts described at least a group traveling together in future
Movement trend in time period t 2, according to movement trend and current driving pair of the described at least a group traveling together in future time section t2
The motion planning information of elephant carries out Driving control to current driving object.Since reference at least a group traveling together is in a historical time
Action message in section t1, may be implemented the Accurate Prediction of the movement trend at least a group traveling together in future time section t2, and
Movement trend of the pedestrian in future time section t2 is combined, to realize the accurate driving control to current driving object
System effectively prevents pedestrian causing traffic accident as the traveling of stationary obstruction control vehicle, to improve traffic peace
Quan Xing.
In addition, using technical solution provided by the present invention, by first nerves network, based in described image at least one
Action message of the pedestrian in a historical time section t1, prediction at least movement trend of a group traveling together in future time section t2,
Due to the deep learning function of neural network, the Accurate Prediction for acting trend in future time section to pedestrian may be implemented, mention
The high accuracy of movement trend prediction.
It should be noted that for the various method embodiments described above, for simple description, therefore, it is stated as a series of
Combination of actions, but those skilled in the art should understand that, the present invention is not limited by the sequence of acts described because
According to the present invention, some steps may be performed in other sequences or simultaneously.Secondly, those skilled in the art should also know
It knows, the embodiments described in the specification are all preferred embodiments, and related actions and modules is not necessarily of the invention
It is necessary.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, there is no the portion being described in detail in some embodiment
Point, reference can be made to the related descriptions of other embodiments.
Fig. 2 be another embodiment of the present invention provides automatic Pilot control device structural schematic diagram, as shown in Figure 2.This
The automatic Pilot control device of embodiment may include image acquisition units 21, the first predicting unit 22 and control unit 23.Its
In, image acquisition units 21, the image for Driving Scene where acquiring current driving object;First predicting unit 22, is used for
By first nerves network, based at least action message of a group traveling together in a historical time section t1 in described image, prediction
Movement trend of the described at least a group traveling together in future time section t2;Control unit 23, for being existed according to described at least a group traveling together
The motion planning information of movement trend and the current driving object in future time section t2, to the current driving object into
Row Driving control.
It should be noted that some or all of the present embodiment automatic Pilot control device can be to be located locally terminal i.e.
The application of the terminal device of service provider, or can also be the plug-in unit or software being arranged in the application of local terminal
The functional units such as development kit (Software Development Kit, SDK), or can also be for positioned at network-side service
Processing engine in device, or can also be for positioned at the distributed system of network side, the present embodiment is to this without being particularly limited to.
It is understood that the application can be mounted in the local program (nativeApp) in terminal, or may be used also
To be a web page program (webApp) of browser in terminal, the present embodiment is to this without being particularly limited to.
In this way, by first nerves network, based at least a group traveling together in the image of Driving Scene where current driving object
Action message in a historical time section t1, the prediction at least movement trend of a group traveling together in future time section t2,
According to the motion planning information of movement trend and current driving object of the described at least a group traveling together in future time section t2, to working as
Preceding traveling object carries out Driving control.It, can be with due to reference at least action message of a group traveling together in a historical time section t1
It realizes the Accurate Prediction of the movement trend at least a group traveling together in future time section t2, and combines pedestrian in future time section
Movement trend in t2 effectively prevents treating as pedestrian to realize the accurate Driving control to current driving object
The traveling of stationary obstruction control vehicle causes traffic accident, to improve traffic safety.
Optionally, in a possible implementation of the present embodiment, the current driving object may include but not
Be limited to: vehicle or robot, etc. can arbitrarily use the object of automatic Pilot technology.Vehicle therein for example can be vapour
Vehicle, electric vehicle, toy car etc. arbitrarily travel object, and the present embodiment is to this without being particularly limited to.
Optionally, in a possible implementation of the present embodiment, the motion planning information may include but not
Be limited to: driving path, on driving path each position point driving direction, velocity and acceleration.
Optionally, in a possible implementation of the present embodiment, the action message includes but is not limited to following
It is any one or more: to wave, mention leg, turn round, headwork, etc..The movement trend may include: static or fortune
It is dynamic.
Optionally, in a possible implementation of the present embodiment, first predicting unit 22 is specifically used for logical
It crosses the first nerves network: being directed to an at least at least frame history image of a group traveling together in a historical time section t1 respectively,
Extract the limb action feature of each pedestrian in history image;Indicate each pedestrian in historical time section t1 with a motion vector respectively
Interior limb action feature;The first offset moment matrix is generated according to the motion vector of described at least a group traveling together, is based on described first
Offset moment matrix obtains the second offset square for indicating at least movement trend of a group traveling together in future time section t2
Battle array;It is obtained by the second offset moment matrix for indicating that limb action of the described at least a group traveling together in future time section t2 is special
The motion vector of sign;Obtain respectively indicate the displacement of at least limb action feature of a group traveling together in future time section t2 to
Measure corresponding limb action feature;Respectively according at least limb action feature of a group traveling together in future time section t2, obtain
Take at least movement trend of a group traveling together in future time section t2.
Further optionally, in a possible implementation of the various embodiments described above, first predicting unit 22,
It can also be used to extract the limb action feature of each pedestrian in the image of current time acquisition by first nerves network, based on working as
The limb action feature of each pedestrian determines that the state of current time each pedestrian is static or movement in the image of preceding moment acquisition,
Original state as corresponding pedestrian relative to future time section.
Optionally, in a possible implementation of the present embodiment, when the movement trend is static, the control
Unit 23 is specifically used for: according between the position of described at least a group traveling together, described at least a group traveling together and current driving object away from
From and current driving object motion planning information and current motion state, determine the Driving control of current driving object
Strategy;Motion planning information and Driving control strategy based on current driving object carry out Driving control to current driving object.
Optionally, in a possible implementation of the present embodiment, as shown in figure 3, the movement trend is movement
When, automatic Pilot control device provided by the present embodiment can further include the second predicting unit 31, for passing through the
Two neural networks, the movement trend based on described at least a group traveling together in future time section t2 predict that described at least a group traveling together exists
Walking behavioural information in future time section t2.Correspondingly, in the embodiment, control unit 23 is specifically used for: according to it is described extremely
The motion planning information of few walking behavioural information and current driving object of a group traveling together in future time section t2 and currently
Motion state determines the Driving control strategy of current driving object;Based on the motion planning information of current driving object and driving
Control strategy carries out Driving control to current driving object.
The walking behavioural information therein can include but is not limited to: direction of travel, the speed of travel, walking path.
When the movement trend of pedestrian is movement, its behavioural information of walking of the movement trend prediction based on pedestrian, and according to
The walking behavioural information of pedestrian and the motion planning information of current driving object and current motion state, determine current line
The Driving control strategy for sailing object, motion planning information and Driving control strategy based on current driving object, to current driving
Object carries out Driving control and improves friendship in this way, realizing to the motion planning of current driving object more refined and control
Logical safety.
Optionally, in a possible implementation of the present embodiment, the Driving control strategy includes but is not limited to
Following any one: ignore, deceleration, acceleration, stop, following, bypassing, etc..Correspondingly, in the embodiment, control unit 23
Motion planning information and Driving control strategy based on current driving object, when carrying out Driving control to current driving object, tool
Body is adjusted the motion planning information of current driving object for the Driving control strategy based on current driving object, and base
Driving control is carried out to current driving object in motion planning information adjusted.
For example, it is assumed that according to the motion planning information of the walking behavioural information of certain a group traveling together and current driving object, prediction
The route of the pedestrian and current driving object exists in subsequent time and intersects out, subtracts then current driving object can be controlled in advance
Speed, stop motion or adjustment driving path are around the pedestrian etc., the generation of event to avoid collision, to improve traffic
Safety.
Wherein, in the Driving control strategy, ignore and refer to: pedestrian outside current driving object a certain range, pedestrian's
Behavior will not influence the safe driving of current driving object;
Deceleration refers to: pedestrian is in current driving object a certain range, the behavior of pedestrian, it is possible to intervene and arrive current driving
The safe driving of object, at this point, control current driving object pre-decelerating, is touched to avoid current driving object with pedestrian
It hits;
Acceleration refers to: pedestrian is in current driving object a certain range, the behavior of pedestrian, leads to the following sometime pedestrian
It may collide with current driving object, at this point, control current driving object gives it the gun, so that current driving object shifts to an earlier date
By possible crash site, current driving object and pedestrian collision are avoided;
Stopping refers to: in current driving object a certain range, current driving object can not bypass pedestrian, or around having
Risk, the safe driving of the behavioral implications automatic Pilot current driving object of pedestrian, control the stop motion of current driving object with
It waits.
Follow and refer to: pedestrian is in front of the current driving object in a certain range, pedestrian and current driving object direction of travel
Unanimously, current driving object can not bypass, or around risky, control current driving object with safety movement speed follower row
People's traveling;
Around referring to: for pedestrian in front of the current driving object in a certain range, current driving object can be safely around space
People controls current driving object then around pedestrian traveling.
Optionally, in a possible implementation of the present embodiment, as shown in figure 3, provided by the present embodiment certainly
Dynamic steering control device can further include training unit 32, for utilizing the Sample video including pedestrian to first nerves
Network is trained.
Optionally, in a possible implementation of the present embodiment, training unit 32, it may also be used for using including row
The Sample video of people is trained nervus opticus network.
It should be noted that method in the corresponding embodiment of Fig. 1, it can be real by image processing apparatus provided in this embodiment
It is existing.Detailed description may refer to the related content in the corresponding embodiment of Fig. 1, and details are not described herein again.
In the present embodiment, the image of Driving Scene, passes through first nerves network, base where can acquiring current driving object
At least action message of a group traveling together in a historical time section t1 in described image predicts described at least a group traveling together in future
Movement trend in time period t 2, according to movement trend and current driving pair of the described at least a group traveling together in future time section t2
The motion planning information of elephant carries out Driving control to current driving object.Since reference at least a group traveling together is in a historical time
Action message in section t1, may be implemented the Accurate Prediction of the movement trend at least a group traveling together in future time section t2, and
Movement trend of the pedestrian in future time section t2 is combined, to realize the accurate driving control to current driving object
System effectively prevents pedestrian causing traffic accident as the traveling of stationary obstruction control vehicle, to improve traffic peace
Quan Xing.
In addition, using technical solution provided by the present invention, by first nerves network, based in described image at least one
Action message of the pedestrian in a historical time section t1, prediction at least movement trend of a group traveling together in future time section t2,
Due to the deep learning function of neural network, the Accurate Prediction for acting trend in future time section to pedestrian may be implemented, mention
The high accuracy of movement trend prediction.
In addition, the equipment includes: one or more processors the embodiment of the invention also provides a kind of equipment;Storage
Device, for storing one or more programs, when one or more of programs are executed by one or more processors, so that one
A or multiple processors realize the automatic Pilot control method such as any embodiment in above-mentioned Fig. 1.
In addition, it is stored thereon with computer program the embodiment of the invention also provides a kind of computer readable storage medium,
The automatic Pilot control method such as any embodiment in above-mentioned Fig. 1 is realized when the program is executed by processor.
Fig. 4 shows the block diagram for being suitable for the exemplary computer system/server for being used to realize embodiment of the present invention.Figure
The computer system/servers of 4 displays are only an example, should not function to the embodiment of the present invention and use scope bring
Any restrictions.
As shown in figure 4, computer system/server is showed in the form of universal computing device.Computer system/server
Component can include but is not limited to: one or more processor perhaps 16 storage device of processing unit or system storage
Device 28 connects the bus 18 of different system components (including system storage 28 and processing unit 16).
Bus 18 indicates one of a few class bus structures or a variety of, including memory bus or Memory Controller,
Peripheral bus, graphics acceleration port, processor or the local bus using any bus structures in a variety of bus structures.It lifts
For example, these architectures include but is not limited to industry standard architecture (ISA) bus, microchannel architecture (MAC)
Bus, enhanced isa bus, Video Electronics Standards Association (VESA) local bus and peripheral component interconnection (PCI) bus.
Computer system/server typically comprises a variety of computer system readable media.These media can be any
The usable medium that can be accessed by computer system/server, including volatile and non-volatile media, movably and can not
Mobile medium.
System storage 28 may include the computer system readable media of form of volatile memory, such as arbitrary access
Memory (RAM) 30 and/or cache memory 32.Computer system/server may further include it is other it is removable/
Immovable, volatile/non-volatile computer system storage medium.Only as an example, storage system 34 can be used for reading
Write immovable, non-volatile magnetic media (Fig. 4 do not show, commonly referred to as " hard disk drive ").Although not shown in fig 4,
The disc driver for reading and writing to removable non-volatile magnetic disk (such as " floppy disk ") can be provided, and non-easy to moving
The CD drive that the property lost CD (such as CD-ROM, DVD-ROM or other optical mediums) is read and write.In these cases, each
Driver can be connected by one or more data media interfaces with bus 18.System storage 28 may include at least one
A program product, the program product have one group of (for example, at least one) program module, these program modules are configured to perform
The function of various embodiments of the present invention.
Program/utility 40 with one group of (at least one) program module 42 can store and store in such as system
In device 28, such program module 42 includes --- but being not limited to --- operating system, one or more application program, other
It may include the realization of network environment in program module and program data, each of these examples or certain combination.Journey
Sequence module 42 usually executes function and/or method in embodiment described in the invention.
Computer system/server can also be with one or more external equipments 14 (such as keyboard, sensing equipment, display
Device 24 etc.) communication, the equipment interacted with the computer system/server communication can be also enabled a user to one or more,
And/or with enable the computer system/server and one or more of the other any equipment (example for being communicated of calculating equipment
Such as network interface card, modem etc.) communication.This communication can be carried out by input/output (I/O) interface 44.Also, it calculates
Machine systems/servers can also pass through network adapter 20 and one or more network (such as local area network (LAN), wide area network
(WAN) and/or public network, for example, internet) communication.As shown, network adapter 20 passes through bus 18 and department of computer science
The other module communications of system/server.It should be understood that although not shown in the drawings, can be used in conjunction with computer system/server
Other hardware and/or software module, including but not limited to: microcode, device driver, redundant processing unit, external disk drive
Dynamic array, RAID system, tape drive and data backup storage system etc..
Processing unit 16 by the program that is stored in system storage 28 of operation, thereby executing various function application and
Data processing, such as realize image processing method provided by embodiment corresponding to Fig. 1.
Another embodiment of the present invention additionally provides a kind of computer readable storage medium, is stored thereon with computer program,
The program realizes image processing method provided by embodiment corresponding to Fig. 1 when being executed by processor.
It specifically, can be using any combination of one or more computer-readable media.Computer-readable medium
It can be computer-readable signal media or computer readable storage medium.Computer readable storage medium for example can be with
System, device or the device of --- but being not limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, or it is any more than
Combination.The more specific example (non exhaustive list) of computer readable storage medium includes: to have one or more conducting wires
Electrical connection, portable computer diskette, hard disk, random access memory (RAM), read-only memory (ROM), erasable type can compile
Journey read-only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic
Memory device or above-mentioned any appropriate combination.In this document, computer readable storage medium, which can be, any includes
Or the tangible medium of storage program, which can be commanded execution system, device or device use or in connection make
With.
Computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal,
Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including --- but
It is not limited to --- electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be
Any computer-readable medium other than computer readable storage medium, which can send, propagate or
Transmission is for by the use of instruction execution system, device or device or program in connection.
The program code for including on computer-readable medium can transmit with any suitable medium, including --- but it is unlimited
In --- wireless, electric wire, optical cable, RF etc. or above-mentioned any appropriate combination.
The computer for executing operation of the present invention can be write with one or more programming languages or combinations thereof
Program code, described program design language include object oriented program language-such as Java, Smalltalk, C++,
Further include conventional procedural programming language-such as " C " language or similar programming language.Program code can be with
It fully executes, partly execute on the user computer on the user computer, being executed as an independent software package, portion
Divide and partially executes or executed on a remote computer or server completely on the remote computer on the user computer.?
Be related in the situation of remote computer, remote computer can pass through the network of any kind --- including local area network (LAN) or
Wide area network (WAN) --- it is connected to subscriber computer, or, it may be connected to outer computer (such as utilize Internet service
Provider is connected by internet).
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description,
The specific work process of device and unit, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In several embodiments provided by the present invention, it should be understood that disclosed system, device and method can be with
It realizes by another way.For example, the apparatus embodiments described above are merely exemplary, for example, the unit
It divides, only a kind of logical function partition, there may be another division manner in actual implementation, such as multiple units or the page
Component can be combined or can be integrated into another system, or some features can be ignored or not executed.Another point is shown
The mutual coupling, direct-coupling or communication connection shown or discussed can be through some interfaces, between device or unit
Coupling or communication connection are connect, can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of hardware adds SFU software functional unit.
The above-mentioned integrated unit being realized in the form of SFU software functional unit can store and computer-readable deposit at one
In storage media.Above-mentioned SFU software functional unit is stored in a storage medium, including some instructions are used so that a computer
It is each that equipment (can be personal computer, server or the network equipment etc.) or processor (processor) execute the present invention
The part steps of embodiment the method.And storage medium above-mentioned includes: USB flash disk, mobile hard disk, read-only memory (Read-
Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic or disk etc. it is various
It can store the medium of program code.
Finally, it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although
Present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: it still may be used
To modify the technical solutions described in the foregoing embodiments or equivalent replacement of some of the technical features;
And these are modified or replaceed, technical solution of various embodiments of the present invention that it does not separate the essence of the corresponding technical solution spirit and
Range.
Claims (22)
1. a kind of automatic Pilot control method characterized by comprising
The image of Driving Scene where acquiring current driving object;
By first nerves network, based at least action message of a group traveling together in a historical time section t1 in described image,
The prediction at least movement trend of a group traveling together in future time section t2;
According to the motion planning of movement trend and the current driving object of the described at least a group traveling together in future time section t2
Information carries out Driving control to the current driving object.
2. the method according to claim 1, wherein the current driving object includes: vehicle or robot.
3. the method according to claim 1, wherein the motion planning information includes: driving path, is travelling
Driving direction, the velocity and acceleration of each position point on path.
4. the method according to claim 1, wherein the action message includes following any one or more:
It waves, mentions leg, turns round, headwork;
The movement trend includes: static or movement.
5. the method according to claim 1, wherein described gone through based at least a group traveling together in described image at one
Action message in history time period t 1, the prediction at least movement trend of a group traveling together in future time section t2, comprising:
Respectively for an at least at least frame history image of a group traveling together in a historical time section t1, extract each in history image
The limb action feature of pedestrian;
Limb action feature of each pedestrian in historical time section t1 is indicated with a motion vector respectively;
The first offset moment matrix is generated according to the motion vector of described at least a group traveling together, is obtained based on the first offset moment matrix
For indicating the second offset moment matrix of at least movement trend of a group traveling together in future time section t2;
It is obtained by the second offset moment matrix for indicating at least limb action of a group traveling together in future time section t2
The motion vector of feature;
Obtaining respectively indicates that the motion vector of at least limb action feature of a group traveling together in future time section t2 is corresponding
Limb action feature;
Respectively according at least limb action feature of a group traveling together in future time section t2, at least a group traveling together will be obtained in future
Movement trend in time period t 2.
6. the method according to claim 1, wherein the movement trend be it is static when, it is described according to extremely
The motion planning information of few movement trend and the current driving object of a group traveling together in future time section t2, to described current
It travels object and carries out Driving control, comprising:
According to the distance between the position of described at least a group traveling together, described at least a group traveling together and described current driving object and
The motion planning information and current motion state of the current driving object, determine the Driving control of the current driving object
Strategy;
Motion planning information and Driving control strategy based on the current driving object, drive the current driving object
Sail control.
7. the method according to claim 1, wherein the movement trend be movement when, it is described according to extremely
The motion planning information of few movement trend and the current driving object of a group traveling together in future time section t2, to described current
It travels object and carries out Driving control, comprising:
By nervus opticus network, movement trend based on described at least a group traveling together in future time section t2, prediction is described extremely
Few walking behavioural information of a group traveling together in future time section t2;
According to the fortune of walking behavioural information and the current driving object of the described at least a group traveling together in future time section t2
Dynamic planning information and current motion state, determine the Driving control strategy of the current driving object;
Motion planning information and Driving control strategy based on the current driving object, drive the current driving object
Sail control.
8. the method according to the description of claim 7 is characterized in that the walking behavioural information includes: direction of travel, walking speed
Degree, walking path.
9. according to method described in claim 6~8 any claim, which is characterized in that the Driving control strategy includes
Following any one: ignore, deceleration, acceleration, stop, following, bypassing;
The motion planning information and Driving control strategy based on the current driving object, to the current driving object into
Row Driving control, comprising:
Driving control strategy based on the current driving object adjusts the motion planning information of the current driving object
It is whole, and Driving control is carried out to the current driving object based on motion planning information adjusted.
10. the method according to claim 1, wherein further include:
The first nerves network is trained using the Sample video for including pedestrian.
11. a kind of automatic Pilot control device characterized by comprising
Image acquisition units, the image for Driving Scene where acquiring current driving object;
First predicting unit, for passing through first nerves network, based at least a group traveling together in described image in a historical time
Action message in section t1, the prediction at least movement trend of a group traveling together in future time section t2;
Control unit, for according at least movement trend and the current driving pair of a group traveling together in future time section t2
The motion planning information of elephant carries out Driving control to the current driving object.
12. device according to claim 11, which is characterized in that the current driving object includes: vehicle or robot.
13. device according to claim 11, which is characterized in that the motion planning information includes: driving path, is expert at
Sail driving direction, the velocity and acceleration of each position point on path.
14. device according to claim 11, which is characterized in that the action message includes following any one or more
: it waves, mentions leg, turns round, headwork;
The movement trend includes: static or movement.
15. device according to claim 11, which is characterized in that first predicting unit is specifically used for by described
First nerves network:
Respectively for an at least at least frame history image of a group traveling together in a historical time section t1, extract each in history image
The limb action feature of pedestrian;
Limb action feature of each pedestrian in historical time section t1 is indicated with a motion vector respectively;
The first offset moment matrix is generated according to the motion vector of described at least a group traveling together, is obtained based on the first offset moment matrix
For indicating the second offset moment matrix of at least movement trend of a group traveling together in future time section t2;
It is obtained by the second offset moment matrix for indicating at least limb action of a group traveling together in future time section t2
The motion vector of feature;
Obtaining respectively indicates that the motion vector of at least limb action feature of a group traveling together in future time section t2 is corresponding
Limb action feature;
Respectively according at least limb action feature of a group traveling together in future time section t2, at least a group traveling together will be obtained in future
Movement trend in time period t 2.
16. device according to claim 11, which is characterized in that when the movement trend is static, described control unit
It is specifically used for:
According to the distance between the position of described at least a group traveling together, described at least a group traveling together and described current driving object and
The motion planning information and current motion state of the current driving object, determine the Driving control of the current driving object
Strategy;
Motion planning information and Driving control strategy based on the current driving object, drive the current driving object
Sail control.
17. device according to claim 11, which is characterized in that when the movement trend is movement, described device is also wrapped
It includes:
Second predicting unit, it is dynamic in future time section t2 based on described at least a group traveling together for passing through nervus opticus network
Make trend, the prediction at least walking behavioural information of a group traveling together in future time section t2;
Described control unit is specifically used for: according to walking behavioural information of the described at least a group traveling together in future time section t2, with
And the motion planning information and current motion state of the current driving object, determine the driving control of the current driving object
System strategy;Motion planning information and Driving control strategy based on the current driving object, to the current driving object into
Row Driving control.
18. device according to claim 17, which is characterized in that the walking behavioural information includes: direction of travel, walking
Speed, walking path.
19. device described in 6~18 any claims according to claim 1, which is characterized in that the Driving control strategy packet
It includes following any one: ignoring, deceleration, acceleration, stop, following, bypassing;
Motion planning information and Driving control strategy of the described control unit based on the current driving object, to the current line
When sailing object progress Driving control, specifically for the Driving control strategy based on the current driving object to the current driving
The motion planning information of object is adjusted, and is driven based on motion planning information adjusted to the current driving object
Sail control.
20. device according to claim 11, which is characterized in that further include:
Training unit, for being trained using the Sample video for including pedestrian to the first nerves network.
21. a kind of equipment, which is characterized in that the equipment includes:
One or more processors;
Storage device, for storing one or more programs,
When one or more of programs are executed by one or more of processors, so that one or more of processors are real
The now automatic Pilot control method as described in any in claim 1~10.
22. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is by processor
The automatic Pilot control method as described in any in claim 1~10 is realized when execution.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910036809.2A CN109878512A (en) | 2019-01-15 | 2019-01-15 | Automatic Pilot control method, device, equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910036809.2A CN109878512A (en) | 2019-01-15 | 2019-01-15 | Automatic Pilot control method, device, equipment and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109878512A true CN109878512A (en) | 2019-06-14 |
Family
ID=66926033
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910036809.2A Pending CN109878512A (en) | 2019-01-15 | 2019-01-15 | Automatic Pilot control method, device, equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109878512A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110239529A (en) * | 2019-06-28 | 2019-09-17 | 北京海益同展信息科技有限公司 | Control method for vehicle, device and computer readable storage medium |
CN110316186A (en) * | 2019-07-01 | 2019-10-11 | 百度在线网络技术(北京)有限公司 | Vehicle collision avoidance pre-judging method, device, equipment and readable storage medium storing program for executing |
CN111907520A (en) * | 2020-07-31 | 2020-11-10 | 东软睿驰汽车技术(沈阳)有限公司 | Pedestrian posture recognition method and device and unmanned automobile |
CN112572462A (en) * | 2019-09-30 | 2021-03-30 | 北京百度网讯科技有限公司 | Automatic driving control method and device, electronic equipment and storage medium |
CN112987754A (en) * | 2021-04-14 | 2021-06-18 | 北京三快在线科技有限公司 | Unmanned equipment control method and device, storage medium and electronic equipment |
GB2591515A (en) * | 2020-01-31 | 2021-08-04 | Mclaren Automotive Ltd | Track assistant |
WO2021237768A1 (en) * | 2020-05-29 | 2021-12-02 | 初速度(苏州)科技有限公司 | Data-driven-based system for implementing automatic iteration of prediction model |
CN114261406A (en) * | 2020-09-15 | 2022-04-01 | 丰田自动车株式会社 | Open type vehicle and operation management system thereof |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106504266A (en) * | 2016-09-29 | 2017-03-15 | 北京市商汤科技开发有限公司 | The Forecasting Methodology of walking behavior and device, data processing equipment and electronic equipment |
US9665802B2 (en) * | 2014-11-13 | 2017-05-30 | Nec Corporation | Object-centric fine-grained image classification |
CN106864361A (en) * | 2017-02-14 | 2017-06-20 | 驭势科技(北京)有限公司 | Vehicle and the method for people's car mutual, system, device and storage medium outside car |
CN108196535A (en) * | 2017-12-12 | 2018-06-22 | 清华大学苏州汽车研究院(吴江) | Automated driving system based on enhancing study and Multi-sensor Fusion |
CN108416321A (en) * | 2018-03-23 | 2018-08-17 | 北京市商汤科技开发有限公司 | For predicting that target object moves method, control method for vehicle and the device of direction |
-
2019
- 2019-01-15 CN CN201910036809.2A patent/CN109878512A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9665802B2 (en) * | 2014-11-13 | 2017-05-30 | Nec Corporation | Object-centric fine-grained image classification |
CN106504266A (en) * | 2016-09-29 | 2017-03-15 | 北京市商汤科技开发有限公司 | The Forecasting Methodology of walking behavior and device, data processing equipment and electronic equipment |
CN106864361A (en) * | 2017-02-14 | 2017-06-20 | 驭势科技(北京)有限公司 | Vehicle and the method for people's car mutual, system, device and storage medium outside car |
CN108196535A (en) * | 2017-12-12 | 2018-06-22 | 清华大学苏州汽车研究院(吴江) | Automated driving system based on enhancing study and Multi-sensor Fusion |
CN108416321A (en) * | 2018-03-23 | 2018-08-17 | 北京市商汤科技开发有限公司 | For predicting that target object moves method, control method for vehicle and the device of direction |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110239529A (en) * | 2019-06-28 | 2019-09-17 | 北京海益同展信息科技有限公司 | Control method for vehicle, device and computer readable storage medium |
CN110316186A (en) * | 2019-07-01 | 2019-10-11 | 百度在线网络技术(北京)有限公司 | Vehicle collision avoidance pre-judging method, device, equipment and readable storage medium storing program for executing |
US11529971B2 (en) | 2019-09-30 | 2022-12-20 | Apollo Intelligent Driving Technology (Beijing) Co., Ltd. | Method and apparatus for autonomous driving control, electronic device, and storage medium |
CN112572462A (en) * | 2019-09-30 | 2021-03-30 | 北京百度网讯科技有限公司 | Automatic driving control method and device, electronic equipment and storage medium |
GB2591515A (en) * | 2020-01-31 | 2021-08-04 | Mclaren Automotive Ltd | Track assistant |
GB2591515B (en) * | 2020-01-31 | 2023-07-12 | Mclaren Automotive Ltd | Track assistant |
US11745756B2 (en) | 2020-01-31 | 2023-09-05 | Mclaren Automotive Limited | Track assistant |
WO2021237768A1 (en) * | 2020-05-29 | 2021-12-02 | 初速度(苏州)科技有限公司 | Data-driven-based system for implementing automatic iteration of prediction model |
CN111907520B (en) * | 2020-07-31 | 2022-03-15 | 东软睿驰汽车技术(沈阳)有限公司 | Pedestrian posture recognition method and device and unmanned automobile |
CN111907520A (en) * | 2020-07-31 | 2020-11-10 | 东软睿驰汽车技术(沈阳)有限公司 | Pedestrian posture recognition method and device and unmanned automobile |
CN114261406A (en) * | 2020-09-15 | 2022-04-01 | 丰田自动车株式会社 | Open type vehicle and operation management system thereof |
CN114261406B (en) * | 2020-09-15 | 2023-07-25 | 丰田自动车株式会社 | Open type vehicle and operation management system thereof |
US11951984B2 (en) | 2020-09-15 | 2024-04-09 | Toyota Jidosha Kabushiki Kaisha | Open vehicle and operation management system thereof |
CN112987754A (en) * | 2021-04-14 | 2021-06-18 | 北京三快在线科技有限公司 | Unmanned equipment control method and device, storage medium and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109878512A (en) | Automatic Pilot control method, device, equipment and computer readable storage medium | |
CN111033512B (en) | Motion control device for communicating with autonomous traveling vehicle based on simple two-dimensional planar image pickup device | |
CN111860155B (en) | Lane line detection method and related equipment | |
JP6744679B2 (en) | Human-machine hybrid decision making method and apparatus | |
US11100646B2 (en) | Future semantic segmentation prediction using 3D structure | |
US11776155B2 (en) | Method and apparatus for detecting target object in image | |
CN109817021A (en) | A kind of laser radar trackside blind area traffic participant preventing collision method and device | |
CN110070056A (en) | Image processing method, device, storage medium and equipment | |
US11718306B2 (en) | Method and apparatus for acquiring sample deviation data, and electronic device | |
US20220035733A1 (en) | Method and apparatus for checking automatic driving algorithm, related device and storage medium | |
Sabeti et al. | Toward AI-enabled augmented reality to enhance the safety of highway work zones: Feasibility, requirements, and challenges | |
EP3937077B1 (en) | Lane marking detecting method, apparatus, electronic device, storage medium, and vehicle | |
CN112419722A (en) | Traffic abnormal event detection method, traffic control method, device and medium | |
CN107515607A (en) | Control method and device for unmanned vehicle | |
JP2022023910A (en) | Method for acquiring traffic state and apparatus thereof, roadside device, and cloud control platform | |
Li et al. | DBUS: Human driving behavior understanding system | |
CN113821720A (en) | Behavior prediction method and device and related product | |
CN112258837A (en) | Vehicle early warning method, related device, equipment and storage medium | |
Chen et al. | AI-based vehicular network toward 6G and IoT: Deep learning approaches | |
CN113901341A (en) | Navigation information prompting method, device, medium and program product | |
CN109726447A (en) | Pedestrian's evacuation method, device and storage medium around automatic driving vehicle | |
CN113392793A (en) | Method, device, equipment, storage medium and unmanned vehicle for identifying lane line | |
CN113703704B (en) | Interface display method, head-mounted display device, and computer-readable medium | |
CN109885392A (en) | Distribute the method and device of vehicle computing resource | |
CN116868239A (en) | Static occupancy tracking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190614 |
|
RJ01 | Rejection of invention patent application after publication |