CN109523574B - Walking track prediction method and electronic equipment - Google Patents

Walking track prediction method and electronic equipment Download PDF

Info

Publication number
CN109523574B
CN109523574B CN201811609759.4A CN201811609759A CN109523574B CN 109523574 B CN109523574 B CN 109523574B CN 201811609759 A CN201811609759 A CN 201811609759A CN 109523574 B CN109523574 B CN 109523574B
Authority
CN
China
Prior art keywords
historical
distance
sensitivity attribute
distance sensitivity
different
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811609759.4A
Other languages
Chinese (zh)
Other versions
CN109523574A (en
Inventor
杨大业
宋建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201811609759.4A priority Critical patent/CN109523574B/en
Publication of CN109523574A publication Critical patent/CN109523574A/en
Application granted granted Critical
Publication of CN109523574B publication Critical patent/CN109523574B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Abstract

The application provides a walking track prediction method and electronic equipment, wherein the method comprises the following steps: obtaining a motion track of a first object; determining historical positions and historical walking speeds of the first object at different historical moments at least based on the motion trail; obtaining at least one distance sensitivity attribute corresponding to the first object at different historical moments, wherein the distance sensitivity attribute is used for representing a safety distance required to be kept between the first object and at least one second object except the first object; and predicting the walking track of the first object at least according to the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments.

Description

Walking track prediction method and electronic equipment
Technical Field
The present disclosure relates to the field of trajectory prediction technologies, and in particular, to a walking trajectory prediction method and an electronic device.
Background
With the popularization of video monitoring, the digitalized video image provides a basis for predicting the walking track of an object in real time. The possible accidents can be early warned in advance by predicting the walking track of the object, for example, reliable basis can be provided for the people flow control of a market, the investigation of suspects and the like by predicting the walking track of people.
Disclosure of Invention
The purpose of the present disclosure is to provide a walking track prediction method and an electronic device, so as to improve the prediction range of the walking track of an object and improve the prediction accuracy.
In order to achieve the above purpose, the present disclosure provides the following technical solutions:
a walking trajectory prediction method comprises the following steps:
obtaining a motion track of a first object;
determining historical positions and historical walking speeds of the first object at different historical moments at least based on the motion trail;
obtaining at least one distance sensitivity attribute corresponding to the first object at different historical moments, wherein the distance sensitivity attribute is used for representing a safety distance required to be kept between the first object and at least one second object except the first object;
and predicting the walking track of the first object at least according to the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments.
Preferably, the obtaining of the motion trajectory of the first object based on the video image, and the obtaining of the at least one distance sensitivity attribute corresponding to the first object at different historical moments includes:
determining historical relative movement information between the first object and at least one second object except the first object at different historical moments;
determining at least one distance sensitivity attribute corresponding to the first object at different historical moments based on historical relative movement information between the first object and at least one second object other than the first object at different historical moments.
Preferably, the obtaining at least one distance sensitivity attribute corresponding to the first object at different historical times includes:
and acquiring at least one distance sensitivity attribute corresponding to the first object at set different historical moments.
Preferably, the predicting the walking trajectory of the first object according to at least the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments comprises:
determining a behavior pattern type to which at least one distance sensitivity attribute corresponding to the first object belongs at different historical moments according to a preset corresponding relation between the distance sensitivity attribute and the behavior pattern type;
and predicting the walking track of the first object at least based on the historical position, the historical walking speed and the belonged behavior pattern category of the first object at different moments.
Preferably, the predicting the walking track of the first object based on at least the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different time includes:
and inputting the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different moments into a pre-trained recurrent neural network model so as to output the walking track of the first object through the recurrent neural network model.
Preferably, the recurrent neural network model is trained based on the positions, walking speeds and behavior pattern categories of a plurality of sample users at a plurality of different time instants.
Preferably, the at least one distance sensitivity attribute comprises at least one of:
a first distance sensitivity attribute for characterizing a safety distance that is reserved for the first object to avoid actively making contact with at least one second object while the first object is moving;
and
a second distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid passive contact with at least one second object.
An electronic device, comprising:
a memory for obtaining a motion trajectory of a first object;
and the processor is used for determining the historical position and the historical walking speed of a first object in the video images at different historical moments at least based on the motion trail, obtaining at least one distance sensitivity attribute corresponding to the first object at different historical moments, wherein the distance sensitivity attribute is used for representing a safe distance required to be kept between the first object and at least one second object except the first object, and predicting the walking trail of the first object at least according to the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments.
Preferably, the method further comprises the following steps:
and the acquisition equipment is used for acquiring the motion trail of the first object.
An electronic device, comprising:
a first acquisition unit for acquiring a motion trajectory of a first object;
the first determining unit is used for determining the historical position and the historical walking speed of a first object in the video images at different historical moments at least based on the motion trail;
the second determining unit is used for obtaining at least one distance sensitivity attribute corresponding to the first object at different historical moments, wherein the distance sensitivity attribute is used for representing a safety distance required to be kept between the first object and at least one second object except the first object;
the first prediction unit is used for predicting the walking track of the first object at least according to the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments.
As can be seen from the above solutions, the embodiment of the present disclosure provides a walking trajectory prediction method, which obtains a motion trajectory of a first object, determines historical positions and historical walking speeds of the first object at different historical times based on the motion trajectory, obtains at least one distance sensitivity attribute corresponding to the first object at different historical times, and thereby predicts a walking trajectory of the first object according to at least the historical positions, the historical walking speeds and the at least one distance sensitivity attribute of the first object at different historical times, where the distance sensitivity attribute is used to represent a safety distance that needs to be maintained between the first object and at least one second object other than the first object, and thus in this application, the walking trajectory of the first object can be predicted by fully utilizing the historical positions, the historical walking speeds and the distance sensitivity attribute corresponding to the first object, the prediction precision of the walking track is improved.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic flowchart of a walking trajectory prediction method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a walking trajectory prediction method according to a second embodiment of the present application;
fig. 3 is a schematic diagram of relative positions of a first object and a second object according to a second embodiment of the method of the present application;
fig. 4 is a schematic flow chart of a walking trajectory prediction method according to a third embodiment of the present application;
fig. 5 is a schematic flow chart of a walking trajectory prediction method according to a fourth embodiment of the present application;
fig. 6 is a schematic diagram of a first corresponding relationship provided in a fourth embodiment of the method of the present application;
fig. 7 is a schematic flowchart of a walking trajectory prediction method according to a fifth embodiment of the present application;
fig. 8a is a structural model diagram of a long-short memory neural network model according to a fifth embodiment of the present invention;
fig. 8b is a functional structure diagram of a long-short memory neural network model according to a fifth embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be practiced otherwise than as specifically illustrated.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. All other embodiments, which can be derived by one of ordinary skill in the art from the embodiments disclosed herein without inventive faculty, are intended to be within the scope of the disclosure.
An embodiment of the present application provides a method for predicting a walking trajectory, as shown in fig. 1, the method includes the following steps:
step 101: obtaining a motion track of a first object;
the motion trail is preferably acquired based on a video image, the video image comprises the motion trail of the first object, and the behavior of the first object can be monitored and acquired through the acquisition equipment. The video image may or may not include the first object.
The video image contains a first object that can be located in an acquisition area of at least one acquisition device. If at least two capturing devices capture the motion trail of the first object, video images captured by the at least two capturing devices and containing the motion trail of the first object can be acquired.
If the video image does not contain the first object, the capturing device for capturing the motion trail of the first object may be disposed on the first object to capture the video image at the angle of the first object, and the video image may also embody the motion trail of the first object.
It should be noted that the walking track prediction method provided by the present application may be applied to an electronic device provided with a collection device, or may also be applied to an electronic device capable of communicating with a collection device.
In addition, the motion trail can be obtained by other means, such as an electronic device with a positioning function carried by the user, such as an electronic watch and a mobile phone.
Step 102: determining historical positions and historical walking speeds of the first object at different historical moments at least based on the motion trail;
the historical position and the historical walking speed of the first object at different historical moments can be determined through analyzing the first motion trail.
The historical time is a time that has passed, and specifically may be a plurality of times before the current time, so as to improve the accuracy of the prediction of the walking trajectory of the first object at the next time adjacent to the current time. In this case, the motion trajectory may be located to determine a historical position and a historical walking speed of the first object in the video images at a plurality of previous time instants adjacent to the current time instant.
The current time is a time at which the walking trajectory prediction method is currently executed, and specifically, a time at which the motion trajectory of the first object is obtained may be regarded as the current time.
The historical walking speed can be obtained by the distance moved by the first object at two adjacent historical time moments and the time between the two adjacent historical time moments. It should be noted that the historical walking speed here includes the movement direction information of the first object.
Step 103: obtaining at least one distance sensitivity attribute corresponding to the first object at different historical moments;
the distance sensitivity attribute is used to characterize a safe distance that needs to be maintained between the first object and at least one second object other than the first object.
Specifically, the at least one distance sensitivity attribute includes at least one of:
a first distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid actively making contact with at least one second object while the first object is moving. In colloquial terms, the first distance sensitivity attribute is a safety distance that the first object reserves to prevent active contact with the second object during movement of the first object.
A second distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid passive contact with at least one second object. Colloquially, the second distance sensitivity attribute is a safety distance reserved for the first object to change the current state to prevent a collision when the first object finds that the second object may contact itself. Such as pausing the movement or changing the current walking trajectory is the way the first object changes the current state.
It should be noted that, in the present application, the first object may be of various types, and specifically, the first object may be a real person, or may be an intelligent device, such as an unmanned automobile, a logistics robot, a match robot during a soccer match, and the like. The manner in which different types of objects determine the distance sensitivity attribute may vary, as will be described in more detail below.
Step 104: and predicting the walking track of the first object at least according to the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments.
It should be noted that, predicting the walking trajectory of the first object may be predicting the walking trajectory of the first object at a time next to the current time, so that timeliness is provided.
Therefore, in the embodiment, by acquiring the motion trajectory of a first object, determining the historical position and the historical walking speed of the first object at different historical times based on the motion trajectory, and acquiring at least one distance sensitivity attribute corresponding to the first object at different historical times, the walking trajectory of the first object is predicted according to at least the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical times, and the distance sensitivity attribute is used for representing the safety distance required to be kept between the first object and at least one second object except the first object, it can be seen that in the present application, the walking trajectory of the first object can be predicted by fully utilizing the position, the walking speed and the distance sensitivity attribute corresponding to the first object, the prediction accuracy is improved.
In the present application, there are various ways to determine the distance sensitivity attribute, and specifically, a second embodiment of the method of the present application provides a walking trajectory prediction method, as shown in fig. 2, the method includes the following steps:
step 201: acquiring a motion track of a first object;
step 202: determining historical positions and historical walking speeds of the first object at different historical moments at least based on the motion trail;
step 203: determining historical relative movement information between the first object and at least one second object except the first object at different historical moments;
in this embodiment, the motion trajectory of the first object is obtained based on the video image, and the motion trajectory of the first object is not only included in the video image, but also includes the motion trajectory of at least one second object other than the first object, which is any one of the objects other than the first object in the video image. Then historical relative movement information between the first object and the second object at different historical times can be determined by analyzing the video images. The historical relative movement information is used at least to characterize a movement distance between the first object and the second object.
Step 204: determining at least one distance sensitivity attribute corresponding to the first object at different historical moments based on historical relative movement information between the first object and at least one second object except the first object at different historical moments;
step 203 and step 204 are specific implementations of obtaining at least one distance sensitivity attribute corresponding to the first object at different historical times.
The distance sensitivity attribute is used to characterize a safe distance that needs to be maintained between the first object and at least one second object other than the first object.
Specifically, the at least one distance sensitivity attribute includes at least one of:
a first distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid actively making contact with at least one second object while the first object is moving.
Colloquially, the first distance sensitivity attribute is a safety distance that the first object reserves to prevent active contact with the second object during movement. For example, if a first object has a second object walking in the same direction as the first object in front of the first object during normal walking, the first object will keep a certain distance from the second object walking in front in order to prevent contact with the second object, and the distance is the first distance sensitivity attribute.
For example, at a plurality of historical times, if the first object is maintained at a distance of 1.2m from the second object as determined by analyzing the video images, the first distance sensitivity attribute value may be determined to be 1.2 m.
A second distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid passive contact with at least one second object.
Colloquially, the second distance sensitivity attribute is a safety distance reserved for the first object to change the current state to prevent a collision when the first object finds that the second object may contact itself. Such as pausing the movement or changing the current walking trajectory is the way the first object changes the current state. For example, during normal walking of a first object, a second object is in front of the first object and moves towards the first object, the first object changes the walking track when finding that the second object may collide with the first object, and then the distance between the first object and the second object when changing the walking track is the second distance sensitivity attribute.
For example, as shown in fig. 3, the distance between the first object P1 and the second object P2 is D1 at the first historical time, D2 at the second historical time, and the walking trajectory is changed by the first object P1 at the third historical time, so the distance D3 between the first object P1 and the second object P2 at the third historical time is the second distance sensitivity attribute value.
In this embodiment, the first object may be a person. Of course, the intelligent device may also be an unmanned automobile, a logistics robot, a match robot such as a match robot in a football match, and the like. It should be noted that, the smart device is not conscious of normal people, so the first distance sensitivity attribute and/or the second distance sensitivity attribute of the smart device may be set in the system, the setting may be preset or may be adjusted at any time in actual use, and during walking, the smart device may keep a certain safety distance with other second objects based on the set first distance sensitivity attribute and/or the set second distance sensitivity attribute, so that the first distance sensitivity attribute and/or the second distance sensitivity attribute corresponding to the smart device may also be determined by analyzing the video image.
The second object may be an object in motion, including an object in motion of the same type as the first object, and/or an object in motion of a different type than the first object.
If the first object is a person, the second object may also be a person. The first object is a smart device, and the second object may be a smart device or a person.
Step 205: and predicting the walking track of the first object at least according to the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments.
Therefore, in the embodiment, the travel track of the first object can be predicted by fully utilizing the historical position, the historical travel speed and the distance sensitivity attribute corresponding to the first object, and the prediction accuracy is improved.
An embodiment of the method provides a walking trajectory prediction method, as shown in fig. 4, the method includes the following steps:
step 401: acquiring a motion track of a first object;
step 402: determining historical positions and historical walking speeds of the first object at different historical moments at least based on the motion trail;
step 403: acquiring at least one distance sensitivity attribute corresponding to the first object at set different historical moments;
step 403 is to obtain a specific implementation of at least one distance sensitivity attribute corresponding to the first object at different historical times.
In this embodiment, the first object is a smart device, and in order to prevent the smart device from colliding with other objects during a moving process, a distance sensitivity attribute is set for the smart device, so in this step, at least one distance sensitivity attribute corresponding to the first object at different set historical times can be obtained.
The second object may be an object in motion, including an object in motion of the same type as the first object, and/or an object in motion of a different type than the first object.
The distance sensitivity attribute is used to characterize a safe distance that needs to be maintained between the first object and at least one second object other than the first object.
Specifically, the at least one distance sensitivity attribute includes at least one of:
a first distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid actively making contact with at least one second object while the first object is moving. Colloquially, the first distance sensitivity attribute is a safety distance that the first object reserves to prevent active contact with the second object during movement. For example, setting the first distance sensitivity attribute to 0.5m, the smart device will maintain a safe distance of at least 0.5m from other walking objects.
A second distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid passive contact with at least one second object.
Colloquially, the second distance sensitivity attribute is a safety distance reserved for the first object to change the current state to prevent being hit when the first object finds that the second object may come into contact with itself. Such as pausing the movement, or changing the current walking trajectory is the way the first object changes the current state. For example, if the second distance sensitivity attribute is set to 0.3m, the first object changes the walking trajectory if the second object is found to be 0.3m away from the first object during normal walking.
It should be noted that the method for predicting a walking trajectory provided in this embodiment may be applied to an intelligent device, and certainly may also be applied to an electronic device different from the intelligent device, so that the electronic device may obtain a distance sensitivity attribute through communication with the intelligent device (first object), or the electronic device may store distance sensitivity attributes of different intelligent devices in advance, and determine the distance sensitivity attribute corresponding to the intelligent device by obtaining an identifier of the currently predicted intelligent device.
Step 404: and predicting the walking track of the first object at least according to the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments.
Therefore, in the embodiment, the travel track of the first object can be predicted by fully utilizing the historical position, the historical travel speed and the distance sensitivity attribute corresponding to the first object, and the prediction accuracy is improved.
The fourth embodiment of the present application provides a walking trajectory prediction method, as shown in fig. 5, the method includes the following steps:
step 501: acquiring a motion track of a first object;
step 502: determining historical positions and historical walking speeds of the first object at different historical moments at least based on the motion trail;
step 503: acquiring at least one distance sensitivity attribute corresponding to the first object at different historical moments;
step 504: determining a behavior pattern type to which at least one distance sensitivity attribute corresponding to the first object belongs at different historical moments according to a preset corresponding relation between the distance sensitivity attribute and the behavior pattern type;
in this embodiment, a corresponding relationship between the distance sensitivity attribute and the behavior pattern category is pre-established, and specifically, if the distance sensitivity attribute includes a first distance sensitivity attribute and a second distance sensitivity attribute, the corresponding relationship may be a first corresponding relationship between the first distance sensitivity attribute and the behavior pattern category and between the second distance sensitivity attribute and the behavior pattern category.
For example, as shown in FIG. 6, the first distance sensitivity attribute is represented by D1, and the second distance sensitivity attribute is represented by D2, then the following correspondence is present:
d1 being greater than D1 and D2 being greater than D2 correspond to behavior pattern class 1;
d1 being less than D1 and D2 being greater than D2 correspond to behavioral pattern class 2;
d1 being less than D1 and D2 being less than D2 correspond to behavioral pattern class 3;
d1 is greater than D1, and D2 is less than D2 for behavior pattern class 4.
Through the first corresponding relation, the behavior pattern types to which the first distance sensitivity attribute and the second distance sensitivity attribute corresponding to the first object at different historical moments belong can be determined.
The correspondence may be a second correspondence of the first distance sensitivity attribute to the behavior pattern category if the distance sensitivity attribute only includes a first distance sensitivity attribute.
For example, the first distance sensitivity attribute is represented by D1, having the following correspondence:
d1 is less than D1 for behavior pattern category 5;
d1 is greater than D1 for behavior pattern class 6.
And determining the behavior pattern type to which the first distance sensitivity attribute corresponding to the first object at different historical moments belongs through the second corresponding relation.
The correspondence may be a third correspondence of the second distance sensitivity attribute to the behavior pattern category if the distance sensitivity attribute includes only the second distance sensitivity attribute.
For example, the second distance sensitivity attribute is represented by D2, having the following correspondence:
d2 being less than D2 for behavior pattern class 7;
d1 is greater than D2 for behavior pattern class 8.
And determining the behavior pattern type to which the second distance sensitivity attribute corresponding to the first object belongs at different historical moments through the third corresponding relation.
Step 505: and predicting the walking track of the first object at least based on the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different moments.
Step 504 and step 505 are steps of predicting a concrete implementation of the walking track of the first object based on at least the historical position, the historical walking speed and the belonged behavior pattern category of the first object at different moments.
The predicting of the walking track of the first object at least based on the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different moments may include:
and inputting the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different moments into a pre-trained recurrent neural network model so as to output the walking track of the first object through the recurrent neural network model.
The recurrent neural network model is obtained by training based on the positions, walking speeds and behavior pattern classes of a plurality of sample users at a plurality of different moments.
For convenience of understanding, a specific example of a walking trajectory prediction method is provided in the method embodiment five of the present application, and as shown in fig. 7, the method includes the following steps:
step 701: acquiring a video image containing a motion track of a first person;
step 702: extracting historical positions of a first person at different historical moments from the video image, and calculating historical walking speeds at different historical moments and historical relative movement information of a plurality of second persons around the distance;
step 703: determining a first distance sensitivity attribute and a second distance sensitivity attribute of the first person and a plurality of second persons around the first person based on the historical relative movement information;
step 704: classifying the first person by using the first distance sensitivity attribute and the second distance sensitivity attribute to obtain a behavior pattern class to which the first person belongs, wherein the classification mode refers to the attached figure 6;
step 705: and inputting the historical position, the historical walking speed and the attributive behavior pattern type of the first object at different moments into a long and short memory neural network model trained in advance, so as to output the walking track of the first object through the long and short memory neural network model.
The long and short memory neural network model is a specific model of the recurrent neural network model, the structural model diagram is shown in fig. 8a, the functional diagram is shown in fig. 8b, and as can be seen from fig. 8b, the behavior pattern type is used as a hidden state layer to be associated with each long and short memory neural network model, that is, the long and short memory neural network model of a certain object can be used as the input of the behavior pattern type of the hidden state layer of other objects.
Corresponding to the foregoing method for predicting a walking trajectory, an embodiment of the present application further provides an electronic device, as shown in fig. 9, in a first embodiment of the apparatus of the present application, an electronic device includes: a memory 100 and a processor 200;
the memory 100 is used for obtaining a motion track of a first object;
the motion trajectory is preferably obtained based on a video image, the video image includes a motion trajectory of the first object, the behavior of the first object may be monitored and obtained by the capture device, and when the video image is obtained by the electronic device, the motion trajectory is stored in the memory 100.
The video image may or may not include the first object.
The video image contains a first object that can be located in the acquisition area of the at least one acquisition device. If at least two capturing devices capture the motion trail of the first object, video images captured by the at least two capturing devices and containing the motion trail of the first object can be acquired.
The video image does not contain the first object, and the capturing device for capturing the motion trail of the first object may be disposed on the first object to capture the video image at the angle of the first object, and the video image can also embody the motion trail of the first object.
It should be noted that the electronic device provided by the present application may further include a capturing device for capturing a video image including the motion trajectory of the first object. Or the electronic device provided by the application can also be communicated with the acquisition device to acquire the video image containing the motion trail of the first object and store the video image in the memory.
In addition, the motion trail can be obtained by other means, such as an electronic device with a positioning function carried by the user, such as an electronic watch and a mobile phone.
A processor 200, configured to determine, based on at least the motion trajectory, a historical position and a historical walking speed of the first object at different historical times, determine at least one distance sensitivity attribute corresponding to the first object at different historical times, and predict a walking trajectory of the first object at least according to the historical position, the historical walking speed, and the at least one distance sensitivity attribute of the first object at different historical times.
The historical time is a time that has already passed, and specifically, the historical time may be a plurality of times before the current time, so as to improve the accuracy of the walking trajectory prediction of the first object at the next time adjacent to the current time. In this case, the motion trajectory may be located to determine a historical position and a historical walking speed of the first object in the video images at a plurality of previous time instants adjacent to the current time instant.
The current time is a time at which the walking trajectory prediction method is currently executed, and specifically, a time at which a motion trajectory including the first object is obtained may be regarded as the current time.
The historical walking speed can be obtained by the distance moved by the first object at two adjacent historical time moments and the time between the two adjacent historical time moments. It should be noted that the historical walking speed here includes the movement direction information of the first object.
The distance sensitivity attribute is used to characterize a safe distance that needs to be maintained between the first object and at least one second object other than the first object.
Specifically, the at least one distance sensitivity attribute includes one of:
a first distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid actively making contact with at least one second object while the first object is moving. Colloquially, the first distance sensitivity attribute is a safety distance that the first object reserves to prevent active contact with the second object during movement.
A second distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid passive contact with at least one second object. Colloquially, the second distance sensitivity attribute is a safety distance reserved for the first object to change the current state to prevent being hit when the first object finds that the second object may come into contact with itself. Such as pausing the movement, or changing the current walking trajectory is the way the first object changes the current state.
It should be noted that, in the present application, the first object may be of various types, and specifically, the first object may be a real person, or may be an intelligent device, such as an unmanned automobile, a logistics robot, a match robot during a soccer match, and the like. The manner in which different types of objects determine the distance sensitivity attribute may vary, as will be described in more detail below.
It should be noted that, predicting the walking trajectory of the first object may be predicting the walking trajectory of the first object at a time next to the current time, so that timeliness is provided.
Therefore, in the embodiment, by acquiring the motion trajectory of a first object, determining the historical position and the historical walking speed of the first object at different historical moments based on the motion trajectory, and determining at least one distance sensitivity attribute corresponding to the first object at different historical moments, the walking trajectory of the first object is predicted according to at least the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments, and the distance sensitivity attribute is used for representing a safety distance required to be kept between the first object and at least one second object other than the first object, so that in the application, the walking trajectory of the first object can be predicted by fully utilizing the position, the walking speed and the distance sensitivity attribute corresponding to the first object, the prediction accuracy is improved.
In a second embodiment of the apparatus of the present application, the obtaining, by a processor, at least one distance sensitivity attribute corresponding to the first object at different historical times includes: historical relative movement information between the first object and at least one second object except the first object at different historical moments is determined, and at least one distance sensitivity attribute corresponding to the first object at different historical moments is determined based on the historical relative movement information between the first object and the at least one second object except the first object at different historical moments.
In this embodiment, the motion trajectory of the first object is obtained based on the video image, and the motion trajectory of the first object is not only included in the video image, but also includes the motion trajectory of at least one second object other than the first object, which is any one of the objects other than the first object in the video image. Then historical relative movement information between the first object and the second object at different historical times can be determined by analyzing the video images. The historical relative movement information is used to characterize a distance between the first object and the second object.
The distance sensitivity attribute is used to characterize a safe distance that needs to be maintained between the first object and at least one second object other than the first object.
Specifically, the at least one distance sensitivity attribute includes at least one of:
a first distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid actively making contact with at least one second object while the first object is moving.
In colloquial terms, the first distance sensitivity attribute is a safety distance that the first object reserves to prevent active contact with the second object during movement of the first object. For example, if a first object has a second object walking in the same direction as the first object in front of the first object during normal walking, the first object will keep a certain distance from the second object walking in front in order to prevent contact with the second object, and the distance is the first distance sensitivity attribute.
For example, at a plurality of historical times, if the first object is maintained at a distance of 1.2m from the second object as determined by analyzing the video images, the first distance sensitivity attribute value may be determined to be 1.2 m.
A second distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid passive contact with at least one second object.
Colloquially, the second distance sensitivity attribute is a safety distance reserved for the first object to change the current state to prevent a collision when the first object finds that the second object may contact itself. Such as pausing the movement, or changing the current walking trajectory is the way the first object changes the current state. For example, during normal walking of a first object, a second object is in front of the first object and moves towards the first object, the first object changes the walking track when finding that the second object may collide with the first object, and then the distance between the first object and the second object when changing the walking track is the second distance sensitivity attribute.
In this embodiment, the first object may be a person, and certainly may also be an intelligent device, and the intelligent device may be an unmanned automobile, a logistics robot, a match robot in a match process such as a soccer match, and the like. It should be noted that, the smart device is not conscious of normal people, so the first distance sensitivity attribute and/or the second distance sensitivity attribute of the smart device may be set in the system, the setting may be preset or may be adjusted at any time in actual use, during walking, the smart device may keep a certain safety distance with other second objects based on the preset first distance sensitivity attribute and/or the preset second distance sensitivity attribute, and then the processor may determine the first distance sensitivity attribute and/or the second distance sensitivity attribute corresponding to the smart device by analyzing the video image.
The second object may be an object in motion, including an object in motion of the same type as the first object, and/or an object in motion of a different type than the first object.
In a third embodiment of the apparatus of the present application, the obtaining, by a processor, at least one distance sensitivity attribute corresponding to the first object at different historical times includes: and acquiring at least one distance sensitivity attribute corresponding to the first object at set different historical moments.
In this embodiment, the first object is an intelligent device, and in order to prevent the intelligent device from colliding with other objects during a walking process, a distance sensitivity attribute is set for the intelligent device, so that at least one distance sensitivity attribute corresponding to the first object at different set historical times can be obtained.
The second object may be an object in motion, including an object in motion of the same type as the first object, and/or an object in motion of a different type than the first object.
The distance sensitivity attribute is used to characterize a safe distance that needs to be maintained between the first object and at least one second object other than the first object.
Specifically, the at least one distance sensitivity attribute includes at least one of:
specifically, the at least one distance sensitivity attribute includes at least one of:
a first distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid actively making contact with at least one second object while the first object is moving. Colloquially, the first distance sensitivity attribute is a safety distance that the first object reserves to prevent active contact with the second object during movement. For example, setting the first distance sensitivity attribute to 0.5m, the smart device will maintain a safe distance of at least 0.5m from other walking objects.
A second distance sensitivity attribute for characterizing a safety distance reserved for the first object to avoid passive contact with at least one second object.
Colloquially, the second distance sensitivity attribute is a safety distance reserved for the first object to change the current state to prevent a collision when the first object finds that the second object may contact itself. Such as pausing the movement, or changing the current walking trajectory is the way the first object changes the current state. For example, if the second distance sensitivity attribute is set to 0.3m, the first object changes the walking trajectory if the second object is found to be 0.3m away from the first object during normal walking.
It should be noted that the electronic device provided in this embodiment may be an intelligent device, and certainly may also be different from the intelligent device. When the distance sensitivity attribute is different from the smart device, the electronic device may acquire the distance sensitivity attribute through communication with the smart device (first object), or the distance sensitivity attribute of different smart devices is stored in the electronic device in advance, and the distance sensitivity attribute corresponding to the smart device is determined by acquiring the currently predicted identifier of the smart device.
In a fourth embodiment of the present application device, the predicting, by a processor, a walking trajectory of the first object according to at least the historical position, the historical walking speed, and the at least one distance sensitivity attribute of the first object at different historical times includes: and determining the behavior pattern type to which at least one distance sensitivity attribute corresponding to the first object belongs at different historical moments according to the preset corresponding relation between the distance sensitivity attribute and the behavior pattern type, and predicting the walking track of the first object at least based on the historical position, the historical walking speed and the belonged behavior pattern type of the first object at different moments.
In this embodiment, a corresponding relationship between the distance sensitivity attribute and the behavior pattern category is pre-established, and specifically, if the distance sensitivity attribute includes a first distance sensitivity attribute and a second distance sensitivity attribute, the corresponding relationship may be a first corresponding relationship between the first distance sensitivity attribute and the behavior pattern category and between the second distance sensitivity attribute and the behavior pattern category.
The processor can determine the behavior pattern types to which the first distance sensitivity attribute and the second distance sensitivity attribute corresponding to the first object at different historical moments belong through the first corresponding relation.
The correspondence may be a second correspondence of the first distance sensitivity attribute to the behavior pattern category if the distance sensitivity attribute includes only the first distance sensitivity attribute.
The processor can determine the behavior pattern type to which the first distance sensitivity attribute corresponding to the first object belongs at different historical moments through the second corresponding relation.
The correspondence may be a third correspondence of the second distance sensitivity attribute to the behavior pattern category if the distance sensitivity attribute includes only the second distance sensitivity attribute.
The processor can determine the behavior pattern type to which the second distance sensitivity attribute corresponding to the first object belongs at different historical moments through the third corresponding relation.
Specifically, the predicting, by the processor, the walking trajectory of the first object based on at least the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different times includes: and inputting the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different moments into a pre-trained recurrent neural network model so as to output the walking track of the first object through the recurrent neural network model.
The recurrent neural network model is obtained by training based on the positions, walking speeds and behavior pattern classes of a plurality of sample users at a plurality of different moments.
The present application further provides an electronic device, as shown in fig. 10, in a fifth embodiment of the apparatus of the present application, an electronic device includes: a first acquisition unit 1001, a first determination unit 1002, a second determination unit 1003, and a first prediction unit 1004; wherein:
a first acquisition unit 1001 for acquiring a motion trajectory of a first object;
the motion trajectory is preferably obtained based on a video image, the video image includes a motion trajectory of the first object, and the behavior of the first object can be monitored and obtained through the acquisition device. The video image may or may not include the first object.
The video image contains a first object that can be located in an acquisition area of at least one acquisition device. If at least two capturing devices capture the motion trail of the first object, video images captured by the at least two capturing devices and containing the motion trail of the first object can be acquired.
The video image does not contain the first object, and the capturing device for capturing the motion trail of the first object may be disposed on the first object to capture the video image at the angle of the first object, and the video image can also embody the motion trail of the first object.
It should be noted that the electronic device provided by the present application may have a capturing device for capturing a video image including a motion trajectory of the first object. Or the electronic device provided by the application can also communicate with the acquisition device to acquire the video image containing the motion trail of the first object.
In addition, the motion trail can be obtained by other means, such as an electronic device with a positioning function carried by the user, such as an electronic watch and a mobile phone.
A first determining unit 1002, configured to determine historical positions and historical walking speeds of the first object at different historical moments based on at least the motion trajectory;
the historical time is a time that has passed, and specifically may be a plurality of times before the current time, so as to improve the accuracy of the prediction of the walking trajectory of the first object at the next time adjacent to the current time. In this case, the video images may be located to determine a historical position, a historical walking speed of the first object in the video images at a plurality of previous time instants adjacent to the current time instant.
The current time is a time of the current walking trajectory prediction method, and specifically, a time at which the first obtaining unit obtains the motion trajectory including the first object may be regarded as the current time.
Wherein, the historical walking speed can be obtained by the distance moved by the first object at two adjacent historical time moments and the time between the two adjacent historical time moments. It should be noted that the historical walking speed here includes the movement direction information of the first object.
A second determining unit 1003, configured to obtain at least one distance sensitivity attribute corresponding to the first object at different historical times.
The distance sensitivity attribute is used to characterize a safe distance that needs to be maintained between the first object and at least one second object other than the first object.
Specifically, the at least one distance sensitivity attribute includes one of:
a first distance sensitivity attribute for characterizing a safety distance that is reserved for the first object to avoid actively making contact with at least one second object while the first object is moving. Colloquially, the first distance sensitivity attribute is a safety distance that the first object reserves to prevent active contact with the second object during movement.
A second distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid passive contact with at least one second object. Colloquially, the second distance sensitivity attribute is a safety distance reserved for the first object to change the current state to prevent a collision when the first object finds that the second object may contact itself. Such as pausing the movement, or changing the current walking trajectory is the way the first object changes the current state.
It should be noted that, in the present application, the first object may be of various types, and specifically, the first object may be a real person, or may be an intelligent device, such as an unmanned automobile, a logistics robot, a match robot during a soccer match, and the like. The manner in which different types of objects determine the distance sensitivity attribute may vary, as will be described in more detail below.
A first prediction unit 1004, configured to predict a walking trajectory of the first object at least according to the historical position, the historical walking speed, and the at least one distance sensitivity attribute of the first object at different historical times.
It should be noted that, predicting the walking trajectory of the first object may be predicting the walking trajectory of the first object at a time next to the current time, so that timeliness is provided.
It can be seen that, in this embodiment, by acquiring a motion trajectory of a first object, determining a historical position and a historical walking speed of the first object at different historical times based on the motion trajectory, determining at least one distance sensitivity attribute corresponding to the first object at different historical times, thereby predicting a walking trajectory of the first object in dependence on the historical position, the historical walking speed and the at least one distance sensitivity attribute of the first object at different historical moments, and the distance sensitivity attribute is used for characterizing a safety distance required to be kept between the first object and at least one second object except the first object, therefore, the method and the device can predict the walking track of the first object by fully utilizing the position, the walking speed and the distance sensitivity attribute corresponding to the first object, and improve the prediction precision.
In an embodiment of the apparatus of the present application, the second determining unit includes: a first determination module and a second determination module, wherein:
the first determination module is used for determining historical relative movement information between the first object and at least one second object except the first object at different historical moments;
in this embodiment, the motion trajectory of the first object is obtained based on the video image, and the motion trajectory of the first object is not only included in the video image, but also includes the motion trajectory of at least one second object other than the first object, which is any one of the objects other than the first object in the video image. Then historical relative movement information between the first object and the second object at different historical times can be determined by analyzing the video images. The historical relative movement information is used to characterize a movement distance between the first object and the second object.
And the second determining module is used for determining at least one distance sensitivity attribute corresponding to the first object at different historical moments based on historical relative movement information between the first object at different historical moments and at least one second object except the first object.
The distance sensitivity attribute is used to characterize a safe distance that needs to be maintained between the first object and at least one second object other than the first object.
Specifically, the at least one distance sensitivity attribute includes one of:
a first distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid actively making contact with at least one second object while the first object is moving.
Colloquially, the first distance sensitivity attribute is a safety distance that the first object reserves to prevent active contact with the second object during movement. For example, if a first object has a second object walking in the same direction as the first object in front of the first object during normal walking, the first object will keep a certain distance from the second object walking in front in order to prevent contact with the second object, and the distance is the first distance sensitivity attribute.
For example, at a plurality of historical times, if the first object is maintained at a distance of 1.2m from the second object by analyzing the video images, the first distance sensitivity attribute value may be determined to be 1.2 m.
A second distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid passive contact with at least one second object.
Colloquially, the second distance sensitivity attribute is a safety distance reserved for the first object to change the current state to prevent a collision when the first object finds that the second object may contact itself. Such as pausing the movement, or changing the current walking trajectory is the way the first object changes the current state. For example, during normal walking of a first object, a second object is in front of the first object and moves towards the first object, the first object changes the walking track when finding that the second object may collide with the first object, and then the distance between the first object and the second object when changing the walking track is the second distance sensitivity attribute.
In this embodiment, the first object may be a person, and certainly may also be an intelligent device, and the intelligent device may be an unmanned automobile, a logistics robot, a match robot in a match process such as a soccer match, and the like. It should be noted that, the smart device is not conscious of normal people, so the first distance sensitivity attribute and/or the second distance sensitivity attribute of the smart device may be set in the system, the setting may be preset or may be adjusted at any time in actual use, during walking, the smart device may keep a certain safety distance with other second objects based on the set first distance sensitivity attribute and/or the set second distance sensitivity attribute, and then the first distance sensitivity attribute and/or the second distance sensitivity attribute corresponding to the smart device may also be determined by analyzing the video image.
The second object may be an object in motion, including an object in motion of the same type as the first object, and/or an object in motion of a different type than the first object.
In an embodiment of the device of the present application, the second determining unit includes: the first acquisition module is used for acquiring at least one distance sensitivity attribute corresponding to the first object at different set historical moments.
In this embodiment, the first object is an intelligent device, and in order to prevent the intelligent device from colliding with other objects during a walking process, a distance sensitivity attribute is set for the intelligent device, and the first obtaining module can obtain at least one distance sensitivity attribute corresponding to the first object at different set historical times.
The second object may be an object in motion, including an object in motion of the same type as the first object, and/or an object in motion of a different type than the first object.
The distance sensitivity attribute is used to characterize a safe distance that needs to be maintained between the first object and at least one second object other than the first object.
Specifically, the at least one distance sensitivity attribute includes at least one of:
a first distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid actively making contact with at least one second object while the first object is moving. Colloquially, the first distance sensitivity attribute is a safety distance that the first object reserves to prevent active contact with the second object during movement. For example, setting the first distance sensitivity attribute to 0.5m, the smart device will maintain a safe distance of at least 0.5m from other walking objects.
A second distance sensitivity attribute for characterizing a safety distance that the first object reserves to avoid passive contact with at least one second object.
Colloquially, the second distance sensitivity attribute is a safety distance reserved for the first object to change the current state to prevent a collision when the first object finds that the second object may contact itself. Such as pausing the movement, or changing the current walking trajectory is the way the first object changes the current state. For example, if the second distance sensitivity attribute is set to 0.3m, the first object changes the walking trajectory if the second object is found to be 0.3m away from the first object during normal walking.
It should be noted that the electronic device provided in this embodiment may be an intelligent device, and certainly may also be different from an intelligent device. When the electronic device is different from the smart device, the electronic device may acquire the distance sensitivity attribute through communication with the smart device (the first object), or the electronic device may pre-store the distance sensitivity attribute of different smart devices, and determine the distance sensitivity attribute corresponding to the smart device by acquiring the currently predicted identifier of the smart device.
In an eighth embodiment of the apparatus of the present application, the first prediction unit includes a third determining module and a first prediction module; wherein:
a third determining module, configured to determine, according to a preset correspondence between distance sensitivity attributes and behavior pattern categories, a behavior pattern category to which at least one distance sensitivity attribute corresponding to the first object belongs at different historical times;
in this embodiment, a corresponding relationship between the distance sensitivity attribute and the behavior pattern category is pre-established, and specifically, if the distance sensitivity attribute includes a first distance sensitivity attribute and a second distance sensitivity attribute, the corresponding relationship may be a first corresponding relationship between the first distance sensitivity attribute and the behavior pattern category and between the second distance sensitivity attribute and the behavior pattern category.
And determining the behavior pattern types to which the first distance sensitivity attribute and the second distance sensitivity attribute corresponding to the first object at different historical moments belong through the first corresponding relation.
The correspondence may be a second correspondence of the first distance sensitivity attribute to the behavior pattern category if the distance sensitivity attribute includes only the first distance sensitivity attribute.
And through the second corresponding relation, the behavior pattern type to which the first distance sensitivity attribute corresponding to the first object belongs at different historical moments can be determined.
The correspondence may be a third correspondence of the second distance sensitivity attribute to the behavior pattern category if the distance sensitivity attribute includes only the second distance sensitivity attribute.
And determining the behavior pattern type to which the second distance sensitivity attribute corresponding to the first object belongs at different historical moments through the third corresponding relation.
And the first prediction module is used for predicting the walking track of the first object based on the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different moments.
The first prediction module is specifically configured to input the historical position, the historical walking speed, and the attributive behavior pattern category of the first object at different times into a pre-trained recurrent neural network model, so as to output the walking trajectory of the first object through the recurrent neural network model.
The recurrent neural network model is obtained by training based on the positions, walking speeds and behavior pattern classes of a plurality of sample users at a plurality of different moments.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (7)

1. A walking trajectory prediction method comprises the following steps:
obtaining a motion track of a first object;
determining historical positions and historical walking speeds of the first object at different historical moments at least based on the motion trail;
obtaining at least one distance sensitivity attribute corresponding to the first object at different historical moments, wherein the distance sensitivity attribute is used for representing a safety distance required to be kept between the first object and at least one second object except the first object;
determining the behavior pattern type to which at least one distance sensitivity attribute corresponding to the first object belongs at different historical moments according to the preset corresponding relation between the distance sensitivity attribute and the behavior pattern type;
predicting the walking track of the first object at least based on the historical position, the historical walking speed and the belonged behavior pattern category of the first object at different moments;
the at least one distance sensitivity attribute comprises at least one of:
a first distance sensitivity attribute for characterizing a safety distance reserved for the first object to avoid actively making contact with at least one second object while the first object is moving;
and the number of the first and second groups,
the second distance sensitivity attribute is used for representing a safety distance reserved for avoiding the first object from being passively contacted with at least one second object, and the second distance sensitivity attribute is used for changing the current state to prevent the first object from being collided with the reserved safety distance when the first object finds that the second object can be contacted with the first object;
the motion trail of the first object is obtained based on video images, and the obtaining of at least one distance sensitivity attribute corresponding to the first object at different historical moments comprises:
determining historical relative movement information between the first object and at least one second object except the first object at different historical moments;
determining at least one distance sensitivity attribute corresponding to the first object at different historical moments based on historical relative movement information between the first object and at least one second object other than the first object at different historical moments.
2. The method for predicting walking trajectory according to claim 1, wherein said obtaining at least one distance sensitivity attribute corresponding to said first object at different historical moments comprises:
and acquiring at least one distance sensitivity attribute corresponding to the first object at set different historical moments.
3. The walking trajectory prediction method according to claim 1, wherein the predicting the walking trajectory of the first object based on at least the historical position, the historical walking speed and the attributed behavior pattern category of the first object at different time comprises:
and inputting the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different moments into a pre-trained recurrent neural network model so as to output the walking track of the first object through the recurrent neural network model.
4. The walking trajectory prediction method according to claim 3, wherein the recurrent neural network model is trained based on positions, walking speeds and behavior pattern categories of a plurality of sample users at a plurality of different time instants.
5. An electronic device, comprising:
a memory for obtaining a motion trajectory of a first object;
a processor, configured to determine, based on at least the motion trajectory, a historical position and a historical walking speed of a first object in video images at different historical times, obtain at least one distance sensitivity attribute corresponding to the first object at different historical times, where the distance sensitivity attribute is used to represent a safety distance that needs to be maintained between the first object and at least one second object other than the first object, and determine, according to a preset correspondence between the distance sensitivity attribute and a behavior pattern category, a behavior pattern category to which the at least one distance sensitivity attribute corresponding to the first object at different historical times belongs; predicting the walking track of the first object at least based on the historical position, the historical walking speed and the belonged behavior pattern category of the first object at different moments;
the at least one distance sensitivity attribute comprises at least one of:
a first distance sensitivity attribute for characterizing a safety distance that is reserved for the first object to avoid actively making contact with at least one second object while the first object is moving;
and the number of the first and second groups,
the second distance sensitivity attribute is used for representing a safety distance reserved for avoiding the first object from being passively contacted with at least one second object, and the second distance sensitivity attribute is used for changing the current state to prevent the first object from being collided with the reserved safety distance when the first object finds that the second object can be contacted with the first object;
the motion trail of the first object is obtained based on video images, and the obtaining of at least one distance sensitivity attribute corresponding to the first object at different historical moments comprises:
determining historical relative movement information between the first object and at least one second object except the first object at different historical moments;
determining at least one distance sensitivity attribute corresponding to the first object at different historical moments based on historical relative movement information between the first object and at least one second object other than the first object at different historical moments.
6. The electronic device of claim 5, further comprising:
and the acquisition equipment is used for acquiring the motion trail of the first object.
7. An electronic device, comprising:
a first acquisition unit for acquiring a motion trajectory of a first object;
the first determining unit is used for determining the historical position and the historical walking speed of the first object in the video images at different historical moments at least based on the motion trail;
the second determining unit is used for obtaining at least one distance sensitivity attribute corresponding to the first object at different historical moments, wherein the distance sensitivity attribute is used for representing a safety distance required to be kept between the first object and at least one second object except the first object;
the first prediction unit is used for determining the behavior pattern type to which at least one distance sensitivity attribute corresponding to the first object belongs at different historical moments according to the preset corresponding relation between the distance sensitivity attribute and the behavior pattern type; predicting the walking track of the first object at least based on the historical position, the historical walking speed and the attributive behavior pattern category of the first object at different moments; the at least one distance sensitivity attribute comprises at least one of:
a first distance sensitivity attribute for characterizing a safety distance that is reserved for the first object to avoid actively making contact with at least one second object while the first object is moving;
and the number of the first and second groups,
the second distance sensitivity attribute is used for representing a safety distance reserved for avoiding the first object from being passively contacted with at least one second object, and the second distance sensitivity attribute is used for changing the current state to prevent the first object from being collided with the reserved safety distance when the first object finds that the second object can be contacted with the first object;
the motion trajectory of the first object is obtained based on a video image, and the second determining unit is specifically configured to:
determining historical relative movement information between the first object and at least one second object except the first object at different historical moments;
determining at least one distance sensitivity attribute corresponding to the first object at different historical moments based on historical relative movement information between the first object and at least one second object other than the first object at different historical moments.
CN201811609759.4A 2018-12-27 2018-12-27 Walking track prediction method and electronic equipment Active CN109523574B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811609759.4A CN109523574B (en) 2018-12-27 2018-12-27 Walking track prediction method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811609759.4A CN109523574B (en) 2018-12-27 2018-12-27 Walking track prediction method and electronic equipment

Publications (2)

Publication Number Publication Date
CN109523574A CN109523574A (en) 2019-03-26
CN109523574B true CN109523574B (en) 2022-06-24

Family

ID=65797395

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811609759.4A Active CN109523574B (en) 2018-12-27 2018-12-27 Walking track prediction method and electronic equipment

Country Status (1)

Country Link
CN (1) CN109523574B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110827326B (en) * 2019-11-14 2023-05-02 清华大学苏州汽车研究院(吴江) Method, device, equipment and storage medium for generating simulation man-vehicle conflict scene model
EP4074563A4 (en) * 2019-12-30 2022-12-28 Huawei Technologies Co., Ltd. Trajectory prediction method and related device
CN113689660B (en) * 2020-05-19 2023-08-29 三六零科技集团有限公司 Safety early warning method of wearable device and wearable device
CN112183221A (en) * 2020-09-04 2021-01-05 北京科技大学 Semantic-based dynamic object self-adaptive trajectory prediction method
CN113504527B (en) * 2021-09-13 2021-12-14 北京海兰信数据科技股份有限公司 Radar target prediction processing method and system
CN113951767A (en) * 2021-11-08 2022-01-21 珠海格力电器股份有限公司 Control method and device for movable equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104827963A (en) * 2015-04-01 2015-08-12 北京京东尚科信息技术有限公司 Method of collision avoidance and intelligent early warning for vehicle, control system and control device
CN106023647A (en) * 2016-05-05 2016-10-12 苏州京坤达汽车电子科技有限公司 Driving habit and state self-adaptive vehicle safety distance early-warning control device
CN106023244A (en) * 2016-04-13 2016-10-12 南京邮电大学 Pedestrian tracking method based on least square locus prediction and intelligent obstacle avoidance model
CN106926844A (en) * 2017-03-27 2017-07-07 西南交通大学 A kind of dynamic auto driving lane-change method for planning track based on real time environment information
CN108255182A (en) * 2018-01-30 2018-07-06 上海交通大学 A kind of service robot pedestrian based on deeply study perceives barrier-avoiding method
US10156850B1 (en) * 2017-12-08 2018-12-18 Uber Technologies, Inc. Object motion prediction and vehicle control systems and methods for autonomous vehicles

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104882025B (en) * 2015-05-13 2017-02-22 东华大学 Crashing detecting and warning method based on vehicle network technology
CN108803617B (en) * 2018-07-10 2020-03-20 深圳大学 Trajectory prediction method and apparatus
CN108958263A (en) * 2018-08-03 2018-12-07 江苏木盟智能科技有限公司 A kind of Obstacle Avoidance and robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104827963A (en) * 2015-04-01 2015-08-12 北京京东尚科信息技术有限公司 Method of collision avoidance and intelligent early warning for vehicle, control system and control device
CN106023244A (en) * 2016-04-13 2016-10-12 南京邮电大学 Pedestrian tracking method based on least square locus prediction and intelligent obstacle avoidance model
CN106023647A (en) * 2016-05-05 2016-10-12 苏州京坤达汽车电子科技有限公司 Driving habit and state self-adaptive vehicle safety distance early-warning control device
CN106926844A (en) * 2017-03-27 2017-07-07 西南交通大学 A kind of dynamic auto driving lane-change method for planning track based on real time environment information
US10156850B1 (en) * 2017-12-08 2018-12-18 Uber Technologies, Inc. Object motion prediction and vehicle control systems and methods for autonomous vehicles
CN108255182A (en) * 2018-01-30 2018-07-06 上海交通大学 A kind of service robot pedestrian based on deeply study perceives barrier-avoiding method

Also Published As

Publication number Publication date
CN109523574A (en) 2019-03-26

Similar Documents

Publication Publication Date Title
CN109523574B (en) Walking track prediction method and electronic equipment
CN111857356B (en) Method, device, equipment and storage medium for recognizing interaction gesture
CN109686109B (en) Parking lot safety monitoring management system and method based on artificial intelligence
CN104020751B (en) Campus Security monitoring method based on Internet of Things
US20180124423A1 (en) Dynamic scene prediction with multiple interacting agents
CN106682572A (en) Target tracking method, target tracking system and first electronic device
WO2017074966A1 (en) Joint processing for embedded data inference
CN107830767B (en) Based on the unmanned plane counter method remotely controlled and medium
US10296785B1 (en) Apparatuses, systems, and methods for vehicle operator gesture recognition and transmission of related gesture data
CN110706261A (en) Vehicle violation detection method and device, computer equipment and storage medium
CN110264495A (en) A kind of method for tracking target and device
CN113326719A (en) Method, equipment and system for target tracking
CN109389016B (en) Method and system for counting human heads
CN103955977A (en) Method for counting number of people in bus based on 4G base station
CN110213720A (en) Unexpected prevention method in mobile phone use process based on user behavior analysis
CN111797302A (en) Model processing method and device, storage medium and electronic equipment
GB2586099A (en) An apparatus and method for person detection, tracking and identification utilizing wireless signals and images
Hu et al. Building occupancy detection and localization using cctv camera and deep learning
Li et al. Efficient health-related abnormal behavior detection with visual and inertial sensor integration
Lamaazi et al. Smart edge-based driver drowsiness detection in mobile crowdsourcing
Hung et al. Model-driven traffic data acquisition in vehicular sensor networks
CN110803170B (en) Driving assistance system with intelligent user interface
Yoon et al. Tracking System for mobile user Based on CCTV
CN116823884A (en) Multi-target tracking method, system, computer equipment and storage medium
CN109815921A (en) The prediction technique and device of the class of activity in hydrogenation stations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant