CN112734938A - Pedestrian position prediction method, device, computer equipment and storage medium - Google Patents

Pedestrian position prediction method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN112734938A
CN112734938A CN202110037224.XA CN202110037224A CN112734938A CN 112734938 A CN112734938 A CN 112734938A CN 202110037224 A CN202110037224 A CN 202110037224A CN 112734938 A CN112734938 A CN 112734938A
Authority
CN
China
Prior art keywords
time
time interval
pedestrian
target
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110037224.XA
Other languages
Chinese (zh)
Inventor
杨旭
孙鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Aibee Technology Co Ltd
Original Assignee
Beijing Aibee Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Aibee Technology Co Ltd filed Critical Beijing Aibee Technology Co Ltd
Priority to CN202110037224.XA priority Critical patent/CN112734938A/en
Publication of CN112734938A publication Critical patent/CN112734938A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C17/00Compasses; Devices for ascertaining true or magnetic north for navigation or surveying purposes
    • G01C17/02Magnetic compasses
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/04Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means
    • G01C21/08Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by terrestrial means involving use of the magnetic field of the earth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Library & Information Science (AREA)
  • Automation & Control Theory (AREA)
  • Computer Hardware Design (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Geology (AREA)
  • General Life Sciences & Earth Sciences (AREA)
  • Environmental & Geological Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application relates to a pedestrian position prediction method, a pedestrian position prediction device, a computer device and a storage medium. The method comprises the following steps: acquiring the predicted walking step length in a time interval, pedestrian direction angle data at a target moment in the time interval and pedestrian position information at the starting moment of the time interval; the time interval is the interval between the first stepping time and the second stepping time in the two adjacent stepping processes, and the starting time of the time interval is the first stepping time; obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval; and obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment. By adopting the method, the output frequency of the position information of the pedestrian can be improved.

Description

Pedestrian position prediction method, device, computer equipment and storage medium
Technical Field
The present application relates to the field of positioning technologies, and in particular, to a method and an apparatus for predicting a pedestrian position, a computer device, and a storage medium.
Background
The AR (Augmented Reality)/VR (Virtual Reality) technology is a computer simulation technology capable of creating and experiencing a Virtual world, and with the development of computer simulation technologies such as AR/VR, the requirement for outputting position information for positioning of personnel is higher and higher.
However, the conventional PDR (Pedestrian Dead Reckoning) technology is to perform Pedestrian position location according to the Pedestrian moving step frequency, that is, only one step of the Pedestrian is required to output position information, the output frequency is low, and the position information cannot correspond to the image updating frequency of the AR/VR, so that the fluency of position display in the AR/VR image frame is seriously affected.
Disclosure of Invention
In view of the above, it is necessary to provide a pedestrian position prediction method, apparatus, computer device and storage medium for solving the above technical problems.
A pedestrian position prediction method, the method comprising:
acquiring the predicted walking step length in a time interval, pedestrian direction angle data at a target moment in the time interval and pedestrian position information at the starting moment of the time interval; the time interval is the interval from the first step time to the second step time in the two adjacent step processes, and the starting time of the time interval is the first step time;
obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval;
and obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment.
In one embodiment, the target time is a timestamp time carried by the image frame when the image frame is correspondingly updated in the augmented reality navigation and/or the virtual reality navigation.
In one embodiment, the obtaining of the predicted walking step length in the time interval, the pedestrian direction angle data at the target time in the time interval, and the pedestrian position information at the starting time of the time interval includes:
calculating to obtain the estimated walking step length in the time interval according to the pedestrian acceleration data acquired by the accelerometer and a preset step length estimation algorithm;
acquiring pedestrian direction angle data at a target moment in a time interval through a gyroscope and/or a magnetic compass;
and obtaining the pedestrian position information of the initial moment according to the estimated walking step length in the time interval and the pedestrian direction angle data at the initial moment of the time interval.
In one embodiment, the obtaining the estimated walking speed in the time interval according to the estimated walking step in the time interval includes:
calculating the interval duration between the second stepping time and the first stepping time in the time interval;
and performing division calculation according to the estimated walking step length and the interval duration in the time interval to obtain the estimated walking speed in the time interval.
In one embodiment, the obtaining pedestrian position information at the target time within the time interval according to the estimated walking speed, the pedestrian direction angle data at the target time, the pedestrian position information at the starting time of the time interval, and the time difference between the target time and the starting time includes:
checking the estimated walking speed according to a preset speed threshold;
if the estimated walking speed is less than or equal to a preset speed threshold, calculating to obtain a movement distance corresponding to the target time according to the interval duration of the target time and the first stepping time in the time interval and the estimated walking speed;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In one embodiment, the method comprises:
if the estimated walking speed is greater than a preset speed threshold, calculating to obtain a movement distance corresponding to the target moment according to the interval duration of the target moment and the first stepping moment in the time interval and the preset speed threshold;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In one embodiment, the obtaining of the pedestrian position information at the target time by performing fusion calculation according to the movement distance corresponding to the target time, the pedestrian direction angle data at the target time, and the pedestrian position information at the first step time includes:
multiplying the moving distance corresponding to the target moment by the cosine of the pedestrian direction angle at the target moment to obtain a first multiplication result, and adding the first multiplication result and the position abscissa in the pedestrian position information at the first stepping moment to obtain a predicted position abscissa of the pedestrian position at the target moment;
and performing multiplication operation according to the movement distance corresponding to the target time and the sine value of the pedestrian direction angle at the target time to obtain a second multiplication operation result, and performing addition operation on the second multiplication operation result and the position ordinate in the pedestrian position information at the first stepping time to obtain the predicted position ordinate of the pedestrian position at the target time.
A pedestrian position prediction apparatus, the apparatus comprising:
the acquisition module is used for acquiring the predicted walking step length in a time interval, the pedestrian direction angle data at the target moment in the time interval and the pedestrian position information at the starting moment of the time interval; the time interval is the interval between a first step time and a second step time in the two adjacent step processes, and the starting time of the time interval is the first step time;
the speed estimation module is used for acquiring the estimated walking speed in the time interval according to the estimated walking step length in the time interval;
and the position estimation module is used for obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference value between the target moment and the starting moment.
In one embodiment, the position estimation module is specifically configured to verify the estimated walking speed according to a preset speed threshold;
if the estimated walking speed is less than or equal to a preset speed threshold, calculating to obtain a movement distance corresponding to the target time according to the interval duration of the target time and the first stepping time in the time interval and the estimated walking speed;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
acquiring the predicted walking step length in a time interval, pedestrian direction angle data at a target moment in the time interval and pedestrian position information at the starting moment of the time interval; the time interval is the interval between a first step time and a second step time in the two adjacent step processes, and the starting time of the time interval is the first step time;
obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval;
and obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring the predicted walking step length in a time interval, pedestrian direction angle data at a target moment in the time interval and pedestrian position information at the starting moment of the time interval; the time interval is the interval from the first step time to the second step time in the two adjacent step processes, and the starting time of the time interval is the first step time;
obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval;
and obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment.
According to the pedestrian position prediction method, the device, the computer equipment and the storage medium, the predicted walking step length in the time interval, the pedestrian direction angle data at the target moment in the time interval and the pedestrian position information at the starting moment of the time interval are obtained; the time interval is the interval from the first step time to the second step time in the two adjacent step processes, and the starting time of the time interval is the first step time; obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval; and obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment. By adopting the method, the position change of the pedestrian in the walking time interval is simulated by a machine through estimating the walking speed in the walking time interval, the pedestrian direction angle data at the target moment in the time interval and the initial position of the starting moment of each walking time interval, so that the position information of the pedestrian at the target moment in the walking time interval is obtained, and the output frequency of the position information of the pedestrian is improved.
Drawings
FIG. 1 is a flow diagram illustrating a method for predicting a pedestrian location in one embodiment;
FIG. 2 is a flowchart illustrating the steps of obtaining data such as estimated walking speed and walking step size in one embodiment;
FIG. 3 is a flowchart illustrating the step of calculating an estimated walking speed according to one embodiment;
FIG. 4 is a schematic flow chart illustrating the steps of calculating a movement distance in one embodiment;
FIG. 5 is a flowchart illustrating steps of estimating walking speed greater than a predetermined threshold in one embodiment;
FIG. 6 is a flowchart illustrating steps for estimating pedestrian location based on estimated walking speed and distance traveled, etc., in one embodiment;
FIG. 7 is an example flow diagram of a method for pedestrian location prediction in one embodiment;
FIG. 8 is a block diagram showing the construction of a pedestrian position predicting apparatus according to one embodiment;
FIG. 9 is a diagram illustrating an internal structure of a computer device according to an embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Based on the AR/VR navigation technology, in the AR/VR navigation process, the frequency of image frame output display of a pedestrian advancing navigation image is 10-30 Hz, however, the output frequency of the pedestrian position information output according to the walking frequency of the pedestrian traditionally is only 1-3 Hz, so that the AR/VR does not correspond to the output frequency of the pedestrian position information when the image frame is updated, and the fluency of position display in the image frame cannot be ensured. The following technical scheme is provided for the pedestrian position output frequency requirement corresponding to the AR/VR image frame.
Therefore, in an embodiment, as shown in fig. 1, a pedestrian location prediction method is provided, and this embodiment is exemplified by applying this method to a terminal, for example, a mobile terminal (e.g., a mobile phone, an AR/VR wearable device) and the like, which may integrate positioning data acquisition modules with functions such as an accelerometer, a gyroscope, a magnetic compass and the like, and which may also directly acquire positioning data corresponding to positioning data acquisition devices (e.g., an acceleration sensor, a magnetic sensor and the like) deployed in a positioning scene, so this embodiment is not particularly limited to a terminal that may perform this method, and terminal devices that may perform this method are collectively referred to as a computer device in this embodiment, and in addition, it is understood that this method may also be applied to a server, and may also be applied to a system including a terminal and a server, and is realized through the interaction of the terminal and the server. In this embodiment, the method includes the steps of:
step 101, acquiring predicted walking step length in a time interval, pedestrian direction angle data at a target moment in the time interval, and pedestrian position information at a starting moment of the time interval; the time interval is the interval between the first step time and the second step time in the two adjacent step processes, and the starting time of the time interval is the first step time.
In the step taking process of the pedestrian, each step has a time interval of a certain duration, for example, in two adjacent step taking processes, the time interval between the first step time and the second step time (for example, the landing time of the left foot step of the pedestrian is the first step time, and the landing time of the right foot step of the pedestrian is the second step time), therefore, when the position prediction output is performed according to the step, the output of the pedestrian position information can be performed once when each step is terminated, but the pedestrian position information cannot be obtained within the time interval.
In implementation, the computer device acquires the estimated walking step length in the time interval, the pedestrian direction angle data at the target time in the time interval, and the pedestrian position information at the starting time (first stepping time) of the time interval, namely the position coordinates of the pedestrian. The coordinate system of the position coordinates may be a reference coordinate system having the initial position of the pedestrian as the origin of coordinates. Thus, the pedestrian's initial position coordinates
Figure BDA0002893690220000061
Optionally, the target time in the time interval may be any time in the time interval according to the output requirement of the location information, and the time in the time interval may be any infinitesimal time under the condition that the accuracy of the time system is high enough, so the number of times in the time interval is not limited.
Optionally, before the pedestrian position is predicted once, the historical position data information is cleared, and the walking speed and walking time of the pedestrian are initialized, so that the initial position of the pedestrian is set to be a (0,0) point, and thus the subsequent stepping position is predicted.
Optionally, in the process of making a step, the step time can be determined by a preset step detection algorithm every time the step is made, specifically: the method comprises the steps of collecting three-axis acceleration in the walking process of the pedestrian through an accelerometer (namely acceleration data collected in the direction of three axes of the accelerometer under a coordinate system of the accelerometer) and representing the acceleration data as (ax, ay and az), calculating a module value of the acceleration data corresponding to each frame in an output frame of the accelerometer, drawing a module value curve according to the module value of the acceleration data, detecting the acceleration module value curve in a sliding mode according to a preset time window (for example, 1.2s), determining that the pedestrian takes a step when a peak value of the acceleration module value curve appears in the sliding mode window, and recording the moment corresponding to the peak value of the curve as a pedestrian step moment t _ step.
And 102, obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval.
In implementation, the computer device obtains an estimated walking speed in a time interval according to the estimated walking step in the time interval. Specifically, the estimated walking step size in the time interval acquired by the computer device may be represented as step _ length, and the time interval may be represented as: t _ step-last _ step _ time, where last _ step _ time represents a first step time and t _ step represents a second step time, the calculation formula of the estimated walking speed is:
Figure BDA0002893690220000071
alternatively, when the first prediction of the stepping position is performed, the speed of the pedestrian at the (0,0) point (initial position) is required, the initial walking speed of the pedestrian may be set to 1m/s (i.e., initial position walking speed), and the initial walking speed may be used as the basis of the predicted walking speed of the pedestrian position prediction for calculating the position of the pedestrian at the first stepping initial time, and the calculation formula of the predicted walking speed in the whole walking process of the pedestrian is as follows:
Figure BDA0002893690220000081
where last _ step _ time ═ None indicates the walking speed of the pedestrian in the first time interval during walking (i.e. no previous step), and last _ step _ time ═ | is! None indicates the stepping time corresponding to each time interval except the first time interval (i.e., the previous stepping time is not empty) during walking of the pedestrian.
And 103, acquiring pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment.
In implementation, the computer device performs fusion calculation according to the estimated walking speed, the time difference value between the target time and the starting time and the pedestrian direction angle data at the target time, and obtains the pedestrian position information at the target time in the time interval on the basis of the pedestrian position information at the starting time of the time interval.
Specifically, the computer device performs multiplication calculation according to the time difference between the target time and the time interval starting time, the estimated walking speed and the pedestrian direction angle data to obtain the moving distance of the pedestrian at the target time, and further obtains the pedestrian position information at the target time by combining the corresponding moving distance on the basis of the pedestrian position information at the starting time.
In the method for predicting the pedestrian position, the predicted walking step length in a time interval, the pedestrian direction angle data at the target moment in the time interval and the pedestrian position information at the starting moment of the time interval are obtained; the time interval is the interval from the first stepping time to the second stepping time in the two adjacent stepping processes, and the starting time of the time interval is the first stepping time; obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval; and obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment. By adopting the method, the pedestrian position information at the target time in the walking time interval can be obtained by estimating the walking speed in the walking time interval, the pedestrian direction angle data at the target time in the time interval and the initial position of the starting time of each walking time interval and simulating the pedestrian position change in the walking time interval by a machine, and the position information corresponding to the target time at the time precision can be output according to the requirement of the time precision in the time interval because any time in the time interval can be used as the target time, thereby improving the output frequency of the pedestrian position information.
Optionally, the target time is a time corresponding to an updated image frame in AR/VR (augmented reality navigation and/or virtual reality navigation), that is, in order to ensure that the output frequency of the image frame in the augmented reality navigation/virtual reality navigation technology is consistent with the output frequency of the pedestrian position information in the image frame, since each updated and output image frame carries time stamp information, the time stamp information carried by any image frame is determined as the target time, the estimated walking step length in the time interval where the target time is located, the pedestrian direction angle data at the target time (image frame time stamp time), and the pedestrian position information at the start time of the time interval are obtained according to the processing procedure of step 102 and step 103, and then if the time stamp information carried by each image frame in the time interval is targeted, the pedestrian position positioning is carried out, and the output frequency of each image frame can be ensured to be consistent with the output frequency of the corresponding pedestrian position.
In one embodiment, as shown in fig. 2, the specific processing procedure of step 101 is as follows:
step 201, calculating to obtain an estimated walking step length within a time interval according to pedestrian acceleration data acquired by an accelerometer and a preset step length estimation algorithm.
The data required by the pedestrian position prediction can be acquired in real time in the pedestrian traveling process through corresponding sensor equipment, and further, the data acquired by the sensor equipment is acquired and processed by computer equipment; or a data acquisition module with the acquisition function of each sensor can be integrated in the computer equipment to finish the acquisition of data and the internal data processing.
In implementation, the computer device calculates an estimated walking step length within a time interval according to pedestrian acceleration data acquired by an accelerometer (or an acceleration acquisition module) and a preset step length estimation algorithm.
Specifically, an accelerometer collects a module value sequence of acceleration data in a time interval from a first step time to a second step time, computer equipment calculates the variance of the module value sequence of the acceleration data, the variance is represented by var, then, the time interval duration from the first step time to the second step time is determined, the time interval duration is represented by delta _ t, and the step frequency f can be determined to be 1/delta _ t according to the time interval duration; and finally, estimating the walking step length according to a step length estimation model stepf + b var + c, wherein a, b and c are model coefficients and are generally obtained by pre-calibration.
And step 202, acquiring pedestrian direction angle data at a target moment in a time interval through a gyroscope and/or a magnetic compass.
In implementation, pedestrian direction angle data at a target time within a time interval is acquired through a gyroscope and/or a magnetic compass, specifically, a calculation example of acquiring pedestrian direction angle data through a gyroscope alone, a magnetic compass alone or a combination of a gyroscope and a magnetic compass is given, and the calculation method is not limited in the present embodiment for the calculation and acquisition method of the pedestrian direction angle because the calculation methods are various.
1) Determining the pedestrian direction angle by using a gyroscope: the gyroscope can acquire three-axis angular velocities (namely, angular velocities in three directions of a coordinate system of the gyroscope) of a pedestrian in a walking process, optionally selecting one axis as a direction axis, generally selecting a z axis, and calculating a direction angle according to the angular velocity on the axis, wherein a specific calculation formula is yaw-t-gyro _ z, wherein yaw represents pedestrian direction angle data, t is sampling time of the gyroscope and corresponds to target time in a time interval, and gyro _ z is an angular velocity of the z axis of the gyroscope.
2) The pedestrian direction angle is determined by utilizing a magnetic compass: acquiring magnetic vectors mx, my and mz of three acquisition directions of a magnetic compass in the pedestrian walking process based on the magnetic compass (or called as a magnetic sensor); then, calculating a horizontal inclination angle in the pedestrian movement process based on the acceleration output by the accelerometer; according to the horizontal inclination angle obtained by the accelerometer, projecting the triaxial magnetic vector in the horizontal direction to obtain two magnetic vectors in the horizontal direction; finally, the direction angle yaw in the magnetic direction is obtained by calculating the angle of the two horizontal magnetic vectors as pedestrian direction angle data.
3) The direction angle yaw1 is determined by the gyroscope according to the method in the method 1), the direction angle yaw2 is determined by the magnetic compass according to the method in the method 2), and finally, the direction angle data is fused according to the complementary filter algorithm or the kalman filter algorithm, so that the direction angle yaw3 with higher precision is obtained as the direction angle data of the pedestrian walking.
And step 203, obtaining pedestrian position information of the starting time according to the estimated walking step length in the time interval and the pedestrian direction angle data at the starting time of the time interval.
In implementation, the computer device obtains the pedestrian position information at the starting time of the time interval according to the estimated walking step _ length in the time interval and the pedestrian direction angle data yaw at the starting time of the time interval.
Specifically, when the time interval is the time interval corresponding to the first stepping of the pedestrian, the initial position information of the pedestrian is (0,0) after initialization, so that the initial time position information of the first time interval (the first stepping time interval) in the walking process of the pedestrian is (0,0) and calculation is not needed;
when the time interval is a non-first step time interval, the start time of the time interval is the end time corresponding to the previous step, so if the current pedestrian position information is predicted, the position information of the previous pedestrian step end time is needed as the basis, the embodiment takes the step time interval adjacent to the first step time interval as an example, and the first step time interval is: from time 0 to the end of the first stride; the adjacent step time interval is as follows: the second swing start time (equal in value to the first swing end time) to the second swing end time. When the pedestrian finishes the second step, the position information of the first step ending time (namely the second step starting time) is needed as a basis, so the calculation of the pedestrian position information of the first step ending time is as follows:
Figure BDA0002893690220000111
in this case, since the pedestrian position information corresponding to 0 is also the initialized pedestrian position (0,0) in the first stepping interval, the present example shows that
Figure BDA0002893690220000112
And substituting the formula into the formula to calculate the pedestrian position information (position coordinates) at the first stepping termination moment. And then, iterative calculation is carried out according to the formula (2), and the pedestrian position information of the starting time of each time interval can be obtained.
In one embodiment, as shown in FIG. 3, the specific process of step 102 is as follows:
step 301, calculating the interval duration between the second stepping time and the first stepping time in the time interval.
In an implementation, the computer device calculates a time interval between the second step time and the first step time, that is, a full time of the time interval, and the specific calculation formula is t _ step-last _ step _ time, where t _ step represents the second step time, and at this time, last _ step _ time is the first step time in the time interval.
And step 302, performing division calculation according to the estimated walking step length and the interval duration in the time interval to obtain the estimated walking speed in the time interval.
In implementation, the computer device performs division calculation according to the estimated walking step length in the time interval and the interval duration t _ step-last _ step _ time of the time interval to obtain the estimated walking speed in the time interval.
Specifically, the calculation formula of the estimated walking speed is as follows:
Figure BDA0002893690220000121
in this embodiment, the computer device predicts the walking speed of the pedestrian through the estimated walking step length and the interval duration of each time interval to obtain the average speed of the pedestrian in each step time interval process, so as to simulate the running condition of the pedestrian in the step time interval, thereby solving the problem that the existing positioning output device can only output discrete positioning data according to the step frequency.
In one embodiment, as shown in fig. 4, the specific processing procedure of step 103 is as follows:
step 401, calculating to obtain a movement distance corresponding to the target time according to the interval duration between the target time and the first stepping time in the time interval and the estimated walking speed.
In implementation, the computer device calculates the movement distance Delta _ dist corresponding to the target time according to the interval duration between the target time and the first stepping time in the time interval and the estimated walking speed.
The specific calculation formula is as follows: delta _ dist is speed (t-last _ step _ time), where speed is the estimated walking speed, t is any time in the time interval, and last _ step _ time is the first stepping time, i.e., the starting time of the time interval.
And step 402, performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In implementation, the computer device performs fusion settlement according to the movement distance corresponding to the target time, the pedestrian direction angle data at the target time and the pedestrian position information at the first step time to obtain the pedestrian position information at the target time.
In this embodiment, the computer device predicts the position of the pedestrian at the target in the time interval by predicting the walking speed and the time difference between the target time and the time interval starting time and combining the pedestrian direction angle data to obtain the position information of the pedestrian at the target and output the position, thereby improving the output efficiency of the pedestrian positioning data.
In one embodiment, as shown in fig. 5, the method further comprises:
and step 501, checking the estimated walking speed according to a preset speed threshold value.
In implementation, the computer device checks the estimated walking speed in advance according to a preset speed threshold.
Specifically, after the estimated walking speed is obtained by calculation according to the method in steps 301-302, the estimated walking speed needs to be checked according to a speed threshold (represented by speed _ threshold) to avoid the situation that the estimated speed is wrong and the estimated deviation of the subsequent pedestrian position is large. Alternatively, the speed threshold may be obtained by screening the maximum speed data from a plurality of pedestrian walking speed data by the computer device. The walking speed is the upper limit of the walking speed of the pedestrian, for example, 3 to 5 m/s.
Step 502, if the estimated walking speed is less than or equal to the preset speed threshold, calculating to obtain the movement distance corresponding to the target time according to the interval duration between the target time and the first stepping time in the time interval and the estimated walking speed.
In implementation, if the estimated walking speed is less than or equal to the preset speed threshold, that is, it is indicated that no obvious error (obvious abnormal data value) occurs in the estimated walking speed, the computer device calculates the movement distance Delta _ dist corresponding to the target time according to the interval duration between the target time and the first stepping time in the time interval and the estimated walking speed.
The specific calculation formula is as follows: delta _ dist is speed (t-last _ step _ time), where speed is the estimated walking speed, t is any time in the time interval, and last _ step _ time is the first stepping time, i.e., the starting time of the time interval.
Step 503, if the estimated walking speed is greater than the preset speed threshold, calculating to obtain the movement distance corresponding to the target time according to the interval duration between the target time and the first stepping time in the time interval and the preset speed threshold.
In implementation, if the estimated walking speed is greater than the preset speed threshold, it is indicated that the speed estimation is wrong, the estimated speed data is an obvious abnormal data value, and in order to ensure the smooth progress of the pedestrian position prediction process, the preset speed threshold is used for replacing the estimated walking speed to perform calculation, so as to obtain the movement distance corresponding to the target moment, and the specific calculation formula is as follows:
Delta_dist=speed_threshold*(t-last_step_time) (3)
and step 504, performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In implementation, the computer device performs fusion calculation according to the movement distance corresponding to the target time, the pedestrian direction angle data at the target time and the pedestrian position information at the first stepping time to obtain the pedestrian position information at the target time.
Optionally, in order to ensure that the prediction of the position of the pedestrian is performed successfully, the computer device may further set a movement distance threshold value Dist _ threshold for the obtained movement distance at the target time, check the obtained movement distance at the target time to determine whether the movement distance is obviously abnormal movement distance data, and if the movement distance is obviously abnormal (that is, the movement distance data is greater than the movement distance threshold value), perform processing of replacing the abnormal value with a threshold value for updating. The specific formula is expressed as: delta _ distf=min(Delta_dist,Dist_threshold)。
In one embodiment, as shown in fig. 6, the specific processing procedure of step 403 is as follows:
step 601, performing multiplication operation according to the movement distance corresponding to the target time and the cosine value of the pedestrian direction angle at the target time to obtain a first multiplication operation result, and performing addition operation on the first multiplication operation result and the position abscissa in the pedestrian position information at the first stepping time to obtain the predicted position abscissa of the pedestrian position at the target time.
In implementation, the computer device performs multiplication operation according to the movement distance (Delta _ dist) corresponding to the target time and the cosine value of the pedestrian direction angle (yaw) at the target time to obtain a first multiplication operation result Delta _ dist cos (yaw), and then the computer device performs addition operation on the first multiplication operation result and the position abscissa px _ last in the pedestrian position information at the first stepping time to obtain the predicted position abscissa px _ preditct of the pedestrian position at the target time in the time interval.
The specific calculation formula is as follows: px _ preditct ═ px _ last + Delta _ dist — (yaw) (4).
And step 602, performing multiplication operation according to the movement distance corresponding to the target time and the sine value of the pedestrian direction angle at the target time to obtain a second multiplication operation result, and performing addition operation on the second multiplication operation result and the position ordinate in the pedestrian position information at the first stepping time to obtain the predicted position ordinate of the pedestrian position at the target time.
In implementation, the computer device performs multiplication operation according to the movement distance (Delta _ dist) corresponding to the target time and the sine value of the pedestrian direction angle (yaw) at the target time to obtain a second multiplication operation result Delta _ dist sin (yaw), and performs addition operation on the second multiplication operation result and the position ordinate py _ last in the pedestrian position information at the first stepping time to obtain the predicted position ordinate py _ preditct of the pedestrian position at the target time.
The specific calculation formula is as follows: py _ preditct ═ py _ last + Delta _ dist × sin (yaw) (5).
In this embodiment, an example of a method for predicting a pedestrian position is further provided, as shown in fig. 7, a specific processing procedure is as follows:
step 701, initializing pedestrian position, walking time and walking speed data;
step 702, monitoring the step of the pedestrian according to a step detection method, if the step of the pedestrian is monitored, executing step 703, otherwise, ending;
step 703, obtaining the step time of the step time interval in the step process of the pedestrian (the time interval is the time length from the first step time to the second step time), obtaining pedestrian direction angle data of the target time in the time interval according to a gyroscope/magnetic compass, and estimating the walking length of the step time interval by using a step length estimation algorithm;
step 704, obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval and the time interval duration (the difference between the second step time and the first step time);
step 705, obtaining the stepping distance of the target time in the time interval according to the predicted walking step length and the pedestrian direction angle data of the target time in the time interval, and obtaining the pedestrian position of each stepping time by combining the time interval duration;
step 706, obtaining a movement distance corresponding to the target time according to the interval duration between the target time and the starting time (first stepping time) in the time interval and the estimated walking speed, and performing fusion calculation according to the movement distance corresponding to the target time, pedestrian direction angle data at the target time and pedestrian position information at the starting time to obtain pedestrian position information at the target time in the time interval.
It should be understood that although the various steps in the flow charts of fig. 1-7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 1-7 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least some of the other steps.
In one embodiment, as shown in fig. 8, there is provided a pedestrian position prediction apparatus 800 including: an acquisition module 810, a velocity estimation module 820, and a location estimation module 830, wherein:
the obtaining module 810 is configured to obtain predicted walking step length in a time interval, pedestrian direction angle data at a target time in the time interval, and pedestrian position information at a starting time of the time interval; the time interval is the interval between the first stepping time and the second stepping time in the two adjacent stepping processes, and the starting time of the time interval is the first stepping time;
a speed estimation module 820, configured to obtain an estimated walking speed in a time interval according to the estimated walking step in the time interval;
the position estimation module 830 is configured to obtain pedestrian position information at the target time within the time interval according to the estimated walking speed, the pedestrian direction angle data at the target time, the pedestrian position information at the starting time of the time interval, and the time difference between the target time and the starting time.
In one embodiment, the obtaining module 810 is specifically configured to calculate an estimated walking step length within a time interval according to pedestrian acceleration data acquired by an accelerometer and a preset step length estimation algorithm;
acquiring pedestrian direction angle data at a target moment in a time interval through a gyroscope and/or a magnetic compass;
and obtaining pedestrian position information of the initial moment according to the estimated walking step length in the time interval and the pedestrian direction angle data at the initial moment of the time interval.
In one embodiment, the speed estimation module 820 is specifically configured to calculate an interval duration between the second step time and the first step time within the time interval;
and performing division calculation according to the estimated walking step length and the interval duration in the time interval to obtain the estimated walking speed in the time interval.
In one embodiment, the location estimation module 830 is specifically configured to verify the estimated walking speed according to a preset speed threshold;
if the estimated walking speed is less than or equal to the preset speed threshold, calculating to obtain a movement distance corresponding to the target time according to the interval duration of the target time and the first stepping time in the time interval and the estimated walking speed;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In one embodiment, the position estimation module 830 is further configured to calculate a movement distance corresponding to the target time according to an interval duration between the target time and the first stepping time in the time interval and a preset speed threshold if the estimated walking speed is greater than the preset speed threshold;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In one embodiment, the position estimation module 830 is specifically configured to perform multiplication operation according to a moving distance corresponding to a target time and a cosine of a pedestrian direction angle at the target time to obtain a first multiplication operation result, and perform addition operation on the first multiplication operation result and a position abscissa in the pedestrian position information at the first stepping time to obtain a predicted position abscissa of the pedestrian position at the target time;
and performing multiplication operation according to the movement distance corresponding to the target time and the sine value of the pedestrian direction angle at the target time to obtain a second multiplication operation result, and performing addition operation on the second multiplication operation result and the position ordinate in the pedestrian position information at the first stepping time to obtain the predicted position ordinate of the pedestrian position at the target time.
For specific definition of the pedestrian position prediction device 800, reference may be made to the above definition of the pedestrian position prediction method, and details thereof are not repeated here. The various modules in the pedestrian position prediction apparatus 800 described above may be implemented in whole or in part by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 9. The computer device includes a processor, a memory, a communication interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a pedestrian position prediction method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 9 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, comprising a memory and a processor, the memory having a computer program stored therein, the processor implementing the following steps when executing the computer program:
acquiring the predicted walking step length in a time interval, pedestrian direction angle data at a target moment in the time interval and pedestrian position information at the starting moment of the time interval; the time interval is the interval from the first step time to the second step time in the two adjacent step processes, and the starting time of the time interval is the first step time;
obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval;
and obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
calculating to obtain an estimated walking step length within a time interval according to pedestrian acceleration data acquired by an accelerometer and a preset step length estimation algorithm;
acquiring pedestrian direction angle data at a target moment in a time interval through a gyroscope and/or a magnetic compass;
and obtaining pedestrian position information of the initial moment according to the estimated walking step length in the time interval and the pedestrian direction angle data at the initial moment of the time interval.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
calculating the interval duration between the second stepping time and the first stepping time in the time interval;
and performing division calculation according to the estimated walking step length and the interval duration in the time interval to obtain the estimated walking speed in the time interval.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
checking the estimated walking speed according to a preset speed threshold value;
if the estimated walking speed is less than or equal to the preset speed threshold, calculating to obtain a movement distance corresponding to the target time according to the interval duration of the target time and the first stepping time in the time interval and the estimated walking speed;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
if the estimated walking speed is larger than a preset speed threshold, calculating to obtain a movement distance corresponding to the target moment according to the interval duration of the target moment and the first walking moment in the time interval and the preset speed threshold;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In one embodiment, the processor, when executing the computer program, further performs the steps of:
performing multiplication operation according to the movement distance corresponding to the target moment and the cosine value of the pedestrian direction angle at the target moment to obtain a first multiplication operation result, and performing addition operation on the first multiplication operation result and the position abscissa in the pedestrian position information at the first stepping moment to obtain a predicted position abscissa of the pedestrian position at the target moment;
and performing multiplication operation according to the movement distance corresponding to the target time and the sine value of the pedestrian direction angle at the target time to obtain a second multiplication operation result, and performing addition operation on the second multiplication operation result and the position ordinate in the pedestrian position information at the first stepping time to obtain the predicted position ordinate of the pedestrian position at the target time.
In one embodiment, a computer-readable storage medium is provided, having a computer program stored thereon, which when executed by a processor, performs the steps of:
acquiring the predicted walking step length in a time interval, pedestrian direction angle data at a target moment in the time interval and pedestrian position information at the starting moment of the time interval; the time interval is the interval from the first step time to the second step time in the two adjacent step processes, and the starting time of the time interval is the first step time;
obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval;
and obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment.
In one embodiment, the computer program when executed by the processor further performs the steps of:
calculating to obtain an estimated walking step length within a time interval according to pedestrian acceleration data acquired by an accelerometer and a preset step length estimation algorithm;
acquiring pedestrian direction angle data at a target moment in a time interval through a gyroscope and/or a magnetic compass;
and obtaining pedestrian position information of the initial moment according to the estimated walking step length in the time interval and the pedestrian direction angle data at the initial moment of the time interval.
In one embodiment, the computer program when executed by the processor further performs the steps of:
calculating the interval duration between the second stepping time and the first stepping time in the time interval;
and performing division calculation according to the estimated walking step length and the interval duration in the time interval to obtain the estimated walking speed in the time interval.
In one embodiment, the computer program when executed by the processor further performs the steps of:
checking the estimated walking speed according to a preset speed threshold value;
if the estimated walking speed is less than or equal to the preset speed threshold, calculating to obtain a movement distance corresponding to the target time according to the interval duration of the target time and the first stepping time in the time interval and the estimated walking speed;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In one embodiment, the computer program when executed by the processor further performs the steps of:
if the estimated walking speed is larger than a preset speed threshold, calculating to obtain a movement distance corresponding to the target moment according to the interval duration of the target moment and the first walking moment in the time interval and the preset speed threshold;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
In one embodiment, the computer program when executed by the processor further performs the steps of:
performing multiplication operation according to the movement distance corresponding to the target moment and the cosine value of the pedestrian direction angle at the target moment to obtain a first multiplication operation result, and performing addition operation on the first multiplication operation result and the position abscissa in the pedestrian position information at the first stepping moment to obtain a predicted position abscissa of the pedestrian position at the target moment;
and performing multiplication operation according to the movement distance corresponding to the target time and the sine value of the pedestrian direction angle at the target time to obtain a second multiplication operation result, and performing addition operation on the second multiplication operation result and the position ordinate in the pedestrian position information at the first stepping time to obtain the predicted position ordinate of the pedestrian position at the target time.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A pedestrian position prediction method, characterized by comprising:
acquiring the predicted walking step length in a time interval, pedestrian direction angle data at a target moment in the time interval and pedestrian position information at the starting moment of the time interval; the time interval is the interval from the first step time to the second step time in the two adjacent step processes, and the starting time of the time interval is the first step time;
obtaining the estimated walking speed in the time interval according to the estimated walking step length in the time interval;
and obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference between the target moment and the starting moment.
2. The method of claim 1, wherein the target time is a timestamp time carried by the image frame when the image frame is correspondingly updated in augmented reality navigation and/or virtual reality navigation.
3. The method of claim 1, wherein the obtaining of the predicted walking step length in the time interval, the pedestrian direction angle data at the target time in the time interval, and the pedestrian position information at the starting time of the time interval comprises:
calculating to obtain an estimated walking step length in the time interval according to pedestrian acceleration data acquired by an accelerometer and a step length estimation method;
acquiring pedestrian direction angle data at a target moment in a time interval through a gyroscope and/or a magnetic compass;
and obtaining the pedestrian position information of the initial moment according to the estimated walking step length in the time interval and the pedestrian direction angle data at the initial moment of the time interval.
4. The method of claim 1, wherein obtaining the estimated walking speed for the time interval from the estimated walking step size for the time interval comprises:
calculating the interval duration between the second stepping time and the first stepping time in the time interval;
and performing division calculation according to the estimated walking step length and the interval duration in the time interval to obtain the estimated walking speed in the time interval.
5. The method of claim 1, wherein obtaining the pedestrian position information at the target time within the time interval according to the estimated walking speed, the pedestrian direction angle data at the target time, the pedestrian position information at the starting time of the time interval, and the time difference between the target time and the starting time comprises:
calculating to obtain a movement distance corresponding to the target moment according to the interval duration of the target moment and the first stepping moment in the time interval and the estimated walking speed;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
6. The method of claim 1, wherein obtaining the pedestrian position information at the target time within the time interval according to the estimated walking speed, the pedestrian direction angle data at the target time, the pedestrian position information at the starting time of the time interval, and the time difference between the target time and the starting time comprises:
checking the estimated walking speed according to a preset speed threshold;
if the estimated walking speed is less than or equal to a preset speed threshold, calculating to obtain a movement distance corresponding to the target time according to the interval duration of the target time and the first stepping time in the time interval and the estimated walking speed;
if the estimated walking speed is greater than a preset speed threshold, calculating to obtain a movement distance corresponding to the target moment according to the interval duration of the target moment and the first stepping moment in the time interval and the preset speed threshold;
and performing fusion calculation according to the movement distance corresponding to the target moment, the pedestrian direction angle data at the target moment and the pedestrian position information at the first stepping moment to obtain the pedestrian position information at the target moment.
7. The method according to claim 5 or 6, wherein the obtaining of the pedestrian position information at the target time by performing the fusion calculation according to the moving distance corresponding to the target time, the pedestrian direction angle data at the target time, and the pedestrian position information at the first stepping time comprises:
multiplying the moving distance corresponding to the target moment by the cosine of the pedestrian direction angle at the target moment to obtain a first multiplication result, and adding the first multiplication result and the position abscissa in the pedestrian position information at the first stepping moment to obtain a predicted position abscissa of the pedestrian position at the target moment;
and performing multiplication operation according to the movement distance corresponding to the target time and the sine value of the pedestrian direction angle at the target time to obtain a second multiplication operation result, and performing addition operation on the second multiplication operation result and the position ordinate in the pedestrian position information at the first stepping time to obtain the predicted position ordinate of the pedestrian position at the target time.
8. A pedestrian position prediction apparatus characterized by comprising:
the acquisition module is used for acquiring the predicted walking step length in a time interval, the pedestrian direction angle data at the target moment in the time interval and the pedestrian position information at the starting moment of the time interval; the time interval is the interval from the first step time to the second step time in the two adjacent step processes, and the starting time of the time interval is the first step time;
the speed estimation module is used for acquiring the estimated walking speed in the time interval according to the estimated walking step length in the time interval;
and the position estimation module is used for obtaining the pedestrian position information at the target moment in the time interval according to the estimated walking speed, the pedestrian direction angle data at the target moment, the pedestrian position information at the starting moment of the time interval and the time difference value between the target moment and the starting moment.
9. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202110037224.XA 2021-01-12 2021-01-12 Pedestrian position prediction method, device, computer equipment and storage medium Pending CN112734938A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110037224.XA CN112734938A (en) 2021-01-12 2021-01-12 Pedestrian position prediction method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110037224.XA CN112734938A (en) 2021-01-12 2021-01-12 Pedestrian position prediction method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112734938A true CN112734938A (en) 2021-04-30

Family

ID=75590633

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110037224.XA Pending CN112734938A (en) 2021-01-12 2021-01-12 Pedestrian position prediction method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112734938A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170138740A1 (en) * 2015-11-12 2017-05-18 Blackberry Limited Utilizing camera to assist with indoor pedestrian navigation
CN108460787A (en) * 2018-03-06 2018-08-28 北京市商汤科技开发有限公司 Method for tracking target and device, electronic equipment, program, storage medium
CN108537094A (en) * 2017-03-03 2018-09-14 株式会社理光 Image processing method, device and system
WO2018210055A1 (en) * 2017-05-15 2018-11-22 腾讯科技(深圳)有限公司 Augmented reality processing method and device, display terminal, and computer storage medium
CN109470238A (en) * 2017-09-08 2019-03-15 中兴通讯股份有限公司 A kind of localization method, device and mobile terminal
CN110579211A (en) * 2018-06-07 2019-12-17 北京嘀嘀无限科技发展有限公司 Walking positioning method and system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170138740A1 (en) * 2015-11-12 2017-05-18 Blackberry Limited Utilizing camera to assist with indoor pedestrian navigation
CN108537094A (en) * 2017-03-03 2018-09-14 株式会社理光 Image processing method, device and system
WO2018210055A1 (en) * 2017-05-15 2018-11-22 腾讯科技(深圳)有限公司 Augmented reality processing method and device, display terminal, and computer storage medium
CN109470238A (en) * 2017-09-08 2019-03-15 中兴通讯股份有限公司 A kind of localization method, device and mobile terminal
CN108460787A (en) * 2018-03-06 2018-08-28 北京市商汤科技开发有限公司 Method for tracking target and device, electronic equipment, program, storage medium
CN110579211A (en) * 2018-06-07 2019-12-17 北京嘀嘀无限科技发展有限公司 Walking positioning method and system

Similar Documents

Publication Publication Date Title
CN107888828B (en) Space positioning method and device, electronic device, and storage medium
CN108731664B (en) Robot state estimation method, device, computer equipment and storage medium
WO2020253260A1 (en) Time synchronization processing method, electronic apparatus, and storage medium
CN105931275A (en) Monocular and IMU fused stable motion tracking method and device based on mobile terminal
CN112907678B (en) Vehicle-mounted camera external parameter attitude dynamic estimation method and device and computer equipment
CN108871311B (en) Pose determination method and device
CN111959495B (en) Vehicle control method and device and vehicle
CN110211151B (en) Method and device for tracking moving object
US20150149111A1 (en) Device and method for using time rate of change of sensor data to determine device rotation
CN109781117B (en) Combined positioning method and system
CN108318027B (en) Method and device for determining attitude data of carrier
CN103512584A (en) Navigation attitude information output method, device and strapdown navigation attitude reference system
CN105841695B (en) Information processing apparatus, information processing method, and recording medium
CN110388919B (en) Three-dimensional model positioning method based on feature map and inertial measurement in augmented reality
Porzi et al. Visual-inertial tracking on android for augmented reality applications
CN106370178B (en) Attitude measurement method and device of mobile terminal equipment
CN108389264A (en) Coordinate system determines method, apparatus, storage medium and electronic equipment
CN114061570A (en) Vehicle positioning method and device, computer equipment and storage medium
CN111862150A (en) Image tracking method and device, AR device and computer device
CN113566850B (en) Method and device for calibrating installation angle of inertial measurement unit and computer equipment
CN113498505A (en) Modeling pose of tracked object by predicting sensor data
CN112734938A (en) Pedestrian position prediction method, device, computer equipment and storage medium
CN114001730B (en) Fusion positioning method, fusion positioning device, computer equipment and storage medium
CN116631307A (en) Display method, intelligent wearable device, electronic device, device and storage medium
CN108595095B (en) Method and device for simulating movement locus of target body based on gesture control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination