CN116680346A - Motion trail analysis method, device and medium - Google Patents
Motion trail analysis method, device and medium Download PDFInfo
- Publication number
- CN116680346A CN116680346A CN202211166370.3A CN202211166370A CN116680346A CN 116680346 A CN116680346 A CN 116680346A CN 202211166370 A CN202211166370 A CN 202211166370A CN 116680346 A CN116680346 A CN 116680346A
- Authority
- CN
- China
- Prior art keywords
- motion
- point
- track
- stay
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 47
- 238000000034 method Methods 0.000 claims abstract description 30
- 230000000284 resting effect Effects 0.000 claims description 63
- 230000008859 change Effects 0.000 claims description 31
- 238000004422 calculation algorithm Methods 0.000 claims description 13
- 230000015654 memory Effects 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims description 5
- 238000004454 trace mineral analysis Methods 0.000 claims 2
- 238000004364 calculation method Methods 0.000 description 18
- 230000008569 process Effects 0.000 description 11
- 230000001133 acceleration Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 9
- 230000003068 static effect Effects 0.000 description 9
- 239000006185 dispersion Substances 0.000 description 6
- 230000005236 sound signal Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 230000014759 maintenance of location Effects 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/245—Query processing
- G06F16/2458—Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
- G06F16/2462—Approximate or statistical queries
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Databases & Information Systems (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Probability & Statistics with Applications (AREA)
- Remote Sensing (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Computational Linguistics (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The present application relates to the field of electronic devices, and in particular, to a method, an apparatus, and a medium for analyzing a motion trajectory. The motion trail analysis method comprises the following steps: acquiring track data of a plurality of stay points in a user motion track; calculating motion parameters of each stay point based on the track data of each stay point, wherein the motion parameters comprise the motion of a user corresponding to the motion track of the user; and determining the motion type of the user at each dwell point based on the motion parameters of each dwell point. The motion trail analysis method can determine the motion type of the user at each stay point based on the multidimensional motion parameters of each stay point. Furthermore, based on the motion type of the user at each stay point, the motion information of the user at each stay point is determined, the motion track information of the user is enriched, and the motion track of the user is better analyzed.
Description
Technical Field
The present application relates to the field of electronic devices, and in particular, to a method, an apparatus, and a medium for analyzing a motion trajectory.
Background
With the development of the internet, the positioning function of the electronic device can be expanded to serve various fields, such as sports and fitness fields, user location point of interest (POI, point Of Interest) analysis, and the like, through analysis and research on the location points of the positioning of the electronic device. The positioning function may be implemented by a wireless fidelity (Wi-Fi) network of the electronic device, a global positioning system (GPS, global Positioning System), a base station identification number (ID), an Identity positioning function, and the like.
For example, in fig. 1, the mobile phone 100 may determine the current location of the mobile phone 100 by acquiring GPS positioning signals of a plurality of satellites 200, including the longitude and latitude of the mobile phone 100 at time t. When the user carries the mobile phone 100 in a motion state, a plurality of location points (i.e. track points) in the motion state of the user can be determined as track data of the mobile phone 100, as shown in fig. 2. The trajectory data consists of a series of time-ordered trajectory points, which may represent p= { P1, P2, P3, …, pn }, where for any i-th trajectory point pi= { longitude, latitude, time }. The time is a concept of time, for example, 11 points on 12 months 1 day.
Currently, analysis of track points generally includes identifying track points (i.e., dwell points) in track data that characterize dwell, so as to divide a user motion track into dwell points and movement track segments. For example, a set of dwell points S1, a set of dwell points S2, and a set of dwell points S3 may be identified in fig. 2. Further, the motion trajectory of the mobile phone 100 in fig. 2 may be divided into: the method comprises the steps of moving a track section M1-a stay point set S1-a moving track section M2-a stay point set S2-a moving track section M3-a stay point set S3-a moving track section M4.
However, the stay points include different types, for example, the user is in a loitering motion state and a static motion state in a stay point area, and in addition, the stay time in the different stay points is different, so that different application scenes exist for the different kinds of stay points, the stay points are identified only through a stay point identification algorithm, further analysis and research on the motion trail of the user cannot be met, and due to the fact that the reliability of the stay points identified through the stay point identification algorithm in the different stay points is different, no quality standard capable of being used for indicating the identification of the stay points exists in the existing scheme, and the accuracy of the further analysis and research on the motion trail of the user is affected.
Disclosure of Invention
The embodiment of the application provides a motion trail analysis method, equipment and medium, which solve the problem that the existing stay point identification algorithm cannot judge the motion type of a user at a stay point.
In a first aspect, an embodiment of the present application provides a motion trajectory analysis method, which is applied to an electronic device, and includes: acquiring track data of a plurality of stay points in a user motion track, wherein the track data comprises longitude, latitude and time of the stay points; calculating motion parameters of each stay point based on track data of each stay point, wherein the motion parameters comprise parameters of a plurality of dimensions related to motion generated when a user moves to the corresponding stay point in motion corresponding to the motion track of the user; and determining the motion type of the user at each dwell point based on the motion parameters of each dwell point.
It is understood that the type of motion may include, but is not limited to, absolute rest, relative rest, loitering movement, and the like. Wherein, for absolute rest, the user is shown to be in a completely rest state at the resting point, and no movement occurs; for relative rest, the user moves in the stay process of the stay point, but the moving distance, the moving speed, the moving time and the like are shorter or lower; for loiter movements, the user moves back and forth at the dwell point.
The motion trail analysis method provided by the embodiment of the application can determine the motion type of the user at each stay point based on the multidimensional motion parameters of each stay point. Furthermore, based on the motion type of the user at each stay point, the motion information of the user at each stay point is determined, the motion track information of the user is enriched, and the motion track of the user is better analyzed.
In a possible implementation of the first aspect described above, the motion parameter includes at least one of: speed parameters, mobility parameters, direction change parameters.
In a possible implementation of the first aspect described above, the motion type includes at least one of: absolute rest, relative rest, loitering movement.
In a possible implementation of the first aspect, the dwell point includes at least two track points in a user motion track; calculating motion parameters of each dwell point based on the trajectory data of each dwell point, including: determining a speed parameter of the stay point based on a curve distance between a first time track point and a last time track point in at least two track points corresponding to the stay point and a first time of the first time track point and a second time of the last time track point; determining a mobility parameter of the dwell point based on the curve distance and the linear distance between the first time track point and the last time track point; and determining the direction change parameters of the stay points based on the movement directions of two adjacent track points in at least two track points corresponding to the stay points.
In a possible implementation of the first aspect, determining a motion type of the user at each dwell point based on the motion parameter of each dwell point includes: determining preset weights of motion parameters of the stay points; and determining the confidence degree of the stay point based on the motion parameter of the stay point and the preset weight of the motion parameter of the stay point, wherein the higher the confidence degree of the stay point is, the higher the probability that the user is stationary at the stay point is.
In a possible implementation of the first aspect, the confidence of the stay point is calculated by the following formula:
wherein J represents the dimension number of the motion parameter;a preset weight of a motion parameter of the j dimension of the i-th stop point is represented; />A motion parameter representing a j-th dimension of the i-th dwell point; c (i) represents the confidence of the ith dwell point.
In a possible implementation manner of the first aspect, the electronic device further records a motion state of the user and a time corresponding to each motion state; based on the motion parameters of each dwell point, determining the motion type of the user at each dwell point, and further comprising: determining a resting time duty ratio of the resting point based on the motion state, the time corresponding to each motion state and the track data of the resting point, wherein the resting time duty ratio represents the proportion of the total time of the user in the region corresponding to the resting point; and determining the confidence of the dwell point based on the rest time duty ratio.
In a possible implementation of the first aspect, determining the confidence of the dwell point based on the rest time ratio includes: if the resting time duty ratio is larger than the first resting threshold value, determining that the confidence coefficient of the resting point is 1; if the resting time duty ratio is smaller than the second resting threshold value, determining that the confidence coefficient of the resting point is 0; if the resting time duty ratio is smaller than or equal to the first resting threshold value and larger than or equal to the second resting threshold value, determining the preset weight of the motion parameter of the resting point, and determining the confidence coefficient of the resting point based on the motion parameter of the resting point and the preset weight of the motion parameter of the resting point.
In a possible implementation of the first aspect, the method further includes: track data of a plurality of track points in a user motion track are obtained, and stay points in the plurality of track points are determined by adopting a stay point identification algorithm.
In a second aspect, an embodiment of the present application provides a motion trajectory analysis device, which is applied to an electronic device, including: the data acquisition module is used for acquiring track data of a plurality of stay points in a user motion track, wherein the track data comprises longitude, latitude and time of the stay points; the parameter calculation module is used for calculating the motion parameters of each stay point based on the track data of each stay point, wherein the motion parameters comprise parameters of a plurality of dimensions related to the motion generated when a user moves to the corresponding stay point in the motion corresponding to the motion track of the user; and the type determining module is used for determining the motion type of the user at each stopover point based on the motion parameters of each stopover point.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; one or more memories; the one or more memories store one or more programs that, when executed by the one or more processors, cause the electronic device to perform the motion profile analysis method described above.
In a fourth aspect, an embodiment of the present application provides a computer readable storage medium, where instructions are stored, where the instructions, when executed on a computer, cause the computer to perform the above-described motion trajectory analysis method.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program/instruction which, when executed by a processor, implements the above-described motion profile analysis method.
Drawings
FIG. 1 is a schematic diagram illustrating an application scenario for determining a user motion profile, according to some embodiments of the application;
FIG. 2 is a schematic diagram illustrating trace points of a user motion trace, according to some embodiments of the application;
FIG. 3 is a schematic diagram illustrating a method of determining a movement point and a dwell point, according to some embodiments of the application;
FIG. 4 is a schematic diagram illustrating a division of motion direction classes, according to some embodiments of the application;
FIG. 5 is a diagram illustrating another exemplary division of the direction of motion level, according to some embodiments of the application;
FIG. 6 is a flow chart illustrating a method of motion trajectory analysis, according to some embodiments of the application;
FIG. 7 is a schematic diagram illustrating a method of determining a dwell point, according to some embodiments of the application;
FIG. 8 is a flow chart illustrating another method of motion trajectory analysis, according to some embodiments of the application;
FIG. 9 is a schematic diagram illustrating a hardware architecture of an electronic device, according to some embodiments of the application;
fig. 10 is a schematic diagram schematically illustrating a program module of a motion trajectory analysis device according to some embodiments of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
As described above, the existing stay point recognition algorithm is only used to recognize whether the track point is a stay point, and cannot determine the motion type of the user at the stay point, for example, whether the user is in absolute rest, relatively rest or wandering at the stay point, and cannot analyze more motion information of the user at the stay point.
It can be understood that the dwell point can be one track point in the motion track of the user, or can be a track point set formed by a plurality of track points continuously in a certain motion area.
Therefore, the embodiment of the application discloses a motion trail analysis method which is applied to electronic equipment. Specifically, in the motion trail analysis method disclosed in the embodiment of the present application, the electronic device may calculate motion parameters of a user in multiple dimensions of each stay point, for example, a speed parameter, a mobility parameter, a direction change parameter, etc. of the user at the stay point according to trail data of each trail point corresponding to the stay point obtained by a positioning system (for example, a GPS positioning system, a beidou positioning system), and then determine a motion type of each stay point based on the motion parameters of each stay point. For example, the type of motion may include, but is not limited to, absolute rest, relative rest, loitering movement, and the like. Wherein, for absolute rest, the user is shown to be in a completely rest state at the stop point, and no movement occurs; for relative rest, the user moves in the stay process of the stay point, but the moving distance, the moving speed, the moving time and the like are shorter or lower; for loiter movements, the user moves back and forth at the dwell point.
It can be understood that, when the dwell point is a single track point, the direction change parameter, the mobility parameter, and the like of the dwell point cannot be calculated or are meaningless, so the motion track analysis method in the application is suitable for determining the motion type of the user at the dwell point composed of the track point set.
It will be appreciated that in some embodiments, the user at dwell point speed parameter may characterize the speed of movement of the electronic device between trace points within the dwell point. For example, the electronic device is identified that the set of track points of the movement area A2 is a dwell point, and the electronic device may determine the speed parameter of the movement area A2 corresponding to the dwell point based on the curve distance between the track point P3 along the first time to the track point P4 along the last time in the movement area A2, and the time corresponding to the track point P3 and the track point P4. The electronic device may determine a speed of movement of the user within each dwell point based on the calculated speed parameter for each dwell point.
The mobility parameter may characterize a degree of dispersion of a plurality of trajectory points corresponding to the dwell point. The smaller the mobility parameter is, the higher the discrete degree is, and the weaker the mobility of the electronic equipment in a plurality of continuous stay points or areas corresponding to the stay points is; the larger the mobility parameter, the lower the degree of dispersion, and the stronger the mobility of the electronic device in a plurality of continuous stop points or areas corresponding to the stop points. For example, the set of trajectory points of the motion area A2 in fig. 3 is a dwell point, and the mobility parameter thereof may characterize the degree of dispersion of the trajectory points in the motion area A2. In some embodiments, the mobility parameter of the motion area A2 may be calculated by dividing the linear distance and the curved distance between the trajectory point P3 and the trajectory point P4.
The direction change parameter can represent the condition that the movement direction of the electronic equipment changes in the movement process of the area where the stay point is located. In some embodiments, the motion change parameter may be characterized as directional entropy. For example, the magnitude of the directional entropy indicates the degree of change of the movement direction of the electronic device in the area where the stop point is located, when the directional entropy is large, it indicates that the direction change of the electronic device in the area where the stop point is located is relatively large, and when the directional entropy is small, it indicates that the movement direction of the electronic device in the area where the stop point is located is relatively uniform.
Further, it is understood that in some embodiments, the type of motion may be represented, hereinafter referred to as confidence, by being quantified by a numerical value calculated by the motion parameter integration. For example, the confidence of the stay point can be obtained by calculating the motion parameters of each dimension by combining weights preset by the motion parameters of each dimension. If the calculated confidence coefficient of the stay point is high under the condition that the weights of all the motion parameters are the same, for example, the calculated confidence coefficient exceeds a preset first threshold value, the motion type of the user is absolute static; if the calculated confidence level of the stay point is higher, for example, the calculated confidence level is between a preset first threshold value and a preset second threshold value, the movement type of the user is relatively static; if the calculated stay point confidence is low, for example, below a preset second threshold, it indicates that the user's type of exercise is loitering movement. In other embodiments, the motion type may also include other types, and further the preset threshold may also include a preset third threshold, a preset fourth threshold, and so on. The motion type of the user at the stay point can be determined according to the calculated confidence coefficient of the stay point and the corresponding threshold value interval of each motion type.
In addition, the confidence of each dwell point can be used for judging the dwell point type and evaluating the accuracy of each dwell point identified by each identification algorithm. For example, if the calculated dwell point confidence is high, it may be determined that the accuracy of determining the identified dwell point is high; if the calculated stay point confidence is higher, determining that the accuracy of judging the identified stay point is higher; if the calculated dwell point confidence is relatively low, it may be determined that the accuracy of determining the identified dwell point is relatively low. In some embodiments, the accuracy level and confidence interval corresponding to each level may be set, and then the accuracy level of the stay point identification algorithm is determined based on the calculated stay point confidence and the confidence interval corresponding to each accuracy level. In the embodiment of the application, the retention point identification algorithm with higher accuracy can be selected for carrying out the retention point identification by evaluating the confidence of the retention points identified by various retention point identification algorithms.
In addition, it can be appreciated that in some embodiments, the motion state of the user may also be acquired through a sensor of the electronic device, such as a gyroscope sensor, an acceleration sensor, etc., and the dwell point confidence may be calculated in combination with the motion state of each dwell point.
Wherein the motion state may characterize the state of the user relative to a stationary reference object within each dwell point, which may include, but is not limited to, stationary, walking, running, riding a bike, etc. The electronic device may record a start time when the motion state starts to change and a type of the motion state, and determine the motion state of the user at each dwell point according to a time corresponding to each dwell point and a duration time of each motion state. For example, when the electronic device detects that the user has changed in motion state at time 1 and determines that the changed motion state is stationary, the stationary state and the corresponding changed time 1 may be recorded; detecting the change of the second motion state of the user at the time 2, and determining that the changed motion state is walking, then recording the walking state and the corresponding changed time 2; the third movement state change occurs within the time 3, and the changed movement state is determined to be running, so that the running state and the corresponding changed time 3 can be recorded; a fourth change in the state of motion occurs within time 4 and it is determined that the changed state of motion is stationary, then the stationary state and the corresponding time of change 4 can be recorded. Further, the electronic device may determine the motion state of each dwell point based on the above-described recorded data in combination with the time of each dwell point.
It will be appreciated that in some embodiments, when calculating dwell point confidence in connection with the motion state of each dwell point, it specifically includes: the motion state of a plurality of dwell points in the motion area is determined, and the dwell time ratio of the dwell time of the dwell points in the motion area to the time interval between the dwell start time and the dwell end in the area is calculated. If the resting time duty ratio is greater than the first resting threshold value and/or the resting time period threshold value, the confidence coefficient of the resting point in the area can be determined to be 1, and if the resting time duty ratio is less than the second resting threshold value, the confidence coefficient of the resting point in the area is determined to be 0. It will be appreciated that a confidence level of 1 indicates that the electronic device is in an absolute rest state and a confidence level of 0 indicates that the electronic device is in a continuous motion state.
It should be understood that the above examples of the calculation methods for the mobility parameter, the speed parameter, and the direction change parameter are merely examples of embodiments of the present application, and those skilled in the art may calculate the corresponding motion parameter or calculate other motion parameters to determine the motion type of the user by using other methods, which is not limited in this application.
It is understood that electronic devices include, but are not limited to, cell phones (including folding screen cell phones), tablet computers, laptop computers, desktop computers, servers, wearable devices, head mounted displays, mobile email devices, car set devices, portable game players, portable music players, reader devices, televisions with one or more processors embedded therein or coupled thereto, and the like.
In some embodiments, the electronic device may acquire track data of a plurality of track points of the motion track, then upload the track data to the server, the server may identify the stay points by using a stay point identification algorithm based on the track data, calculate motion parameters of each stay point, and then determine a motion type of the user at each stay point based on the calculated motion parameters of each stay point. The server side can return the calculated confidence coefficient of each stay point to the electronic equipment. That is, the execution body of the scheme in the embodiment of the present application may be a server or an electronic device, which is not limited in this aspect of the present application.
In some embodiments, for a moving track segment in the user's moving track, the motion parameters of each moving track segment may also be calculated, and the confidence of each moving track segment may be calculated based on the motion parameters of each moving track segment. Furthermore, analysis and research can be carried out on the moving track segments in the moving track of the user based on the confidence coefficient of each moving track segment, and the confidence coefficient obtained based on calculation is used for applying the moving track to different scenes.
The motion trail analysis method provided by the embodiment of the application can determine the motion type of the user at each stay point based on the multidimensional motion parameters of each stay point. Furthermore, based on the motion type of the user at each stay point, the motion information of the user at each stay point is determined, the motion track information of the user is enriched, and the motion track of the user is better analyzed.
The following describes in detail the motion trajectory analysis method provided in the embodiment of the present application with reference to fig. 6.
Fig. 6 is a flow chart illustrating a method of motion trajectory analysis, according to some embodiments of the application. It can be understood that the execution body in the embodiment of the present application is an electronic device.
The motion trail analysis method provided by the embodiment of the application comprises the following steps:
601: and acquiring track data of a plurality of track points in the motion track of the user.
It may be appreciated that the track data may include a longitude and a latitude of each track point and a time when the electronic device is located at the track point, where the time represents a time corresponding to the track point. The longitude and latitude of each track point in the track data may be obtained in various manners, for example, by a positioning function such as a GPS positioning system, a Wi-Fi network, a base station identification number, and the like.
In some embodiments, the trajectory data may be represented as a time ordered sequence, e.g., for a plurality of trajectory points in a user motion trajectory, the trajectory data may be ordered according to time of each trajectory point, represented as p= { P1, P2, P3, …, pn }, and for any i-th trajectory point, the trajectory data pi= { LOi, LAi, ti }, where LOi represents longitude of the i-th trajectory point, LAi represents latitude of the i-th trajectory point, and Ti represents time of the i-th trajectory point.
602: and determining the stay point in the plurality of track points based on the track data of the plurality of track points.
It is understood that a dwell point is a locus of points that a user produces while staying at a relatively fixed location for a period of time.
In some embodiments, the identification of dwell points in the trace points may be determined by the straight line distance between the trace points and the time difference between the trace points. Specifically, when it is determined that the distance between two track points matches a preset distance threshold, and the time difference between the track points matches a preset time difference threshold, it may be determined that the two track points are stop points. For example, assuming that the linear distance between the track point P1 and the track point P5 in fig. 3 is smaller than the preset distance threshold and the time difference between the track point P5 and the track point P1 is greater than the preset time difference threshold, it may be determined that the track point P1 and the track point P5 are stay points.
Further, if the user motion track is followed, other track points are further included between the two track points, on the user motion track, the two track points and the other track points between the two track points are all stay points. For example, assuming that the linear distance between the track point P1 and the track point P2 in fig. 3 is smaller than the preset distance threshold and the time difference between the track point P2 and the track point P1 is larger than the preset time difference threshold, it may be determined that the track point P1, the track point P5 and the track point P2 are stay points.
In other embodiments, the probability of each track point being a stop point may be calculated by calculating the motion parameters of each track point and combining the motion parameters of each track point. Further, among the plurality of trajectory points, a trajectory point whose probability satisfies a preset probability threshold is determined as a stay point. The motion parameters can include the motion speed and motion state of the user at each track point. The motion state may include, but is not limited to, stationary, walking, running, riding, etc., and may be obtained by analysis of data collected by sensors of the electronic device, such as gyroscopic sensors, acceleration sensors, etc.
In some embodiments, the dwell point may include at least one trace point. I.e. for one dwell point it is a set of at least one locus point. When identifying the stay points, the plurality of track points can be divided into different motion areas, wherein the corresponding range of the motion areas can be determined based on the distance between the first time track point and the last time track point of the track points in the interval, and the corresponding serial numbers of the track points in the sequence corresponding to the track data can be divided into different motion areas according to the preset track points. Then, whether the track point set of each motion area is a stay point is determined based on the estimated data of at least one track point of the area.
In some embodiments, the linear distance and the time difference between the first time track point and the last time track point in the motion area may be calculated, and when the linear distance is calculated to meet the preset distance threshold and the time difference is calculated to meet the preset time difference threshold, the track point set in the motion area is determined to be a stop point, for example, in fig. 3, based on a preset motion area dividing method, the motion area A2 may be divided, where the first time track point is the track point P3 and the last time track point is the track point P4. And assuming that the linear distance between the track point P3 and the track point P4 meets the preset distance threshold, and the time difference between the track point P3 and the track point P4 meets the preset time difference threshold, in the motion area A2, the track point set of a plurality of track points including the track point P3 and the track point P4 is a stay point.
Further, in some embodiments, a set of trajectory points within a plurality of motion regions may be determined as one dwell point. For example, the user motion trajectory shown in fig. 7 includes a plurality of trajectory points. With 4 track points as a motion area, the area A7 in fig. 7 includes 12 track points: the trajectory points P701, P702, …, and P712 can be divided into 3 moving regions. When it is determined that the set of track points in the 3 motion areas meets the preset stay point judgment condition, for example, the straight line distance between the first time track point and the last time track point in the motion areas in the above example meets the preset distance threshold, the time difference between the first time track point and the last time track point meets the preset time difference threshold, it may be further judged whether the track points in the plurality of at least two motion areas meet the preset stay point judgment condition. Specifically, for example, it may be determined that the linear distance between the first time track point P701 in the first motion region and the last time track point P712 in the third motion region in the region A7 satisfies a preset distance threshold, and the time difference between the track point P701 and the track point P712 satisfies a preset time difference threshold, and then it may be determined that the track point set (of the three motion regions) in the region A7 is a dwell point.
It can be understood that the algorithm for identifying the stop point and the condition for determining the stop point in the embodiment of the present application are not limited to the above examples, and other methods may be adopted, which are not limited in this aspect of the present application.
In some embodiments, the electronic device may directly acquire and differentiate the track data of the stop points in the motion track of the user, perform the calculations of step 603 and step 604 described below, and do not need to perform the processes of step 601 and step 602 described above.
603: and calculating the motion parameters of each dwell point based on the track data of each dwell point.
It is understood that the motion parameters may include, but are not limited to, individual resting point speed parameters, mobility parameters, direction change parameters.
It will be appreciated that in some embodiments, the user at dwell point speed parameter may characterize the speed of movement of the electronic device between trace points within the dwell point. For example, the electronic device is identified that the set of track points of the movement area A2 is a dwell point, and the electronic device may determine the speed parameter of the movement area A2 corresponding to the dwell point based on the curve distance between the track point P3 along the first time to the track point P4 along the last time in the movement area A2, and the time corresponding to the track point P3 and the track point P4. The electronic device may determine a speed of movement of the user within each dwell point based on the calculated speed parameter for each dwell point.
The mobility parameter may characterize a degree of dispersion of a plurality of trajectory points corresponding to the dwell point. The smaller the mobility parameter is, the higher the discrete degree is, and the weaker the mobility of the electronic equipment in a plurality of continuous stay points or areas corresponding to the stay points is; the larger the mobility parameter, the lower the degree of dispersion, and the stronger the mobility of the electronic device in a plurality of continuous stop points or areas corresponding to the stop points.
For example, the set of trajectory points of the motion area A2 in fig. 3 is a dwell point, and its mobility parameter may characterize the degree of dispersion of the trajectory points in the motion area A2. In some embodiments, the mobility parameter of the motion area A2 may be calculated by dividing the linear distance and the curved distance between the trajectory point P3 and the trajectory point P4.
The direction change parameter can represent the situation that the movement direction of the electronic equipment changes in the movement process of the movement area where each track point is located. In some embodiments, the motion change parameter may be characterized as directional entropy. For example, the magnitude of the directional entropy indicates the degree of change of the movement direction of the electronic device in the area where the stop point is located, when the directional entropy is large, it indicates that the direction change of the electronic device in the area where the stop point is located is relatively large, and when the directional entropy is small, it indicates that the movement direction of the electronic device in the area where the stop point is located is relatively consistent.
When calculating the direction entropy, the motion direction in the space or the plane can be divided into N direction grades, different direction grades correspond to different motion direction ranges, the direction grades of the motion directions of a plurality of track points in the neighborhood region are calculated, and then the direction entropy of the motion region is determined according to the number of the motion directions contained in each direction grade. For example, in fig. 4, the 360-degree movement direction in the space is divided into eight direction ranges, i.e., eight direction grades, i, ii, iii, …, viii, for 13 track points in the movement area A2 in fig. 3, the direction grades corresponding to the movement directions of two adjacent track points can be determined, and then the direction entropy in the movement area A2 can be calculated by combining the direction grades of 12 movement directions of 13 track points in the movement area A2. In fig. 5, four quadrants in a plane are divided into 8, each sub-quadrant corresponds to a direction grade of a moving direction, i, ii, iii, …, viii, for 13 track points in a moving area A2 in fig. 3, the direction grade corresponding to the moving direction of two adjacent track points can be determined, and then, in combination with the direction grade of 12 moving directions of 13 track points in the moving area A2, the direction entropy in the moving area A2 can be calculated. The electronic device may determine, based on the calculated direction change parameter of each dwell point, a change in the movement direction of the user in each dwell point, for example, determine whether the movement direction of the user is continuously changed, the movement direction is unchanged, or the movement direction is slightly changed.
604: and determining the motion type of the user at each dwell point based on the motion parameters of each dwell point.
It is understood that the type of motion may include, but is not limited to, absolute rest, relative rest, loitering movement, and the like. Wherein, for absolute rest, the user is shown to be in a completely rest state at the resting point, and no movement occurs; for relative rest, the user moves in the stay process of the stay point, but the moving distance, the moving speed, the moving time and the like are shorter or lower; for loiter movements, the user moves back and forth at the dwell point.
In some embodiments, the type of motion, i.e., confidence level, may be represented quantitatively by a value calculated by motion parameter integration.
It can be appreciated that in some embodiments, the weights of the motion parameters may be preset in the electronic device, and then the confidence level of each stay point may be calculated based on the motion parameters of each stay point and the corresponding preset weights. For example, the motion parameters of each dimension may be weighted by a preset weight of the motion parameters of each dimension to obtain the confidence level of the stay point, which will be described below with reference to a formula, which is not described herein.
If the calculated confidence coefficient of the stay point is high under the condition that the weights of all the motion parameters are the same, for example, the calculated confidence coefficient exceeds a preset first threshold value, the motion type of the user is shown to be absolute static; if the calculated confidence level of the stay point is higher, for example, the calculated confidence level is between a preset first threshold value and a preset second threshold value, the movement type of the user is relatively static; if the calculated stay point confidence is low, for example, lower than a preset second threshold, the movement type of the user is loitering movement. In other embodiments, the motion type may further include other types, and further the preset threshold may further include a preset third threshold, a preset fourth threshold, and so on. And determining the motion type of the user at the stay point according to the calculated confidence coefficient of the stay point and the corresponding threshold interval of each motion type.
In some embodiments, the weights of different motion parameters may be set based on different application requirements. For example, if it is desired to screen out a resting point in a relatively stationary motion state within the resting points, the weights of the motion parameters may be set to be the same. If it is desired to screen out a resting point in an absolute stationary motion state among the resting points, the weight of the speed parameter may be increased and the weight of other types of motion parameters may be decreased. If the user wants to screen out a stay point in a moving state of wandering in the stay points, the weight of the speed parameter can be reduced, and the weight of other types of moving parameters can be increased.
In some embodiments, the motion parameters may also include a motion state. The motion state may be acquired through a sensor of the electronic device, for example, a gyroscope sensor, an acceleration sensor, and the like, and the motion state may include, but is not limited to, stationary, walking, running, riding, and the like. In particular, the electronic device may record a start time at which the movement state changes.
For example, a change in the motion state occurs when the electronic device is at time 1, and the changed motion state is determined to be stationary; a second change in the movement state occurs at time 2 and the changed movement state is determined to be walking; a third change in the movement state occurs within time 3 and the changed movement state is determined to be running; a fourth change in motion state occurs within time 4 and the changed motion state is determined to be stationary. It may be determined that the electronic device is in a stationary state during time 1 to time 2, in a walking state during time 2 to time 3, in a running state during time 3 to time 4, and beginning to be in a stationary state at time 4. The electronic device may determine the motion state of each track point according to the time corresponding to each track point and the duration of each motion state. It will be appreciated that the motion state of the electronic device, i.e. the motion state of the user, when the user moves together with the electronic device.
Further, in determining the confidence level of the dwell point, the determination may also be performed in combination with the above-described motion state. Specifically, the confidence of the dwell point can be determined by determining the stationary time duty ratio of the duration of the motion state being stationary to the total duration realization of the dwell point in the motion track section where the dwell point is located. For example, if the rest time duty cycle is greater than the first rest threshold, and/or the rest duration threshold, the confidence of the rest point in the area may be determined to be 1, and if it is less than the second rest threshold, the confidence of the rest point in the area may be determined to be 0. It will be appreciated that a confidence level of 1 indicates that the electronic device is in an absolute rest state and a confidence level of 0 indicates that the electronic device is in a continuous motion state. If the rest time ratio is between the first rest threshold value and the second rest threshold value, the confidence of the rest point can be calculated by combining the motion parameters except the motion state. The present application will be described in detail with reference to the accompanying drawings, and will not be described in detail herein.
The motion trail analysis method provided by the embodiment of the application can calculate the confidence coefficient of each stay point based on the motion parameters of each stay point. Furthermore, the accuracy of each stay point identified by the identification algorithm can be evaluated based on the confidence coefficient obtained by calculation, the motion information of the user at each stay point can be determined based on the confidence coefficient, the motion track information of the user is enriched, and the motion track of the user is better analyzed.
In some embodiments, the above-mentioned method for analyzing a motion trajectory of a user may also be used for analyzing a moving track segment in a motion trajectory of a user. That is, the motion parameters of multiple dimensions of each moving track section are determined, and then the motion type of the user in each moving track section is determined based on the motion parameters of multiple dimensions, wherein the motion type can comprise uniform motion, uniform acceleration motion, variable acceleration motion, deceleration motion, static motion and the like. Further, the confidence of each moving track section can be calculated based on the motion parameters of multiple dimensions and combined with the preset weights of the motion parameters of the dimensions.
The calculation process of the motion parameters and the confidence of the stay points is further described below.
In some embodiments, the dwell point speed parameter may be calculated by the following formula:
wherein P is u Represents the first time track point in the ith dwell point, P l Represents the ithThe last time trace point in the dwell points, totalsist (P) u ,P l ) Represents the locus point P in the ith dwell point u And locus point P l The distance between curves, timeGap (P) u ,P l ) Representing the user passing through the locus point P in the ith stop point u And locus point P l The time interval (time difference) between them, S (i) represents the speed parameter of the i-th dwell point.
It will be appreciated that the curve distance can be obtained by calculating the distance between every two adjacent track points in the dwell points. In other embodiments, the electronic device may collect the accumulated steps for each trace point, and the curve distance may be the difference between the accumulated steps collected at the first and last time trace points in the changed trace point.
In other embodiments, in order to obtain a more accurate calculation result of the speed parameter, the electronic device may further include a preset speed threshold, and further based on the speed parameter S (i) calculated by the above formula (1), the speed parameter of the ith dwell point may be determined by the following formula:
wherein Sth represents a preset speed threshold, and S (i)' represents a speed parameter calculated in combination with the preset speed threshold.
Further, to facilitate calculation of the confidence level of the dwell point, the velocity parameter calculated by the above formula (2) may be further processed by the following formula:
s (i) "= 1-S (i)' formula (3);
the speed parameter of the ith stopping point obtained by calculation can be unified with the mobility parameter and the direction change parameter in the following formula by the formula, so that the calculating of the confidence coefficient of the stopping point is facilitated.
It will be appreciated that the velocity parameter for calculating the confidence level of the stay point may be calculated using any one of the above formulas (1) to (3), which is not limited in the present application.
In other embodiments, for the velocity parameter S (i) calculated by the above formula (1), it may be directly substituted into the formula (3), that is, S (i) "=1-S (i), and the velocity capability parameter S (i)' may be calculated by the formula (2) without combining with a preset velocity threshold.
In some embodiments, the mobility parameter of the dwell point may be calculated by the following formula:
wherein P is u Represents the first time track point in the ith dwell point, P l Represents directDist (P) u ,P l ) Represents the locus point P in the ith dwell point u And locus point P l Linear distance between them, totalDist (P) u ,P l ) Represents the track point P in the ith dwell point u And locus point P l The curve distance between the two points, M (i), represents the mobility parameter of the ith dwell point.
In other embodiments, in order to obtain a more accurate calculation result of the mobility parameter, the electronic device may further include a preset mobility threshold, and further based on the velocity parameter M (i) calculated by the above formula (4), the mobility parameter of the ith dwell point may be determined by the following formula:
Wherein Mth represents a preset mobility threshold, and M (i)' represents a mobility parameter calculated in combination with the preset mobility threshold.
Further, in order to facilitate calculation of the confidence level of the dwell point, the mobility parameter calculated by the above formula (5) may be further processed by the following formula:
m (i) "= 1-M (i)' formula (6);
the formula can unify the calculated movement capacity parameter of the ith dwell point with the speed parameter in the formula and the direction change parameter in the following formula, so that the calculation of the dwell point confidence coefficient is facilitated.
It will be appreciated that the mobility parameter for calculating the confidence level of the stay point may be calculated using any one of the above formulas (4) to (6), which is not limited in the present application.
In other embodiments, for the mobility parameter M (i) calculated by the above formula (4), it may be directly substituted into the formula (6), that is, M (i) "=1 to M (i), and the velocity parameter M (i)' may be calculated by the formula (5) without combining with a preset velocity threshold.
In some embodiments, the directional entropy of the dwell point may be calculated by the following formula:
wherein D represents the D-th direction level, D represents the number of direction levels, n d The number of moving directions in the d-th direction level in the i-th dwell point is represented, N represents the total number of moving directions within the dwell point, and E (i) represents the directional entropy of the i-th dwell point.
It is understood that the movement direction may be the movement direction of every two adjacent track points in the ith dwell point. The number of directions of movement is related to the number of trajectory points in the dwell point. Assuming that the dwell point includes D trajectory points, D-1 motion directions may be included in the dwell point. For example, in fig. 7, 12 sets of track points in the area A7 constitute one dwell point, and 11 moving directions may be included in the dwell point.
It can be understood that when the movement direction changes of the plurality of track points in the dwell point are relatively uniform, the value of the direction entropy may be relatively low, and when the movement direction changes of the plurality of track points in the dwell point are relatively large, the value of the direction entropy may be relatively high.
In some embodiments, the confidence of each dwell point may be determined by the following equation:
wherein J represents the number of dimensions of the motion parameter,preset weights of motion parameters (i.e. calculation factors) representing the j-th dimension of the i-th dwell point,/->A motion parameter representing the j-th dimension of the i-th dwell point, C (i) representing the confidence of the i-th dwell point.
It can be understood that the j-th motion parameter of the i-th dwell pointThe velocity parameter calculated by the above formula (1), formula (2) or formula (3), the mobility parameter calculated by the above formula (4), formula (5) or formula (6), and the directional entropy calculated by the above formula (7) may be included but not limited thereto.
It will be appreciated that the confidence of a dwell point depends on the motion parameter with the greatest weight, and that the confidence of a dwell point may be relatively low when the value of the motion parameter does not correspond to the desired dwell point, and the confidence of a dwell point may be relatively high when the value of the motion parameter corresponds to the desired dwell point.
In some embodiments, the preset weight in the above formula (8) may be adjusted according to actual application requirements.
The following describes another motion trajectory analysis method according to the embodiment of the present application in detail with reference to fig. 8.
Fig. 8 is a flow chart illustrating another method of motion trajectory analysis, according to some embodiments of the application. It can be understood that the execution body in the embodiment of the present application is an electronic device.
The motion trail analysis method provided by the embodiment of the application comprises the following steps:
801: track data of a plurality of track points in a user motion track are acquired. Step 801 is the same as step 601 in fig. 6, and will not be described herein.
802: the dwell point identification distinguishes between movement trajectories and dwell points.
It is understood that the process of distinguishing the movement track from the stop point in step 802 is the same as that in step 602 in fig. 6, and will not be described herein.
803: and calculating the stationary time duty ratio of the motion state in the dwell point to be stationary.
It can be appreciated that the motion state of the dwell point can be determined using data collected from the sensors of the electronic device. The foregoing is specifically described, and will not be described in detail herein.
In some embodiments, the electronic device may generate a motion state sequence that characterizes the motion state of the user based on the collected data for each sensor. The motion state sequence table may be expressed as k= { K1, K2, K3, …, km }. The i-th movement state Ki may include the type of movement state and a corresponding time stamp, i.e. ki= { time stamp, movement state type }. The time stamp is understood to be the time at which the corresponding movement state type starts.
In some embodiments, the resting time ratio of each dwell point may be obtained by comparing the time in rest state in the dwell point with the total dwell point duration, i.e. resting time ratio = resting state duration/dwell point total duration.
In some embodiments, the rest time duty cycle may also be obtained by other means, as the application is not limited in this regard.
804: it is determined whether the rest time duty cycle is greater than 0.9 and the duration is greater than 1 hour. If it is determined that the resting time ratio is greater than 0.9 and the duration is greater than 1 hour, indicating that the user stays for a long period of time within the stay point, in an absolute resting state, i.e. the user's motion type is absolute resting, step 806 is performed, otherwise step 805 is performed.
It will be appreciated that 0.9 is the first rest threshold in the foregoing, and 1 hour is the rest duration threshold.
In other embodiments, the first rest threshold and the rest duration threshold may take other values, which the present application is not limited to.
In some embodiments, step 804 may also be used only to determine whether the rest time duty cycle is greater than 0.9, or whether the rest duration is greater than 1 hour, which is not limited by the present application.
805: and judging whether the static time duty ratio is smaller than 0.1. If the rest time ratio is less than 0.1, it indicates that the user is in a continuous motion state in the rest point, that is, the motion type of the user is continuous motion, step 808 is performed, otherwise step 807 is performed.
It is understood that 0.1 is the second rest threshold in the foregoing.
806: in the absolute rest state, the dwell point confidence is assigned a value of 1.
It will be appreciated that when the rest time of the rest point satisfies the above step 804, it indicates that the confidence of the rest point is high, and the rest point can be used as the rest point in the absolute rest state, where the confidence is assigned to 1, and further calculation of the confidence of the rest point is not required.
807: and calculating each factor of the stay points in the track, and calculating the confidence coefficient of each stay point. The calculation process of each factor of the dwell point in step 807, that is, each motion parameter of the dwell point, and the process of calculating the confidence of the dwell point based on each factor are the same as steps 603 and 604 in fig. 6, and will not be described in detail herein.
808: in the continuous motion state, the confidence value of the stay point is assigned to 0.
It will be appreciated that when the rest time of the resting point satisfies the above step 805, it indicates that the confidence of the resting point is low, and the resting point can be used as a resting point in a continuous motion state, where the confidence is assigned to 0, and further calculation of the confidence of the resting point is not required.
In the embodiment of the application, before calculating the confidence coefficient of the stay point, the confidence coefficient of part of the stay points is determined based on the duration time or the static time ratio of the static state of each stay point, so that the calculated amount of the confidence coefficient of the stay point can be reduced, and the confidence coefficient of each stay point can be calculated more quickly.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device 100, for executing related instructions in a motion trajectory analysis method according to an embodiment of the present application.
As shown in fig. 9, the electronic device 100 may include a processor 110, a memory 180, a sensor module 190, a display module 120, a mobile communication module 150, a wireless communication module 160, an audio module 170, an interface module 130, a power module 140, and the like. Wherein the sensor module 190 may include a pressure sensor, an acceleration sensor, a touch sensor, a gyro sensor, etc. Wherein the audio module 170 may include a speaker 170A, a receiver 170B, a microphone 170C, and an earphone interface 170D, etc.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, certain components may be separated, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution. A memory may also be provided in the processor 110 for storing instructions and data. In the embodiment of the present application, relevant instructions and data for executing the motion trajectory analysis method of the present application may be stored in the memory for the processor 110 to call, and the processor 110 may control execution of each step of executing the motion trajectory analysis method through the controller, and the specific implementation process is described in detail above and will not be repeated here.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor or the like through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor through an I2C interface, such that the processor 110 communicates with the touch sensor through an I2C bus interface to implement a touch function of the electronic device 100.
The MIPI interface may be used to connect the processor 110 with peripheral devices such as the display module 120. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. The processor 110 and the display module 120 communicate through a DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect processor 110 to display module 120, sensor module 190, and the like. The GPIO interface may also be configured as an I2C interface, MIPI interface, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The electronic device 100 implements display functions through a GPU, a display module 120, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display module 120 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display module 120 is used for displaying images, videos, etc. The display module 120 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (FLED), a Mini-LED, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display modules 120, N being a positive integer greater than 1.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor is used for sensing a pressure signal and can convert the pressure signal into an electric signal. In some embodiments, the pressure sensor may be disposed at the display module 120. Pressure sensors are of many kinds, such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc. The capacitive pressure sensor may be a plate comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display module 120, the electronic device 100 detects the intensity of the touch operation according to the pressure sensor. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for checking the short message. And executing the instruction of newly creating the short message when the touch operation with the touch operation intensity being larger than or equal to the first pressure threshold value is used for the short message application icon.
The acceleration sensor may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications. In the embodiment of the application, the motion state of the user can be determined based on the data acquired by the acceleration sensor.
The gyroscopic sensor may be used to determine a motion pose of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by a gyroscope sensor. The gyroscopic sensor may be used to navigate, somatosensory a game scene. In the embodiment of the application, the motion state of the user can be determined based on the data acquired by the gyroscope sensor and the data acquired by the acceleration sensor.
Touch sensors, also known as "touch devices". The touch sensor may be disposed on the display module 120, and the touch sensor and the display module 120 form a touch screen, which is also referred to as a "touch screen". The touch sensor is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display module 120.
It is to be understood that the system configuration shown in fig. 9 above does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown in FIG. 9, or may combine some components, or split some components, or a different arrangement of components.
Fig. 10 is a schematic diagram of a program module of a motion trajectory analysis device 1000 according to an embodiment of the present application, which may be used to execute instructions corresponding to the motion trajectory analysis method according to the embodiment of the present application, for example, execute instructions corresponding to steps 601 to 604.
As shown in fig. 10, the motion trajectory analysis device 1000 includes:
the data acquisition module 1001 is configured to acquire track data of a plurality of stay points in a motion track of a user, where the track data includes longitude, latitude, and time of the stay points.
The parameter calculation module 1002 is configured to calculate, based on the trajectory data of each dwell point, a motion parameter of each dwell point, where the motion parameter includes parameters of multiple dimensions related to a motion generated when a user moves to a corresponding dwell point in a motion corresponding to a motion trajectory of the user.
The type determining module 1003 is configured to determine a motion type of the user at each dwell point based on the motion parameters of each dwell point.
Reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one example implementation or technique according to the disclosure. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment.
The present disclosure also relates to an operating device for executing the text. The apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, magnetic-optical disks, read-only memories (ROMs), random Access Memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, application Specific Integrated Circuits (ASICs), or any type of media suitable for storing electronic instructions, and each may be coupled to a computer system bus. Furthermore, the computers referred to in the specification may include a single processor or may be architectures employing multiple processors for increased computing power.
Additionally, the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the disclosed subject matter. Accordingly, the present disclosure is intended to be illustrative, but not limiting, of the area of concepts discussed herein.
Claims (12)
1. The motion trail analysis method is applied to the electronic equipment and is characterized by comprising the following steps of:
acquiring track data of a plurality of stay points in a user motion track, wherein the track data comprises longitude, latitude and time of the stay points;
calculating motion parameters of each dwell point based on the track data of each dwell point, wherein the motion parameters comprise parameters of a plurality of dimensions related to motion generated when a user moves to the corresponding dwell point in motion corresponding to the motion track of the user;
and determining the motion type of the user at each stopover point based on the motion parameters of each stopover point.
2. The motion profile analysis method of claim 1, wherein the motion parameters include at least one of:
speed parameters, mobility parameters, direction change parameters.
3. The motion profile analysis method of claim 1, wherein the motion type comprises at least one of:
Absolute rest, relative rest, loitering movement.
4. The motion trajectory analysis method according to claim 1, wherein the dwell point includes at least two trajectory points in the user motion trajectory;
the calculating the motion parameter of each dwell point based on the trajectory data of each dwell point includes:
determining a speed parameter of the stay point based on a curve distance between a first time track point and a last time track point in at least two track points corresponding to the stay point, and a first time of the first time track point and a second time of the last time track point;
determining a mobility parameter of the dwell point based on the curve distance and a straight line distance between the first time track point and the last time track point;
and determining the direction change parameters of the stay points based on the movement directions of two adjacent track points in at least two track points corresponding to the stay points.
5. The method of claim 1, wherein determining the motion type of the user at each dwell point based on the motion parameters of each dwell point comprises:
Determining preset weights of the motion parameters of the stay points;
and determining the confidence coefficient of the stay point based on the motion parameter of the stay point and the preset weight of the motion parameter of the stay point, wherein the higher the confidence coefficient of the stay point is, the higher the probability that the user is stationary at the stay point is.
6. The motion profile analysis method according to claim 5, wherein the confidence level of the stay point is calculated by the following formula:
wherein,,
j represents the dimension number of the motion parameter;
a preset weight of a motion parameter of the j dimension of the i-th stop point is represented;
a motion parameter representing a j-th dimension of the i-th dwell point;
c (i) represents the confidence of the ith dwell point.
7. The movement trace analysis method according to claim 1, wherein the electronic device further records a movement state of the user and a time corresponding to each movement state;
the determining the motion type of the user at each dwell point based on the motion parameters of each dwell point further comprises:
determining a resting time duty ratio of the resting point based on the motion state, the time corresponding to each motion state and the track data of the resting point, wherein the resting time duty ratio represents the proportion of the time in the resting state to the total time in the region corresponding to the resting point of the user in the region corresponding to the resting point;
And determining the confidence of the dwell point based on the rest time duty ratio.
8. The motion profile analysis method of claim 1, wherein the determining the confidence level of the dwell point based on the rest time duty cycle comprises:
if the resting time duty ratio is greater than a first resting threshold value, determining that the resting point confidence coefficient is 1;
if the resting time duty ratio is smaller than a second resting threshold value, determining that the resting point confidence coefficient is 0;
if the resting time duty ratio is smaller than or equal to a first resting threshold value and larger than or equal to a second resting threshold value, determining a preset weight of the motion parameter of the resting point, and determining the confidence degree of the resting point based on the motion parameter of the resting point and the preset weight of the motion parameter of the resting point.
9. The motion trajectory analysis method according to claim 1, further comprising:
and acquiring track data of a plurality of track points in the user motion track, and determining stay points in the plurality of track points by adopting a stay point identification algorithm.
10. An electronic device, comprising:
a memory for storing instructions for execution by one or more processors of the electronic device, and
A processor, which is one of the processors of the electronic device, for controlling the execution of the motion trajectory analysis method according to any one of claims 1 to 9.
11. A computer readable storage medium having stored thereon instructions which, when executed on a computer, cause the computer to perform the motion profile analysis method of any one of claims 1 to 9.
12. A computer program product, characterized in that it comprises instructions that, when executed, cause a computer to perform the movement trace analysis method according to any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211166370.3A CN116680346B (en) | 2022-09-23 | 2022-09-23 | Motion trail analysis method, device and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211166370.3A CN116680346B (en) | 2022-09-23 | 2022-09-23 | Motion trail analysis method, device and medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116680346A true CN116680346A (en) | 2023-09-01 |
CN116680346B CN116680346B (en) | 2024-04-16 |
Family
ID=87779698
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211166370.3A Active CN116680346B (en) | 2022-09-23 | 2022-09-23 | Motion trail analysis method, device and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116680346B (en) |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017031856A1 (en) * | 2015-08-25 | 2017-03-02 | 百度在线网络技术(北京)有限公司 | Information prediction method and device |
CN107016126A (en) * | 2017-05-12 | 2017-08-04 | 西南交通大学 | A kind of multi-user's model movement pattern method based on sequential mode mining |
CN109446186A (en) * | 2018-09-27 | 2019-03-08 | 江苏大学 | A kind of social relationships judgment method based on motion track |
CN110888979A (en) * | 2018-09-10 | 2020-03-17 | 中国移动通信集团有限公司 | Interest region extraction method and device and computer storage medium |
CN111159582A (en) * | 2019-12-20 | 2020-05-15 | 北京邮电大学 | Method and device for processing track data of moving object |
CN111694905A (en) * | 2019-03-13 | 2020-09-22 | 杭州海康威视系统技术有限公司 | Track playback method and device |
US20200348838A1 (en) * | 2019-07-22 | 2020-11-05 | Beijing Dajia Internet Information Technology Co., Ltd. | Method, device, electronic device, and storage medium for sending and receiving message |
CN113032502A (en) * | 2021-02-09 | 2021-06-25 | 北京工业大学 | Ship anomaly detection method based on improved track segment DBSCAN clustering |
CN113139029A (en) * | 2021-04-25 | 2021-07-20 | 深圳市泰衡诺科技有限公司 | Processing method, mobile terminal and storage medium |
CN113589338A (en) * | 2021-07-29 | 2021-11-02 | 成都乐动信息技术有限公司 | Method and device for detecting stop point of user in motion process and electronic equipment |
CN114357036A (en) * | 2022-01-11 | 2022-04-15 | 拉扎斯网络科技(上海)有限公司 | Method and device for identifying stop point, storage medium and computer equipment |
-
2022
- 2022-09-23 CN CN202211166370.3A patent/CN116680346B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017031856A1 (en) * | 2015-08-25 | 2017-03-02 | 百度在线网络技术(北京)有限公司 | Information prediction method and device |
CN107016126A (en) * | 2017-05-12 | 2017-08-04 | 西南交通大学 | A kind of multi-user's model movement pattern method based on sequential mode mining |
CN110888979A (en) * | 2018-09-10 | 2020-03-17 | 中国移动通信集团有限公司 | Interest region extraction method and device and computer storage medium |
CN109446186A (en) * | 2018-09-27 | 2019-03-08 | 江苏大学 | A kind of social relationships judgment method based on motion track |
CN111694905A (en) * | 2019-03-13 | 2020-09-22 | 杭州海康威视系统技术有限公司 | Track playback method and device |
US20200348838A1 (en) * | 2019-07-22 | 2020-11-05 | Beijing Dajia Internet Information Technology Co., Ltd. | Method, device, electronic device, and storage medium for sending and receiving message |
CN111159582A (en) * | 2019-12-20 | 2020-05-15 | 北京邮电大学 | Method and device for processing track data of moving object |
CN113032502A (en) * | 2021-02-09 | 2021-06-25 | 北京工业大学 | Ship anomaly detection method based on improved track segment DBSCAN clustering |
CN113139029A (en) * | 2021-04-25 | 2021-07-20 | 深圳市泰衡诺科技有限公司 | Processing method, mobile terminal and storage medium |
CN113589338A (en) * | 2021-07-29 | 2021-11-02 | 成都乐动信息技术有限公司 | Method and device for detecting stop point of user in motion process and electronic equipment |
CN114357036A (en) * | 2022-01-11 | 2022-04-15 | 拉扎斯网络科技(上海)有限公司 | Method and device for identifying stop point, storage medium and computer equipment |
Non-Patent Citations (4)
Title |
---|
KELLY MERCKAERT 等: "Real-time motion control of robotic manipulators for safe human–robot coexistence", ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, pages 1 - 14 * |
张厚禄 等: "基于停留点密度聚类的轨迹区域划分方法", 中国人民公安大学学报(自然科学版), pages 102 - 108 * |
李春廷: "基于语义停留点的用户行为特征模型的构建研究", 信息科技, pages 1 - 67 * |
黄亮 等: "基于反地理编码服务的内河船舶轨迹停留语义信息提取", 中国航海, pages 88 - 94 * |
Also Published As
Publication number | Publication date |
---|---|
CN116680346B (en) | 2024-04-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109558512B (en) | Audio-based personalized recommendation method and device and mobile terminal | |
US20220124597A1 (en) | Method for Identifying Specific Position on Specific Route and Electronic Device | |
CN110544272B (en) | Face tracking method, device, computer equipment and storage medium | |
EP3014476B1 (en) | Using movement patterns to anticipate user expectations | |
KR101165537B1 (en) | User Equipment and method for cogniting user state thereof | |
CN109684980B (en) | Automatic scoring method and device | |
US20120035881A1 (en) | Activating Applications Based on Accelerometer Data | |
CN110807361A (en) | Human body recognition method and device, computer equipment and storage medium | |
CN108259758B (en) | Image processing method, image processing apparatus, storage medium, and electronic device | |
CN111105788B (en) | Sensitive word score detection method and device, electronic equipment and storage medium | |
EP3594848B1 (en) | Queue information acquisition method, device and computer readable storage medium | |
WO2019105457A1 (en) | Image processing method, computer device and computer readable storage medium | |
CN106295511A (en) | Face tracking method and device | |
CN112749613B (en) | Video data processing method, device, computer equipment and storage medium | |
CN106228158A (en) | The method and apparatus of picture detection | |
CN111738100B (en) | Voice recognition method based on mouth shape and terminal equipment | |
CN107784298B (en) | Identification method and device | |
CN113744736B (en) | Command word recognition method and device, electronic equipment and storage medium | |
CN113468929A (en) | Motion state identification method and device, electronic equipment and storage medium | |
US20140194147A1 (en) | Apparatus and method for reducing battery consumption of mobile terminal | |
CN112464831B (en) | Video classification method, training method of video classification model and related equipment | |
CN110728167A (en) | Text detection method and device and computer readable storage medium | |
KR20140074129A (en) | Place recognizing device and method for providing context awareness service | |
CN116680346B (en) | Motion trail analysis method, device and medium | |
CN111930228A (en) | Method, device, equipment and storage medium for detecting user gesture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |