CN113554932A - Track playback method and related device - Google Patents

Track playback method and related device Download PDF

Info

Publication number
CN113554932A
CN113554932A CN202010329410.6A CN202010329410A CN113554932A CN 113554932 A CN113554932 A CN 113554932A CN 202010329410 A CN202010329410 A CN 202010329410A CN 113554932 A CN113554932 A CN 113554932A
Authority
CN
China
Prior art keywords
control node
electronic device
track
determining
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010329410.6A
Other languages
Chinese (zh)
Other versions
CN113554932B (en
Inventor
王俊岭
高延龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010329410.6A priority Critical patent/CN113554932B/en
Priority to PCT/CN2021/088805 priority patent/WO2021213451A1/en
Publication of CN113554932A publication Critical patent/CN113554932A/en
Application granted granted Critical
Publication of CN113554932B publication Critical patent/CN113554932B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/006Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Abstract

A track playback method and related apparatus are disclosed, including: the electronic equipment acquires a motion track to be played back, wherein the motion track comprises a plurality of control nodes; the electronic equipment extracts a first control node and a second control node which are adjacent in the motion trail; the electronic equipment determines the rotation angle of the first control node according to the first control node, the second control node and the first direction; wherein, in the motion trail, the second control node is behind the first control node; when the electronic equipment detects that the motion track is played back to the first control node, the electronic equipment adjusts the display direction of the motion track according to the rotating angle corresponding to the first control node. The state change of the motion trail in the map can be stabilized under the condition of ensuring the similarity by controlling the number of the control nodes and the rotation angle of the control nodes.

Description

Track playback method and related device
Technical Field
The present application relates to the field of electronic map data processing technologies, and in particular, to a track playback method and a related apparatus.
Background
With the development and popularization of sensor technology, a Global Navigation Satellite System (GNSS) becomes the basic capability of terminal devices, such as automobiles, mobile phones, watches, and the like. After the relevant equipment or application program completes the user track recording by utilizing the GNSS, the capacity of dynamically playing back the track can be provided. Track playback refers to the constant updating of the user track and the map state of the display area at appropriate time intervals until the end of the user track.
Existing trajectory thinning techniques may provide a simplified trajectory that approximates the initial motion trajectory. In the track playback, the track thinning can be used for guiding the change track of the map state, and the change track of the map state refers to the change track of the real geographic coordinates corresponding to the central display area of the map.
However, there are two problems often existing in the track playback, one of which is that if the similarity between the simplified track and the initial motion track is low, the track will run out of the map display area; secondly, if the similarity between the simplified track and the initial motion track is high, the state change of the map is too violent, visual vertigo is caused, and subjective discomfort of a user may be caused.
Therefore, how to stabilize the state change of the map while ensuring the similarity is a problem that those skilled in the art are studying.
Disclosure of Invention
The embodiment of the application provides a track playback method and a related device, which can stabilize the state change of a map under the condition of ensuring the similarity.
In a first aspect, the present application provides a method for track playback, including:
the electronic equipment acquires a motion track to be played back, wherein the motion track comprises a plurality of control nodes; the electronic equipment extracts a first control node and a second control node which are adjacent in the motion track; the electronic equipment determines the rotation angle of the first control node according to the first control node, the second control node and the first direction; wherein, in the motion trail, the second control node is behind the first control node; when the electronic equipment detects that the motion track is played back to the first control node, the electronic equipment adjusts the display direction of the motion track according to the rotating angle corresponding to the first control node.
The method of the first aspect is implemented, each control node in the motion track corresponds to a rotation angle, the electronic device determines the rotation angle of each control node in the motion track, and the motion track is dynamically played back on a display screen of the electronic device according to the rotation angle of each control node. Wherein the first direction is a display direction of the motion trajectory in the display area of the electronic device at the time, and the display direction may be understood as a direction pointing to the top of the display area along a vertical direction of the bottom of the display area of the electronic device. The number of the control nodes in the motion track is related to the similarity between the motion track and the initial motion track, and the rotation angle of the control nodes in the motion track is related to the stability of the electronic equipment for playing back the motion track. The electronic device can stabilize the state change of the motion trajectory in the map by controlling the number of the control nodes and the rotation angles of the control nodes under the condition of ensuring the similarity.
With reference to the first aspect, in some embodiments, the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: when the electronic device detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, the electronic device determines that the rotation angle corresponding to the first control node is 0. In this way, if the first included angle is not greater than the first threshold, it may be considered that the display direction of the current motion trajectory in the display area does not affect the viewing of the user in the first direction, and the display direction of the motion trajectory does not need to be rotated. By setting the first threshold, the rotation frequency of the display direction of the movement track can be reduced, and the state change of the movement track in the map is stabilized.
In some embodiments, the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: when the electronic equipment detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, the electronic equipment determines that a rotating angle corresponding to the first control node is the first included angle. In this way, if the first included angle is greater than the first threshold, the display direction of the motion track is rotated, so that the display direction of the motion track is the first direction, and thus, a user can watch track playback in the first direction, and user experience is improved.
With reference to the first aspect, in some embodiments, the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: when the electronic device detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, or the distance between the first control node and the second control node is smaller than or equal to a second threshold value, the electronic device determines that the rotation angle corresponding to the first control node is 0. That is, if the first included angle is greater than the first threshold, but the physical distance between the control node and the next control node is close (less than or equal to the second threshold), the display direction of the motion trajectory cannot be rotated. Therefore, the situation that the display directions of the two motion tracks are continuously changed within a short time due to the fact that the distances of the two control nodes are too close to each other can be avoided, and the stability of the map state is further improved.
With reference to the first aspect, in some embodiments, the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: when the electronic device detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a first threshold value, and the distance between the connecting line between the first control node and the second control node is larger than a second threshold value, the electronic device determines that a rotation angle corresponding to the first control node is the first included angle. In this way, the first included angle and the distance between the two control nodes are jointly judged whether to rotate the display direction of the motion trail, so that the rotation frequency of the display direction of the motion trail can be reduced, and the state change of the motion trail in the map is stabilized.
In some embodiments, the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: when the first control node is a first control node in the motion trail, when the electronic equipment detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a third threshold value, the electronic equipment determines that a rotating angle corresponding to the first control node is the first included angle; wherein the third threshold is less than the first threshold. Since the third threshold is smaller than the first threshold, the first control node of the motion trail is easier to rotate the display direction of the motion trail than the other control nodes. This allows the user to view the track in a first direction when the dynamic track is initially played.
With reference to the first aspect, in some embodiments, the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: when the electronic equipment detects that the distance between the connecting line between the first control node and the second control node is larger than the fourth threshold value, the electronic equipment determines that the rotating angle corresponding to the first control node is the first included angle. In this way, under the condition that the distance between the control node and the next control node is far (greater than the fourth threshold), even if the first included angle does not satisfy the condition of rotating the display direction of the motion track, the display direction of the motion track can still be rotated to avoid that the user cannot view the track in the first direction for a long time, so that the current track advancing direction is ensured to be the first direction, and the user experience is improved.
With reference to the first aspect, in some embodiments, before the electronic device acquires the motion trajectory to be played back, the method further includes: the electronic equipment acquires an initial motion track of the electronic equipment; the electronic equipment determines a plurality of track points from the initial motion track; the electronic equipment determines a plurality of control nodes from a plurality of track points through the position information of the plurality of track points. The mode that the electronic equipment determines the plurality of control nodes from the plurality of track points can comprise the steps of taking the weighted average of the distance midpoint, the time midpoint and the position coordinate of two adjacent track points and the like.
In some embodiments, the electronic device determines the plurality of control nodes from the plurality of trace points through the position information of the plurality of trace points, and specifically includes: when the electronic equipment detects that an included angle formed by a line segment formed by connecting a target track point and two adjacent track points in the plurality of track points is larger than a fifth threshold value, the electronic equipment judges that the target track point is a redundant node; the electronic equipment deletes redundant nodes in the plurality of track points, and the rest track points in the plurality of track points are a plurality of control nodes. The redundant nodes can be effectively deleted by setting the fifth threshold, and because the redundant nodes have little influence on the similarity of the tracks, for the same motion track, the rotation times can be reduced by reducing the redundant nodes, and the stability of the map state change is improved.
In some embodiments, the obtaining, by the electronic device, a motion trajectory to be played back specifically includes: the electronic equipment connects the plurality of control nodes into a motion track. The similarity of the motion track and the initial motion track is related to the number of the control nodes and a fifth threshold, and the electronic equipment can well control the similarity of the track and the stability of the map state change by setting a reasonable fifth threshold.
In a second aspect, the present application provides an electronic device, which may include: one or more processors, memory, and a display screen; the memory, the display screen, and the one or more processors are coupled, the memory for storing computer program code, the computer program code including computer instructions that the one or more processors invoke to cause the electronic device to perform:
acquiring a motion track to be played back, wherein the motion track comprises a plurality of control nodes; extracting a first control node and a second control node which are adjacent in the motion trail; determining the rotation angle of the first control node according to the first control node, the second control node and the first direction; wherein, in the motion trail, the second control node is behind the first control node; and when the motion track is detected to be played back to the first control node, adjusting the display direction of the motion track according to the rotating angle corresponding to the first control node.
With reference to the second aspect, in some embodiments, determining a rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: and when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, determining that the rotating angle corresponding to the first control node is 0.
In some embodiments, determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: and when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a first threshold value, determining that the rotating angle corresponding to the first control node is the first included angle.
With reference to the second aspect, in some embodiments, determining a rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: and when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, or the distance between the first control node and the second control node is smaller than or equal to a second threshold value, determining that the rotating angle corresponding to the first control node is 0.
With reference to the second aspect, in some embodiments, determining a rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a first threshold value, and the distance between the first control node and the second control node is larger than a second threshold value, determining that a rotating angle corresponding to the first control node is the first included angle.
In some embodiments, determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: when the first control node is the first control node in the motion trail, when a first included angle formed by the first direction and a connecting line between the first control node and the second control node is detected to be larger than a third threshold value, the rotating angle corresponding to the first control node is determined to be the first included angle; wherein the third threshold is less than the first threshold.
With reference to the second aspect, in some embodiments, determining a rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: and when the distance between the connecting line between the first control node and the second control node is detected to be larger than a fourth threshold value, determining that the rotating angle corresponding to the first control node is a first included angle.
With reference to the second aspect, in some embodiments, before obtaining the motion trajectory to be played back, the method further includes: acquiring an initial motion track of the electronic equipment; determining a plurality of track points from the initial motion track; and determining a plurality of control nodes from the plurality of track points through the position information of the plurality of track points.
In some embodiments, determining a plurality of control nodes from a plurality of trace points according to the position information of the plurality of trace points specifically includes: when detecting that an included angle formed by a line segment formed by connecting a target track point and two adjacent track points in the plurality of track points is larger than a fifth threshold value, judging the target track point as a redundant node; and deleting redundant nodes in the plurality of track points, wherein the rest track points in the plurality of track points are a plurality of control nodes.
In some embodiments, obtaining the motion trajectory to be played back specifically includes: and connecting the plurality of control nodes into a motion track.
In a third aspect, an embodiment of the present application provides a track playback system, including an electronic device and a server, wherein,
the server is used for acquiring a motion trail to be played back, and the motion trail comprises a plurality of control nodes;
the server is also used for extracting a first control node and a second control node which are adjacent in the motion trail;
the server is further used for determining the rotation angle of the first control node according to the first control node, the second control node and the first direction; wherein, in the motion trail, the second control node is behind the first control node;
the server is further used for sending the motion trail and the rotation angle to the electronic equipment;
and the electronic equipment is used for adjusting the display direction of the motion track according to the rotating angle corresponding to the first control node when the motion track is detected to be played back to the first control node.
With reference to the third aspect, in some embodiments, the server is specifically configured to: and when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, determining that the rotating angle corresponding to the first control node is 0.
In some embodiments, the server is specifically configured to: and when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a first threshold value, determining that the rotating angle corresponding to the first control node is the first included angle.
With reference to the third aspect, in some embodiments, the server is specifically configured to: and when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, or the distance between the first control node and the second control node is smaller than or equal to a second threshold value, determining that the rotating angle corresponding to the first control node is 0.
With reference to the third aspect, in some embodiments, the server is specifically configured to: when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a first threshold value, and the distance of the connecting line between the first control node and the second control node is larger than a second threshold value, determining that a rotating angle corresponding to the first control node is the first included angle.
In some embodiments, the server is specifically configured to: when the first control node is the first control node in the motion trail, when a first included angle formed by the first direction and a connecting line between the first control node and the second control node is detected to be larger than a third threshold value, determining a rotating angle corresponding to the first control node as the first included angle; wherein the third threshold is less than the first threshold.
With reference to the third aspect, in some embodiments, the server is specifically configured to: and when the fact that the distance between the connecting line between the first control node and the second control node is larger than a fourth threshold value is detected, determining that the rotating angle corresponding to the first control node is a first included angle.
With reference to the third aspect, in some embodiments, the server is further configured to: acquiring an initial motion track of the electronic equipment before acquiring a motion track to be played back; determining a plurality of track points from the initial motion track; and determining a plurality of control nodes from the plurality of track points through the position information of the plurality of track points.
In some embodiments, the server is specifically configured to: when detecting that an included angle formed by a line segment formed by connecting a target track point and two adjacent track points in the plurality of track points is larger than a fifth threshold value, judging the target track point as a redundant node; and deleting redundant nodes in the plurality of track points, wherein the rest track points in the plurality of track points are a plurality of control nodes.
In some embodiments, the server is specifically configured to: and connecting the plurality of control nodes into a motion track.
In a fourth aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device is caused to perform the method for track playback in any one of the possible implementations of the first aspect.
In a fifth aspect, the present application provides a computer program product, which when run on a computer, causes the computer to execute the method for track playback in any one of the possible implementations of any one of the above aspects.
It is to be understood that the electronic device provided by the second aspect, the system provided by the third aspect, the computer storage medium provided by the fourth aspect, and the computer program product provided by the fifth aspect are all configured to perform the method for track playback provided by the first aspect, and therefore, the beneficial effects achieved by the electronic device provided by the second aspect may refer to the beneficial effects in the method provided by the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic diagram of a center point coordinate, an azimuth angle, and a control node according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a method for determining a control node according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of another method for determining a control node according to an embodiment of the present application;
FIG. 4 is a schematic diagram of determining an included angle of rotation according to an embodiment of the present disclosure;
fig. 5 is a schematic diagram illustrating a method for determining a lens action of a control node according to an embodiment of the present disclosure;
fig. 6 is a display diagram of an interface of a track playback method according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 8 is a schematic diagram of a software architecture according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described in detail and clearly with reference to the accompanying drawings. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" in the text is only an association relationship describing an associated object, and means that three relationships may exist, for example, a and/or B may mean: three cases of a alone, a and B both, and B alone exist, and in addition, "a plurality" means two or more than two in the description of the embodiments of the present application.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
The electronic device related in the embodiment of the present application may be a mobile phone, a tablet Computer, a desktop, a laptop, a notebook, an Ultra-mobile Personal Computer (UMPC), a server (including a cloud server), a handheld Computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device (e.g., a sports watch, a smart bracelet, and a smart watch), a virtual reality device, an on-vehicle multimedia device, an unmanned aerial vehicle, an aerial photography instrument, and other devices having positioning data acquisition and/or processing capabilities.
First, some related terms referred to in the present application are explained to facilitate understanding by those skilled in the art.
(1) Track playback: refers to the situation that the user selects a certain time period and then the electronic equipment reproduces the track of the user in the time period on the map. The trajectory of the user over the time period includes a series of location points, each of which may include date, time, longitude, latitude, altitude information, speed of movement, and the like. In this application, location points may be referred to as track points.
(2) The map state is as follows: indicating the map state of the display area during the track playback. The map state is described by using the coordinates of the center point of the map and the azimuth angle of the map, wherein the center point is shown as a diagram in fig. 1, and the a diagram in fig. 1 shows the intersection point of two diagonal lines of the display area, namely the center point of the display area. The center point coordinate describes the physical coordinate of the center point of the map in the display area, and the map state moves correspondingly along with the movement of the center point coordinate. The azimuth angle is shown as a diagram b in fig. 1, the diagram b in fig. 1 exemplarily shows the azimuth angle of the timing diagram state when the electronic device is vertically screened, the direction pointed by the arrow (r) shows the physical due north direction, and the direction pointed by the arrow (r) shows the direction pointed by the arrow (c) shows the direction pointed by the top of the display area along the vertical direction of the bottom of the display area of the electronic device. Wherein the azimuth angle represents the minimum angle formed by the physical true north direction rotating clockwise to the direction pointed by arrow (c). For example, if the azimuth angle is 45 degrees, the view angle (the direction pointed by the arrow) of the map is 45 degrees north and east, for example, if the azimuth angle is 90 degrees, the view angle (the direction pointed by the arrow) of the map is the direction of true east. As shown in a diagram c in fig. 1, the diagram c in fig. 1 exemplarily shows an azimuth angle of a state of the electronic device when the electronic device is landscape, a direction pointed by an arrow (c) shows a physical due north direction, and a direction pointed by an arrow (d) shows a direction pointed by a top of a display area of the electronic device along a vertical direction of the bottom of the display area. Wherein, the azimuth is the same as that of the electronic equipment when the electronic equipment erects the screen.
In this application, a direction (arrow) pointing to the top of the display area along the vertical direction of the bottom of the display area of the electronic device may be referred to as a positive direction or a first direction.
(3) Lens action: the method comprises three basic actions of straight running, rotation and stopping, wherein the straight running represents the change of the center point of the map, the rotation represents the change of the azimuth angle of the map, and the stopping represents the change of the state of the stop map.
(4) Motion trail: the track formed by the coordinates of the center point of the map in the process of track playback. The motion trail is represented as a plurality of limited straight lines, the intersection point of adjacent straight lines and the head and tail points of the motion trail are defined as control nodes, and the connecting lines of the control nodes form the motion trail, as shown in a diagram d in fig. 1. In the application, each control node corresponds to one lens action, and the electronic equipment makes corresponding lens actions according to the control nodes, so that the motion trail is drawn dynamically.
(5) And (3) a thinning algorithm: refers to an algorithm that compresses a large number of redundant trace data points to extract the necessary data points. The curve can be approximated as a series of points and the number of points reduced. In the track playback, the track thinning algorithm can provide a simplified track similar to the initial motion track, and can be used for guiding the changing track of the map state. The change track of the map state refers to a change track of real geographic coordinates corresponding to coordinates of a center point of the map.
The conventional track playback method generally provides a simplified track similar to an initial track by using a rarefying algorithm, and if the similarity between the simplified track and the initial track is low, the coordinate of the center point of a map cannot keep up with the drawing speed of the track, so that the problem that the track runs out of a map display area is caused. Conversely, if the similarity between the simplified trajectory and the initial movement trajectory is high, the state of the map changes too sharply, causing a visual vertigo feeling, and possibly causing subjective discomfort to the user.
In view of the above technical problems, an embodiment of the present application provides a track playback method, which can stabilize state changes of a map and improve user experience while ensuring similarity. In the track playback process of the application, the electronic equipment determines a control node of a track, then determines lens actions of the control node, and dynamically draws a track playback curve according to the control node and the lens actions of the control node. The above two processes are described separately below.
And (I) determining a control node of the track.
The method comprises the steps of firstly, obtaining an initial motion track of the electronic equipment.
The electronic equipment acquires an initial motion track of the electronic equipment, wherein the initial motion track comprises various types of tracks such as a walking track, a running track, a mountain climbing track and an amusement park track. The electronic device may obtain the initial motion trajectory locally, or may obtain the initial motion trajectory at other electronic devices (e.g., a cloud server), which is not limited in the present application.
The electronic device may obtain the initial motion trajectory in the time period according to the time period selected by the user, or obtain the initial motion trajectory between two positions according to the starting position and the ending position selected by the user, which is not limited in the present application.
In some possible embodiments, the electronic device may be an initial motion trajectory obtained according to a preset condition. The preset condition may be a movement track of the electronic device with the longest continuous movement time, a movement track of the electronic device with the longest continuous movement distance, a movement track of the electronic device on a certain day with the longest movement distance in a month/year, and the like. The preset condition may be set or selected by a user, or may be automatically triggered in the electronic device according to a certain time point and/or place.
For example, when the electronic device detects that the current position is a park, the electronic device automatically acquires a historical motion track of the electronic device in the park; when the electronic equipment detects that the current time is 12, 31 and 2019, the electronic equipment automatically acquires the motion track of the electronic equipment on a certain day with the longest motion distance in one year (12, 31 and 2018 to 12, 31 and 2019); when the electronic device detects an operation triggered by a user to acquire the primary motion track of the electronic device with the longest continuous motion time, the electronic device acquires the primary motion track of the electronic device with the longest continuous motion time.
And step two, calculating track points of the initial motion track.
The locus point is defined as Pi(x, y), where i represents the index value of the track point throughout the initial motion trajectory, and (x, y) represents the coordinates of the track point, which may be absolute coordinates (e.g., latitude and longitude) or relative coordinates. The trajectory data set of the motion is defined as P ═ { P ═ PiAnd j, i is 1, 2, …, N, where N is a positive integer and represents the number of track points of the current initial motion track. Each track point may include date, time, longitude, latitude, altitude information, speed of movement, etc.
After the electronic equipment acquires the initial motion track of the electronic equipment, the electronic equipment divides the initial motion track into K segments according to the number K of the segments and the total route L of the current initial motion track and the total route L, the interval between every two segments is calculated to be L/K, and every two segments are a track point. The initial point and the terminal point of the initial motion trail are respectively a track point, and the number of the track points is N which is K + 1. The distance may be an actual geographic distance or a distance on a map.
In some possible embodiments, the trace points may also be location points having the same linear distance separation. The straight-line distance between each track point and the adjacent track point is the same. The straight-line distance may be a distance in an actual geographical location or a distance on a map.
In some possible embodiments, the trace points may also be location points having the same time interval. For example, the electronic device divides the initial motion trajectory into K segments according to the number K of segments and the total time T consumed by the current initial motion trajectory, calculates the segment interval to be T/K, and takes a track point every T/K time period. The initial point and the terminal point of the initial motion trail are respectively a track point, and at the moment, K +1 track points exist.
Optionally, the method is not limited to the above equidistant point acquisition and equal time point acquisition, and the trace points may also be obtained according to other sampling algorithms, for example, an inflection point of the motion trajectory is extracted as the trace point, and the like, which is not limited in the present application.
Step three: and generating a control node set of the initial motion trail according to the track points.
After the electronic equipment acquires the track points in the initial motion track, one control node is selected between every two adjacent track points according to a preset mode, and head and tail track points are added to form a control node set. Several possible presettings are described below by way of example.
In the first mode, the distance midpoint of two adjacent track points is a control node. For example, the distance interval between two adjacent track points is L/K, a control node is taken from the initial motion track between the two adjacent track points, and the distances between the control node and the two adjacent track points are both L/2K.
And in the second mode, the time middle points of two adjacent track points are control nodes. For example, the time interval between two adjacent track points is T/K, a control node is taken from the initial motion track between the two adjacent track points, and the time interval between the control node and the two adjacent track points is T/2K.
And thirdly, performing weighted average on the position coordinates of two adjacent track points to obtain a new coordinate as the coordinate of the control node. For example, the position coordinates of two adjacent track points are respectively (x)1,y1) And (x)2,y2) Then the coordinates of the control node are taken as
Figure BDA0002464408080000081
Optionally, the method is not limited to the three manners, and the control node may also be obtained according to other algorithms, for example, the position coordinates of three continuous track points are weighted and averaged to obtain a new coordinate as the control node, and the like.
Fig. 2 exemplarily shows the process of acquiring the control node set, as shown in a diagram in fig. 2, and a diagram in fig. 2 exemplarily shows an initial motion trajectory. In this initial motion trajectory, the electronic device takes one trace point every same distance. As shown in the diagram b in fig. 2, the diagram b in fig. 2 exemplarily takes 10 track points, and the distance (the actual geographic distance or the distance on the map) between each two adjacent track points is the same. And according to the third preset rule, performing weighted average on the position coordinates of two adjacent track points to obtain a new coordinate as the coordinate of the control node. As shown in the c diagram in fig. 2, the electronic device obtains 11 control nodes.
Step four: the electronic equipment connects the plurality of control nodes into a motion track.
It can be understood that the greater the number K of segments, the greater the number of control nodes, and the greater the similarity between the drawn motion trajectory and the actual initial motion trajectory for the same motion trajectory. However, the larger the number of control nodes, the more the lens action, which results in unstable state change of the map.
In some possible embodiments, according to the obtained control node set, redundant control nodes in the control node set in the third step may be further screened and deleted without affecting the similarity.
Namely, after the third step, before the fourth step, the method further includes deleting redundant control nodes in the control node set to form a new control node set.
And after the electronic equipment generates the control node set of the track according to the track points, sequentially judging the control nodes in the control node set. If the included angle (acute angle) formed by the line segment formed by the node and the two adjacent nodes is larger than the first threshold value, the node is judged to be a redundant node. And the head node and the tail node are determined as control nodes without judgment. And sequentially judging the control nodes in the control node set to form a new control node set.
Fig. 3 illustrates the above-described process of forming a new set of control nodes. As shown in a diagram in fig. 3, a diagram in fig. 3 exemplarily shows the control node set obtained in step three, including A, B, C, D, E five control nodes. Firstly, determining a first node A as a control node, acquiring a next control node B of the first node A, and judging whether the B is a redundant node. Here, the north direction in fig. 3 is the north direction in the map, for example, when the control node B is in the north-east direction of the control node a in the motion trajectory of the diagram a in fig. 3. As shown in B of fig. 3, an included angle θ formed by three nodes is calculated A, B, C with B as a middle node. Where θ ═ 180 | - | θ12||,θ1Is the azimuth angle of AB (the angle between the north and the direction of AB), theta2Is the azimuth angle of BC (the angle between the north and the direction BC is pointed). And judging the size of theta, if theta is larger than a fifth threshold value, judging the control node B as a redundant node, and deleting the control node B as shown in a graph c in fig. 3.
And then, acquiring the next control node C of the B, and judging whether the C is a redundant node. And C is taken as a middle node, an included angle theta formed by A, C, D three nodes is calculated, C is judged to be a redundant node, and the control node C is deleted. And sequentially judging the control nodes in the control node set until the last control node E is a tail node, and directly determining the tail node as the control node. As shown in the d diagram of fig. 3, a new control node set is formed, comprising A, D, E three control nodes.
In the present application, the above-mentioned fifth threshold value may be referred to as a damping angle α. The above method for determining the control node of the track can effectively delete the redundant control point by setting the attenuation angle alpha, and because the redundant control node has little influence on the similarity of the track, for the same motion track, reducing the redundant control point can reduce the lens action and improve the stability of the map state change.
It can be understood that the smaller the attenuation angle α, the greater the number of redundant control nodes screened out, the fewer the remaining control nodes, and the lower the similarity between the drawn trajectory and the actual trajectory for the same motion trajectory. Thus, the number of segments K and the attenuation angle α together determine the similarity of the drawn trajectory to the actual trajectory. In the application, the track similarity and the stability of the map state change can be well controlled by setting the reasonable segmentation quantity K and the attenuation angle alpha. For example, the number of segments K may be set at about 10, and the attenuation angle α may be set at about 160 degrees.
In some possible embodiments, by fixing the physical distance between the trace points, different numbers of segments may be determined for different trace lengths. The longer the length of the track is, the more the number of the segments is, and the more the control nodes are; the shorter the length of the trace, the fewer the number of segments and the fewer control nodes.
In some possible embodiments, the different number of segments is determined according to a scale (ratio of display distance to actual distance) on which the motion trajectory is displayed on the display area of the electronic device, wherein the larger the scale, the larger the number of segments. For example, a segment of the same motion trajectory has an actual distance of 5 km, if a scale of the motion trajectory on a display area of the electronic device is 1 cm: 1 km, the total length of the motion trajectory displayed on the display area of the electronic device is 5 cm, 10 trajectory points can be taken from the motion trajectory, and the number of the segments is 9; if the scale of the motion trajectory is 1 cm: 0.5 km (the scale is large), the total length of the motion trajectory displayed on the display area of the electronic device is 10 cm, 20 trajectory points can be taken for the motion trajectory, and the number of segments is 19.
And (II) determining the lens action of the control node.
After the electronic equipment acquires the control node set of the track, lens action of each control node in the control node set is determined.
First, the rotation angle will be described. In this application, the rotation included angle of the control node is the included angle (acute angle) between the track direction and the positive direction of the control node. The positive direction is a direction pointing in a vertical direction along the bottom of the display area of the electronic device to the top of the display area.As shown in fig. 4, a diagram a in fig. 4 exemplarily shows a motion trajectory on a display area, which includes A, B, C, D, 4 control nodes. For the control node A, the track direction of A is AB direction, the positive direction is vertical upwards in a diagram a in FIG. 4, and the rotation included angle of A is thetaA. If the lens action of A is rotation, the azimuth angle of the map state of the display area changes, the point A is taken as a rotation point, the positive direction is taken as a rotation axis, the display area rotates towards the positive direction, and the rotation angle is thetaAAs shown in diagram b of fig. 4. At this time, for the control node B, the track direction of B is BC direction, the positive direction is AB direction, and the rotation angle of B is θB. When the lens of B moves as a straight line, the azimuth angle of the map state of the display area is not changed, and the positive direction is not changed. For the control node C, the track direction of C is the CD direction, the positive direction is the AB direction, and the rotation included angle of C is thetaC
In this application, the rotation angle may also be referred to as a first angle.
Several possible preset rules are exemplarily described below.
And presetting a rule I, if the rotation included angle of the control node is larger than a first threshold value, determining the lens movement of the control node as rotation. And if the rotation included angle of the control node is not larger than the first threshold value, determining the lens movement of the control node as straight-ahead movement.
The first preset rule determines the lens action of the control node by judging whether the rotation included angle is larger than a first threshold value. The frequency of rotation of the map state can be reduced, the azimuth angle of the map state does not need to be changed every time the track direction is changed, and the stability of the map state is improved. In the present application, the first threshold may be referred to as a rotation suppression angle β, and the rotation suppression angle β may be set to about 60 degrees.
And presetting a second rule, if the rotation included angle of the control node is greater than a first threshold value and the physical distance between the control node and the next control node is greater than a second threshold value, determining the lens movement of the control node as rotation. And if the rotation included angle of the control node is not greater than a first threshold value or the physical distance between the control node and the next control node is not greater than a second threshold value, determining the lens motion of the control node as straight-ahead motion.
That is, when the rotation angle of the control node is greater than the first threshold, if the physical distance between the control node and the next control node is close (not greater than the second threshold), the azimuth cannot be rotated. The second preset rule determines the lens action of the control node by judging whether the rotation included angle is larger than a first threshold value and whether the physical distance between the control node and the next control node is larger than a second threshold value. The situation that the distance between two control nodes is too close, so that the lens action is continuously changed for two times in a short time can be avoided, and the stability of the map state is further improved. In the present application, the second threshold may be referred to as a rotation suppression distance Lβ
And presetting a rule III, if the physical distance between the control node and the next control node is greater than a fourth threshold value, determining the lens movement of the control node as rotation.
That is, when the rotation angle of the control node is not greater than the first threshold, if the physical distance between the control node and the next control node is relatively long (greater than the fourth threshold), the azimuth angle may also be rotated. The third preset rule describes that under the condition that the distance between the control node and the next control node is far, even if the rotation included angle of the control node does not meet the condition of rotating the azimuth angle, the azimuth angle can still be rotated in order to avoid that a user cannot watch the track in the positive direction for a long time, the forward direction of the current track is ensured to be approximately the positive direction, and the user experience is improved. In the present application, the fourth threshold may be referred to as a forced rotation distance Ls
And a fourth preset rule, when the control node is the head node, if the rotation included angle of the control node is greater than a third threshold value, determining that the lens of the control node moves as rotation. And if the rotation included angle of the control node is not larger than the third threshold value, determining that the lens action of the control node is straight. Wherein the third threshold is smaller than the first threshold.
The fourth preset rule describes that in the case that the control node is the head node, the lens action of the head node is determined by judging whether the rotation included angle is greater than a third threshold. Since the third threshold is smaller than the first threshold, the head node is easier to implement the lens action of the rotating azimuth angle than other control nodes. The effect that the user can watch the track in the positive direction when the dynamic track is played at the beginning is achieved. In the present application, the third threshold may be referred to as a first rotation suppression angle λ, and the first rotation suppression angle λ may be set to about 45 degrees.
And presetting a rule V, and if the control node is a tail node, taking the lens action of the control node as stop.
The preset rules in the present application may include, but are not limited to, one or more of the above preset rules. For the preset rule, the electronic device determines the lens movement of the control node as rotation, where the rotation angle may be an angle of a rotation included angle of the control node. In order to avoid a large degree of rotation of the map state, the rotation angle may also be smaller than the rotation angle. The electronic device determines the lens motion of the control node as straight-going, i.e. the rotation angle is 0.
Fig. 5 exemplarily shows a process of determining lens actions of control nodes in a track, and as shown in a diagram in fig. 5, the a diagram in fig. 5 exemplarily shows one control node set, including A, B, C, D, E five control nodes. And analyzing each control node in turn to determine the lens action of each control node. Here, the north direction in fig. 5 is the north direction in the map, for example, when the control node B is in the north-east direction of the control node a in the motion trajectory of the graph a in fig. 5.
First, for the head node a, as shown in the b diagram in fig. 5, the azimuth angle θ of a is acquired0(the included angle between the AB pointing direction and the due north direction) according to a preset rule of four, theta0If the maximum rotation angle is larger than the third threshold value, the lens action of the control node A is determined as rotation, wherein the maximum rotation angle is theta0
For the next controlling node B of a, the rotation angle θ of B is obtained as shown in the graph c in fig. 5. Where θ is ═ θ12| >180360-|θ12|:|θ12I.e. if | θ12If | is greater than 180, then θ is 360 | - θ12L, |; if theta12If | is less than or equal to 180, then theta is | - [ theta ]12|。 θ1For the azimuth angle of the forward adjacent rotating node (here, the azimuth angle of A), θ2To control the azimuth angle of the node B (the angle between the BC pointing direction and the north direction). And after theta is calculated, according to a first preset rule, if theta is not larger than a first threshold value, the lens action of the node B is controlled to be determined to be straight.
Optionally, if the control node has no forward adjacent rotating node, then θ1Is the azimuth of the positive direction.
For the next control node C of B, the rotation angle of C is obtained as shown in d in fig. 5. And after the rotation included angle of the node C is calculated, according to a second preset rule, if the rotation included angle is larger than a second threshold value but the physical distance of the CD is not larger than the second threshold value, controlling the lens action of the node B to be determined as straight.
For the next control node D of C, as shown in e diagram in FIG. 5, the rotation angle of D is obtained as θ3(the angle between the track direction DE and the now positive direction AB). After the rotation included angle of D is calculated, according to a third preset rule, if the physical distance of DE is greater than a fourth threshold value, the lens action of the node D is controlled to be rotation, wherein the maximum rotation angle is theta3
And for the next control node E of the D, if E is a tail node according to a preset rule five, the lens action of the control node E is stop.
In the present application, the rotation-inhibiting distance LβAnd a forced rotation distance LsThe longer the map state is, the fewer the number of rotations performed on the map state is, and the user experience is poor; rotation-inhibiting distance LβAnd a forced rotation distance LsThe shorter the rotation number of the map state, the more the stability of the map state is affected. In the present application, the distance L is suppressed by setting a reasonable rotationβAnd a forced rotation distance LsThe stability of the map state change can be well controlledAnd (5) problems are solved.
In some possible embodiments, the rotation suppression angle β may take different values for different trajectories. For example, the electronic device calculates a diagonal of a rectangular box of southwest and northeast coordinates of the trajectory, the diagonal distance being L. Defining a rotation suppression distance Lβγ ═ L, where γ is the intensity coefficient, recommended value of 0.05; and defining a forced rotation distance Lsη ═ L, where η is the intensity coefficient, with a recommended value of 0.25.
In this way, the rotation suppression distance L is determined by introducing the distance L of the diagonal line of the rectangular frame formed by the trajectoryβAnd a forced rotation distance Ls. Rotation suppression distances L of different motion trajectoriesβAnd a forced rotation distance LsAlso different, wherein the longer L, the rotation inhibition distance LβAnd a forced rotation distance LsThe longer, the fewer rotations made to the map state; the shorter L, the rotation suppression distance LβAnd a forced rotation distance LsThe shorter the rotation number is, the smaller the rotation number is made to the map state.
In the present application, for a control node whose lens action is rotation, the rotation angle to the azimuth of the map state in practice is the maximum rotation included angle of the control node.
The above describes two processes of the electronic device determining the control node and determining the lens action of the control node. After the electronic device determines the control node and the lens action of the control node, drawing dynamic track playback according to the control node and the lens action of the control node.
The center point of the display area of the electronic equipment is used as the center point coordinate of the map state, the track dynamically displayed by the electronic equipment is a track formed by connecting control nodes, and the center point coordinate is continuously changed along with the dynamic track drawn by the electronic equipment. When the electronic equipment detects that the motion track is played back to a control node with a straight-going lens action, and the rotation angle is 0, the central point coordinate moves to the next control node at a corresponding speed; when the electronic equipment detects that the motion track is played back to a control node of which the lens action is rotation, the electronic equipment correspondingly rotates the azimuth angle of the map state according to the rotation angle corresponding to the control node, adjusts the display direction of the motion track, and simultaneously moves the central point coordinate to the next control node at a corresponding speed; and when the electronic equipment draws the lens motion as a stopped control node, finishing drawing and finishing track playback.
According to the track playback method, firstly, a control node of the track is selected, and the similarity between the motion track and the initial motion track is controlled according to the segmentation number K and the attenuation angle alpha. According to the rotation inhibition angle beta, the rotation inhibition distance LβAnd a forced rotation distance LsThe intensity of the lens action is controlled. The action and the movement track of the lens are controlled by different parameters, so that the effect of visual subjective discomfort can be avoided while the track is kept in a display area during dynamic track playing, and the stability of the map state is improved.
In a possible implementation manner, a Point of interest (POI) is further included in the motion track played back by the electronic device. The POI may be a point preset by the electronic device, such as a mall, a bus station, or the like, and may also be a point detected by the electronic device according to a preset function, such as a point with the highest speed in the motion trajectory, a point with the highest heart rate, or the like.
The point of interest may be a point preset by the electronic device before the electronic device acquires the initial motion trajectory. For example, the points with the highest speed in the motion trajectory are all set as the interest points, and after the electronic device acquires the initial motion trajectory, the track points with the highest speed in the initial motion trajectory are acquired and determined as the interest points. For another example, the electronic device sets a specific building as the interest point, and after the electronic device acquires the initial motion trajectory, if the initial motion trajectory passes through the specific building, the trajectory point passing through the specific building is determined as the interest point.
The interest point may also be a shot action that a user selects an interest point and corresponds to the interest point on the initial motion track after the electronic device acquires the initial motion track.
The lens action corresponding to the interest point may be a lens action preset by the electronic device or a lens action selected by the user. The lens actions corresponding to the points of interest may include pause, zoom, and the like. Pause means that when the electronic device draws a shot as a paused point of interest, the map state is paused, and the pause time may be 0.5 seconds; zooming means zooming the scale of the map display when the electronic device draws a shot as the point of interest of zooming, for example, zooming in the map when the point of interest is reached and zooming out the map to the previous scale when the point of interest is left.
The lens action corresponding to the point of interest may also be pause and zoom. Pause and zoom means that when the electronic device draws to a point of interest for which the lens action is pause and zoom, the map is zoomed in when the point of interest is reached, and after 0.5 second of pause, the map is zoomed out to the previous scale away from the point of interest.
In this embodiment, after the electronic device determines the control node, the lens action of the control node, and the point of interest, the motion trajectory is dynamically played back according to the control node, the lens action of the control node, and the point of interest.
According to the embodiment of the application, the electronic equipment displays the track playback icon on the display area, receives user operation of a user on the icon of the track playback function, and dynamically plays back the motion track in the display area.
When the electronic equipment detects that the lens action is a straight control node, the rotation angle of the map state is 0, and the central point coordinate moves to the next control node at a corresponding speed; when the electronic equipment detects that the lens action is a rotating control node, correspondingly rotating the azimuth angle of the map state according to the rotating angle corresponding to the control node, adjusting the display direction of the motion track, and simultaneously moving the central point coordinate to the next control node at a corresponding speed; when the electronic device detects that the lens action is a paused point of interest, pausing the map state, wherein the pause time can be 0.5 second, and is not limited herein; and when the electronic equipment draws the lens motion as a stopped control node, ending the playback and finishing the playback of the motion track.
Optionally, the display area of the electronic device may further include a text entry box. The text input box is used for receiving text data input by a user, such as time information and place information input by the user. The electronic equipment receives time information input by a user, and can play back a motion track in a time period of the time information; the electronic equipment receives the place information input by the user, and can play back the motion trail passing through the place in the place information.
Optionally, the display area of the electronic device may further include a voice input box. The voice input box may receive voice information input by a user. The electronic equipment receives voice information input by a user, and plays back a corresponding motion track by recognizing keywords in the voice information.
Optionally, the display area of the electronic device may further include a track list. The track list comprises one or more motion tracks, and the electronic equipment receives click operation of a user aiming at a certain motion track and plays back the motion track.
Next, a description is made of an application interface of the present application for rendering dynamic track playback. As shown in fig. 6, fig. 6 is an exemplary interface display diagram showing two lens strokes of straight and rotating. Among them, diagrams a in fig. 6 to b in fig. 6 show the panning as a straight trajectory playback interface display, and diagrams b in fig. 6 to c in fig. 6 show the panning as a rotated trajectory playback interface display.
As shown in a diagram in fig. 6, a diagram a in fig. 6 includes a trajectory line 601, a position point 602, a route display area 603, and a speed display area 604. Wherein the content of the first and second substances,
the trajectory line 601 describes the motion process of the trajectory during the dynamic trajectory playback process.
When the position point reaches the control node, the position point 602 controls the map state according to the lens action corresponding to the control node. In this application, the position point 602 is a central point of the display area, and the coordinate of the central point represented by the position point 602 changes according to the control node and the lens action of the control node, so that the coordinate of the central point of the map state also changes correspondingly.
And a distance display area 603 for displaying a physical distance from the starting point in the motion trajectory of the current position point 602.
And a speed display area 604 for displaying the moving speed of the current position point 602 in the moving track.
As shown in a diagram in fig. 6, the route in the route display area 603 is 1.90 km, and the movement speed in the speed display area 604 is 6 minutes and 31 seconds per km.
When the rotation included angle of the control node reached by the position point 602 is not greater than the first threshold value; or when the physical distance between the control node and the next control node is not more than the second threshold value, the map state is a straight-going state. As shown in a diagram b in fig. 6, the distance in the distance display area 603 is 1.98 km, and it is obvious that the coordinates of the center point of the map in the diagram a in fig. 6 and the map in the diagram b in fig. 6 are changed, and the azimuth is not changed. That is, the map state is always the straight traveling state from the diagram a in fig. 6 to the diagram b in fig. 6.
When the rotation angle of the control node reached by the position point 602 is greater than a first threshold value; or the physical distance between the control node and the next control node is larger than a second threshold value; or when the physical distance between the control node and the next control node is larger than the fourth threshold, the map state is a rotation state. As shown in a diagram c in fig. 6, the distance in the distance display area 603 is 2.05 km, and it is obvious that coordinates of a center point of the map in a diagram b in fig. 6 and a diagram c in fig. 6 are changed, and an azimuth angle is also changed. That is, the map state is the rotated and straight state from the b diagram in fig. 6 to the c diagram in fig. 6.
In the embodiment of the application, the electronic device can perform processing and track playback by receiving initial motion track data acquired by other electronic devices. For example. The user triggers the electronic equipment to carry out track playback on the electronic equipment, and the electronic equipment sends a track playback request to the server side and receives the initial motion track sent by the server side. The electronic equipment determines a control node of the motion trail and lens actions of the control node according to the received initial motion trail, and carries out trail playback according to the lens actions of the control node and the control node. The electronic device may refer to the above embodiments for determining the control node and the lens action of the control node, and details are not described here.
Alternatively, the electronic device may perform track playback by receiving initial motion track data acquired and processed by other electronic devices. In terms of distance, a user triggers the electronic device to perform track playback on the electronic device, and after receiving user operation for starting track playback, the electronic device sends a track playback request to the server and receives the control node and lens actions of the control node sent by the server. And the electronic equipment performs track playback according to the control node and the lens action of the control node. The above embodiments may be referred to as a method for determining the control node and the lens action of the control node by the server, and details are not repeated here.
Optionally, the electronic device obtains the initial motion trajectory data of the electronic device from the cloud server according to a manner of logging in the unique identity account, and processes and plays back the initial motion trajectory data obtained from the cloud server.
The track playback method provided by the embodiment of the application can also be applied to the field of map navigation.
Specifically, the terminal device obtains a starting position and an end position selected by a user, the starting position and the end position are sent to the server, the server obtains a route between the two positions according to the starting position and the end position, lens actions of a control node and the control node in the route are calculated by combining a planned route and a track playback method provided by the application, and the lens actions are sent to the terminal device. And the terminal equipment dynamically displays the recommended route according to the data information sent by the server.
Optionally, the terminal device obtains a start position and an end position selected by the user, obtains a route between the two positions according to the start position and the end position, and calculates a control node and a lens action of the control node in the route by combining the planned route and the trajectory playback method provided by the application. And the terminal equipment dynamically displays the recommended route according to the calculated control node and the lens of the control node.
For the convenience of understanding the embodiment of the present application, an electronic device to which the embodiment of the present application is applied will be described by taking the electronic device 100 shown in fig. 7 as an example.
Fig. 7 shows a schematic structural diagram of an exemplary electronic device 100 provided in an embodiment of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 150, a power management module 151, a battery 152, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display 195, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than illustrated, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, a Micro Control Unit (MCU), and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
In this application, the processor 110 may be configured to extract a control node in the motion trajectory and then determine a rotation angle of the control node according to a preset rule, and the processor 110 draws the motion trajectory according to the control node and the rotation angle of the control node. When the processor 110 detects that the motion trajectory is played back to the control node, the display direction of the motion trajectory is adjusted according to the rotation angle corresponding to the control node.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the electronic device 100.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 195, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. Processor 110 and display screen 195 communicate through the DSI interface to implement display functions of electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 195, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
In some embodiments, the solution of wireless communication provided by the mobile communication module 150 may enable the electronic device to communicate with a device (e.g., a server) in a network, and the solution of WLAN wireless communication provided by the wireless communication module 160 may also enable the electronic device to communicate with a device (e.g., a server) in a network and to communicate with a cloud device through the device (e.g., a server) in the network. Like this, electronic equipment alright in order to discover high in the clouds equipment, transmission data to high in the clouds equipment.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs an audio signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display 195. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves. Illustratively, the wireless communication module 160 may include a Bluetooth module, a Wi-Fi module, and the like.
The electronic device 100 may implement display functions via a GPU, a display screen 195, and an application processor, among others. The GPU is a microprocessor for image processing, coupled to a display screen 195 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute instructions to generate or change display information.
The display screen 195 is used to display images, video, and the like. The display screen 195 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, electronic device 100 may include 1 or N display screens 195, with N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) -1, MPEG-2, MPEG-3, MPEG-5, etc.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, data such as music, photos, video, etc. are stored in an external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may execute the above-mentioned instructions stored in the internal memory 121, so as to enable the electronic device 100 to perform the data sharing method provided in some embodiments of the present application, and various functional applications and data processing. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage area may also store one or more application programs (e.g., gallery, contacts, etc.), etc. The storage data area may store data (e.g., photos, contacts, etc.) created during use of the electronic device 100. In addition, the internal memory 121 may include a high speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal for output, and also used to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 195. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 195, the electronic apparatus 100 detects the intensity of the touch operation based on the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position based on the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is at rest. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from a nearby object using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically turn off the screen to save power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 195 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 152 when the temperature is lower than another threshold, so as to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs a boost on the output voltage of the battery 152 to avoid an abnormal shutdown due to low temperature.
Touch sensor 180K, which may also be referred to as a touch panel or touch sensitive surface. The touch sensor 180K may be disposed on the display screen 195, and the touch sensor 180K and the display screen 195 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display 195. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display 195.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic device 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for both an incoming call vibration prompt and a touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 195. Different application scenes (such as time reminding, information receiving, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be attached to and detached from the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out from the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The electronic device 100 exemplarily shown in fig. 7 may display various user interfaces through the display screen 195. The electronic device 100 may detect a touch operation in each user interface through the touch sensor 180K, such as a click operation in each user interface (e.g., a touch operation on an icon, a double-click operation), an upward or downward sliding operation in each user interface, or an operation of performing a circle-drawing gesture, and so on. In some embodiments, the electronic device 100 may detect a motion gesture performed by the user holding the electronic device 100, such as shaking the electronic device, via the gyroscope sensor 180B, the acceleration sensor 180E, and/or the like. In some embodiments, the electronic device 100 may detect non-touch gesture operations through the camera 193 (e.g., 3D camera, depth camera).
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 8 is a block diagram of a software configuration of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. And the layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 8, the application packages may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 8, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, and the like.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to construct an application. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. Such as prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide a fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG5, h.265, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The software system shown in fig. 8 involves application rendering (e.g., gallery, file manager) using trace playback capability, an instant sharing module providing sharing capability, a map navigation module providing positioning capability, and an application framework layer providing WLAN services, bluetooth services, and a kernel and underlying layer providing WLAN bluetooth capabilities and basic communication protocols.
The embodiment of the application also provides a computer readable storage medium. All or part of the procedures in the above method embodiments may be performed by relevant hardware instructed by a computer program, which may be stored in the above computer storage medium, and when executed, may include the procedures in the above method embodiments. The computer-readable storage medium includes: various media that can store program codes, such as a read-only memory (ROM) or a Random Access Memory (RAM), a magnetic disk, or an optical disk.
As an alternative design, a computer-readable storage medium may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
The embodiment of the application also provides a computer program product. The methods described in the above method embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the above computer instructions are loaded and executed on a computer, the procedures or functions described in the above method embodiments are wholly or partially generated. The computer may be a general purpose computer, a special purpose computer, a computer network, a network appliance, a user device, or other programmable apparatus. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The modules in the device can be merged, divided and deleted according to actual needs.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (22)

1. A method of track playback, comprising:
the method comprises the steps that electronic equipment obtains a motion track to be played back, wherein the motion track comprises a plurality of control nodes;
the electronic equipment extracts a first control node and a second control node which are adjacent in the motion trail;
the electronic equipment determines the rotation angle of the first control node according to the first control node, the second control node and the first direction; wherein the second control node is subsequent to the first control node in the motion profile;
when the electronic equipment detects that the motion track is played back to the first control node, the electronic equipment adjusts the display direction of the motion track according to the rotation angle corresponding to the first control node.
2. The method according to claim 1, wherein the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
when the electronic device detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, the electronic device determines that the rotation angle corresponding to the first control node is 0.
3. The method according to claim 1 or 2, wherein the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
when the electronic equipment detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a first threshold value, the electronic equipment determines that a rotation angle corresponding to the first control node is the first included angle.
4. The method according to claim 1, wherein the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
when the electronic device detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, or the distance between the first control node and the second control node is smaller than or equal to a second threshold value, the electronic device determines that the rotation angle corresponding to the first control node is 0.
5. The method according to claim 1 or 4, wherein the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
when the electronic device detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a first threshold value, and the distance between the first control node and the second control node is larger than a second threshold value, the electronic device determines that a rotation angle corresponding to the first control node is the first included angle.
6. The method according to any one of claims 2 to 5, wherein the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
when the first control node is a first control node in the motion trail, when the electronic device detects that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a third threshold value, the electronic device determines that a rotation angle corresponding to the first control node is the first included angle; wherein the third threshold is less than the first threshold.
7. The method according to any one of claims 2 to 6, wherein the determining, by the electronic device, the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
when the electronic equipment detects that the distance between the connecting line between the first control node and the second control node is larger than a fourth threshold value, the electronic equipment determines that the rotation angle corresponding to the first control node is the first included angle.
8. The method of claim 1, wherein before the electronic device obtains the motion trajectory to be played back, the method further comprises:
the electronic equipment acquires an initial motion track of the electronic equipment;
the electronic equipment determines the plurality of track points from the initial motion track;
and the electronic equipment determines the control nodes from the track points through the position information of the track points.
9. The method according to claim 8, wherein the electronic device determines the plurality of control nodes from the plurality of track points through the position information of the plurality of track points, and specifically includes:
when the electronic equipment detects that an included angle formed by a line segment formed by connecting a target track point and two adjacent track points in the plurality of track points is larger than a fifth threshold value, the electronic equipment judges that the target track point is a redundant node;
and the electronic equipment deletes the redundant nodes in the plurality of track points, and the rest track points in the plurality of track points are the plurality of control nodes.
10. The method according to claim 8 or 9, wherein the electronic device obtains the motion trajectory to be played back, specifically comprising:
and the electronic equipment connects the plurality of control nodes into the motion trail.
11. An electronic device, comprising: one or more processors, memory, and a display screen;
the memory, the display screen, and the one or more processors are coupled, the memory for storing computer program code, the computer program code comprising computer instructions, the one or more processors to invoke the computer instructions to cause performance of:
obtaining a motion track to be played back, wherein the motion track comprises a plurality of control nodes;
extracting a first control node and a second control node which are adjacent in the motion trail;
determining a rotation angle of the first control node according to the first control node, the second control node and a first direction; wherein the second control node is subsequent to the first control node in the motion profile;
and when the motion track is detected to be played back to the first control node, adjusting the display direction of the motion track according to the rotating angle corresponding to the first control node.
12. The electronic device according to claim 11, wherein determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
and when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, determining that the rotation angle corresponding to the first control node is 0.
13. The electronic device according to claim 11 or 12, wherein determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
and when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a first threshold value, determining that the rotating angle corresponding to the first control node is the first included angle.
14. The electronic device according to claim 11, wherein determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is smaller than or equal to a first threshold value, or the distance between the first control node and the second control node is smaller than or equal to a second threshold value, determining that the rotation angle corresponding to the first control node is 0.
15. The electronic device according to claim 11 or 14, wherein determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a first threshold value, and the distance between the first control node and the second control node is larger than a second threshold value, determining that a rotating angle corresponding to the first control node is the first included angle.
16. The electronic device according to any one of claims 12 to 15, wherein determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
when the first control node is the first control node in the motion trail, determining that a rotation angle corresponding to the first control node is the first included angle when detecting that a first included angle formed by the first direction and a connecting line between the first control node and the second control node is larger than a third threshold value; wherein the third threshold is less than the first threshold.
17. The electronic device according to any of claims 12-16, wherein determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes:
and when the fact that the distance between the connecting line between the first control node and the second control node is larger than a fourth threshold value is detected, determining that the rotating angle corresponding to the first control node is the first included angle.
18. The electronic device of claim 11, wherein prior to obtaining the motion trajectory to be played back, the method further comprises:
acquiring an initial motion track of the electronic equipment;
determining the plurality of track points from the initial motion track;
and determining the control nodes from the track points through the position information of the track points.
19. The electronic device according to claim 18, wherein determining the plurality of control nodes from the plurality of trace points through the position information of the plurality of trace points specifically includes:
when detecting that an included angle formed by a line segment formed by connecting a target track point and two adjacent track points in the plurality of track points is larger than a fifth threshold value, judging that the target track point is a redundant node;
and deleting the redundant nodes in the plurality of track points, wherein the rest track points in the plurality of track points are the plurality of control nodes.
20. The method according to claim 18 or 19, wherein obtaining the motion trajectory to be played back specifically comprises:
and connecting the plurality of control nodes into the motion trail.
21. A computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-10.
22. A computer program product, characterized in that it causes a computer to carry out the method according to any one of claims 1 to 10 when said computer program product is run on the computer.
CN202010329410.6A 2020-04-23 2020-04-23 Track playback method and device Active CN113554932B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010329410.6A CN113554932B (en) 2020-04-23 2020-04-23 Track playback method and device
PCT/CN2021/088805 WO2021213451A1 (en) 2020-04-23 2021-04-21 Track playback method, and related apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010329410.6A CN113554932B (en) 2020-04-23 2020-04-23 Track playback method and device

Publications (2)

Publication Number Publication Date
CN113554932A true CN113554932A (en) 2021-10-26
CN113554932B CN113554932B (en) 2022-07-19

Family

ID=78129459

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010329410.6A Active CN113554932B (en) 2020-04-23 2020-04-23 Track playback method and device

Country Status (2)

Country Link
CN (1) CN113554932B (en)
WO (1) WO2021213451A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114117878B (en) * 2021-11-29 2022-09-23 中国人民解放军国防科技大学 Target motion trajectory segmented compression method based on improved particle swarm optimization
CN116384209B (en) * 2023-05-30 2023-08-08 江苏伟岸纵横科技股份有限公司 Disaster simulation method for emergency simulation exercise

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005249589A (en) * 2004-03-04 2005-09-15 Xanavi Informatics Corp Navigation system, summary map distribution system, car guiding method and map display
CN1754084A (en) * 2003-02-26 2006-03-29 通腾有限责任公司 Navigation device and method for displaying analogue navigation data
KR20070038859A (en) * 2005-10-07 2007-04-11 주식회사 현대오토넷 Map display method of navigation systemm
KR20090056219A (en) * 2007-11-30 2009-06-03 엘지전자 주식회사 Method and apparatus for displaying map in navigation
CN101901551A (en) * 2010-06-29 2010-12-01 上海英迪信息技术有限公司 Method for optimizing track playback function in vehicle monitoring system
US20110208422A1 (en) * 2010-02-22 2011-08-25 Denso Corporation Trajectory display device
US20130091472A1 (en) * 2010-07-22 2013-04-11 Sony Corporation Information processing apparatus, information processing method, and recording medium
CN103185586A (en) * 2011-12-30 2013-07-03 上海博泰悦臻电子设备制造有限公司 Map display method, apparatus for controlling map display and navigation apparatus
CN103500516A (en) * 2013-09-26 2014-01-08 深圳市宏电技术股份有限公司 High-efficiency trace replay method and system based on electronic map
CN103927795A (en) * 2013-01-14 2014-07-16 北京中交兴路信息科技有限公司 Method and system for replaying vehicle historical traveling track
CN104603576A (en) * 2012-08-30 2015-05-06 三菱电机株式会社 Navigation device
US20160148418A1 (en) * 2013-07-23 2016-05-26 National Ict Australia Limited Geo-located activity visualisation, editing and sharing
CN106528555A (en) * 2015-09-10 2017-03-22 中国科学院上海高等研究院 System for quickly constructing three-dimensional building model
CN107003132A (en) * 2015-09-30 2017-08-01 华为技术有限公司 A kind of calibration method and portable electric appts based on dead reckoning
CN107741944A (en) * 2017-08-09 2018-02-27 成都路行通信息技术有限公司 A kind of electronic map simulation track back method and system
CN108344422A (en) * 2018-02-09 2018-07-31 城市生活(北京)资讯有限公司 A kind of navigation methods and systems
CN108780329A (en) * 2016-02-29 2018-11-09 微软技术许可有限责任公司 Delivery vehicle track for stablizing the captured video of delivery vehicle determines
CN109238283A (en) * 2018-08-24 2019-01-18 广东小天才科技有限公司 A kind of adjustment in direction method, apparatus, equipment and storage medium
CN109959379A (en) * 2019-02-13 2019-07-02 歌尔科技有限公司 Localization method and electronic equipment
CN110553651A (en) * 2019-09-26 2019-12-10 众虎物联网(广州)有限公司 Indoor navigation method and device, terminal equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105719351B (en) * 2014-12-04 2018-09-28 高德软件有限公司 A kind of method and apparatus of display electronic map
JP2018069753A (en) * 2016-10-24 2018-05-10 アイシン・エィ・ダブリュ株式会社 Driving condition display system and driving condition display program
US10830602B2 (en) * 2018-01-08 2020-11-10 Alpine Electronics, Inc. Systems and methods for providing direction guidance during off-road routing

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1754084A (en) * 2003-02-26 2006-03-29 通腾有限责任公司 Navigation device and method for displaying analogue navigation data
JP2005249589A (en) * 2004-03-04 2005-09-15 Xanavi Informatics Corp Navigation system, summary map distribution system, car guiding method and map display
KR20070038859A (en) * 2005-10-07 2007-04-11 주식회사 현대오토넷 Map display method of navigation systemm
KR20090056219A (en) * 2007-11-30 2009-06-03 엘지전자 주식회사 Method and apparatus for displaying map in navigation
US20110208422A1 (en) * 2010-02-22 2011-08-25 Denso Corporation Trajectory display device
CN101901551A (en) * 2010-06-29 2010-12-01 上海英迪信息技术有限公司 Method for optimizing track playback function in vehicle monitoring system
US20130091472A1 (en) * 2010-07-22 2013-04-11 Sony Corporation Information processing apparatus, information processing method, and recording medium
CN103185586A (en) * 2011-12-30 2013-07-03 上海博泰悦臻电子设备制造有限公司 Map display method, apparatus for controlling map display and navigation apparatus
CN104603576A (en) * 2012-08-30 2015-05-06 三菱电机株式会社 Navigation device
CN103927795A (en) * 2013-01-14 2014-07-16 北京中交兴路信息科技有限公司 Method and system for replaying vehicle historical traveling track
US20160148418A1 (en) * 2013-07-23 2016-05-26 National Ict Australia Limited Geo-located activity visualisation, editing and sharing
CN103500516A (en) * 2013-09-26 2014-01-08 深圳市宏电技术股份有限公司 High-efficiency trace replay method and system based on electronic map
CN106528555A (en) * 2015-09-10 2017-03-22 中国科学院上海高等研究院 System for quickly constructing three-dimensional building model
CN107003132A (en) * 2015-09-30 2017-08-01 华为技术有限公司 A kind of calibration method and portable electric appts based on dead reckoning
CN108780329A (en) * 2016-02-29 2018-11-09 微软技术许可有限责任公司 Delivery vehicle track for stablizing the captured video of delivery vehicle determines
CN107741944A (en) * 2017-08-09 2018-02-27 成都路行通信息技术有限公司 A kind of electronic map simulation track back method and system
CN108344422A (en) * 2018-02-09 2018-07-31 城市生活(北京)资讯有限公司 A kind of navigation methods and systems
CN109238283A (en) * 2018-08-24 2019-01-18 广东小天才科技有限公司 A kind of adjustment in direction method, apparatus, equipment and storage medium
CN109959379A (en) * 2019-02-13 2019-07-02 歌尔科技有限公司 Localization method and electronic equipment
CN110553651A (en) * 2019-09-26 2019-12-10 众虎物联网(广州)有限公司 Indoor navigation method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN113554932B (en) 2022-07-19
WO2021213451A1 (en) 2021-10-28

Similar Documents

Publication Publication Date Title
CN110138959B (en) Method for displaying prompt of human-computer interaction instruction and electronic equipment
CN108829881B (en) Video title generation method and device
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN111061912A (en) Method for processing video file and electronic equipment
CN110022489B (en) Video playing method, device and storage medium
CN113838490B (en) Video synthesis method and device, electronic equipment and storage medium
CN114461111B (en) Function starting method and electronic equipment
CN114173000B (en) Method, electronic equipment and system for replying message and storage medium
CN109819306B (en) Media file clipping method, electronic device and server
WO2021213451A1 (en) Track playback method, and related apparatus
CN116070035B (en) Data processing method and electronic equipment
CN112015943A (en) Humming recognition method and related equipment
CN112637477A (en) Image processing method and electronic equipment
CN116208704A (en) Sound processing method and device
CN111249728B (en) Image processing method, device and storage medium
CN114694646A (en) Voice interaction processing method and related device
CN113489895B (en) Method for determining recommended scene and electronic equipment
CN114995715A (en) Control method of floating ball and related device
CN116561085A (en) Picture sharing method and electronic equipment
CN111722896B (en) Animation playing method, device, terminal and computer readable storage medium
CN115706916A (en) Wi-Fi connection method and device based on position information
CN114911400A (en) Method for sharing pictures and electronic equipment
CN114079691A (en) Equipment identification method and related device
CN113542575A (en) Device pose adjusting method, image shooting method and electronic device
WO2023116669A1 (en) Video generation system and method, and related apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant