WO2021213451A1 - 轨迹回放方法及相关装置 - Google Patents

轨迹回放方法及相关装置 Download PDF

Info

Publication number
WO2021213451A1
WO2021213451A1 PCT/CN2021/088805 CN2021088805W WO2021213451A1 WO 2021213451 A1 WO2021213451 A1 WO 2021213451A1 CN 2021088805 W CN2021088805 W CN 2021088805W WO 2021213451 A1 WO2021213451 A1 WO 2021213451A1
Authority
WO
WIPO (PCT)
Prior art keywords
control node
electronic device
trajectory
threshold
rotation angle
Prior art date
Application number
PCT/CN2021/088805
Other languages
English (en)
French (fr)
Inventor
王俊岭
高延龙
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021213451A1 publication Critical patent/WO2021213451A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/006Representation of non-cartographic information on maps, e.g. population distribution, wind direction, radiation levels, air and sea routes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram

Definitions

  • This application relates to the technical field of electronic map data processing, in particular to a track playback method and related devices.
  • GNSS Global Navigation Satellite System
  • trajectory thinning technology can provide a simplified trajectory similar to the initial motion trajectory.
  • trajectory thinning can be used to guide the change trajectory of the map state.
  • the so-called change trajectory of the map state refers to the change trajectory of the real geographic coordinates corresponding to the display area in the center of the map.
  • trajectory playback there are always two problems in trajectory playback. One is that if the similarity between the simplified trajectory and the initial motion trajectory is relatively low, it will cause the trajectory to run out of the map display area; the second is if the simplified trajectory is similar to the initial motion trajectory. A relatively high degree will cause the state of the map to change too drastically, cause visual dizziness, and may cause subjective discomfort to the user.
  • the embodiments of the present application provide a trajectory playback method and related devices, which can stabilize the state change of the map while ensuring the similarity.
  • this application provides a method for track playback, including:
  • the electronic device obtains the motion track to be played back, and the motion track includes a plurality of control nodes; the electronic device extracts the adjacent first control node and the second control node in the motion track; the electronic device controls according to the first control node and the second control node.
  • the node and the first direction determine the rotation angle of the first control node; wherein, in the motion trajectory, the second control node is after the first control node; when the electronic device detects that the motion trajectory is played back to the first control node, the electronic device Adjust the display direction of the motion track according to the rotation angle corresponding to the first control node.
  • each control node in the motion trajectory corresponds to a rotation angle.
  • the electronic device determines the rotation angle of each control node in the motion trajectory. According to the rotation angle of each control node, the electronic device The motion track will be played back dynamically on the display screen of.
  • the first direction is the display direction of the motion track in the display area of the electronic device at that time, and the display direction can be understood as a direction that points to the top of the display area along the vertical direction of the bottom of the display area of the electronic device.
  • the number of control nodes in the motion trajectory is related to the similarity between the motion trajectory and the initial motion trajectory, and the rotation angle of the control node in the motion trajectory is related to the stability of the electronic device playing back the motion trajectory.
  • the electronic device can control the number of control nodes and the rotation angle of the control nodes to stabilize the state change of the motion trajectory in the map while ensuring the similarity.
  • the electronic device determines the rotation angle of the first control node according to the first control node, the second control node, and the first direction, which specifically includes: when the electronic device detects the first direction and the first direction
  • the electronic device determines that the rotation angle corresponding to the first control node is zero.
  • the first included angle is not greater than the first threshold, it can be considered that the display direction of the current motion trajectory in the display area does not affect the user's viewing in the first direction, and there is no need to rotate the display direction of the motion trajectory.
  • the rotation frequency of the display direction of the motion track can be reduced, and the state change of the motion track in the map can be stabilized.
  • the electronic device determines the rotation angle of the first control node according to the first control node, the second control node, and the first direction, which specifically includes: when the electronic device detects the first direction and the first control node and the second control node When the first included angle formed by the connection lines between the control nodes is less than or equal to the first threshold, the electronic device determines that the rotation angle corresponding to the first control node is the first included angle. In this way, if the first included angle is greater than the first threshold, the display direction of the motion track is rotated to make the display direction of the motion track the first direction, so that the user can watch the track playback in the first direction, which improves user experience.
  • the electronic device determines the rotation angle of the first control node according to the first control node, the second control node, and the first direction, which specifically includes: when the electronic device detects the first direction and the first direction When the first angle formed by the connection between the control node and the second control node is less than or equal to the first threshold, or the distance between the first control node and the second control node is less than or equal to the second threshold, The device determines that the rotation angle corresponding to the first control node is 0. That is, if the first included angle is greater than the first threshold, but the physical distance between the control node and the next control node is close (less than or equal to the second threshold), the display direction of the motion track cannot be rotated. This can avoid the situation where the distance between the two control nodes is too close, so that the display direction of the motion trajectory is continuously changed twice in a short time, and the stability of the map state is further improved.
  • the electronic device determines the rotation angle of the first control node according to the first control node, the second control node, and the first direction, which specifically includes: when the electronic device detects the first direction and the first direction When the first angle formed by the connection between the control node and the second control node is greater than the first threshold, and the distance between the connection between the first control node and the second control node is greater than the second threshold, the electronic device determines that the first The rotation angle corresponding to the control node is the first included angle. In this way, the first included angle and the distance between the two control nodes are jointly judged whether to rotate the display direction of the motion trajectory, which can reduce the rotation frequency of the display direction of the motion trajectory and stabilize the state change of the motion trajectory in the map.
  • the electronic device determines the rotation angle of the first control node according to the first control node, the second control node, and the first direction, which specifically includes: when the first control node is the first control node in the motion trajectory
  • the electronic device detects that the first angle formed by the connection between the first direction and the first control node and the second control node is greater than the third threshold
  • the electronic device determines that the rotation angle corresponding to the first control node is the first angle Angle; where the third threshold is less than the first threshold. Since the third threshold is smaller than the first threshold, that is, the first control node of the motion trajectory is easier to rotate the display direction of the motion trajectory than other control nodes. In this way, when the dynamic track is played at the beginning, the user can watch the track in the first direction.
  • the electronic device determines the rotation angle of the first control node according to the first control node, the second control node, and the first direction, which specifically includes: when the electronic device detects the first control node and the second control node The distance between the two control nodes is greater than the fourth threshold, and the electronic device determines that the rotation angle corresponding to the first control node is the first included angle.
  • This method describes that when the control node is far from the next control node (greater than the fourth threshold), even if the first included angle does not meet the condition of the display direction of the rotational motion trajectory, in order to prevent the user from being unable to for a long time
  • the display direction of the motion trajectory can still be rotated to ensure that the current forward direction of the trajectory is the first direction, which improves the user experience.
  • the method before the electronic device acquires the movement track to be played back, the method further includes: the electronic device acquires the initial movement track of the electronic device; the electronic device determines multiple track points from the initial movement track ; The electronic device determines multiple control nodes from the multiple trajectory points through the position information of the multiple trajectory points.
  • the manner in which the electronic device determines multiple control nodes from the multiple trajectory points may include taking a weighted average of the distance midpoint, time midpoint, and position coordinates of two adjacent trajectory points.
  • the electronic device determines multiple control nodes from the multiple trajectory points based on the position information of the multiple trajectory points, specifically including: The angle formed by the line segment formed by the two trajectory points is greater than the fifth threshold, the electronic device judges the target trajectory point as a redundant node; the electronic device deletes the redundant node from the multiple trajectory points, and the remaining trajectory among the multiple trajectory points Points are multiple control nodes.
  • This method of setting the fifth threshold can effectively delete redundant nodes. Since redundant nodes have little effect on the similarity of trajectories, for the same motion trajectory, reducing redundant nodes can reduce the number of rotations and improve the map. Stability of state changes.
  • the electronic device acquiring the movement track to be played back specifically includes: the electronic device connects multiple control nodes to form the movement track.
  • the similarity between the motion trajectory and the initial motion trajectory is related to the number of control nodes and the fifth threshold. By setting a reasonable fifth threshold, the electronic device can well control the trajectory similarity and the stability of the map state change.
  • the present application provides an electronic device, which may include: one or more processors, a memory, and a display screen; the memory and the display screen are coupled with one or more processors, and the memory is used to store a computer program Code, the computer program code includes computer instructions, one or more processors call the computer instructions to cause the electronic device to execute:
  • the motion trajectory includes multiple control nodes; extract the adjacent first control node and second control node in the motion trajectory; determine the first control node, the second control node, and the first direction according to the first control node, the second control node, and the first direction.
  • a rotation angle of the control node wherein, in the motion trajectory, the second control node is after the first control node; when the motion trajectory is detected to be played back to the first control node, the motion is adjusted according to the rotation angle corresponding to the first control node
  • determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction includes: when detecting the first direction and the first control node and the first direction When the first included angle formed by the connection between the two control nodes is less than or equal to the first threshold, it is determined that the rotation angle corresponding to the first control node is zero.
  • determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction includes: detecting the relationship between the first direction and the first control node and the second control node When the first included angle formed by the connection is greater than the first threshold, it is determined that the rotation angle corresponding to the first control node is the first included angle.
  • determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction includes: when detecting the first direction and the first control node and the first direction When the first angle formed by the connection between the two control nodes is less than or equal to the first threshold, or the distance between the first control node and the second control node is less than or equal to the second threshold, the first control node is determined The corresponding rotation angle is 0.
  • determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction includes: when detecting the first direction and the first control node and the first direction When the first angle formed by the connection between the two control nodes is greater than the first threshold, and the distance between the connection between the first control node and the second control node is greater than the second threshold, determine the rotation angle corresponding to the first control node Is the first angle.
  • determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction specifically includes: when the first control node is the first control node in the motion trajectory, When it is detected that the first included angle formed by the connection between the first direction and the first control node and the second control node is greater than the third threshold, it is determined that the rotation angle corresponding to the first control node is the first included angle; The threshold is less than the first threshold.
  • determining the rotation angle of the first control node according to the first control node, the second control node, and the first direction includes: when detecting the first control node and the second control node The distance between the connecting lines is greater than the fourth threshold, and the rotation angle corresponding to the first control node is determined to be the first included angle.
  • the method before acquiring the motion trajectory to be played back, the method further includes: acquiring the initial motion trajectory of the electronic device; determining multiple trajectory points from the initial motion trajectory; The location information of, determines multiple control nodes from multiple track points.
  • the multiple control nodes are determined from the multiple trajectory points based on the position information of the multiple trajectory points, specifically including: when the target trajectory point is detected in the multiple trajectory points, the two adjacent trajectory points are respectively The angle formed by the connected line segments is greater than the fifth threshold, and the target trajectory point is judged to be a redundant node; the redundant node is deleted from the multiple trajectory points, and the remaining trajectory points among the multiple trajectory points are multiple control nodes.
  • acquiring the motion track to be played back specifically includes: connecting multiple control nodes to form a motion track.
  • an embodiment of the present application provides a track playback system, including an electronic device and a server, where:
  • the server is used to obtain the motion trajectory to be played back, and the motion trajectory includes multiple control nodes;
  • the server is also used to extract the adjacent first control node and second control node in the motion track;
  • the server is further configured to determine the rotation angle of the first control node according to the first control node, the second control node, and the first direction; wherein, in the motion trajectory, the second control node is after the first control node;
  • the server is also used to send the motion track and the rotation angle to the electronic device
  • the electronic device is used to adjust the display direction of the movement track according to the rotation angle corresponding to the first control node when it is detected that the movement track is played back to the first control node.
  • the server is specifically configured to: when detecting that the first angle formed by the connection between the first direction and the first control node and the second control node is less than or equal to the first threshold, It is determined that the rotation angle corresponding to the first control node is 0.
  • the server is specifically configured to: when detecting that the first angle formed by the connection between the first direction and the first control node and the second control node is greater than the first threshold, determine that the first control node corresponds to The rotation angle is the first included angle.
  • the server is specifically configured to: when detecting that the first angle formed by the connection between the first direction and the first control node and the second control node is less than or equal to the first threshold, Or the distance between the first control node and the second control node is less than or equal to the second threshold, and it is determined that the rotation angle corresponding to the first control node is zero.
  • the server is specifically configured to: when detecting that the first angle formed by the connection between the first direction and the first control node and the second control node is greater than the first threshold, and the first The distance between a control node and a second control node is greater than the second threshold, and the rotation angle corresponding to the first control node is determined to be the first included angle.
  • the server is specifically configured to: when the first control node is the first control node in the motion track, when detecting the connection between the first direction and the first control node and the second control node The first included angle is greater than the third threshold, and the rotation angle corresponding to the first control node is determined to be the first included angle; wherein, the third threshold is less than the first threshold.
  • the server is specifically configured to: when detecting that the distance between the first control node and the second control node is greater than the fourth threshold, determine that the rotation angle corresponding to the first control node is the first Angle.
  • the server is further configured to: obtain the initial motion track of the electronic device before acquiring the motion track to be played back; determine multiple track points from the initial motion track; pass multiple track points The location information of, determines multiple control nodes from multiple track points.
  • the server is specifically configured to: when detecting that the angle formed by the line segment formed by the target track point and the two adjacent track points in the multiple track points is greater than the fifth threshold, determine that the target track point is redundant Remaining nodes; redundant nodes are deleted from multiple trajectory points, and the remaining trajectory points among multiple trajectory points are multiple control nodes.
  • the server is specifically used to connect multiple control nodes to form a motion track.
  • an embodiment of the present application provides a computer storage medium, including computer instructions, which when the computer instructions run on an electronic device, cause the electronic device to execute the track playback in any one of the possible implementations of the first aspect. method.
  • the embodiments of the present application provide a computer program product, which when the computer program product runs on a computer, causes the computer to execute the track playback method in any one of the possible implementation manners of any of the foregoing aspects.
  • the electronic equipment provided in the second aspect, the system provided in the third aspect, the computer storage medium provided in the fourth aspect, and the computer program product provided in the fifth aspect provided above are all used to execute the functions provided in the first aspect.
  • the method of trajectory playback, therefore, the beneficial effects that can be achieved can refer to the beneficial effects of the method provided in the first aspect, which will not be repeated here.
  • FIG. 1 is a schematic diagram of center point coordinates, azimuth angles, and control nodes provided by an embodiment of the application;
  • FIG. 2 is a schematic diagram of a method for determining a control node according to an embodiment of the application
  • FIG. 3 is a schematic diagram of another method for determining a control node according to an embodiment of the application.
  • FIG. 4 is a schematic diagram of determining a rotation included angle provided by an embodiment of the application.
  • FIG. 5 is a schematic diagram of a method for determining a lens action of a control node according to an embodiment of the application
  • FIG. 6 is an interface display diagram of a track playback method provided by an embodiment of the application.
  • FIG. 7 is a schematic structural diagram of an electronic device provided by an embodiment of this application.
  • Fig. 8 is a schematic diagram of a software architecture provided by an embodiment of the application.
  • first and second are only used for descriptive purposes, and cannot be understood as implying or implying relative importance or implicitly specifying the number of indicated technical features. Therefore, the features defined with “first” and “second” may explicitly or implicitly include one or more of these features. In the description of the embodiments of the present application, unless otherwise specified, “multiple” The meaning is two or more.
  • the electronic devices involved in the embodiments of this application may be mobile phones, tablet computers, desktops, laptops, notebook computers, Ultra-mobile Personal Computers (UMPC), servers (including cloud servers), handheld computers, Netbooks, personal digital assistants (PDAs), wearable electronic devices (such as sports watches, smart bracelets, smart watches), virtual reality devices, in-vehicle multimedia devices, unmanned aerial vehicles, aerial photography instruments, etc. are equipped with positioning data collection and / Or processing power equipment.
  • UMPC Ultra-mobile Personal Computers
  • servers including cloud servers
  • handheld computers Netbooks
  • PDAs personal digital assistants
  • wearable electronic devices such as sports watches, smart bracelets, smart watches
  • virtual reality devices such as sports watches, smart bracelets, smart watches
  • in-vehicle multimedia devices unmanned aerial vehicles, aerial photography instruments, etc. are equipped with positioning data collection and / Or processing power equipment.
  • Track playback Refers to the user selecting a certain time period, and then the electronic device reproduces the appearance of the user's trajectory during this time period on the map.
  • the user's trajectory in this time period includes a series of location points, and each location point may include information such as date, time, longitude, latitude, altitude information, and movement speed.
  • a location point may be referred to as a track point.
  • Map status indicates the map status of the display area during track playback.
  • This application uses the coordinates of the center point of the map and the azimuth of the map to describe the map state.
  • the center point coordinates describe the physical coordinates of the center point of the map currently in the display area, and the map state moves accordingly with the movement of the center point coordinates.
  • the azimuth angle is shown in figure b in Figure 1, and figure b in Figure 1 exemplarily shows the azimuth angle of the map state when the electronic device is in the vertical screen.
  • the direction pointed by arrow 1 indicates the physical true north direction, and the arrow 2 points
  • the direction of indicates the direction that the vertical direction along the bottom of the display area of the electronic device points to the top of the display area.
  • the azimuth angle represents the minimum angle formed by rotating the physical true north direction in the clockwise direction to the direction pointed by the arrow 2. For example, if the azimuth angle is 45 degrees, the view angle of the map (the direction pointed by arrow 2) is 45 degrees north east, for example, if the azimuth angle is 90 degrees, then the view angle of the map (the direction pointed by arrow 2) is due east.
  • the figure c in Figure 1 exemplarily shows the azimuth angle of the map state when the electronic device is horizontally screened
  • the direction pointed by arrow 3 indicates the physical true north direction
  • the direction pointed by arrow 4 Represents the direction that the vertical direction along the bottom of the display area of the electronic device points to the top of the display area.
  • the azimuth angle is the same as the vertical screen of the electronic device.
  • the direction that points the vertical direction along the bottom of the display area of the electronic device to the top of the display area may be referred to as the positive direction or the first direction.
  • Camera action includes three basic actions: straight, rotate and stop.
  • Straight represents the change of the map center point, where rotation represents the change of the map azimuth, and stop represents the stop of the change of the map state.
  • Motion trajectory refers to the trajectory formed by the coordinates of the center point of the map during trajectory playback.
  • This application expresses the motion trajectory as a finite number of straight lines, the intersection of adjacent straight lines and the beginning and end points of the motion trajectory are defined as control nodes, and the connection of multiple control nodes forms the motion trajectory, as shown in Figure 1 d .
  • each control node corresponds to a lens action, and the electronic device makes a corresponding lens action according to the control node, thereby dynamically drawing a movement track.
  • Thinning algorithm Refers to an algorithm that compresses a large number of redundant trajectory data points to extract necessary data points. You can approximate the curve as a series of points and reduce the number of points.
  • the thinning algorithm of the trajectory can provide a simplified trajectory similar to the initial motion trajectory, which can be used to guide the change trajectory of the map state.
  • the so-called change trajectory of the map state refers to the change trajectory of the real geographic coordinates corresponding to the coordinates of the center point of the map.
  • the existing trajectory playback method usually uses the thinning algorithm to provide a simplified trajectory similar to the initial trajectory. If the similarity between the simplified trajectory and the initial trajectory is relatively low, the coordinates of the center point of the map cannot keep up with the drawing speed of the trajectory, which will cause the trajectory to run. The problem with the map display area. On the contrary, if the similarity between the simplified trajectory and the initial motion trajectory is relatively high, the state of the map will change too sharply, causing visual dizziness, and may cause subjective discomfort to the user.
  • an embodiment of the present application proposes a trajectory playback method, which can stabilize the state change of the map while ensuring the similarity, and improve the user experience.
  • the electronic device first determines the control node of the trajectory, then determines the lens action of the control node, and dynamically draws the trajectory playback curve according to the control node and the lens action of the control node.
  • Step 1 Obtain the initial movement trajectory of the electronic device.
  • the electronic device obtains the initial motion trajectory of the electronic device, and the initial motion trajectory includes various types of trajectories such as walking trajectory, running trajectory, mountaineering trajectory, and amusement park trajectory.
  • the electronic device may obtain the initial motion trajectory locally, or may obtain the initial motion trajectory from other electronic devices (for example, a cloud server), which is not limited in this application.
  • the electronic device may obtain the initial motion trajectory within the time period selected by the user, or may obtain the initial motion trajectory between the two positions according to the start position and the end position selected by the user, which is not limited in this application.
  • the electronic device may be an initial movement track obtained according to a preset condition.
  • the preset condition can be the longest continuous movement time of the electronic device, the longest continuous movement distance of the electronic device, or the longest movement distance of the electronic device in a month/year. The trajectory of the movement, and so on.
  • the preset condition may be set by the user or selected by the user, or may be automatically triggered in the electronic device according to a certain point in time and/or location.
  • the electronic device when the electronic device detects that the current location is a park, the electronic device automatically obtains the historical movement track of the electronic device in the park; when the electronic device detects that the current time is December 31, 2019, the electronic device automatically Obtain the movement track of the electronic device on the longest day of the year (December 31, 2018 to December 31, 2019); when the electronic device detects the user's trigger, obtain the longest continuous movement time of the electronic device For one movement track operation, the electronic device obtains the longest continuous movement time of the electronic device.
  • Step 2 Calculate the trajectory points of the initial motion trajectory.
  • Locus point is defined as P i (x, y), where i represents the track point index value throughout the initial trajectory is, (x, y) represents the coordinates of a locus of points, the coordinates of the trajectory points may be absolute coordinates (e.g., latitude and longitude) Or relative coordinates.
  • Each track point can include date, time, longitude, latitude, altitude information, movement speed and other information.
  • the electronic device After the electronic device obtains the initial motion trajectory of the electronic device, the electronic device divides the initial motion trajectory into K segments according to the number of segments K and the total distance L of the current initial motion trajectory, and calculates the difference between each two segments.
  • the interval between L/K is L/K
  • the distance of every L/K is a track point.
  • the distance can be the actual geographic distance or the distance on the map.
  • the trajectory points may also be location points with the same straight line distance interval.
  • the linear distance between each track point and its adjacent track point is the same.
  • the straight-line distance can be the distance on the actual geographic location or the distance on the map.
  • the track points may also be location points with the same time interval.
  • the electronic device divides the initial motion trajectory into K segments according to the number of segments K and the total time T consumed by the current initial motion trajectory, and calculates that the segment interval is T/K, and every T/K time Take a track point in a segment.
  • the starting point and the end point of the initial motion trajectory are respectively a trajectory point, and there are K+1 trajectory points at this time.
  • the present application can also obtain track points according to other sampling algorithms, such as extracting the inflection point of the motion track as the track point, which is not limited in this application.
  • Step 3 Generate the control node set of the initial motion trajectory according to the trajectory points.
  • the electronic device After the electronic device obtains the trajectory points in the initial motion trajectory, it takes a control node between every two adjacent trajectory points in a preset manner, and adds the first and last trajectory points to form a control node set.
  • the following exemplarily introduces several possible preset methods.
  • Method 1 The midpoint of the distance between two adjacent track points is the control node. For example, if the distance between two adjacent trajectory points is L/K, a control node is taken on the initial motion trajectory between two adjacent trajectory points, and the distance between the control node and the two adjacent trajectory points Both are L/2K.
  • the time midpoint of two adjacent track points is the control node.
  • the time interval between two adjacent trajectory points is T/K
  • a control node is taken on the initial motion trajectory between two adjacent trajectory points, and the time between the control node and the two adjacent trajectory points
  • the intervals are all T/2K.
  • the third method is to perform a weighted average of the position coordinates of two adjacent track points to obtain a new coordinate as the coordinate of the control node. For example, if the position coordinates of two adjacent track points are (x 1 , y 1 ) and (x 2 , y 2 ), then the coordinates of the control node are taken as
  • This application can also obtain the control node according to other algorithms, such as weighted average of the position coordinates of three consecutive trajectory points to obtain a new coordinate as the control node. No restrictions.
  • Fig. 2 exemplarily shows the above-mentioned process of acquiring the set of control nodes, as shown in the diagram a in Fig. 2, and the diagram a in Fig. 2 exemplarily shows an initial motion trajectory.
  • the electronic device takes a trajectory point every the same distance.
  • the b diagram in FIG. 2 exemplarily takes 10 trajectory points, and the distance between each two adjacent trajectory points (the actual geographic distance or the distance on the map) is the same.
  • the position coordinates of two adjacent track points are weighted and averaged, and a new coordinate is obtained as the coordinate of the control node.
  • the electronic device has 11 control nodes.
  • Step 4 The electronic device connects multiple control nodes to form a motion track.
  • the redundant control nodes in the control node set in the above step 3 can be further screened, and the redundant control nodes can be deleted.
  • step three and before step four it also includes deleting redundant control nodes in the above-mentioned control node set to form a new control node set.
  • the electronic device After the electronic device generates the control node set of the trajectory according to the trajectory points, it sequentially judges the control nodes in the control node set. If the angle (acute angle) formed by the line segments connecting the node with two adjacent nodes is greater than the first threshold, it is determined that the node is a redundant node. Among them, the first and last nodes are determined to be control nodes, and no judgment is required. Judge the control nodes in the control node set in turn to form a new control node set.
  • Fig. 3 exemplarily shows the above-mentioned process of forming a new control node set.
  • Figure a in Figure 3 exemplarily shows the set of control nodes acquired in step three, including five control nodes A, B, C, D, and E.
  • the true north direction in FIG. 3 is the true north direction in the map.
  • the control node B is in the northeast direction of the control node A in the motion trajectory of the a diagram in FIG. 3.
  • control nodes in the control node set are judged in turn, until the last control node E, E is the tail node, and the tail node is directly determined as the control node.
  • a new set of control nodes is formed, including three control nodes A, D, and E.
  • the above-mentioned fifth threshold may be referred to as the attenuation angle ⁇ .
  • the above method of determining the control node of the trajectory can effectively delete the redundant control point by setting the attenuation angle ⁇ . Since the redundant control node has little effect on the similarity of the trajectory, for the same motion trajectory, the redundancy is reduced. The remaining control points can reduce camera movement and improve the stability of map state changes.
  • the smaller the attenuation angle ⁇ the more redundant control nodes will be screened out, and the fewer the remaining control nodes.
  • the drawn trajectory differs from the actual trajectory. The similarity will be lower. Therefore, the number of segments K and the attenuation angle ⁇ jointly determine the similarity between the drawn trajectory and the actual trajectory.
  • the number of segments K can be set to about 10
  • the attenuation angle ⁇ can be set to about 160 degrees.
  • different number of segments can be determined for different track lengths.
  • the number of different segments is determined according to the scale (ratio of the display distance to the actual distance) displayed on the display area of the electronic device of the motion track, where the larger the scale, the more the number of segments.
  • the scale ratio of the display distance to the actual distance
  • the actual distance is 5 kilometers
  • the scale of the motion track on the display area of the electronic device is 1 cm: 1 kilometers
  • the total length of the motion track displayed on the display area of the electronic device is 5 cm
  • 10 trajectory points can be taken for the motion trajectory
  • the number of segments is 9
  • the scale of the motion trajectory is 1 cm:0.5 km (the scale becomes larger)
  • the motion trajectory is displayed on the display area of the electronic device
  • the total length is 10 cm, 20 track points can be taken for the motion track, and the number of segments is 19.
  • the electronic device After the electronic device obtains the control node set of the trajectory, it determines the lens action of each control node in the control node set.
  • the rotation included angle of the control node is the included angle (acute angle) between the trajectory direction of the control node and the positive direction.
  • the positive direction is a direction along the vertical direction of the bottom of the display area of the electronic device pointing to the top of the display area.
  • Figure a in Figure 4 exemplarily shows a section of motion trajectory on the display area, including A, B, C, D, and 4 control nodes.
  • the trajectory direction of A is the AB direction
  • the positive direction is vertically upward in the figure a in Fig. 4
  • the rotation angle of A is ⁇ A. If the lens action of A is rotating, the azimuth angle of the map state of the display area will change.
  • the positive direction is the rotation axis, and the rotation angle is ⁇ A , as shown in b in Figure 4
  • the trajectory direction of B is the BC direction
  • the positive direction is the AB direction
  • the rotation angle of B is ⁇ B.
  • the azimuth angle of the map state of the display area does not change, and the positive direction remains unchanged.
  • the trajectory direction of C is the CD direction
  • the positive direction is the AB direction
  • the rotation angle of C is ⁇ C.
  • the rotation included angle may also be referred to as the first included angle.
  • the first preset rule is that if the rotation angle of the control node is greater than the first threshold, it is determined that the lens action of the control node is rotation. If the angle of rotation of the control node is not greater than the first threshold, it is determined that the lens action of the control node is straight.
  • the preset rule 1 determines the lens action of the control node by judging whether the rotation included angle is greater than the first threshold.
  • the frequency of rotation of the map state can be reduced, and there is no need to change the azimuth angle of the map state every time the trajectory direction changes, and the stability of the map state can be improved.
  • the above-mentioned first threshold may be referred to as the rotation suppression angle ⁇ , and the rotation suppression angle ⁇ may be set at about 60 degrees.
  • the rotation angle of the control node is greater than the first threshold, and the physical distance between the control node and the next control node is greater than the second threshold, it is determined that the lens action of the control node is rotation. If the rotation angle of the control node is not greater than the first threshold, or the physical distance between the control node and the next control node is not greater than the second threshold, then it is determined that the camera action of the control node is straight.
  • the rotation angle of the control node when the rotation angle of the control node is greater than the first threshold, if the physical distance between the control node and the next control node is close (not greater than the second threshold), the azimuth angle cannot be rotated.
  • the second preset rule determines whether the rotation angle is greater than a first threshold, and whether the physical distance between the control node and the next control node is greater than the second threshold, and jointly determines the lens action of the control node. It can avoid the situation that the distance between the two control nodes is too close, so that the camera action is changed twice in a short period of time, which further improves the stability of the map state.
  • the above-mentioned second threshold may be referred to as the rotation suppression distance L ⁇ .
  • the lens action of the control node is rotation.
  • the azimuth angle can also be rotated.
  • the preset rule three describes that when the control node is far away from the next control node, even if the rotation angle of the control node does not meet the conditions of the rotation azimuth angle, in order to prevent the user from being unable to watch the trajectory in the positive direction for a long time , The azimuth angle can still be rotated to ensure that the current forward direction of the trajectory is approximately the positive direction, which improves the user experience.
  • the aforementioned fourth threshold may be referred to as the forced rotation distance L s .
  • the fourth preset rule when the control node is the head node, if the rotation angle of the control node is greater than the third threshold, it is determined that the lens action of the control node is rotation. If the angle of rotation of the control node is not greater than the third threshold, it is determined that the lens action of the control node is straight.
  • the third threshold is smaller than the above-mentioned first threshold.
  • the fourth preset rule describes that when the control node is the head node, the lens action of the head node is determined by judging whether the rotation included angle is greater than the third threshold. Since the third threshold is smaller than the first threshold, that is, the first node is easier to implement the lens motion of the rotation azimuth angle than other control nodes. When the dynamic track is played at the beginning, the effect that the user can watch the track in the positive direction is achieved.
  • the aforementioned third threshold may be referred to as the first rotation suppression angle ⁇ , and the first rotation suppression angle ⁇ may be set at about 45 degrees.
  • the preset rules in this application may include but are not limited to one or more of the above preset rules.
  • the electronic device determines that the lens action of the control node is rotation, where the rotation angle may be the angle of the rotation included angle of the control node. In order to avoid a large degree of rotation of the map state, the rotation angle may also be smaller than the angle of the rotation included angle. The electronic device determines that the lens motion of the control node is straight, that is, the rotation angle is 0.
  • Figure 5 exemplarily shows the process of determining the lens action of the control node in a trajectory, as shown in figure a in figure 5, and figure a in figure 5 exemplarily shows a set of control nodes, including A, Five control nodes B, C, D and E. Analyze each control node in turn to determine the lens action of each control node.
  • the true north direction in FIG. 5 is the true north direction in the map.
  • the control node B is in the northeast direction of the control node A in the motion trajectory of the a diagram in FIG. 5.
  • the azimuth angle ⁇ 0 of A (the angle between the AB pointing direction and the true north direction) is obtained.
  • ⁇ 0 is greater than the third Threshold, the lens action of control node A is determined to be rotation, where the maximum rotation angle is ⁇ 0 .
  • the rotation angle ⁇ of B is obtained.
  • >180, then ⁇ 360-
  • ⁇ 180, then ⁇
  • ⁇ 1 is the azimuth angle of the adjacent rotating node in the forward direction (here is the azimuth angle of A), and ⁇ 2 is the azimuth angle of the control node B (the angle between the BC pointing direction and the true north direction).
  • is not greater than the first threshold, and then the camera motion of the control node B is determined to be straight.
  • ⁇ 1 is the azimuth angle in the positive direction.
  • the rotation angle of C is obtained. After calculating the rotation angle of C, according to preset rule 2, if the rotation angle is greater than the second threshold, but the physical distance of CD is not greater than the second threshold, then the camera action of the control node B is determined to be straight.
  • the rotation angle of D is obtained as ⁇ 3 (the angle between the track direction DE and the positive direction AB at this time).
  • the physical distance of DE is greater than the fourth threshold, then the lens action of the control node D is determined to be rotation, and the maximum rotation angle is ⁇ 3 .
  • E is the tail node, and the lens action of the control node E is stopped.
  • the problem of the stability of the map state change can be well controlled.
  • the rotation suppression angle ⁇ may take different values for different trajectories.
  • the electronic device calculates the diagonal of a rectangular box formed by the southwest and northeast coordinates of the trajectory, and the diagonal distance is L.
  • the rotation suppression distance L ⁇ and the forced rotation distance L s are determined by introducing the distance L of the diagonal of the rectangular frame formed by the trajectory.
  • the rotation inhibition distance L ⁇ and the forced rotation distance L s of different motion trajectories are also different.
  • the longer L is, the longer the rotation inhibition distance L ⁇ and the forced rotation distance L s are, and the fewer the number of rotations performed on the map state;
  • the shorter L is, the shorter the rotation suppression distance L ⁇ and the forced rotation distance L s are , and the fewer the number of times the map state is rotated.
  • the maximum rotation angle of the azimuth angle of the map state is the rotation angle of the control node.
  • the above describes the two processes of the electronic device determining the control node and determining the lens action of the control node. After the electronic device determines the control node and the lens action of the control node, a dynamic trajectory playback is drawn according to the control node and the lens action of the control node.
  • the trajectory dynamically displayed by the electronic device is a trajectory connected according to the control nodes, and the center point coordinates change continuously as the electronic device draws a dynamic trajectory.
  • the electronic device detects that the motion trajectory is played back to the control node where the lens action is straight, and the rotation angle is 0, the center point coordinates move to the next control node at the corresponding speed; when the electronic device detects that the motion trajectory is played back to the lens action as rotation According to the rotation angle corresponding to the control node, the azimuth angle of the map state is rotated correspondingly, and the display direction of the motion track is adjusted.
  • the coordinates of the center point move to the next control node at a corresponding speed;
  • the control node of the trajectory is first selected, and the similarity between the motion trajectory and the initial motion trajectory is controlled according to the number of segments K and the attenuation angle ⁇ .
  • the rotation suppression angle ⁇ the rotation suppression distance L ⁇ and the forced rotation distance L s control the severity of the lens action.
  • the lens action and motion trajectory are controlled by different parameters, which can achieve the effect of keeping the trajectory in the display area during dynamic trajectory playback without causing visual subjective discomfort, and improving the stability of the map state.
  • the motion track played back by the electronic device further includes a point of information (POI).
  • POI can be a point preset by an electronic device, such as a shopping mall, a bus stop, etc., or a point detected by the electronic device according to a preset function, such as the fastest point in the motion track, the fastest heart rate point, and so on.
  • the point of interest may be a point preset by the electronic device before the electronic device acquires the initial motion track.
  • the fastest points in the motion trajectory are set as points of interest, and after the electronic device obtains the initial motion trajectory, the fastest trajectory points in the initial motion trajectory are acquired and determined as the points of interest.
  • the electronic device sets a specific building as a point of interest. After the electronic device obtains the initial motion track, if the initial motion track passes the specific building, the track point passing the specific building is determined as the point of interest.
  • the point of interest may also be that after the electronic device obtains the initial motion track, the user selects the point of interest and the lens action corresponding to the point of interest on the initial motion track.
  • the lens action corresponding to the point of interest may be a lens action preset by the electronic device, or a lens action selected by the user.
  • the camera action corresponding to the point of interest can include pause, zoom, and so on.
  • Pause means that when the electronic device draws to the point of interest where the lens action is paused, the map state is paused, and the pause time can be 0.5 seconds;
  • zoom means that when the electronic device draws to the point of interest where the lens action is zooming, the scale of the map display is performed Zooming, such as zooming in on the map when reaching the point of interest, and zooming out to the previous scale when leaving the point of interest.
  • the camera action corresponding to the point of interest can also be pause and zoom.
  • Pause and zoom means that when the electronic device draws to the point of interest where the lens action is paused and zoomed, it will zoom in on the map when it reaches the point of interest, and after 0.5 seconds of pause, leave the point of interest and zoom out to the previous scale.
  • the motion track is dynamically played back according to the control node, the lens action of the control node and the point of interest.
  • the electronic device displays the track playback icon on the display area, the electronic device receives the user operation of the icon of the track playback function by the user, and the electronic device dynamically replays the motion track in the display area.
  • the rotation angle of the map state is 0, and the center point coordinates move to the next control node at a corresponding speed; when the electronic device detects that the lens motion is a rotating control node, it will follow The rotation angle corresponding to the control node rotates the azimuth angle of the map state accordingly, adjusts the display direction of the motion track, and at the same time the center point coordinates move to the next control node at a corresponding speed; when the electronic device detects the lens action is For the paused point of interest, the map state is paused.
  • the pause time can be 0.5 seconds, and there is no limitation here; when the electronic device draws to the control node where the lens action is stopped, the playback ends and the playback of the motion track is completed.
  • the display area of the electronic device may also include a text input box.
  • the text input box is used to receive text data input by the user, such as time information and location information input by the user.
  • the electronic device receives the time information input by the user, it can play back the movement track in the time period of the time information; the electronic device receives the location information input by the user and can play back the movement track passing through the location in the time location information.
  • the display area of the electronic device may also include a voice input box.
  • the voice input box can receive voice information input by the user.
  • the electronic device receives the voice information input by the user, recognizes the keywords in the voice information, and plays back the corresponding movement track.
  • the display area of the electronic device may also include a track list.
  • the trajectory list includes one or more motion trajectories, and the electronic device receives a user's click operation on a certain motion trajectory and plays back the motion trajectory.
  • FIG. 6 exemplarily shows the interface display diagrams of two lens actions, straight and rotating.
  • Figure 6 a to Figure 6 b shows the track playback interface display where the lens action is straight
  • Figure 6 b to Figure 6 c shows the track playback interface where the lens action is rotating show.
  • the display area of the a diagram in FIG. 6 includes a trajectory line 601, a position point 602, a distance display area 603 and a speed display area 604. in,
  • the trajectory line 601 describes the movement process of the trajectory during the dynamic trajectory playback process.
  • the location point 602 when the location point reaches the control node, the state of the map is controlled according to the lens action corresponding to the control node.
  • the location point 602 is the center point of the display area, and the center point coordinates represented by the location point 602 are changed according to the control node and the lens action of the control node, and the center point coordinates of the map state are also changed accordingly.
  • the route display area 603 is used to display the physical distance of the current position point 602 from the starting point in the motion track.
  • the speed display area 604 is used to display the movement speed of the current position point 602 in the movement track.
  • the distance in the distance display area 603 is 1.90 kilometers, and the movement speed in the speed display area 604 is 6 minutes and 31 seconds per kilometer.
  • the map state is straight.
  • the distance in the route display area 603 is 1.98 kilometers. It can be clearly seen that the center point coordinates of the map in figure a in figure 6 and figure b in figure 6 have changed, and The azimuth has not changed. That is to say, the map state is always in the straight state from the diagram a in FIG. 6 to the diagram b in FIG. 6.
  • the map state is the rotating state.
  • the distance in the route display area 603 is 2.05 kilometers. It can be clearly seen that the center point coordinates of the map in figure b in Figure 6 and figure c in Figure 6 have changed, and The azimuth has also changed. That is to say, the map state from the b diagram in FIG. 6 to the c diagram in FIG. 6 is a rotated and straight state.
  • the electronic device may process and replay the trajectory by receiving initial motion track data collected by other electronic devices.
  • the user triggers the electronic device to perform trajectory playback on the electronic device, and the electronic device sends a trajectory playback request to the server, and receives the initial motion trajectory sent by the server.
  • the electronic device determines the control node of the motion trajectory and the lens motion of the control node according to the received initial motion trajectory, and performs trajectory playback according to the lens motion of the control node and the control node.
  • the electronic device determines the control node and the lens action of the control node, reference may be made to the above-mentioned embodiment, which will not be repeated here.
  • the electronic device may perform trajectory playback by receiving initial motion trajectory data collected and processed by other electronic devices.
  • the user triggers the electronic device to perform trajectory playback on the electronic device.
  • the electronic device After the electronic device receives the user operation to enable the trajectory playback, it sends a trajectory playback request to the server, and receives the control node and the camera movement of the control node sent by the server.
  • the electronic device performs trajectory playback according to the control node and the camera movement of the control node.
  • the manner in which the server side determines the control node and the lens action of the control node can refer to the above-mentioned embodiment, which will not be repeated here.
  • the electronic device obtains the initial movement trajectory data of the electronic device from the cloud server according to the method of logging in to the unique account, and processes and replays the initial movement trajectory data obtained from the cloud server.
  • the trajectory playback method provided by the embodiments of the present application can also be applied to the field of map navigation.
  • the terminal device obtains the start position and the end position selected by the user, sends the start position and the end position to the server, and the server obtains the route between the two positions according to the start position and the end position, and combines it with the planned
  • the route and the trajectory playback method provided in this application calculate the control node and the lens action of the control node in the route, and send it to the terminal device.
  • the terminal device dynamically displays the recommended route according to the data information sent by the server.
  • the terminal device obtains the start position and the end position selected by the user, obtains the route between the two positions according to the start position and the end position, and calculates the route by combining the planned route and the trajectory playback method provided in this application.
  • the terminal device dynamically displays the recommended route according to the calculated control node and the lens action of the control node.
  • the electronic device 100 shown in FIG. 7 is taken as an example to introduce the electronic devices to which the embodiments of the present application are applicable.
  • FIG. 7 shows a schematic structural diagram of an exemplary electronic device 100 provided in an embodiment of the present application.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 150, a power management module 151, a battery 152, an antenna 1, and an antenna 2.
  • Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display 195, and Subscriber identification module (subscriber identification module, SIM) card interface 195, etc.
  • SIM Subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light Sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 100.
  • the electronic device 100 may include more or fewer components than those shown in the figure, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, microcontroller unit MCU, and/or neural network processor (neural-network processor) processing unit, NPU), etc.
  • AP application processor
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • microcontroller unit MCU microcontroller unit
  • NPU neural network processor
  • the different processing units can be independent devices or integrated in one or more processors.
  • the electronic device 100 may also include one or more processors 110.
  • the controller may be the nerve center and command center of the electronic device 100.
  • the controller can generate operation control signals according to the instruction operation code and timing signals to complete the control of fetching instructions and executing instructions.
  • the processor 110 may be used to extract the control node in the motion trajectory and then determine the rotation angle of the control node according to a preset rule, and the processor 110 draws the motion trajectory according to the control node and the rotation angle of the control node.
  • the processor 110 detects that the motion track is played back to the control node, it adjusts the display direction of the motion track according to the rotation angle corresponding to the control node.
  • a memory may also be provided in the processor 110 to store instructions and data.
  • the memory in the processor 110 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to use the instruction or data again, it can be directly called from the memory. Repeated access is avoided, the waiting time of the processor 110 is reduced, and the efficiency of the electronic device 100 is improved.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, and a universal asynchronous transmitter/receiver (universal asynchronous) interface.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter/receiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may include multiple sets of I2C buses.
  • the processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface to implement the touch function of the electronic device 100.
  • the I2S interface can be used for audio communication.
  • the processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through an I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communication to sample, quantize and encode analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a two-way communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 may transmit audio signals to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with the display screen 195, the camera 193 and other peripheral devices.
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI) and so on.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the shooting function of the electronic device 100.
  • the processor 110 and the display screen 195 communicate through a DSI interface to realize the display function of the electronic device 100.
  • the GPIO interface can be configured through software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 195, the wireless communication module 160, the audio module 170, the sensor module 180, and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transfer data between the electronic device 100 and peripheral devices. It can also be used to connect earphones and play audio through earphones. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present application is merely a schematic description, and does not constitute a structural limitation of the electronic device 100.
  • the electronic device 100 may also adopt different interface connection modes in the foregoing embodiments, or a combination of multiple interface connection modes.
  • the wireless communication function of the electronic device 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 150 can provide a wireless communication solution including 2G/3G/5G/5G and the like applied to the electronic device 100.
  • the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), and the like.
  • the mobile communication module 150 can receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110.
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • the wireless communication solution provided by the mobile communication module 150 can enable the electronic device to communicate with devices in the network (such as a server), and the WLAN wireless communication solution provided by the wireless communication module 160 can also enable the electronic device It can communicate with a device (such as a server) in the network, and can communicate with a cloud device through the device (such as a server) in the network. In this way, the electronic device can discover the cloud device and transmit data to the cloud device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays an image or video through the display screen 195.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 110 and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites. System (global navigation satellite system, GNSS), frequency modulation (FM), near field communication (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110.
  • the wireless communication module 160 may also receive a signal to be sent from the processor 110, perform frequency modulation, amplify, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the wireless communication module 160 may include a Bluetooth module, a Wi-Fi module, and the like.
  • the electronic device 100 can implement a display function through a GPU, a display screen 195, an application processor, and the like.
  • the GPU is a microprocessor for image processing, connected to the display 195 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations and is used for graphics rendering.
  • the processor 110 may include one or more GPUs, which execute instructions to generate or change display information.
  • the display screen 195 is used to display images, videos, and the like.
  • the display screen 195 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the electronic device 100 may include one or N display screens 195, and N is a positive integer greater than one.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects the frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as: moving picture experts group (MPEG)-1, MPEG-2, MPEG-3, MPEG-5, and so on.
  • MPEG moving picture experts group
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the electronic device 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, and so on.
  • the external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example, save music, photos, videos and other data in an external memory card.
  • the internal memory 121 may be used to store one or more computer programs, and the one or more computer programs include instructions.
  • the processor 110 can run the above-mentioned instructions stored in the internal memory 121 to enable the electronic device 100 to execute the data sharing methods, various functional applications, and data processing provided in some embodiments of the present application.
  • the internal memory 121 may include a storage program area and a storage data area. Among them, the storage program area can store the operating system; the storage program area can also store one or more application programs (such as a gallery, contacts, etc.) and so on.
  • the data storage area can store data (such as photos, contacts, etc.) created during the use of the electronic device 100.
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), and the like.
  • UFS universal flash storage
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. For example, music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into an analog audio signal for output, and is also used to convert an analog audio input into a digital audio signal.
  • the audio module 170 can also be used to encode and decode audio signals.
  • the audio module 170 may be provided in the processor 110, or part of the functional modules of the audio module 170 may be provided in the processor 110.
  • the speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or voice message, it can receive the voice by bringing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through the human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, which can implement noise reduction functions in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
  • the pressure sensor 180A may be provided on the display screen 195.
  • the capacitive pressure sensor may include at least two parallel plates with conductive materials.
  • the electronic device 100 determines the intensity of the pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations that act on the same touch position but have different touch operation strengths may correspond to different operation instructions. For example: when a touch operation whose intensity of the touch operation is less than the first pressure threshold is applied to the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the movement posture of the electronic device 100.
  • the angular velocity of the electronic device 100 around three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shake of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip holster.
  • the electronic device 100 can detect the opening and closing of the flip according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices, and apply to applications such as horizontal and vertical screen switching, pedometers, and so on.
  • the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F to measure the distance to achieve fast focusing.
  • the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light to the outside through the light emitting diode.
  • the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, and the pocket mode will automatically unlock and lock the screen.
  • the ambient light sensor 180L is used to sense the brightness of the ambient light.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 195 according to the perceived brightness of the ambient light.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touch.
  • the fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photographs, fingerprint answering calls, and so on.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold value, the electronic device 100 reduces the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 152 to avoid abnormal shutdown of the electronic device 100 due to low temperature.
  • the electronic device 100 boosts the output voltage of the battery 152 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K can also be called a touch panel or a touch-sensitive surface.
  • the touch sensor 180K may be disposed on the display screen 195, and the touch screen is composed of the touch sensor 180K and the display screen 195, which is also called a “touch screen”.
  • the touch sensor 180K is used to detect touch operations acting on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 195.
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100, which is different from the position of the display screen 195.
  • the bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can obtain the vibration signal of the vibrating bone mass of the human voice.
  • the bone conduction sensor 180M can also contact the human pulse and receive the blood pressure pulse signal.
  • the bone conduction sensor 180M may also be provided in the earphone, combined with the bone conduction earphone.
  • the audio module 170 can parse the voice signal based on the vibration signal of the vibrating bone block of the voice obtained by the bone conduction sensor 180M, and realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal obtained by the bone conduction sensor 180M, and realize the heart rate detection function.
  • the button 190 includes a power-on button, a volume button, and so on.
  • the button 190 may be a mechanical button. It can also be a touch button.
  • the electronic device 100 may receive key input, and generate key signal input related to user settings and function control of the electronic device 100.
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 195, the motor 191 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 may be an indicator light, which may be used to indicate the charging status, power change, or to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 195 is used to connect to the SIM card.
  • the SIM card can be inserted into the SIM card interface 195 or pulled out from the SIM card interface 195 to achieve contact and separation with the electronic device 100.
  • the electronic device 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 195 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 195 can also be compatible with different types of SIM cards.
  • the SIM card interface 195 may also be compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
  • the electronic device 100 exemplarily shown in FIG. 7 may display various user interfaces through the display screen 195.
  • the electronic device 100 can detect touch operations in each user interface through the touch sensor 180K, such as a click operation in each user interface (such as a touch operation on an icon, a double-click operation), and for example, up or down in each user interface. Swipe down, or perform circle-drawing gestures, and so on.
  • the electronic device 100 may detect a motion gesture performed by the user holding the electronic device 100, such as shaking the electronic device, through the gyroscope sensor 180B, the acceleration sensor 180E, and the like.
  • the electronic device 100 can detect non-touch gesture operations through the camera 193 (eg, a 3D camera, a depth camera).
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • the embodiment of the present application takes an Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100 by way of example.
  • FIG. 8 is a block diagram of the software structure of the electronic device 100 according to an embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Communication between layers through software interface.
  • the Android system is divided into four layers, from top to bottom, the application layer, the application framework layer, the Android runtime and system library, and the kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, short message, etc.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include a window manager, a content provider, a view system, a phone manager, a resource manager, and a notification manager.
  • the window manager is used to manage window programs.
  • the window manager can obtain the size of the display screen, determine whether there is a status bar, lock the screen, take a screenshot, etc.
  • the content provider is used to store and retrieve data and make these data accessible to applications.
  • the data may include videos, images, audios, phone calls made and received, browsing history and bookmarks, phone book, etc.
  • the view system includes visual controls, such as controls that display text, controls that display pictures, and so on.
  • the view system can be used to build applications.
  • the display interface can be composed of one or more views.
  • a display interface that includes a short message notification icon may include a view that displays text and a view that displays pictures.
  • the phone manager is used to provide the communication function of the electronic device 100. For example, the management of the call status (including connecting, hanging up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the status bar, which can be used to convey notification-type messages, and it can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify download completion, message reminders, and so on.
  • the notification manager can also be a notification that appears in the status bar at the top of the system in the form of a chart or a scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, text messages are prompted in the status bar, prompt sounds, electronic devices vibrate, and indicator lights flash.
  • Android Runtime includes core libraries and virtual machines. Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and application framework layer run in a virtual machine.
  • the virtual machine executes the java files of the application layer and the application framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • the system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), three-dimensional graphics processing library (for example: OpenGL ES), 2D graphics engine (for example: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides a combination of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG5, H.265, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, synthesis, and layer processing.
  • the 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the software system shown in Figure 8 involves application presentation (such as gallery and file manager) using track playback capabilities, an instant sharing module that provides sharing capabilities, a map navigation module that provides positioning capabilities, and the application framework layer provides WLAN services, Bluetooth Services, as well as the core and the bottom layer provide WLAN Bluetooth capabilities and basic communication protocols.
  • application presentation such as gallery and file manager
  • instant sharing module that provides sharing capabilities
  • map navigation module that provides positioning capabilities
  • application framework layer provides WLAN services, Bluetooth Services, as well as the core and the bottom layer provide WLAN Bluetooth capabilities and basic communication protocols.
  • the embodiment of the present application also provides a computer-readable storage medium. All or part of the processes in the foregoing method embodiments may be completed by a computer program instructing relevant hardware.
  • the program may be stored in the foregoing computer storage medium. When the program is executed, it may include the processes of the foregoing method embodiments.
  • the computer-readable storage medium includes: read-only memory (ROM) or random access memory (RAM), magnetic disks or optical disks, and other media that can store program codes.
  • the computer-readable storage medium may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or carry instructions or data.
  • the structured form stores the required program code and can be accessed by the computer.
  • any connection is properly termed a computer-readable medium. For example, if you use coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL) or wireless technology (such as infrared, radio and microwave) to transmit software from a website, server or other remote source, then coaxial cable, fiber optic cable , Twisted pair, DSL or wireless technologies such as infrared, radio and microwave are included in the definition of the medium.
  • DSL digital subscriber line
  • wireless technology such as infrared, radio and microwave
  • Magnetic disks and optical disks as used herein include compact disks (CDs), laser disks, optical disks, digital versatile disks (DVD), floppy disks, and blu-ray disks, where disks usually reproduce data magnetically, while optical disks reproduce data optically using lasers. Combinations of the above should also be included in the scope of computer-readable media.
  • the embodiment of the present application also provides a computer program product.
  • the methods described in the foregoing method embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If it is implemented in software, it can be fully or partially implemented in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the foregoing computer instructions are loaded and executed on a computer, the processes or functions described in the foregoing method embodiments are generated in whole or in part.
  • the above-mentioned computer may be a general-purpose computer, a special-purpose computer, a computer network, network equipment, user equipment, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted through the computer-readable storage medium.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
  • the modules in the device of the embodiment of the present application may be combined, divided, and deleted according to actual needs.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Ecology (AREA)
  • Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

公开了一种轨迹回放方法及相关装置,包括:电子设备获取待回放的运动轨迹,运动轨迹包括多个控制节点;电子设备提取出运动轨迹中相邻的第一控制节点和第二控制节点;电子设备根据该第一控制节点、该第二控制节点以及第一方向,确定第一控制节点的旋转角度;其中,在运动轨迹中,第二控制节点在第一控制节点之后;当电子设备检测到运动轨迹回放至第一控制节点时,电子设备根据第一控制节点对应的旋转角度,调整运动轨迹的显示方向。可以通过控制控制节点的数量和控制节点的旋转角度,在保证相似度的情况下,稳定地图中运动轨迹的状态变化。

Description

轨迹回放方法及相关装置
本申请要求于2020年4月23日提交中国专利局、申请号为202010329410.6、申请名称为“轨迹回放方法及相关装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及电子地图数据处理技术领域,尤其涉及轨迹回放方法及相关装置。
背景技术
随着传感器技术的发展和普及,全球卫星导航系统(Global Navigation Satellite System,GNSS)成为终端设备的基础能力,如汽车、手机和手表等。相关设备或应用程序利用GNSS完成用户轨迹记录后,可提供动态回放轨迹的能力。轨迹回放是指以合适的时间间隔不断更新用户轨迹及显示区域的地图状态,直至用户轨迹结尾。
现有的轨迹的抽稀技术可提供与初始运动轨迹近似的简化轨迹。在轨迹回放中,轨迹抽稀可用来指导地图状态的变化轨迹,所谓的地图状态的变化轨迹是指地图中心显示区域对应的真实地理坐标的变化轨迹。
然而,轨迹回放时常存在两个问题,其一是若简化轨迹与初始运动轨迹的相似度比较低,则会引发轨迹跑出地图显示区域的问题;其二是若简化轨迹与初始运动轨迹的相似度比较高,则会导致地图的状态变化过于剧烈,引起视觉的眩晕感,可能会造成用户的主观不适。
所以,如何在保证相似度的情况下,稳定地图的状态变化,是本领域技术人员正在研究的问题。
发明内容
本申请实施例提供了一种轨迹回放方法及相关装置,能够在保证相似度的情况下,稳定地图的状态变化。
第一方面,本申请提供了一种轨迹回放的方法,包括:
电子设备获取待回放的运动轨迹,运动轨迹包括多个控制节点;电子设备提取出运动轨迹中相邻的第一控制节点和第二控制节点;电子设备根据该第一控制节点、该第二控制节点以及第一方向,确定第一控制节点的旋转角度;其中,在运动轨迹中,第二控制节点在第一控制节点之后;当电子设备检测到运动轨迹回放至第一控制节点时,电子设备根据第一控制节点对应的旋转角度,调整运动轨迹的显示方向。
实施第一方面的方法,将运动轨迹中每一个控制节点都对应一个旋转角度,电子设备通过对运动轨迹中每个控制节点的旋转角度进行确定,根据每个控制节点的旋转角度,在电子设备的显示屏上动态回放该运动轨迹。其中第一方向为运动轨迹当时在电子设备的显示区域中的显示方向,该显示方向可以理解为沿着电子设备的显示区域的底部的垂直方向指向显示区域的顶部的方向。运动轨迹中控制节点的数量与该运动轨迹和初始运动轨迹的 相似度有关,以及运动轨迹中控制节点的旋转角度与电子设备回放该运动轨迹的稳定性有关。电子设备可以通过控制控制节点的数量和控制节点的旋转角度,在保证相似度的情况下,稳定地图中运动轨迹的状态变化。
结合第一方面,在一些实施例中,电子设备根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当电子设备检测第一方向与第一控制节点和第二控制节点之间连线所成的第一夹角小于或等于第一阈值时,电子设备确定第一控制节点对应的旋转角度为0。这种方式,若第一夹角不大于第一阈值时,可以认为当前运动轨迹在显示区域中的显示方向不影响用户在第一方向上的观看,无需对运动轨迹的显示方向进行旋转。通过设置第一阈值,可以减少对运动轨迹的显示方向的旋转频率,稳定地图中运动轨迹的状态变化。
在一些实施例中,电子设备根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当电子设备检测第一方向与第一控制节点和第二控制节点之间连线所构成的第一夹角小于或等于第一阈值时,电子设备确定第一控制节点对应的旋转角度为第一夹角。这种方式,若第一夹角大于第一阈值时,对运动轨迹的显示方向进行旋转,使运动轨迹的显示方向为第一方向,这样可以使用户在第一方向上进行观看轨迹回放,提升用户体验。
结合第一方面,在一些实施例中,电子设备根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当电子设备检测第一方向与第一控制节点和第二控制节点之间连线所成的第一夹角小于或等于第一阈值时,或第一控制节点和第二控制节点之间连线的距离小于或等于第二阈值,电子设备确定第一控制节点对应的旋转角度为0。也就是说,若第一夹角大于第一阈值,但是该控制节点与下一个控制节点的物理距离相近(小于或等于第二阈值),也不能旋转运动轨迹的显示方向。这样可以避免由于两个控制节点的距离太相近,从而在短时间内连续变换两次运动轨迹的显示方向的情况,进一步的提高了地图状态的稳定性。
结合第一方面,在一些实施例中,电子设备根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当电子设备检测第一方向与第一控制节点和第二控制节点之间连线所成的第一夹角大于第一阈值时,且第一控制节点和第二控制节点之间连线的距离大于第二阈值,电子设备确定第一控制节点对应的旋转角度为第一夹角。这种方式,对第一夹角和两个控制节点的距离共同进行判断是否旋转运动轨迹的显示方向,可以减少对运动轨迹的显示方向的旋转频率,稳定地图中运动轨迹的状态变化。
在一些实施例中,电子设备根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当第一控制节点为运动轨迹中的第一个控制节点时,当电子设备检测第一方向与第一控制节点和第二控制节点之间连线所构成的第一夹角大于第三阈值,电子设备确定第一控制节点对应的旋转角度为第一夹角;其中,第三阈值小于第一阈值。由于第三阈值小于第一阈值,即运动轨迹的第一个控制节点比其他控制节点更容易实现对运动轨迹的显示方向进行旋转。这样可以达到在一开始播放动态轨迹时,就能使用户以第一方向观看轨迹的效果。
结合第一方面,在一些实施例中,电子设备根据第一控制节点、第二控制节点以及第 一方向,确定第一控制节点的旋转角度,具体包括:当电子设备检测第一控制节点和第二控制节点之间连线的距离大于第四阈值,电子设备确定第一控制节点对应的旋转角度为第一夹角。这种方式描述了在控制节点的距离下一个控制节点较远(大于第四阈值)的情况下,即使该第一夹角不满足旋转运动轨迹的显示方向的条件,但为了避免用户长时间无法以第一方向观看轨迹,依然可以对运动轨迹的显示方向进行旋转,保证当前的轨迹前行方向为第一方向,提升用户体验。
结合第一方面,在一些实施例中,在电子设备获取待回放的运动轨迹之前,方法还包括:电子设备获取到电子设备的初始运动轨迹;电子设备从初始运动轨迹中确定出多个轨迹点;电子设备通过多个轨迹点的位置信息,从多个轨迹点中确定出多个控制节点。其中,电子设备从多个轨迹点中确定出多个控制节点的方式可以包括取相邻两个轨迹点的距离中点、时间中点、位置坐标的加权平均等。
在一些实施例中,电子设备通过多个轨迹点的位置信息,从多个轨迹点中确定出多个控制节点,具体包括:当电子设备检测多个轨迹点中目标轨迹点分别与其相邻的两个轨迹点连成的线段所构成的夹角大于第五阈值,电子设备判断目标轨迹点为冗余节点;电子设备在多个轨迹点中删除冗余节点,多个轨迹点中剩余的轨迹点为多个控制节点。这种通过设置第五阈值的方式可以有效的删除冗余节点,由于冗余节点对于轨迹的相似度影响并不大,对于同一个运动轨迹来说,减少冗余节点可以减少旋转次数,提高地图状态变化的稳定性。
在一些实施例中,电子设备获取待回放的运动轨迹,具体包括:电子设备将多个控制节点连线成运动轨迹。其中,运动轨迹与初始运动轨迹的相似度与控制节点的数量以及第五阈值有关,电子设备通过设置合理的第五阈值,可以很好的控制轨迹相似度和地图状态变化的稳定性的问题。
第二方面,本申请提供了一种电子设备,该电子设备可包括:一个或多个处理器、存储器和显示屏;存储器、显示屏与一个或多个处理器耦合,存储器用于存储计算机程序代码,该计算机程序代码包括计算机指令,一个或多个处理器调用该计算机指令以使得电子设备执行:
获取待回放的运动轨迹,运动轨迹包括多个控制节点;提取出运动轨迹中相邻的第一控制节点和第二控制节点;根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度;其中,在运动轨迹中,第二控制节点在第一控制节点之后;当检测到运动轨迹回放至第一控制节点时,根据第一控制节点对应的旋转角度,调整运动轨迹的显示方向。
结合第二方面,在一些实施例中,根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当检测第一方向与第一控制节点和第二控制节点之间连线所成的第一夹角小于或等于第一阈值时,确定第一控制节点对应的旋转角度为0。
在一些实施例中,根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当检测第一方向与第一控制节点和第二控制节点之间连线所构成的第一夹角大于第一阈值时,确定第一控制节点对应的旋转角度为第一夹角。
结合第二方面,在一些实施例中,根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当检测第一方向与第一控制节点和第二控制节点之间连线所成的第一夹角小于或等于第一阈值时,或第一控制节点和第二控制节点之间连线的距离小于或等于第二阈值,确定第一控制节点对应的旋转角度为0。
结合第二方面,在一些实施例中,根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当检测第一方向与第一控制节点和第二控制节点之间连线所成的第一夹角大于第一阈值时,且第一控制节点和第二控制节点之间连线的距离大于第二阈值,确定第一控制节点对应的旋转角度为第一夹角。
在一些实施例中,根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当第一控制节点为运动轨迹中的第一个控制节点时,当检测第一方向与第一控制节点和第二控制节点之间连线所构成的第一夹角大于第三阈值,确定第一控制节点对应的旋转角度为第一夹角;其中,第三阈值小于第一阈值。
结合第二方面,在一些实施例中,根据第一控制节点、第二控制节点以及第一方向,确定第一控制节点的旋转角度,具体包括:当检测第一控制节点和第二控制节点之间连线的距离大于第四阈值,确定第一控制节点对应的旋转角度为第一夹角。
结合第二方面,在一些实施例中,在获取待回放的运动轨迹之前,方法还包括:获取到电子设备的初始运动轨迹;从初始运动轨迹中确定出多个轨迹点;通过多个轨迹点的位置信息,从多个轨迹点中确定出多个控制节点。
在一些实施例中,通过多个轨迹点的位置信息,从多个轨迹点中确定出多个控制节点,具体包括:当检测多个轨迹点中目标轨迹点分别与其相邻的两个轨迹点连成的线段所构成的夹角大于第五阈值,判断目标轨迹点为冗余节点;在多个轨迹点中删除冗余节点,多个轨迹点中剩余的轨迹点为多个控制节点。
在一些实施例中,获取待回放的运动轨迹,具体包括:将多个控制节点连线成运动轨迹。
第三方面,本申请实施例提供了一种轨迹回放系统,包括电子设备和服务器,其中,
服务器,用于获取待回放的运动轨迹,运动轨迹包括多个控制节点;
服务器,还用于提取出运动轨迹中相邻的第一控制节点和第二控制节点;
服务器,还用于根据第一控制节点、该第二控制节点以及第一方向,确定第一控制节点的旋转角度;其中,在运动轨迹中,第二控制节点在第一控制节点之后;
服务器,还用于将所述运动轨迹以及所述旋转角度发送给电子设备;
电子设备,用于在检测到运动轨迹回放至第一控制节点时,根据第一控制节点对应的旋转角度,调整运动轨迹的显示方向。
结合第三方面,在一些实施例中,服务器具体用于:当检测第一方向与第一控制节点和第二控制节点之间连线所成的第一夹角小于或等于第一阈值时,确定第一控制节点对应的旋转角度为0。
在一些实施例中,服务器具体用于:当检测第一方向与第一控制节点和第二控制节点之间连线所构成的第一夹角大于第一阈值时,确定第一控制节点对应的旋转角度为第一夹角。
结合第三方面,在一些实施例中,服务器具体用于:当检测第一方向与第一控制节点和第二控制节点之间连线所成的第一夹角小于或等于第一阈值时,或第一控制节点和第二控制节点之间连线的距离小于或等于第二阈值,确定第一控制节点对应的旋转角度为0。
结合第三方面,在一些实施例中,服务器具体用于:当检测第一方向与第一控制节点和第二控制节点之间连线所成的第一夹角大于第一阈值时,且第一控制节点和第二控制节点之间连线的距离大于第二阈值,确定第一控制节点对应的旋转角度为第一夹角。
在一些实施例中,服务器具体用于:当第一控制节点为运动轨迹中的第一个控制节点时,当检测第一方向与第一控制节点和第二控制节点之间连线所构成的第一夹角大于第三阈值,确定第一控制节点对应的旋转角度为第一夹角;其中,第三阈值小于第一阈值。
结合第三方面,在一些实施例中,服务器具体用于:当检测第一控制节点和第二控制节点之间连线的距离大于第四阈值,确定第一控制节点对应的旋转角度为第一夹角。
结合第三方面,在一些实施例中,服务器还用于:在获取待回放的运动轨迹之前获取到电子设备的初始运动轨迹;从初始运动轨迹中确定出多个轨迹点;通过多个轨迹点的位置信息,从多个轨迹点中确定出多个控制节点。
在一些实施例中,服务器具体用于:当检测多个轨迹点中目标轨迹点分别与其相邻的两个轨迹点连成的线段所构成的夹角大于第五阈值,判断目标轨迹点为冗余节点;在多个轨迹点中删除冗余节点,多个轨迹点中剩余的轨迹点为多个控制节点。
在一些实施例中,服务器具体用于:将多个控制节点连线成运动轨迹。
第四方面,本申请实施例提供了一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行上述第一方面任一项可能的实现方式中的轨迹回放的方法。
第五方面,本申请实施例提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述任一方面任一项可能的实现方式中的轨迹回放的方法。
可以理解地,上述提供的第二方面提供的电子设备、第三方面提供的系统、第四方面提供的计算机存储介质,以及第五方面提供的计算机程序产品均用于执行第一方面所提供的轨迹回放的方法,因此,其所能达到的有益效果可参考第一方面所提供的方法中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的中心点坐标、方位角、控制节点的示意图;
图2为本申请实施例提供的一种确定控制节点方法的示意图;
图3为本申请实施例提供的又一种确定控制节点方法的示意图;
图4为本申请实施例提供的一种确定旋转夹角的示意图;
图5为本申请实施例提供的一种确定控制节点的镜头动作的方法的示意图;
图6为本申请实施例提供的一种轨迹回放方法的界面显示图;
图7为本申请实施例提供的一种电子设备的结构示意图;
图8为本申请实施例提供的一种软件架构示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清楚、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
本申请实施例中涉及的电子设备可以是手机、平板电脑、桌面型、膝上型、笔记本电脑、超级移动个人计算机(Ultra-mobile Personal Computer,UMPC)、服务器(包括云服务器)、手持计算机、上网本、个人数字助理(Personal Digital Assistant,PDA)、可穿戴电子设备(例如运动手表、智能手环、智能手表)、虚拟现实设备、车载多媒体设备、无人飞行器、航拍仪器等具备定位数据采集和/或处理能力的设备。
首先,对本申请中涉及的部分相关用语进行解释说明,以便于本领域技术人员理解。
(1)轨迹回放:是指用户选择某个时间段,然后电子设备在地图上再现这个时间段内用户的轨迹的出现情况。该时间段内用户的轨迹包括一系列的位置点,每个位置点可以包括日期、时间、经度、纬度、海拔信息、运动速度等信息。本申请中,位置点可以称为轨迹点。
(2)地图状态:表示轨迹回放中,显示区域的地图状态。本申请用地图中心点坐标和地图方位角来描述地图状态,中心点如图1中的a图所示,图1中的a图示出了在显示区域的两对角线的交点,即显示区域的中心点。中心点坐标描述了地图当前在显示区域的中心点的物理坐标,地图状态随着中心点坐标的移动而相应的移动。方位角如图1中的b图所示,图1中的b图示例性的表示了当电子设备竖屏时地图状态的方位角,箭头①指向的方向表示物理的正北方向,箭头②指向的方向表示沿电子设备的显示区域的底部的垂直方向指向显示区域的顶部的方向。其中方位角表示物理的正北方向沿顺时针方向旋转至箭头②指向的方向所形成的最小角度。例如方位角为45度,则地图的视角(箭头②指向的方向)为北偏东45度,例如方位角为90度,则此时地图的视角(箭头②指向的方向)为正东方向。如图1中的c图所示,图1中的c图示例性的表示了当电子设备横屏时地图状态的方位角,箭头③指向的方向表示物理的正北方向,箭头④指向的方向表示沿电子设备的显示区域的底部的垂直方向指向显示区域的顶部的方向。其中,方位角与电子设备竖屏时相同。
本申请中,将沿电子设备的显示区域的底部的垂直方向指向显示区域的顶部的方向(箭头②指向的方向)可以称为正方向或第一方向。
(3)镜头动作:包括直行、旋转和停止三个基础动作,其中直行表示地图中心点的变 化,其中旋转表示地图方位角的变化,其中停止表示停止地图状态的变化。
(4)运动轨迹:指轨迹回放过程中地图中心点坐标形成的轨迹。本申请将运动轨迹表示为有限的多条直线,相邻直线的交点及运动轨迹的首尾点定义为控制节点,多个控制节点的连线形成了运动轨迹,如图1中的d图所示。本申请中,每一个控制节点都对应一个镜头动作,电子设备根据控制节点做出相应的镜头动作,从而动态绘制出运动轨迹。
(5)抽稀算法:指对大量冗余的轨迹数据点进行压缩以提取必要的数据点的一种算法。可以将曲线近似表示为一系列点,并减少点的数量。在轨迹回放中,轨迹的抽稀算法可提供与初始运动轨迹近似的简化轨迹,可用来指导地图状态的变化轨迹。所谓的地图状态的变化轨迹是指地图中心点坐标对应的真实地理坐标的变化轨迹。
现有的轨迹回放方法通常是利用抽稀算法提供与初始轨迹近似的简化轨迹,若简化轨迹与初始轨迹的相似度比较低,则地图中心点坐标跟不上轨迹的绘制速度,会引发轨迹跑出地图显示区域的问题。相反地,若简化轨迹与初始运动轨迹的相似度比较高,则会导致地图的状态变化过于剧烈,引起视觉的眩晕感,可能会造成用户的主观不适。
针对上述技术问题,本申请实施例提出一种轨迹回放方法,可以在保证相似度的情况下,稳定地图的状态变化,提升用户体验。在本申请的轨迹回放过程中,首先电子设备确定轨迹的控制节点,然后确定控制节点的镜头动作,根据控制节点以及控制节点的镜头动作动态绘制轨迹回放曲线。下面分别介绍上述两个过程。
(一)确定轨迹的控制节点。
步骤一、获取电子设备的初始运动轨迹。
电子设备获取电子设备的初始运动轨迹,该初始运动轨迹包括散步轨迹、跑步轨迹、登山轨迹、游乐园轨迹等各种类型轨迹。其中电子设备可以是在本地获取初始运动轨迹,也可以是在其他电子设备(例如云端服务器)获取初始运动轨迹,本申请不做限制。
电子设备可以是依据用户选择的时间段获取该时间段内的初始运动轨迹,也可以是依据用户选择的起始位置和终点位置获取两个位置之间的初始运动轨迹,本申请不做限制。
在一些可能的实施例中,电子设备可以是依据预设条件获取的初始运动轨迹。该预设条件可以是电子设备连续运动时间最长的一次运动轨迹,可以是电子设备连续运动距离最长的一次运动轨迹,可以是电子设备在一个月/一年中运动距离最长的某一天的运动轨迹,等等。其中,该预设条件可以是用户设置或用户选择的,也可以是电子设备中依据某个时间点和/或地点自动触发的。
举例来说,当电子设备检测到当前的位置为公园时,电子设备自动获取电子设备在该公园的历史运动轨迹;当电子设备检测到当前的时间为2019年12月31日时,电子设备自动获取电子设备在一年中(2018年12月31日至2019年12月31日)运动距离最长的某一天的运动轨迹;当电子设备检测到用户触发的获取电子设备连续运动时间最长的一次运动轨迹的操作,则电子设备获取电子设备连续运动时间最长的一次运动轨迹。
步骤二、计算该初始运动轨迹的轨迹点。
轨迹点定义为P i(x,y),其中i表示轨迹点在整个初始运动轨迹中的索引值,(x,y)表示轨迹点的坐标,轨迹点的坐标可以是绝对坐标(例如经纬度)或相对坐标。运动的轨 迹数据集合定义为P={P i},i=1,2,…,N,其中N为正整数,表示当前初始运动轨迹的轨迹点数量。每个轨迹点可以包括日期、时间、经度、纬度、海拔信息、运动速度等信息。
电子设备获取到电子设备的初始运动轨迹后,电子设备根据分段数量K和当前初始运动轨迹的总路程L,根据总路程L将初始运动轨迹分为K段,计算得出每两段路程之间的间隔为L/K,每隔L/K的路程为一个轨迹点。初始运动轨迹的起始点和终点分别为一个轨迹点,此时轨迹点数量有N=K+1个。其中,路程可以是实际的地理路程或地图上的路程。
在一些可能的实施例中,轨迹点还可以是具有相同直线距离间隔的位置点。每一个轨迹点与其相邻的轨迹点的直线距离都相同。该直线距离可以是实际地理位置上的距离也可以是地图上的距离。
在一些可能的实施例中,轨迹点还可以是具有相同时间间隔的位置点。举例来说,电子设备根据分段数量K和当前初始运动轨迹所耗费的总时间T,将初始运动轨迹分为K段,计算得出分段间隔为T/K,每隔T/K的时间段取一个轨迹点。初始运动轨迹的起始点和终点分别为一个轨迹点,此时轨迹点有K+1个。
可选的,不限于上述等距取点和等时间取点的方式,本申请还可以根据其他采样算法获得轨迹点,例如提取运动轨迹的拐点作为轨迹点等,本申请对此不作限制。
步骤三:根据轨迹点生成该初始运动轨迹的控制节点集合。
电子设备获取到初始运动轨迹中的轨迹点后,每两个相邻轨迹点之间按照预设方式取一个控制节点,并添加首尾轨迹点形成控制节点集合。下面示例性的介绍几种可能的预设方式。
方式一,两个相邻轨迹点的距离中点为控制节点。举例来说,两个相邻轨迹点的距离间隔为L/K,在两个相邻轨迹点之间的初始运动轨迹上取一个控制节点,该控制节点与这两个相邻轨迹点的距离都为L/2K。
方式二,两个相邻轨迹点的时间中点为控制节点。举例来说,两个相邻轨迹点的时间间隔为T/K,在两个相邻轨迹点之间的初始运动轨迹上取一个控制节点,该控制节点与这两个相邻轨迹点的时间间隔都为T/2K。
方式三,对两个相邻轨迹点的位置坐标做加权平均,得到一个新的坐标作为控制节点的坐标。举例来说,两个相邻轨迹点的位置坐标分别为(x 1,y 1)和(x 2,y 2),则取控制节点的坐标为
Figure PCTCN2021088805-appb-000001
可选的,不限于上述三种方式,本申请还可以根据其他算法获得控制节点,例如对三个连续轨迹点的位置坐标做加权平均,得到一个新的坐标作为控制节点等,本申请对此不作限制。
图2示例性的示出了上述获取控制节点集合的过程,如图2中的a图所示,图2中的a图示例性的示出了一条初始运动轨迹。在这条初始运动轨迹中,电子设备每隔相同距离取一个轨迹点。如图2中的b图所示,图2中的b图示例性的取了10个轨迹点,每两个相邻轨迹点之间的距离(实际的地理距离或地图上的距离)相同。根据上述的预设规则三,对两个相邻轨迹点的位置坐标做加权平均,得到一个新的坐标作为控制节点的坐标。如图2 中的c图所示,电子设备获得11个控制节点。
步骤四:电子设备将多个控制节点连线成运动轨迹。
可以理解的,分段数量K越多,则控制节点的数量就会越多,对于同一个运动轨迹来说,绘制出的运动轨迹与实际的初始运动轨迹的相似度也会越高。然而控制节点的数量越多,镜头动作也会越多,导致地图的状态变化不稳定。
在一些可能的实施例中,根据上述获取的控制节点集合,可以在不影响相似度的情况下,进一步筛选上述步骤三中控制节点集合中冗余的控制节点,删除冗余控制节点。
即在步骤三之后,步骤四之前还包括,删除上述控制节点集合中冗余的控制节点,形成新的控制节点集合。
电子设备根据轨迹点生成该轨迹的控制节点集合之后,依次对控制节点集合中的控制节点进行判断。若该节点分别与其相邻的两个节点连成的线段所构成的夹角(锐角)大于第一阈值,则判断该节点为冗余节点。其中,首尾节点均确定为控制节点,无需进行判断。依次对控制节点集合中的控制节点进行判断,形成新的控制节点集合。
图3示例性的示出了上述形成新的控制节点集合的过程。如图3中的a图所示,图3中的a图示例性的示出了步骤三中获取的控制节点集合,包括A、B、C、D、E五个控制节点。首先确定首节点A为控制节点,获取A的下一个控制节点B,判断B是否为冗余节点。其中,图3中的正北方向为地图中的正北方向,例如,此时图3中的a图的运动轨迹中控制节点B在控制节点A的东北方向。如图3中的b图所示,以B为中间节点,计算A、B、C三个节点组成的夹角θ。其中θ=|180-|θ 12||,θ 1为AB的方位角(正北方向与AB指向方向的夹角),θ 2为BC的方位角(正北方向与BC指向方向的夹角)。判断θ的大小,若θ大于第五阈值,则判断控制节点B为冗余节点,如图3中的c图所示,删除控制节点B。
然后,获取B的下一个控制节点C,判断C是否为冗余节点。以C为中间节点,计算A、C、D三个节点组成的夹角θ,判断出C为冗余节点,删除控制节点C。这样依次对控制节点集合中的控制节点进行判断,直到最后一个控制节点E,E为尾节点,将尾节点直接确定为控制节点。如图3中的d图所示,形成新的控制节点集合,包括A、D、E三个控制节点。
在本申请中,上述第五阈值可以称为衰减角度α。上述确定轨迹的控制节点的方式,通过设置衰减角度α的方式可以有效的删除冗余控制点,由于冗余控制节点对于轨迹的相似度影响并不大,对于同一个运动轨迹来说,减少冗余控制点可以减少镜头动作,提高地图状态变化的稳定性。
可以理解的,衰减角度α越小,则筛选出的冗余控制节点的数量就会越多,那么剩余的控制节点就越少,对于同一个运动轨迹来说,绘制出的轨迹与实际轨迹的相似度也会越低。因此,分段数量K和衰减角度α共同决定了绘制出的轨迹与实际轨迹的相似度。本申请中,通过设置合理的分段数量K和衰减角度α,可以很好的控制轨迹相似度和地图状态变化的稳定性的问题。例如,分段数量K可以设置在10左右,衰减角度α可以设置在160度左右。
在一些可能的实施方式中,通过固定轨迹点之间的物理距离,可以针对不同的轨迹长 度确定不同的分段数量。轨迹的长度越长,分段数量就越多,控制节点也就更多;轨迹的长度越短,分段数量越少,控制节点也就更少。
在一些可能的实施方式中,根据运动轨迹显示在电子设备的显示区域上的比例尺(显示距离与实际距离的比)来确定不同的分段数量,其中比例尺越大,分段数量就越多。举例来说,一段相同的运动轨迹,实际距离为5公里,若该运动轨迹在电子设备的显示区域上的比例尺为1厘米:1公里,则该运动轨迹显示在电子设备的显示区域上总长有5厘米,可以对该运动轨迹取10个轨迹点,分段数量为9;若该运动轨迹的比例尺为1厘米:0.5公里(比例尺变大),则该运动轨迹显示在电子设备的显示区域上总长有10厘米,可以对该运动轨迹取20个轨迹点,分段数量为19。
(二)确定控制节点的镜头动作。
电子设备获取轨迹的控制节点集合后,确定控制节点集合中每一个控制节点的镜头动作。
首先对旋转夹角进行一下说明。本申请中,控制节点的旋转夹角为该控制节点的轨迹方向与正方向的夹角(锐角)。正方向为沿电子设备的显示区域的底部的垂直方向指向显示区域的顶部的方向。如图4所示,图4中的a图示例性的在显示区域上显示了一段运动轨迹,包括A、B、C、D,4个控制节点。对于控制节点A,A的轨迹方向为AB方向,正方向在图4中的a图中竖直向上,则A的旋转夹角为θ A。若A的镜头动作为旋转,则显示区域的地图状态的方位角发生变化,以A点为旋转点,正方向为旋转轴,向正方向旋转,旋转角度为θ A,如图4中的b图所示。此时,对于控制节点B来说,B的轨迹方向为BC方向,正方向为AB方向,则B的旋转夹角为θ B。当B的镜头动作为直行时,显示区域的地图状态的方位角不发生变化,正方向不变。对于控制节点C来说,C的轨迹方向为CD方向,正方向为AB方向,则C的旋转夹角为θ C
本申请中,旋转夹角又可以称为第一夹角。
下面示例性的介绍几种可能的预设规则。
预设规则一,若控制节点的旋转夹角大于第一阈值,则确定该控制节点的镜头动作为旋转。若控制节点的旋转夹角不大于第一阈值,则确定该控制节点的镜头动作为直行。
该预设规则一通过判断旋转夹角是否大于第一阈值,确定控制节点的镜头动作。可以降低地图状态进行旋转的频率,无需每次轨迹方向改变时都改变地图状态的方位角,提高地图状态的稳定性。在本申请中,上述第一阈值可以称为旋转抑制角度β,旋转抑制角度β可以设置在60度左右。
预设规则二,若控制节点的旋转夹角大于第一阈值,且该控制节点与下一个控制节点的物理距离大于第二阈值,则确定该控制节点的镜头动作为旋转。若控制节点的旋转夹角不大于第一阈值,或该控制节点与下一个控制节点的物理距离不大于第二阈值,则确定该控制节点的镜头动作为直行。
也即是说,当控制节点的旋转夹角大于第一阈值时,若该控制节点与下一个控制节点的物理距离相近(不大于第二阈值),也不能旋转方位角。该预设规则二通过判断旋转夹角是否大于第一阈值,以及控制节点与下一个控制节点的物理距离是否大于第二阈值,共同 确定控制节点的镜头动作。可以避免由于两个控制节点的距离太相近,从而在短时间内连续变换两次镜头动作的情况,进一步的提高了地图状态的稳定性。在本申请中,上述第二阈值可以称为旋转抑制距离L β
预设规则三,若控制节点与下一个控制节点的物理距离大于第四阈值,则确定该控制节点的镜头动作为旋转。
也即是说,当控制节点的旋转夹角不大于第一阈值时,若该控制节点与下一个控制节点的物理距离较远(大于第四阈值),也可以旋转方位角。该预设规则三描述了在控制节点的距离下一个控制节点较远的情况下,即使该控制节点的旋转夹角不满足旋转方位角的条件,但为了避免用户长时间无法以正方向观看轨迹,依然可以旋转方位角,保证当前的轨迹前行方向近似为正方向,提升用户体验。在本申请中,上述第四阈值可以称为强制旋转距离L s
预设规则四,当控制节点为首节点时,若控制节点的旋转夹角大于第三阈值,则确定该控制节点的镜头动作为旋转。若控制节点的旋转夹角不大于第三阈值,则确定该控制节点的镜头动作为直行。其中,该第三阈值小于上述第一阈值。
该预设规则四描述了在控制节点为首节点的情况下,通过判断旋转夹角是否大于第三阈值,来确定首节点的镜头动作。由于第三阈值小于第一阈值,即首节点比其他控制节点更容易实现旋转方位角的镜头动作。达到在一开始播放动态轨迹时,就能使用户以正方向观看轨迹的效果。在本申请中,上述第三阈值可以称为首次旋转抑制角度λ,首次旋转抑制角度λ可以设置在45度左右。
预设规则五,若控制节点为尾节点,则该控制节点的镜头动作为停止。
本申请中的预设规则可以包括但不限于以上一条或多条预设规则。对于上述预设规则来说,电子设备确定该控制节点的镜头动作为旋转,其中,旋转角度可以为该控制节点的旋转夹角的角度。为了避免对地图状态进行的旋转程度较大,旋转角度还可以小于该旋转夹角的角度。电子设备确定该控制节点的镜头动作为直行,即旋转角度为0。
图5示例性的示出了确定一段轨迹中控制节点的镜头动作的过程,如图5中的a图所示,图5中的a图示例性的示出了一个控制节点集合,包括A、B、C、D、E五个控制节点。依次对每一个控制节点进行分析,确定每一个控制节点的镜头动作。其中,图5中的正北方向为地图中的正北方向,例如,此时图5中的a图的运动轨迹中控制节点B在控制节点A的东北方向。
首先,对于首节点A来说,如图5中的b图所示,获取A的方位角θ 0(AB指向方向与正北方向的夹角),根据预设规则四,θ 0大于第三阈值,则控制节点A的镜头动作确定为旋转,其中旋转角度最大为θ 0
对于A的下一个控制节点B来说,如图5中的c图所示,获取B的旋转夹角θ。其中θ=|θ 12|>180?360-|θ 12|:|θ 12|,也即是说,若|θ 12|>180,则θ=360-|θ 12|;若|θ 12|≤180,则θ=|θ 12|。θ 1为前向相邻旋转节点的方位角(这里是A的方位角),θ 2为控制节点B的方位角(BC指向方向与正北方向的夹角)。计算出θ后,根据预设规则一,θ不大于第一阈值,则控制节点B的镜头动作确定为直行。
可选的,若控制节点没有前向相邻旋转节点,则θ 1为正方向的方位角。
对于B的下一个控制节点C来说,如图5中的d图所示,获取C的旋转夹角。计算出C的旋转夹角后,根据预设规则二,若该旋转夹角大于第二阈值,但是CD的物理距离不大于第二阈值,则控制节点B的镜头动作确定为直行。
对于C的下一个控制节点D来说,如图5中的e图所示,获取D的旋转夹角为θ 3(轨迹方向DE与此时的正方向AB方向的夹角)。计算出D的旋转夹角后,根据预设规则三,DE的物理距离大于第四阈值,则控制节点D的镜头动作确定为旋转,其中旋转角度最大为θ 3
对于D的下一个控制节点E来说,根据预设规则五,E为尾节点,则控制节点E的镜头动作为停止。
在本申请中,旋转抑制距离L β和强制旋转距离L s越长,对地图状态进行的旋转次数就越少,用户体验不好;旋转抑制距离L β和强制旋转距离L s越短,对地图状态进行的旋转次数就越多,影响地图状态的稳定性。本申请中,通过设置合理的旋转抑制距离L β和强制旋转距离L s,可以很好的控制地图状态变化的稳定性的问题。
在一些可能的实施方式中,旋转抑制角度β针对不同的轨迹可以取不同的值。举例来说,电子设备计算轨迹的西南坐标和东北坐标组成的矩形框的对角线,该对角线距离为L。定义旋转抑制距离L β=γ*L,其中γ为强度系数,推荐取值为0.05;以及定义强制旋转距离L s=η*L,其中η为强度系数,推荐取值为0.25。
这种方式,通过引入轨迹形成的矩形框的对角线的距离L,来确定旋转抑制距离L β和强制旋转距离L s。不同的运动轨迹的旋转抑制距离L β和强制旋转距离L s也不同,其中L越长,旋转抑制距离L β和强制旋转距离L s就越长,对地图状态进行的旋转次数就越少;L越短,旋转抑制距离L β和强制旋转距离L s就越短,对地图状态进行的旋转次数就越少。
本申请中,对于镜头动作为旋转的控制节点,实际中对地图状态方位角的旋转角度最大为该控制节点的旋转夹角。
上述描述了电子设备确定控制节点以及确定控制节点的镜头动作的两个过程。在电子设备确定了控制节点以及控制节点的镜头动作之后,根据控制节点以及控制节点的镜头动作绘制动态的轨迹回放。
以电子设备的显示区域的中心点为地图状态的中心点坐标,电子设备动态显示的轨迹为根据控制节点连接而成的轨迹,中心点坐标随着电子设备绘制动态的轨迹而不断改变。当电子设备检测到运动轨迹回放至镜头动作为直行的控制节点,旋转角度为0,则中心点坐标以相应的速度移动到下一个控制节点;当电子设备检测到运动轨迹回放至镜头动作为旋转的控制节点,则根据该控制节点对应的旋转角度,对地图状态的方位角进行相应的旋转,调整该运动轨迹的显示方向,同时中心点坐标以相应的速度移动到下一个控制节点;当电子设备绘制到镜头动作为停止的控制节点,则结束绘制,完成轨迹回放。
本申请提供的轨迹回放方法,首先选取轨迹的控制节点,根据分段数量K和衰减角度α控制运动轨迹与初始运动轨迹的相似度。根据旋转抑制角度β,旋转抑制距离L β和强制旋转距离L s控制镜头动作的剧烈程度。镜头动作和运动轨迹由不同的参数控制,可以达到动态轨迹播放时,轨迹保持在显示区域的同时,且不会引起视觉的主观不适感的效果, 提高地图状态的稳定性。
在一种可能的实施方式中,电子设备回放的运动轨迹中还包括兴趣点(Point of Information,POI)。其中POI可以是电子设备预设的点,例如商场、公交站等,还可以是电子设备根据预设功能检测到的点,例如运动轨迹中速度最快的点、心率最快的点等等。
兴趣点可以是在电子设备获取初始运动轨迹之前,电子设备预设的点。例如把运动轨迹中速度最快的点都设置为兴趣点,当电子设备获取到初始运动轨迹后,获取该初始运动轨迹中速度最快的轨迹点,确定为兴趣点。又例如电子设备把特定建筑物设置为兴趣点,当电子设备获取到初始运动轨迹后,若该初始运动轨迹经过了该特定建筑物,则将经过该特定建筑物的轨迹点确定为兴趣点。
兴趣点还可以是在电子设备获取初始运动轨迹之后,用户在该初始运动轨迹上选择兴趣点以及兴趣点对应的镜头动作。
其中,兴趣点对应的镜头动作可以是电子设备预设的镜头动作,也可以是用户选择的镜头动作。兴趣点对应的镜头动作可以包括暂停、缩放等。暂停表示当电子设备绘制到镜头动作为暂停的兴趣点,则将地图状态暂停,暂停时间可以是0.5秒;缩放表示当电子设备绘制到镜头动作为缩放的兴趣点,则对地图显示的比例尺进行缩放,例如在到达兴趣点时对地图进行放大,在离开兴趣点时对地图进行缩小到之前的比例尺。
兴趣点对应的镜头动作还可以是暂停并且缩放。暂停并且缩放表示当电子设备绘制到镜头动作为暂停并且缩放的兴趣点,则在到达兴趣点时对地图进行放大,暂停0.5秒后,离开兴趣点,对地图进行缩小到之前的比例尺。
在该实施方式中,在电子设备确定了控制节点、控制节点的镜头动作以及兴趣点之后,根据控制节点、控制节点的镜头动作以及兴趣点,动态回放该运动轨迹。
本申请实施例,电子设备在显示区域上显示轨迹回放图标,电子设备接收到用户作用于该轨迹回放功能的图标的用户操作,电子设备在显示区域中对运动轨迹进行动态回放。
当电子设备检测到镜头动作为直行的控制节点,地图状态的旋转角度为0,中心点坐标以相应的速度移动到下一个控制节点;当电子设备检测到镜头动作为旋转的控制节点,则根据该控制节点对应的旋转角度,对地图状态的方位角进行相应的旋转,调整该运动轨迹的显示方向,同时中心点坐标以相应的速度移动到下一个控制节点;当电子设备检测到镜头动作为暂停的兴趣点,对地图状态暂停,暂停时间可以是0.5秒,这里不作限制;当电子设备绘制到镜头动作为停止的控制节点,则结束回放,完成对运动轨迹的回放。
可选的,电子设备的显示区域还可以包括文本输入框。该文本输入框用于接收用户输入的文本数据,例如用户输入的时间信息和地点信息。电子设备接收到用户输入的时间信息,可以回放在该时间信息的时间段中的运动轨迹;电子设备接收到用户输入的地点信息,可以回放经过该时地点信息中的地点的运动轨迹。
可选的,电子设备的显示区域还可以包括语音输入框。该语音输入框可以接收用户输入的语音信息。电子设备接收到用户输入的语音信息,通过识别语音信息中的关键字,回 放相应的运动轨迹。
可选的,电子设备的显示区域还可以包括轨迹列表。该轨迹列表包括一条或多条运动轨迹,电子设备接收到用户针对某一条运动轨迹的点击操作,回放该运动轨迹。
接下来,针对本申请的绘制动态轨迹回放的应用界面作出介绍。如图6所示,图6示例性的示出了直行和旋转两种镜头动作的界面显示图。其中,图6中的a图到图6中的b图显示了镜头动作为直行的轨迹回放界面显示,图6中的b图到图6中的c图显示了镜头动作为旋转的轨迹回放界面显示。
如图6中的a图所示,图6中的a图的显示区域中包括轨迹线601、位置点602、路程显示区603以及速度显示区604。其中,
轨迹线601,在动态的轨迹回放过程中,描述了轨迹的运动过程。
位置点602,当位置点到达控制节点时,根据控制节点对应的镜头动作,控制地图状态。本申请中,位置点602为显示区域的中心点,位置点602所表示的中心点坐标根据控制节点以及控制节点的镜头动作进行改变,则地图状态的中心点坐标也相应改变。
路程显示区603,用于显示当前位置点602在运动轨迹中距离起始点的物理距离。
速度显示区604,用于显示当前位置点602在运动轨迹中的运动速度。
如图6中的a图所示,路程显示区603中的路程为1.90公里,速度显示区604中的运动速度为每公里6分31秒。
当位置点602到达的控制节点的旋转夹角不大于第一阈值时;或者该控制节点与下一个控制节点的物理距离不大于第二阈值时,地图状态为直行状态。如图6中的b图所示,路程显示区603中的路程为1.98公里,可以明显的看出图6中的a图和图6中的b图中地图的中心点坐标发生了改变,而方位角并没有发生改变。也即是说,图6中的a图到图6中的b图,地图状态一直为直行状态。
当位置点602到达的控制节点的旋转夹角大于第一阈值时;或者该控制节点与下一个控制节点的物理距离大于第二阈值时;或者控制节点与下一个控制节点的物理距离大于第四阈值时,地图状态为旋转状态。如图6中的c图所示,路程显示区603中的路程为2.05公里,可以明显的看出图6中的b图和图6中的c图中地图的中心点坐标发生了改变,并且方位角也发生了改变。也即是说,图6中的b图到图6中的c图,地图状态为旋转且直行状态。
本申请实施例中,电子设备可以通过接收其他电子设备采集的初始运动轨迹数据进行处理和轨迹回放。举例来说。用户在电子设备上触发电子设备进行轨迹回放,电子设备向服务器端发送轨迹回放请求,接收服务器端发送的初始运动轨迹。电子设备根据接收到的初始运动轨迹,确定运动轨迹的控制节点和控制节点的镜头动作,根据控制节点和控制节点的镜头动作进行轨迹回放。其中,电子设备确定控制节点和控制节点的镜头动作的方式可以参考上述实施例,此处不再赘述。
可选的,电子设备可以通过接收其他电子设备采集和处理过的初始运动轨迹数据进行轨迹回放。距离来说,用户在电子设备上触发电子设备进行轨迹回放,电子设备接收到开 启轨迹回放的用户操作后,向服务器端发送轨迹回放请求,接收服务器端发送的控制节点和控制节点的镜头动作。电子设备根据控制节点和控制节点的镜头动作进行轨迹回放。其中,服务器端确定控制节点和控制节点的镜头动作的方式可以参考上述实施例,此处不再赘述。
可选的,电子设备根据登陆唯一身份账号的方式从云端服务器获取该电子设备的初始运动轨迹数据,对从云端服务器获取的初始运动轨迹数据进行处理和轨迹回放。
本申请实施例提供的轨迹回放方法还可以运用于地图导航领域。
具体来说,终端设备获取用户选择的起始位置和终点位置,将该起始位置和终点位置发送到服务器,服务器根据该起始位置和终点位置获取两个位置之间的路线,结合已规划的路线和本申请提供的轨迹回放方法计算该路线中的控制节点和控制节点的镜头动作,并发送至终端设备。终端设备根据服务器发送的数据信息动态显示所推荐的路线。
可选的,终端设备获取用户选择的起始位置和终点位置,根据该起始位置和终点位置获取两个位置之间的路线,结合已规划的路线和本申请提供的轨迹回放方法,计算该路线中的控制节点和控制节点的镜头动作。终端设备根据计算出的控制节点和控制节点的镜头动作动态显示所推荐的路线。
为便于理解本申请实施例,以图7所示的电子设备100为例对本申请实施例所适用的电子设备进行介绍。
图7示出了本申请实施例提供的示例性电子设备100的结构示意图。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块150,电源管理模块151,电池152,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏195,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,微控制单元MCU,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也 可以集成在一个或多个处理器中。在一些实施例中,电子设备100也可以包括一个或多个处理器110。
其中,控制器可以是电子设备100的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
在本申请中,处理器110可以用于提取出运动轨迹中的控制节点然后根据预设规则确定控制节点的旋转角度,处理器110根据控制节点以及控制节点的旋转角度绘制该运动轨迹。当处理器110检测到所述运动轨迹回放至控制节点时,根据控制节点对应的旋转角度,调整运动轨迹的显示方向。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了电子设备100的效率。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备100的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏195,摄像头193等外围器件。MIPI接 口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备100的拍摄功能。处理器110和显示屏195通过DSI接口通信,实现电子设备100的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏195,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备100充电,也可以用于电子设备100与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备100的结构限定。在另一些实施例中,电子设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/5G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
在一些实施例中,移动通信模块150提供的无线通信的解决方案可使得电子设备可以与网络中的设备(如服务器)通信,无线通信模块160提供的WLAN无线通信的解决方案也可使得电子设备可以与网络中的设备(如服务器)通信,并可以通过网络中的该设备(如服务器)与云端设备通信。这样,电子设备便可以发现云端设备、传输数据至云端设备。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏195显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。示例性地,无线通信模块160可以包括蓝牙模块、Wi-Fi模块等。
电子设备100通过GPU,显示屏195,以及应用处理器等可以实现显示功能。GPU为图像处理的微处理器,连接显示屏195和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行指令以生成或改变显示信息。
显示屏195用于显示图像,视频等。显示屏195包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏195,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)-1,MPEG-2,MPEG-3,MPEG-5等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备100的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐、照片、视频等数据保存在外部存储卡中。
内部存储器121可以用于存储一个或多个计算机程序,该一个或多个计算机程序包括指令。处理器110可以通过运行存储在内部存储器121的上述指令,从而使得电子设备100执行本申请一些实施例中所提供的数据分享的方法,以及各种功能应用以及数据处理等。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统;该存储程序区还可以存储一个或多个应用程序(比如图库、联系人等)等。存储数据区可存储电子设备100使用过程中所创建的数据(比如照片,联系人等)。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪 存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。在另一些实施例中,电子设备100可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备100还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏195。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏195,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备100抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备100的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。在一些实施例中,当电子设备100是翻盖机时,电子设备100可以根据磁传感器180D检测翻盖的开合。进而根据检测到的皮套的开合状态或翻盖的开合状态,设置翻盖自动解锁等特性。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏195亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池152加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池152的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也可称触控面板或触敏表面。触摸传感器180K可以设置于显示屏195,由触摸传感器180K与显示屏195组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏195提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏195所处的位置不同。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏195不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。SIM卡可以通过插入SIM卡接口195,或从SIM卡接口195拔出,实现和电子设备100的接触和分离。电子设备100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口195可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口195可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口195也可以兼容不同类型的SIM卡。SIM卡接口195也可以兼容外部存储卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
图7示例性所示的电子设备100可以通过显示屏195显示各个用户界面。电子设备100可以通过触摸传感器180K在各个用户界面中检测触控操作,例如在各个用户界面中的点击操作(如在图标上的触摸操作、双击操作),又例如在各个用户界面中的向上或向下的滑动操作,或执行画圆圈手势的操作,等等。在一些实施例中,电子设备100可以通过陀螺仪传感器180B、加速度传感器180E等检测用户手持电子设备100执行的运动手势,例如晃动电子设备。在一些实施例中,电子设备100可以通过摄像头193(如3D摄像头、深度摄像头)检测非触控的手势操作。
电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本申请实施例以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图8是本申请实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。
应用程序层可以包括一系列应用程序包。
如图8所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,WLAN,蓝牙,音乐,视频,短信息等应用程序。
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图8所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏, 锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统包括可视控件,例如显示文字的控件,显示图片的控件等。视图系统可用于构建应用程序。显示界面可以由一个或多个视图组成的。例如,包括短信通知图标的显示界面,可以包括显示文字的视图以及显示图片的视图。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在状态栏中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在状态栏提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG5,H.265,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
图8所示的软件系统涉及到使用轨迹回放能力的应用呈现(如图库,文件管理器),提供分享能力的即时分享模块,提供定位能力的地图导航模块,以及应用框架层提供WLAN服务、蓝牙服务,以及内核和底层提供WLAN蓝牙能力和基本通信协议。
本申请实施例还提供了一种计算机可读存储介质。上述方法实施例中的全部或者部分流程可以由计算机程序来指令相关的硬件完成,该程序可存储于上述计算机存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。该计算机可读存储介质包括:只读 存储器(read-only memory,ROM)或随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可存储程序代码的介质。
作为一种可选的设计,计算机可读存储介质可以包括RAM,ROM,EEPROM,CD-ROM或其它光盘存储器,磁盘存储器或其它磁存储设备,或可用于承载的任何其它介质或以指令或数据结构的形式存储所需的程序代码,并且可由计算机访问。而且,任何连接被适当地称为计算机可读介质。例如,如果使用同轴电缆,光纤电缆,双绞线,数字用户线(DSL)或无线技术(如红外,无线电和微波)从网站,服务器或其它远程源传输软件,则同轴电缆,光纤电缆,双绞线,DSL或诸如红外,无线电和微波之类的无线技术包括在介质的定义中。如本文所使用的磁盘和光盘包括光盘(CD),激光盘,光盘,数字通用光盘(DVD),软盘和蓝光盘,其中磁盘通常以磁性方式再现数据,而光盘利用激光光学地再现数据。上述的组合也应包括在计算机可读介质的范围内。
本申请实施例还提供了一种计算机程序产品。上述方法实施例中描述的方法可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。如果在软件中实现,可以全部或者部分得通过计算机程序产品的形式实现。计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行上述计算机指令时,全部或部分地产生按照上述方法实施例中描述的流程或功能。上述计算机可以是通用计算机、专用计算机、计算机网络、网络设备、用户设备或者其它可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者通过所述计算机可读存储介质进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如,固态硬盘(solid state disk,SSD))等。
本申请实施例方法中的步骤可以根据实际需要进行顺序调整、合并和删减。
本申请实施例装置中的模块可以根据实际需要进行合并、划分和删减。
以上所述,以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (22)

  1. 一种轨迹回放的方法,其特征在于,包括:
    电子设备获取待回放的运动轨迹,所述运动轨迹包括多个控制节点;
    所述电子设备提取出所述运动轨迹中相邻的第一控制节点和第二控制节点;
    所述电子设备根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度;其中,在所述运动轨迹中,所述第二控制节点在所述第一控制节点之后;
    当所述电子设备检测到所述运动轨迹回放至所述第一控制节点时,所述电子设备根据所述第一控制节点对应的旋转角度,调整所述运动轨迹的显示方向。
  2. 根据权利要求1所述的方法,其特征在于,所述电子设备根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当所述电子设备检测所述第一方向与所述第一控制节点和所述第二控制节点之间连线所成的第一夹角小于或等于第一阈值时,所述电子设备确定所述第一控制节点对应的旋转角度为0。
  3. 根据权利要求1或2所述的方法,其特征在于,所述电子设备根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当所述电子设备检测所述第一方向与所述第一控制节点和所述第二控制节点之间连线所构成的第一夹角大于第一阈值时,所述电子设备确定所述第一控制节点对应的旋转角度为所述第一夹角。
  4. 根据权利要求1所述的方法,其特征在于,所述电子设备根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当所述电子设备检测所述第一方向与所述第一控制节点和所述第二控制节点之间连线所成的第一夹角小于或等于第一阈值时,或所述第一控制节点和所述第二控制节点之间连线的距离小于或等于第二阈值,所述电子设备确定所述第一控制节点对应的旋转角度为0。
  5. 根据权利要求1或4所述的方法,其特征在于,所述电子设备根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当所述电子设备检测所述第一方向与所述第一控制节点和所述第二控制节点之间连线所成的第一夹角大于第一阈值时,且所述第一控制节点和所述第二控制节点之间连线的距离大于第二阈值,所述电子设备确定所述第一控制节点对应的旋转角度为所述第一夹角。
  6. 根据权利要求2-5任一项所述的方法,其特征在于,所述电子设备根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当所述第一控制节点为所述运动轨迹中的第一个控制节点时,当所述电子设备检测所 述第一方向与所述第一控制节点和所述第二控制节点之间连线所构成的第一夹角大于第三阈值,所述电子设备确定所述第一控制节点对应的旋转角度为所述第一夹角;其中,所述第三阈值小于所述第一阈值。
  7. 根据权利要求2-6任一项所述的方法,其特征在于,所述电子设备根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当所述电子设备检测所述第一控制节点和所述第二控制节点之间连线的距离大于第四阈值,所述电子设备确定所述第一控制节点对应的旋转角度为所述第一夹角。
  8. 根据权利要求1所述的方法,其特征在于,在电子设备获取待回放的运动轨迹之前,所述方法还包括:
    所述电子设备获取到所述电子设备的初始运动轨迹;
    所述电子设备从所述初始运动轨迹中确定出所述多个轨迹点;
    所述电子设备通过所述多个轨迹点的位置信息,从所述多个轨迹点中确定出所述多个控制节点。
  9. 根据权利要求8所述的方法,其特征在于,所述电子设备通过所述多个轨迹点的位置信息,从所述多个轨迹点中确定出所述多个控制节点,具体包括:
    当所述电子设备检测所述多个轨迹点中目标轨迹点分别与其相邻的两个轨迹点连成的线段所构成的夹角大于第五阈值,所述电子设备判断所述目标轨迹点为冗余节点;
    所述电子设备在所述多个轨迹点中删除所述冗余节点,所述多个轨迹点中剩余的轨迹点为所述多个控制节点。
  10. 根据权利要求8或9所述的方法,其特征在于,所述电子设备获取待回放的运动轨迹,具体包括:
    所述电子设备将所述多个控制节点连线成所述运动轨迹。
  11. 一种电子设备,其特征在于,包括:一个或多个处理器、存储器和显示屏;
    所述存储器、所述显示屏与所述一个或多个处理器耦合,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令,所述一个或多个处理器调用所述计算机指令以使得执行:
    获取待回放的运动轨迹,所述运动轨迹包括多个控制节点;
    提取出所述运动轨迹中相邻的第一控制节点和第二控制节点;
    根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度;其中,在所述运动轨迹中,所述第二控制节点在所述第一控制节点之后;
    当检测到所述运动轨迹回放至所述第一控制节点时,根据所述第一控制节点对应的旋转角度,调整所述运动轨迹的显示方向。
  12. 根据权利要求11所述的电子设备,其特征在于,根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当检测所述第一方向与所述第一控制节点和所述第二控制节点之间连线所成的第一夹角小于或等于第一阈值时,确定所述第一控制节点对应的旋转角度为0。
  13. 根据权利要求11或12所述的电子设备,其特征在于,根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当检测所述第一方向与所述第一控制节点和所述第二控制节点之间连线所构成的第一夹角大于第一阈值时,确定所述第一控制节点对应的旋转角度为所述第一夹角。
  14. 根据权利要求11所述的电子设备,其特征在于,根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当检测所述第一方向与所述第一控制节点和所述第二控制节点之间连线所成的第一夹角小于或等于于第一阈值时,或所述第一控制节点和所述第二控制节点之间连线的距离小于或等于第二阈值,确定所述第一控制节点对应的旋转角度为0。
  15. 根据权利要求11或14所述的电子设备,其特征在于,根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当检测所述第一方向与所述第一控制节点和所述第二控制节点之间连线所成的第一夹角大于第一阈值时,且所述第一控制节点和所述第二控制节点之间连线的距离大于第二阈值,确定所述第一控制节点对应的旋转角度为所述第一夹角。
  16. 根据权利要求12-15任一项所述的电子设备,其特征在于,根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当所述第一控制节点为所述运动轨迹中的第一个控制节点时,当检测所述第一方向与所述第一控制节点和所述第二控制节点之间连线所构成的第一夹角大于第三阈值,确定所述第一控制节点对应的旋转角度为所述第一夹角;其中,所述第三阈值小于所述第一阈值。
  17. 根据权利要求12-16任一项所述的电子设备,其特征在于,根据所述第一控制节点、所述第二控制节点以及第一方向,确定所述第一控制节点的旋转角度,具体包括:
    当检测所述第一控制节点和所述第二控制节点之间连线的距离大于第四阈值,确定所述第一控制节点对应的旋转角度为所述第一夹角。
  18. 根据权利要求11所述的电子设备,其特征在于,在获取待回放的运动轨迹之前,所述方法还包括:
    获取到所述电子设备的初始运动轨迹;
    从所述初始运动轨迹中确定出所述多个轨迹点;
    通过所述多个轨迹点的位置信息,从所述多个轨迹点中确定出所述多个控制节点。
  19. 根据权利要求18所述的电子设备,其特征在于,通过所述多个轨迹点的位置信息,从所述多个轨迹点中确定出所述多个控制节点,具体包括:
    当检测所述多个轨迹点中目标轨迹点分别与其相邻的两个轨迹点连成的线段所构成的夹角大于第五阈值,判断所述目标轨迹点为冗余节点;
    在所述多个轨迹点中删除所述冗余节点,所述多个轨迹点中剩余的轨迹点为所述多个控制节点。
  20. 根据权利要求18或19所述的方法,其特征在于,获取待回放的运动轨迹,具体包括:
    将所述多个控制节点连线成所述运动轨迹。
  21. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1至10任一项所述的方法。
  22. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1至10任一项所述的方法。
PCT/CN2021/088805 2020-04-23 2021-04-21 轨迹回放方法及相关装置 WO2021213451A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010329410.6 2020-04-23
CN202010329410.6A CN113554932B (zh) 2020-04-23 2020-04-23 轨迹回放方法及装置

Publications (1)

Publication Number Publication Date
WO2021213451A1 true WO2021213451A1 (zh) 2021-10-28

Family

ID=78129459

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/088805 WO2021213451A1 (zh) 2020-04-23 2021-04-21 轨迹回放方法及相关装置

Country Status (2)

Country Link
CN (1) CN113554932B (zh)
WO (1) WO2021213451A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114117878A (zh) * 2021-11-29 2022-03-01 中国人民解放军国防科技大学 一种基于改进粒子群寻优的目标运动轨迹分段压缩方法
CN116384209A (zh) * 2023-05-30 2023-07-04 江苏伟岸纵横科技股份有限公司 一种用于应急模拟演练的灾害仿真方法
CN117707368A (zh) * 2023-08-31 2024-03-15 荣耀终端有限公司 移动轨迹拟合方法及电子设备
CN117972357A (zh) * 2024-03-26 2024-05-03 山东科瑞特自动化装备有限责任公司 一种水位测量装置的水位监测数据智能处理方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101901551A (zh) * 2010-06-29 2010-12-01 上海英迪信息技术有限公司 车辆监控系统中轨迹回放功能的优化方法
CN103500516A (zh) * 2013-09-26 2014-01-08 深圳市宏电技术股份有限公司 基于电子地图高效率轨迹回放的方法及系统
CN105719351A (zh) * 2014-12-04 2016-06-29 高德软件有限公司 一种显示电子地图的方法和装置
JP2018069753A (ja) * 2016-10-24 2018-05-10 アイシン・エィ・ダブリュ株式会社 運転状況表示システムおよび運転状況表示プログラム
CN109238283A (zh) * 2018-08-24 2019-01-18 广东小天才科技有限公司 一种方向修正方法、装置、设备及存储介质
US20190212164A1 (en) * 2018-01-08 2019-07-11 Alpine Electronics, Inc. Systems and methods for providing direction guidance during off-road routing

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0304358D0 (en) * 2003-02-26 2003-04-02 Palmtop Software B V Navigator 2.0 features
JP2005249589A (ja) * 2004-03-04 2005-09-15 Xanavi Informatics Corp ナビゲーション装置、要約地図配信装置、車両案内方法および地図表示装置
KR20070038859A (ko) * 2005-10-07 2007-04-11 주식회사 현대오토넷 네비게이션 시스템의 지도표시방법
KR20090056219A (ko) * 2007-11-30 2009-06-03 엘지전자 주식회사 네비게이션의 지도 표시 방법 및 장치
JP5382007B2 (ja) * 2010-02-22 2014-01-08 株式会社デンソー 移動軌跡表示装置
JP2012026844A (ja) * 2010-07-22 2012-02-09 Sony Corp 情報処理装置、情報処理方法、及びプログラム
CN103185586B (zh) * 2011-12-30 2017-11-07 上海博泰悦臻电子设备制造有限公司 地图显示方法以及控制地图显示的装置、导航装置
US9470543B2 (en) * 2012-08-30 2016-10-18 Mitsubishi Electric Corporation Navigation apparatus
CN103927795B (zh) * 2013-01-14 2016-08-17 北京中交兴路信息科技有限公司 一种车辆历史行驶轨迹的回放方法和系统
WO2015010165A1 (en) * 2013-07-23 2015-01-29 National Ict Australia Limited Geo-located activity visualisation, editing and sharing
CN106528555B (zh) * 2015-09-10 2019-03-01 中国科学院上海高等研究院 一种快速构建建筑物三维模型的系统
US20180283873A1 (en) * 2015-09-30 2018-10-04 Huawei Technologies Co., Ltd. Calibration method based on dead reckoning technology and portable electronic device
US10271021B2 (en) * 2016-02-29 2019-04-23 Microsoft Technology Licensing, Llc Vehicle trajectory determination to stabilize vehicle-captured video
CN107741944B (zh) * 2017-08-09 2020-04-07 成都路行通信息技术有限公司 一种电子地图仿真轨迹回放方法及系统
CN108344422B (zh) * 2018-02-09 2021-03-30 城市生活(北京)资讯有限公司 一种导航方法及系统
CN109959379B (zh) * 2019-02-13 2021-06-08 歌尔科技有限公司 定位方法及电子设备
CN110553651A (zh) * 2019-09-26 2019-12-10 众虎物联网(广州)有限公司 一种室内导航方法、装置、终端设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101901551A (zh) * 2010-06-29 2010-12-01 上海英迪信息技术有限公司 车辆监控系统中轨迹回放功能的优化方法
CN103500516A (zh) * 2013-09-26 2014-01-08 深圳市宏电技术股份有限公司 基于电子地图高效率轨迹回放的方法及系统
CN105719351A (zh) * 2014-12-04 2016-06-29 高德软件有限公司 一种显示电子地图的方法和装置
JP2018069753A (ja) * 2016-10-24 2018-05-10 アイシン・エィ・ダブリュ株式会社 運転状況表示システムおよび運転状況表示プログラム
US20190212164A1 (en) * 2018-01-08 2019-07-11 Alpine Electronics, Inc. Systems and methods for providing direction guidance during off-road routing
CN109238283A (zh) * 2018-08-24 2019-01-18 广东小天才科技有限公司 一种方向修正方法、装置、设备及存储介质

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114117878A (zh) * 2021-11-29 2022-03-01 中国人民解放军国防科技大学 一种基于改进粒子群寻优的目标运动轨迹分段压缩方法
CN116384209A (zh) * 2023-05-30 2023-07-04 江苏伟岸纵横科技股份有限公司 一种用于应急模拟演练的灾害仿真方法
CN116384209B (zh) * 2023-05-30 2023-08-08 江苏伟岸纵横科技股份有限公司 一种用于应急模拟演练的灾害仿真方法
CN117707368A (zh) * 2023-08-31 2024-03-15 荣耀终端有限公司 移动轨迹拟合方法及电子设备
CN117972357A (zh) * 2024-03-26 2024-05-03 山东科瑞特自动化装备有限责任公司 一种水位测量装置的水位监测数据智能处理方法
CN117972357B (zh) * 2024-03-26 2024-06-07 山东科瑞特自动化装备有限责任公司 一种水位测量装置的水位监测数据智能处理方法

Also Published As

Publication number Publication date
CN113554932B (zh) 2022-07-19
CN113554932A (zh) 2021-10-26

Similar Documents

Publication Publication Date Title
JP7142783B2 (ja) 音声制御方法及び電子装置
WO2020228815A1 (zh) 一种语音唤醒方法及设备
WO2021213451A1 (zh) 轨迹回放方法及相关装置
CN115473957B (zh) 一种图像处理方法和电子设备
WO2021082835A1 (zh) 启动功能的方法及电子设备
WO2021036770A1 (zh) 一种分屏处理方法及终端设备
CN113838490B (zh) 视频合成方法、装置、电子设备及存储介质
CN109819306B (zh) 一种媒体文件裁剪的方法、电子设备和服务器
WO2021175272A1 (zh) 一种应用信息的显示方法及相关设备
WO2021082815A1 (zh) 一种显示要素的显示方法和电子设备
WO2020239001A1 (zh) 一种哼唱识别方法及相关设备
WO2022160991A1 (zh) 权限控制方法和电子设备
WO2022042769A2 (zh) 多屏交互的系统、方法、装置和介质
WO2021185352A1 (zh) 一种版本升级方法及相关装置
CN111222836A (zh) 一种到站提醒方法及相关装置
WO2022143258A1 (zh) 一种语音交互处理方法及相关装置
CN113641634A (zh) 一种日志流量控制的方法以及电子设备
CN112437341B (zh) 一种视频流处理方法及电子设备
CN113448658A (zh) 截屏处理的方法、图形用户接口及终端
WO2022135157A1 (zh) 页面显示的方法、装置、电子设备以及可读存储介质
WO2021104000A1 (zh) 屏幕显示方法及电子设备
CN116828100A (zh) 蓝牙音频播放方法、电子设备及存储介质
CN116561085A (zh) 图片分享方法和电子设备
WO2023116669A1 (zh) 视频生成系统、方法及相关装置
CN116668762B (zh) 录屏方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21792072

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21792072

Country of ref document: EP

Kind code of ref document: A1