CN106780684B - Animation effect realization method and device - Google Patents

Animation effect realization method and device Download PDF

Info

Publication number
CN106780684B
CN106780684B CN201710015061.9A CN201710015061A CN106780684B CN 106780684 B CN106780684 B CN 106780684B CN 201710015061 A CN201710015061 A CN 201710015061A CN 106780684 B CN106780684 B CN 106780684B
Authority
CN
China
Prior art keywords
point
transition
plane
svg
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710015061.9A
Other languages
Chinese (zh)
Other versions
CN106780684A (en
Inventor
陈翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710015061.9A priority Critical patent/CN106780684B/en
Publication of CN106780684A publication Critical patent/CN106780684A/en
Application granted granted Critical
Publication of CN106780684B publication Critical patent/CN106780684B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The embodiment of the invention discloses a method and a device for realizing animation effect, wherein the method for realizing animation effect comprises the following steps: drawing a Scalable Vector Graphics (SVG) line according to a root data source; sequentially acquiring a starting point, each inflection point and a plane coordinate at the ending point of the SVG line to form a plane point set; adding transition attributes to the custom object, and setting transition time to enable the custom object to continuously move between adjacent points of the SVG circuit according to the plane point set and the transition time to form animation effect.

Description

Animation effect realization method and device
Technical Field
The embodiment of the invention relates to the technical field of data processing, in particular to a method and a device for realizing animation effect.
Background
At present, both an ios (mobile operating system of apple) end and an android (mobile operating system developed by google corporation) end can realize animation effects, while a hypertext Markup Language 5(Hyper Text Markup Language5, H5) end cannot realize animation effects, mainly because: the ios end and the android end both provide map native capability, animation effect can be realized by directly depending on an Application Programming Interface (API), and the H5 end does not provide such interfaces, so that the H5 end cannot realize animation effect by depending on the related interfaces.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for implementing an animation effect, which can implement the animation effect without depending on a related interface.
The animation effect implementation method provided by the embodiment of the invention comprises the following steps:
drawing a Scalable Vector Graphics (SVG) line according to a root data source;
sequentially acquiring a starting point, each inflection point and a plane coordinate at the ending point of the SVG line to form a plane point set;
adding transition attributes to the custom object, and setting transition time, so that the custom object can continuously move between adjacent points of the SVG circuit according to the plane point set and the transition time, thereby forming animation effect.
The animation effect implementation device provided by the embodiment of the invention comprises:
a drawing unit for drawing a Scalable Vector Graphics (SVG) line according to a root data source;
the point set forming unit is used for sequentially acquiring the plane coordinates of the starting point, each inflection point and the ending point of the SVG line to form a plane point set;
and the animation realization unit is used for adding transition attributes to the custom object and setting transition time so that the custom object can continuously move between adjacent points of the SVG circuit according to the plane point set and the transition time to form animation effect.
According to the embodiment of the invention, animation effect can be realized independent of related interfaces, and animation effect can be realized independent of a root data source, namely, an SVG line is directly drawn in real time according to the root data source, the plane coordinates of the starting point, the turning point and the end point of the drawn SVG line are taken to form a plane point set, and the transition attribute is added to the custom object and the transition time is set, so that the custom object can continuously move between adjacent points of the SVG line according to the plane point set and the transition time to form animation effect.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a scene diagram of an animation effect implementation method according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for implementing animation effects according to an embodiment of the present invention;
FIG. 3 is another flow chart of a method for implementing animation effects according to an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of an animation effect implementing apparatus according to an embodiment of the present invention;
fig. 5 is another schematic structural diagram of an animation effect implementing apparatus according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Because the end H5 lacks a relevant interface for realizing animation effects, the animation effects cannot be realized at the end H5 in the prior art, so the embodiment of the invention provides the animation effect realization method and the animation effect realization device, which can realize the animation effects without depending on the relevant interface and provide an effective solution for realizing the animation effects at the end H5. The animation effect implementation method provided by the embodiment of the invention can be implemented in an animation effect implementation device, and the animation effect implementation device can be a mobile phone, a tablet personal computer and other terminal equipment. A specific implementation scenario of the animation effect implementation method according to the embodiment of the present invention can be shown in fig. 1, and first draw a Scalable Vector Graphics (SVG) line according to a root data source, and then sequentially obtain plane coordinates (i.e., plane rectangular coordinates, where points are represented by x and y) at a start point, each inflection point, and an end point (e.g., point A, B, C, D in fig. 1) of the SVG line to form a plane point set; adding a transition (transition) attribute to a custom object, and setting a transition time, so that the custom object moves continuously between adjacent points (such as AB, BC, CD) of the SVG line according to the plane point set according to the transition time to form an animation effect. According to the method provided by the embodiment of the invention, the SVG line is directly generated according to the root data source, after the transition attribute is added to the custom object and the transition time is set, the custom object directly performs switching motion between adjacent points of the SVG line according to the plane point set and the transition time, so that the animation effect is realized.
The following are detailed below, and the numbers of the following examples are not intended to limit the preferred order of the examples.
Example one
As shown in fig. 2, the method of the present embodiment includes the following steps:
step 201, drawing an SVG circuit according to a root data source;
SVG is a graphic format based on Extensible Markup Language (XML) for describing two-dimensional vector graphics, which is made by the world wide web consortium and is an open standard. SVG strictly follows XML syntax and describes image contents with a descriptive language in text format, and thus is a vector graphics format independent of image resolution, while vector graphics describe objects with dots and lines, so the file will be small, and at the same time, it can provide a high-definition picture suitable for direct printing or output, so this embodiment will draw SVG lines according to the root data source. The specific drawing method may be as follows:
(1) planning a custom line according to the root data source;
i.e. to plan an animation line directly from the data provided by the root data source.
(2) Acquiring the starting point, each inflection point and the geographic coordinates of the ending point of the custom line to form a geographic point set;
and the geographic coordinates are spherical coordinates representing the positions of the ground points by longitude and latitude, and the geographic point set comprises the geographic coordinates of the starting point, each inflection point and the ending point of the custom line.
(3) And drawing the SVG line on a map according to the geographical point set.
In a specific implementation, the map may provide a Polyline (Polyline) function, and the Polyline function is used to draw a Polyline formed by connecting a series of specified points, and the function is stated as follows:
BOOL Polyline(LPPOINT IpPoints,int nCount);
wherein, the parameter IpPoints POINTs to the POINT structure or the CPoint set, the coordinates of the polyline connection POINTs are stored in the CPoint set in sequence, nCount is the number of the connection POINTs, the parameter must be more than 1, if the drawing is successful, the function returns TRUE, otherwise, the function returns FALSE.
Specifically, in this embodiment, an SVG line may be drawn on a map according to the geographical point set by using a Polyline function provided by the map. The SVG line is a map line which is finally drawn on a map according to the root data source and is displayed, and the expression of the custom line on the map is the SVG line.
Step 202, sequentially obtaining a starting point, each inflection point and a plane coordinate at the ending point of the SVG line to form a plane point set;
specifically, each geographic coordinate included in the above-described geographic point set may be sequentially converted into a plane coordinate to constitute the plane point set. The plane coordinates are also called plane rectangular coordinates, which represent a point by an (x, y) value. Specifically, in this embodiment, the longitude and latitude are converted into x and y values, and the practical principle is to project the longitude and latitude as plane rectangular coordinates x and y in a plane rectangular coordinate system according to a certain projection manner, and specifically, the conversion may be automatically performed by using a related conversion tool, or may be automatically calculated and converted by using a related algorithm, which is not specifically limited herein.
And 203, adding transition attributes to the custom object, and setting transition time, so that the custom object continuously moves between adjacent points of the SVG line according to the plane point set and the transition time to form animation effect.
The custom object may refer to a display mark of a moving object on the SVG line, for example, an arrow or other indication icon with a direction, and the like, which is not specifically limited herein.
Before this step is executed, a coordinate switching display function of the custom object inheriting the native map object may be set so that the custom object can switch display at different plane coordinates. And then setting the custom object at the starting point of the SVG line for display (namely, placing the custom object at the coordinates of the starting point of the SVG line) according to the plane point set, setting an inflection point (namely, a first inflection point) adjacent to the starting point as a transition end point, adding a transition attribute to the custom object, and setting transition time.
The transition attribute is an attribute capable of realizing an animation interaction effect, is a composite attribute, and comprises the following steps: property of the execution transformation: transition-property; time of transformation duration: transition-duration; over an extended period of time, the rate of the transition varies: transition-timing-function; conversion delay time: the transition-delay four sub-attributes are matched to complete a complete transition effect. These four sub-attributes are described separately below:
transition-property is used to specify that a transition effect is performed when one of the properties of an element changes, and has the following main values: none (no attribute change), all (all attribute change), and ent (element genus name). When the value is none, the transition stops executing immediately, when the value is designated as all, the transition effect is executed when any attribute value of the element changes, and ident can designate a certain attribute value of the element.
the transition-duration is used to specify the duration of the element transformation process, taking the values: < time > is a numerical value in units of s (seconds) or ms (milliseconds), and can be applied to all elements.
the value of transition-timing-function allows to change the transformation rate of the attribute value according to the extrapolation of time, there are 6 possible values of transition-timing-function: and (6) ease: (gradual slow down), also default; a linear: (uniform speed); ease-in: (acceleration); easy-out: (deceleration); easy-in-out: (accelerate then decelerate); cubic-bezier: (this value allows customization of a time curve).
transition-delay is used to specify the time when an animation starts to execute, that is, how long after the attribute value of an element is changed, the transition effect starts to execute, and takes the following values: < time > is a numerical value in units of s (seconds) or ms (milliseconds), which is used very similar to the transition-duration, and can also be applied to all elements.
These four sub-attributes of the transition attribute, only < transition-duration > is a necessary value and cannot be 0. Wherein < transition-duration > and < transition-delay > are both times. When two times occur simultaneously, the first is < transition-duration >, and the second is < transition-delay >; when there is only one time, it is < transition-duration >, and < transition-delay > is a default value of 0. Specifically, in the present embodiment, the transition-duration is set.
After adding the transition attribute to the custom object and setting the transition time, the custom object will move continuously between the starting point and the first inflection point of the SVG line according to the transition time.
After that, the transition end point can be reset once according to the plane point set every the transition time until the end point of the SVG line is set as the transition end point. For example, in the scenario shown in fig. 1, the point B may be set as the transition end point for the first time, the point C may be set as the transition end point for the second time, and the point D may be set as the transition end point for the last time, the custom object will move along the point a to the point D according to the transition time (i.e. move from the point a to the point B in the first transition time T, then move from the point B to the point C in the second transition time T, and move from the point C to the point D in the third transition time T), thereby forming the animation effect.
It should be noted that, in this embodiment, the adjacent points of the SVG line are not any two adjacent points on the SVG line, and the adjacent points refer to adjacent points obtained according to the plane point set, and the types of the adjacent points include a start point and an inflection point, an inflection point and an inflection point, and an inflection point and an end point. That is, the custom object of the embodiment is to implement animation effects by switching motions between specified points according to data of a root data source (plane coordinate points converted from specified geographic coordinate points), rather than directly moving along all points on an SVG line.
According to the embodiment of the invention, animation effect can be realized at the end H5, and the customized object can move along the customized line, namely the customized object can move continuously according to the plane point set and the transition time between adjacent points of the SVG line according to the transition time by adding the transition attribute and setting the transition time for the customized object.
Example two
As shown in fig. 3, the method described in the first embodiment, which will be described in further detail by way of example, includes:
step 301, planning a custom line according to a root data source;
i.e. to plan an animation line directly from the data provided by the root data source.
Step 302, obtaining the starting point, each inflection point and the geographic coordinates of the end point of the custom line to form a geographic point set;
and the geographic coordinates are spherical coordinates representing the positions of the ground points by longitude and latitude, and the geographic point set comprises the geographic coordinates of the starting point, each inflection point and the ending point of the custom line.
Step 303, drawing an SVG line on a map according to the geographical point set;
in a specific implementation, the map may provide a Polyline (Polyline) function, and the Polyline function is used to draw a Polyline formed by connecting a series of specified points, and the function is stated as follows:
BOOL Polyline(LPPOINT IpPoints,int nCount);
wherein the parameter IpPoints POINTs to POINT structure or CPoint set, the set stores the coordinates of the polyline connection POINTs in sequence, nCount is the number of connection POINTs, the parameter must be greater than 1, if the drawing line is successful, the function returns TRUE, otherwise returns FALSE.
Specifically, in this embodiment, an SVG line may be drawn on a map according to the geographical point set by using a Polyline function provided by the map. The SVG line is a map line which is finally drawn on a map according to the root data source and is displayed, and the expression of the custom line on the map is the SVG line.
Step 304, sequentially converting each geographic coordinate in the geographic point set into a plane coordinate to form a plane point set;
specifically, each geographic coordinate included in the above-described geographic point set may be sequentially converted into a plane coordinate to constitute the plane point set. The plane coordinates are also called plane rectangular coordinates, which represent a point by an (x, y) value. Specifically, in this embodiment, the longitude and latitude are converted into x and y values, and the practical principle is to project the longitude and latitude as plane rectangular coordinates x and y in a plane rectangular coordinate system according to a certain projection manner, and specifically, the conversion may be automatically performed by using a related conversion tool, or may be automatically calculated and converted by using a related algorithm, which is not specifically limited herein.
Step 305, setting a coordinate switching display function of a custom object inheriting a native object of a map, so that the custom object can be switched and displayed at different plane coordinates;
the custom object may be a display mark of a moving object on the SVG line, for example, an arrow or other indication icon with a direction, and is not limited specifically here.
Step 306, setting the self-defined object at the starting point of the SVG line for display according to the plane point set, and setting the inflection point adjacent to the starting point as a transition end point;
namely, a user-defined object is placed at the coordinate of the starting point of the SVG line, and the first inflection point of the SVG line is set as the transition end point.
The main implementation code for converting the geographic coordinates into the planar coordinates and setting the custom object at the designated position may be as follows:
myOverlay.prototype.anima=function(pnt){
var pixel is this. getprojection ().fromllatlgtoddivpixel (pnt); v/conversion of geographical coordinates into planar coordinates
this is. dom. style. left ═ pixel. getx () + 'px'; placing custom objects in specified locations
this.dom.style.top=pixel.getY()+’px’;
}
307, adding transition attributes to the custom object, and setting transition time to enable the custom object to continuously move between adjacent points of the SVG line according to the transition time to form animation effect;
the transition attribute is an attribute capable of realizing an animation interaction effect, is a composite attribute, and comprises the following steps: property of the execution transformation: transition-property; time of transformation duration: transition-duration; over an extended period of time, the rate of the transition varies: transition-timing-function; conversion delay time: the transition-delay four sub-attributes are matched to complete a complete transition effect.
These four sub-attributes of the transition attribute, only < transition-duration > is a necessary value and cannot be 0. Wherein < transition-duration > and < transition-delay > are both times. When two times occur simultaneously, the first is < transition-duration >, and the second is < transition-delay >; when there is only one time, it is < transition-duration >, and < transition-delay > is a default value of 0. Specifically, in the present embodiment, the transition-duration is set.
After adding the transition attribute to the custom object and setting the transition time, the custom object will move continuously between the starting point and the first inflection point of the SVG line according to the transition time.
It should be noted that, in this embodiment, the adjacent points of the SVG line are not any two adjacent points on the SVG line, and the adjacent points refer to adjacent points obtained according to the plane point set, and the types of the adjacent points include a start point and an inflection point, an inflection point and an inflection point, and an inflection point and an end point. That is, the custom object of the embodiment is to implement animation effects by switching motions between specified points according to data of a root data source (plane coordinate points converted from specified geographic coordinate points), rather than directly moving along all points on an SVG line.
And 308, resetting the transition end point according to the plane point set every transition time until the end point of the SVG line is set as the transition end point.
The transition end point comprises an inflection point and an end point on the SVG line, namely other inflection points and end points of the SVG line are sequentially set as the transition end point according to the plane point set, and the main stage implementation codes of the recursion process can be as follows:
var tag=0;
function go (s, cb) {// record transition time
var time=0;
for (var j ═ 0; j < s; j + + {// start recursion from the incoming set of points
~function(z,s){
var aa=z;
var t=s;
setTimeout(function)。
According to the method, animation effect can be achieved independently of related interfaces, animation effect can be achieved independently of a root data source, namely the SVG circuit is directly drawn in real time according to the root data source, the plane coordinates of the starting point, the inflection point and the end point of the drawn SVG circuit are taken to form a plane point set, transition attributes are added to a user-defined object, transition time is set, the user-defined object is enabled to move continuously according to the plane point set and the transition time between adjacent points of the SVG circuit, so that animation effect is formed, animation effect can be achieved at the H5 end by the method, and the user-defined object can move along the user-defined circuit.
EXAMPLE III
In order to better implement the above method, an embodiment of the present invention further provides an animation effect implementing apparatus, as shown in fig. 4, the apparatus of the embodiment includes: the drawing unit 401, the point set forming unit 402, and the animation realization unit 403 are as follows:
(1) a drawing unit 401;
and a drawing unit 401, configured to draw the SVG line according to the root data source.
SVG is a graphic format based on Extensible Markup Language (XML) for describing two-dimensional vector graphics, which is made by the world wide web consortium and is an open standard. SVG strictly follows XML syntax and describes image content with a descriptive language in text format, and thus is a vector graphics format independent of image resolution, and vector graphics describes an object with dots and lines, so that a file is small, and at the same time, a high-definition picture can be provided, which is suitable for direct printing or output, so that the drawing unit 401 in this embodiment draws an SVG line according to a root data source, and the drawing unit 401 may include a planning subunit, an acquiring subunit, and a drawing subunit, as follows:
the planning subunit is used for planning the custom line according to the root data source;
namely, the planning subunit plans an animation line directly according to the data provided by the root data source.
The acquisition subunit is used for acquiring the geographic coordinates of the starting point, each inflection point and the ending point of the custom line to form a geographic point set;
and the geographic coordinates are spherical coordinates representing the positions of the ground points by longitude and latitude, and the geographic point set comprises the geographic coordinates of the starting point, each inflection point and the ending point of the custom line.
And the drawing subunit draws the SVG line on a map according to the geographical point set.
In a specific implementation, the map may provide a Polyline (Polyline) function, and the Polyline function is used to draw a Polyline formed by connecting a series of specified points, and the function is stated as follows:
BOOL Polyline(LPPOINT IpPoints,int nCount);
wherein the parameter IpPoints POINTs to POINT structure or CPoint set, the set stores the coordinates of the polyline connection POINTs in sequence, nCount is the number of connection POINTs, the parameter must be greater than 1, if the drawing line is successful, the function returns TRUE, otherwise returns FALSE.
Specifically, in this embodiment, the drawing subunit may draw the SVG line on the map according to the geographical point set by using a Polyline function provided by the map. The SVG line is a map line which is finally drawn on a map according to the root data source and is displayed, and the expression of the custom line on the map is the SVG line.
(2) A point set forming unit 402;
and a point set forming unit 402, configured to sequentially acquire the plane coordinates of the start point, each inflection point, and the end point of the SVG line to form a plane point set.
Specifically, the point set forming unit 402 may sequentially convert each geographic coordinate included in the above-described geographic point set into a plane coordinate to constitute the plane point set. The plane coordinates are also called plane rectangular coordinates, which represent a point by an (x, y) value. Specifically, in this embodiment, the longitude and latitude are converted into x and y values, and the practical principle is to project the longitude and latitude as plane rectangular coordinates x and y in a plane rectangular coordinate system according to a certain projection manner, and specifically, the conversion may be automatically performed by using a related conversion tool, or may be automatically calculated and converted by using a related algorithm, which is not specifically limited herein.
(3) An animation realization unit 403;
and an animation implementation unit 403, configured to add a transition attribute to the custom object, and set transition time, so that the custom object continuously moves between adjacent points of the SVG line according to the plane point set and according to the transition time, so as to form an animation effect.
The custom object may be a display mark of a moving object on the SVG line, for example, an arrow or other indication icon with a direction, and is not limited specifically here.
The device of the embodiment may further include a first setting unit and a second setting unit, where the first setting unit may set a coordinate switching display function of the custom object inheriting the native map object, so that the custom object can be switched to be displayed at different planar coordinates. And then a second setting unit sets the custom object at the starting point of the SVG line for display (namely, the custom object is placed at the coordinates of the starting point of the SVG line) according to the plane point set, sets an inflection point (namely, a first inflection point) adjacent to the starting point as a transition end point, and then adds a transition attribute to the custom object by the animation realization unit and sets transition time.
The transition attribute is an attribute capable of realizing an animation interaction effect, is a composite attribute, and comprises the following steps: property of the execution transformation: transition-property; time of transformation duration: transition-duration; over an extended period of time, the rate of the transition varies: transition-timing-function; conversion delay time: the transition-delay four sub-attributes are matched to complete a complete transition effect. These four sub-attributes are described separately below:
transition-property is used to specify that a transition effect is performed when one of the properties of an element changes, and has the following main values: none (no attribute change), all (all attribute change), and ent (element genus name). When the value is none, the transition stops executing immediately, when the value is designated as all, the transition effect is executed when any attribute value of the element changes, and ident can designate a certain attribute value of the element.
the transition-duration is used to specify the duration of the element transformation process, taking the values: < time > is a numerical value in units of s (seconds) or ms (milliseconds), and can be applied to all elements.
the value of transition-timing-function allows to change the transformation rate of the attribute value according to the extrapolation of time, there are 6 possible values of transition-timing-function: and (6) ease: (gradual slow down), also default; a linear: (uniform speed); ease-in: (acceleration); easy-out: (deceleration); easy-in-out: (accelerate then decelerate); cubic-bezier: (this value allows customization of a time curve).
transition-delay is used to specify the time when an animation starts to execute, that is, how long after the attribute value of an element is changed, the transition effect starts to execute, and takes the following values: < time > is a numerical value in units of s (seconds) or ms (milliseconds), which is used very similar to the transition-duration, and can also be applied to all elements.
These four sub-attributes of the transition attribute, only < transition-duration > is a necessary value and cannot be 0. Wherein < transition-duration > and < transition-delay > are both times. When two times occur simultaneously, the first is < transition-duration >, and the second is < transition-delay >; when there is only one time, it is < transition-duration >, and < transition-delay > is a default value of 0. Specifically, in the present embodiment, the transition-duration is set.
And after the animation realization unit adds transition attributes to the custom object and sets transition time, the custom object continuously moves between the starting point and the first inflection point of the SVG line according to the transition time.
Thereafter, the updating unit may reset the transition end point according to the plane point set every the above-described transition time until the end point of the SVG line is set as the transition end point. For example, in the scenario shown in fig. 1, the point B may be set as the transition end point for the first time, the point C may be set as the transition end point for the second time, and the point D may be set as the transition end point for the last time, the custom object will move along the point a to the point D according to the transition time (i.e. move from the point a to the point B in the first transition time T, then move from the point B to the point C in the second transition time T, and move from the point C to the point D in the third transition time T), thereby forming the animation effect.
It should be noted that, in this embodiment, the adjacent points of the SVG line are not any two adjacent points on the SVG line, and the adjacent points refer to adjacent points obtained according to the plane point set, and the types of the adjacent points include a start point and an inflection point, an inflection point and an inflection point, and an inflection point and an end point. That is, the custom object of the embodiment is to implement animation effects by switching motions between specified points according to data of a root data source (plane coordinate points converted from specified geographic coordinate points), rather than directly moving along all points on an SVG line.
It should be noted that, when the animation effect implementation apparatus provided in the foregoing embodiment implements an animation effect, only the division of the above functional modules is used as an example, in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the above described functions. In addition, the animation effect implementation device and the animation effect implementation method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
In the embodiment, animation effect can be realized independent of related interfaces, animation effect can be realized independent of a root data source, namely, a drawing unit directly draws an SVG circuit in real time according to the root data source, a point set forming unit takes plane coordinates of a starting point, an inflection point and an end point of the drawn SVG circuit to form a plane point set, and an animation realizing unit adds transition attributes and sets transition time for a custom object, so that the custom object moves continuously according to the plane point set and the transition time between adjacent points of the SVG circuit to form animation effect.
Example four
An embodiment of the present invention further provides an animation effect implementing device, as shown in fig. 5, which shows a schematic structural diagram of a device according to an embodiment of the present invention, specifically:
the apparatus may include Radio Frequency (RF) circuitry 501, memory 502 including one or more computer-readable storage media, input unit 503, display unit 504, sensor 505, audio circuitry 506, Wireless Fidelity (WiFi) module 507, processor 508 including one or more processing cores, and power supply 509. Those skilled in the art will appreciate that the configuration of the device shown in fig. 5 is not intended to be limiting of the device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components. Wherein:
the RF circuit 501 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for receiving downlink information of a base station and then sending the received downlink information to the one or more processors 508 for processing; in addition, data relating to uplink is transmitted to the base station. In general, RF circuit 501 includes, but is not limited to, an antenna, at least one Amplifier, a tuner, one or more oscillators, a Subscriber Identity Module (SIM) card, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the RF circuitry 501 may also communicate with networks and other devices via wireless communications. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Message Service (SMS), and the like.
The memory 502 may be used to store software programs and modules, and the processor 508 executes various functional applications and data processing by operating the software programs and modules stored in the memory 502. The memory 502 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the device, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 502 may also include a memory controller to provide the processor 508 and the input unit 503 access to the memory 502.
The input unit 503 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control. In particular, in one particular embodiment, the input unit 503 may include a touch-sensitive surface as well as other input devices. The touch-sensitive surface, also referred to as a touch display screen or a touch pad, may collect touch operations by a user (e.g., operations by a user on or near the touch-sensitive surface using a finger, a stylus, or any other suitable object or attachment) thereon or nearby, and drive the corresponding connection device according to a predetermined program. Alternatively, the touch sensitive surface may comprise two parts, a touch detection means and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 508, and can receive and execute commands sent by the processor 508. In addition, touch sensitive surfaces may be implemented using various types of resistive, capacitive, infrared, and surface acoustic waves. The input unit 503 may include other input devices in addition to the touch-sensitive surface. In particular, other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 504 may be used to display information input by or provided to the user and various graphical user interfaces of the terminal, which may be made up of graphics, text, icons, video, and any combination thereof. The Display unit 504 may include a Display panel, and optionally, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch-sensitive surface may overlay the display panel, and when a touch operation is detected on or near the touch-sensitive surface, the touch operation is transmitted to the processor 508 to determine the type of touch event, and then the processor 508 provides a corresponding visual output on the display panel according to the type of touch event. Although in FIG. 5 the touch-sensitive surface and the display panel are two separate components to implement input and output functions, in some embodiments the touch-sensitive surface may be integrated with the display panel to implement input and output functions.
The device may also include at least one sensor 505, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor may include an ambient light sensor that may adjust the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that may turn off the display panel and/or the backlight when the terminal is moved to the ear. As one of the motion sensors, the gravity acceleration sensor can detect the magnitude of acceleration in each direction (generally, three axes), detect the magnitude and direction of gravity when the device is stationary, and can be used for applications of recognizing the posture of the device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured in the device, detailed description is omitted here.
Audio circuitry 506, a speaker, and a microphone may provide an audio interface between the user and the terminal. The audio circuit 506 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electric signal, which is received by the audio circuit 506 and converted into audio data, which is then processed by the audio data output processor 508, and then transmitted to, for example, another terminal via the RF circuit 501, or the audio data is output to the memory 502 for further processing. The audio circuit 506 may also include an earbud jack to provide communication of peripheral headphones with the device.
WiFi belongs to short-distance wireless transmission technology, and the device can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 507, and provides wireless broadband internet access for the user. Although fig. 5 shows the WiFi module 507, it is understood that it does not belong to the essential constitution of the device, and may be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 508 is a control center of the apparatus, connects various parts of the entire apparatus using various interfaces and lines, performs various functions of the apparatus and processes data by operating or executing software programs and/or modules stored in the memory 502, and calling data stored in the memory 502, thereby performing overall monitoring of the apparatus. Optionally, processor 508 may include one or more processing cores; preferably, the processor 508 may integrate an application processor, which primarily handles operating systems, user interfaces, application programs, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 508.
The device also includes a power supply 509 (e.g., a battery) for powering the various components, which may preferably be logically connected to the processor 508 via a power management system to manage charging, discharging, and power consumption management functions via the power management system. The power supply 509 may also include any component such as one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown, the device may further include a camera, a bluetooth module, etc., which will not be described herein. Specifically, in this embodiment, the processor 508 in the apparatus loads the executable file corresponding to the process of one or more application programs into the memory 502 according to the following instructions, and the processor 508 runs the application programs stored in the memory 502, thereby implementing various functions:
drawing an SVG circuit according to the root data source;
sequentially acquiring a starting point, each inflection point and a plane coordinate at the ending point of the SVG line to form a plane point set;
adding transition attributes to the custom object, and setting transition time, so that the custom object can continuously move between adjacent points of the SVG circuit according to the plane point set and the transition time, thereby forming animation effect.
Specifically, the processor 508 may draw the SVG line as follows:
planning a custom line according to the root data source;
sequentially acquiring a starting point, each inflection point and a plane coordinate at the ending point of the SVG line to form a plane point set;
and drawing the SVG line on a map according to the geographical point set.
Specifically, the processor 508 may construct a set of plane points as follows:
and sequentially converting each geographic coordinate included in the geographic point set into a plane coordinate to form the plane point set.
Further, before adding a transition attribute to the custom object and setting a transition time, the processor 508 is further configured to,
setting a coordinate switching display function of the custom object inheriting a native object of the map so that the custom object can be switched to be displayed at different plane coordinates.
Further, after setting the coordinate switch display function of the custom object inheriting the native map object, the processor 508 is further configured to,
and according to the plane point set, setting the self-defined object at the starting point of the SVG line for display, and setting the inflection point adjacent to the starting point as a transition end point.
Further, after adding a transition attribute to the custom object and setting a transition time, the processor 508 is further configured to,
and resetting the transition end point according to the plane point set every other transition time until the end point of the SVG line is set as the transition end point.
The device of this embodiment can not rely on relevant interface to realize animation effect, and relies on the root data source to realize animation effect, directly draws the SVG circuit in real time according to the root data source promptly, gets the plane coordinate constitution plane point set of the starting point, inflection point and the terminal point of the SVG circuit of drawing, through adding transition attribute and setting up transition time for self-defined object, makes self-defined object basis the plane point set according between the adjacent point of SVG circuit according to transition time continuous motion to form animation effect, the device of this embodiment can realize animation effect at H5 end, and can make self-defined object move along self-defined circuit.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer (which may be a personal computer, an apparatus, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (8)

1. An animation effect implementation method, comprising:
drawing a Scalable Vector Graphics (SVG) line according to a root data source;
sequentially acquiring a starting point, each inflection point and a plane coordinate at the ending point of the SVG line to form a plane point set;
according to the plane point set, a custom object is arranged at a starting point of the SVG line to be displayed, an inflection point adjacent to the starting point is arranged as a transition end point, and the custom object can be displayed in different plane coordinates in a switching mode;
setting a coordinate switching display function of the custom object inheriting a native object of the map so that the custom object can be switched and displayed at different plane coordinates;
adding transition attributes to the custom object, and setting transition time, so that the custom object can continuously move between adjacent points of the SVG circuit according to the plane point set and the transition time to form animation effect;
and resetting the transition end point according to the plane point set every other transition time until the end point of the SVG line is set as the transition end point.
2. The method of claim 1, wherein said drawing an SVG line from a root data source comprises:
planning a custom line according to the root data source;
acquiring the starting point, each inflection point and the geographic coordinates of the ending point of the custom line to form a geographic point set;
and drawing the SVG line on a map according to the geographical point set.
3. The method of claim 2, wherein the sequentially obtaining plane coordinates at the start point, each inflection point, and the end point of the SVG line to form a set of plane points comprises:
and sequentially converting each geographic coordinate included in the geographic point set into a plane coordinate to form the plane point set.
4. An animation effect realization apparatus, comprising:
a drawing unit for drawing a Scalable Vector Graphics (SVG) line according to a root data source;
the point set forming unit is used for sequentially acquiring the plane coordinates of the starting point, each inflection point and the ending point of the SVG line to form a plane point set;
the second setting unit is used for setting a custom object at a starting point of the SVG line for display according to the plane point set, and setting an inflection point adjacent to the starting point as a transition end point, wherein the custom object can be displayed in different plane coordinates in a switching manner;
the first setting unit is used for setting the coordinate switching display function of the self-defined object inheriting the map native object before the animation implementation unit adds the transition attribute to the self-defined object and sets the transition time, so that the self-defined object can be switched and displayed at different plane coordinates;
the animation realization unit is used for adding transition attributes to the custom object and setting transition time so that the custom object can continuously move between adjacent points of the SVG circuit according to the plane point set and the transition time to form animation effect;
and the updating unit is used for resetting the transition end point according to the plane point set every transition time after the animation realization unit adds the transition attribute to the user-defined object and sets the transition time until the end point of the SVG line is set as the transition end point.
5. The apparatus of claim 4, wherein the rendering unit comprises:
the planning subunit is used for planning a user-defined line according to the root data source;
the acquisition subunit is used for acquiring the geographic coordinates of the starting point, each inflection point and the ending point of the custom line to form a geographic point set;
and the drawing subunit is used for drawing the SVG line on a map according to the geographical point set.
6. The apparatus according to claim 5, characterized in that the point set forming unit is in particular adapted to,
and sequentially converting each geographic coordinate included in the geographic point set into a plane coordinate to form the plane point set.
7. An animation effect realization device, characterized in that it comprises a processor and a memory storing a software program, the processor running the software program to perform the steps of the method according to any one of claims 1 to 3.
8. A storage medium, characterized in that the storage medium comprises instructions to cause a computer device to perform the steps of the method according to any of claims 1-3.
CN201710015061.9A 2017-01-09 2017-01-09 Animation effect realization method and device Active CN106780684B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710015061.9A CN106780684B (en) 2017-01-09 2017-01-09 Animation effect realization method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710015061.9A CN106780684B (en) 2017-01-09 2017-01-09 Animation effect realization method and device

Publications (2)

Publication Number Publication Date
CN106780684A CN106780684A (en) 2017-05-31
CN106780684B true CN106780684B (en) 2021-08-31

Family

ID=58948661

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710015061.9A Active CN106780684B (en) 2017-01-09 2017-01-09 Animation effect realization method and device

Country Status (1)

Country Link
CN (1) CN106780684B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107945253A (en) * 2017-11-21 2018-04-20 腾讯数码(天津)有限公司 A kind of animation effect implementation method, device and storage device
CN109829956A (en) * 2017-11-23 2019-05-31 腾讯科技(深圳)有限公司 Data display method, device and electronic equipment
CN108038890A (en) * 2017-12-06 2018-05-15 广州视源电子科技股份有限公司 Polar plot demenstration method, device, equipment and computer-readable storage medium
CN110187942B (en) * 2018-02-23 2024-04-16 北京京东尚科信息技术有限公司 Method, device, system and medium for realizing connecting point line animation
CN109669751A (en) * 2018-12-14 2019-04-23 Oppo广东移动通信有限公司 A kind of method for drafting of input frame, device, terminal and computer storage medium
CN110555123B (en) * 2019-07-23 2022-09-02 深圳赛安特技术服务有限公司 Method and device for processing curvilinear motion of interface elements and computer equipment
CN111063010A (en) * 2019-12-12 2020-04-24 北京明略软件系统有限公司 Map motion track animation realization method and device, electronic equipment and storage medium
CN111311715B (en) * 2020-02-14 2023-07-21 北京三快在线科技有限公司 Method and device for adding animation effect in webpage

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101425225A (en) * 2007-10-30 2009-05-06 厦门雅迅网络股份有限公司 Method for managing automobile line by using GPS technology
CN104616520A (en) * 2014-05-09 2015-05-13 腾讯科技(深圳)有限公司 Method and device for dynamically recording navigation trail

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7587274B2 (en) * 2006-03-14 2009-09-08 Sap Ag System and method for navigating a facility
CN103226604B (en) * 2013-04-27 2017-02-15 上海来信信息科技发展有限公司 SVG-based Web GIS system and relevant energy consumption monitoring system
CN106021519A (en) * 2016-05-24 2016-10-12 腾讯科技(深圳)有限公司 Dynamic display method and device for pictures

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101425225A (en) * 2007-10-30 2009-05-06 厦门雅迅网络股份有限公司 Method for managing automobile line by using GPS technology
CN104616520A (en) * 2014-05-09 2015-05-13 腾讯科技(深圳)有限公司 Method and device for dynamically recording navigation trail

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ArcGIS API for JavaScript realizes the drawing of graphics such as point multipoint line polyline cirle;ProgrammerSought;《ProgrammerSought》;20130808;全文 *
百度地图API三:实时轨迹动态展现;liusaint1992;《https://blog.csdn.net/liusaint1992/article/details/50072929》;20151127;全文 *
百度地图API四:实现轨迹动态回放功能;liusaint1992;《https://blog.csdn.net/liusaint1992/article/details/50082781》;20151128;全文 *
谷歌地图多个点画轨迹;CSDN;《https://bbs.csdn.net/topics/390777964》;20140507;全文 *

Also Published As

Publication number Publication date
CN106780684A (en) 2017-05-31

Similar Documents

Publication Publication Date Title
CN106780684B (en) Animation effect realization method and device
CN107741809B (en) Interaction method, terminal, server and system between virtual images
CN110795666B (en) Webpage generation method, device, terminal and storage medium
CN105867751B (en) Operation information processing method and device
US20150082231A1 (en) Method and terminal for displaying desktop
CN107193518B (en) Information display method and terminal equipment
US9760998B2 (en) Video processing method and apparatus
CN111178012A (en) Form rendering method, device and equipment and storage medium
CN107247691B (en) Text information display method and device, mobile terminal and storage medium
CN107666406B (en) Intelligent card display method and device
US11935564B2 (en) Video editing method and intelligent mobile terminal
CN110796725A (en) Data rendering method, device, terminal and storage medium
CN113313804B (en) Image rendering method and device, electronic equipment and storage medium
CN110673770A (en) Message display method and terminal equipment
CN111580815A (en) Editing method of page elements and related equipment
CN109614173B (en) Skin changing method and device
CN109117037B (en) Image processing method and terminal equipment
CN111200648B (en) Service calling method, device, terminal equipment and storage medium
CN106204588B (en) Image processing method and device
CN111142759B (en) Information sending method and electronic equipment
US20160330587A1 (en) Information obtaining method, server, terminal, and system
CN105320532B (en) Method, device and terminal for displaying interactive interface
CN110717486B (en) Text detection method and device, electronic equipment and storage medium
CN111128252A (en) Data processing method and related equipment
CN114510417A (en) Image rendering effect testing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant