CN109343770B - Interactive feedback method, apparatus and recording medium - Google Patents

Interactive feedback method, apparatus and recording medium Download PDF

Info

Publication number
CN109343770B
CN109343770B CN201811133444.7A CN201811133444A CN109343770B CN 109343770 B CN109343770 B CN 109343770B CN 201811133444 A CN201811133444 A CN 201811133444A CN 109343770 B CN109343770 B CN 109343770B
Authority
CN
China
Prior art keywords
click operation
effect
music
beat
animation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811133444.7A
Other languages
Chinese (zh)
Other versions
CN109343770A (en
Inventor
罗飞虎
唐嘉莉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201811133444.7A priority Critical patent/CN109343770B/en
Publication of CN109343770A publication Critical patent/CN109343770A/en
Application granted granted Critical
Publication of CN109343770B publication Critical patent/CN109343770B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B19/00Driving, starting, stopping record carriers not specifically of filamentary or web form, or of supports therefor; Control thereof; Control of operating function ; Driving both disc and head
    • G11B19/02Control of operating function, e.g. switching from recording to reproducing
    • G11B19/022Control panels
    • G11B19/025'Virtual' control panels, e.g. Graphical User Interface [GUI]
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An interactive feedback method, apparatus, and recording medium are disclosed. The interactive feedback method comprises the following steps: acquiring the repeated beat time length of a piece of music; dividing the music into a plurality of continuous repeated beat time periods by taking the repeated beat time length as a unit; and when a first click operation is detected in the playing process of the music, determining a first repeated beat time period in which a time point of the first click operation occurs, and playing a first effect corresponding to the first click operation in a second repeated beat time period after the first repeated beat time period.

Description

Interactive feedback method, apparatus and recording medium
Technical Field
The present invention relates to the field of interactive control, and more particularly, to an interactive feedback method, apparatus, and recording medium.
Background
Existing music animations take the form of sequential play in time. During playback, only simple control operations such as pausing, resuming playback, jumping to a certain playback position, etc. can be performed. However, these control operations are all fixed control operations performed for the playing progress of the music animation. In other words, the user can perform control in terms of the progress of play only by pressing a preset icon. But the playing contents of the music animation are fixed and do not change in response to an external input.
Disclosure of Invention
In view of the above circumstances, it is desirable to provide an interactive feedback method capable of changing its play content in response to an external input, an apparatus applying the method, and a computer-readable recording medium storing a computer program of the method.
According to an aspect of the present invention, there is provided an interactive feedback method, including: acquiring the repeated beat time length of a piece of music; dividing the music into a plurality of continuous repeated beat time periods by taking the repeated beat time length as a unit; and when a first click operation is detected in the playing process of the music, determining a first repeated beat time period in which a time point of the first click operation occurs, and playing a first effect corresponding to the first click operation in a second repeated beat time period after the first repeated beat time period.
In addition, in the interactive feedback method according to the present invention, after the first click operation is detected, a step of determining the first effect is further included, and the step of determining the first effect includes: selecting one effect group among a plurality of effect groups based on a time point at which the first click operation occurs, and selecting one effect among the effect groups as the first effect.
In addition, in the interactive feedback method according to the present invention, the selecting one effect group among a plurality of effect groups based on a time point at which the first click operation occurs, and selecting one effect among the effect groups as the first effect further includes: judging whether a second clicking operation is detected before the first clicking operation is detected; if a second click operation is detected before the first click operation is detected, judging whether the time interval between the first click operation and the second click operation is smaller than a preset threshold value; if the time interval is smaller than a preset threshold value, selecting an effect group where a second effect played in response to a second click operation is located, and selecting one effect as the first effect; if a second click operation has not been detected before the first click operation is detected, or if the time interval is greater than a predetermined threshold, selecting a group of effects based on a predetermined rule and selecting one of the effects as the first effect.
In addition, in the interactive feedback method according to the present invention, the first effect includes a first audio corresponding to the first click operation.
In addition, in the interactive feedback method according to the present invention, the first effect includes a first animation corresponding to the first click operation.
In addition, the interactive feedback method according to the present invention may further include: when a first click operation is not detected in the playing process of the music, a scene corresponding to the music is played, wherein the scene comprises a beat animation corresponding to a repeated beat period of the music.
In addition, in the interactive feedback method according to the present invention, playing the first effect corresponding to the first click operation includes: and replacing the currently played rhythm animation with the first animation.
In addition, in the interactive feedback method according to the present invention, the scene further includes a background image, and the method further includes: dividing the music into a plurality of continuous combined beat periods, wherein each of the combined beat periods includes a plurality of the repeated beat periods, and transforming the beat animation and/or the background image when the position at which the music is played enters a next combined beat period from a current combined beat period.
According to another aspect of the present invention, there is provided an interactive feedback device, including: the acquisition unit is used for acquiring the repeated beat time length of the piece of music; a dividing unit configured to divide the music into a plurality of continuous repeated tempo periods in units of the repeated tempo duration; a playback unit configured to perform a playback operation; the detection unit is used for detecting click operation; and the control unit is used for determining a first repeated beat time interval in which a time point of the first click operation is positioned when the detection unit detects the first click operation in the playing process of the music, and controlling the playing unit to play a first effect corresponding to the first click operation in a second repeated beat time interval after the first repeated beat time interval.
In addition, in the interactive feedback device according to an embodiment of the present invention, the control unit is further configured to: after the detection unit detects the first click operation, performing processing for determining the first effect, and the processing for determining the first effect includes: selecting one effect group among a plurality of effect groups based on a time point at which the first click operation occurs, and selecting one effect among the effect groups as the first effect.
In addition, in the interactive feedback device according to an embodiment of the present invention, the processing of selecting one effect group among a plurality of effect groups based on a time point at which the first click operation occurs, and selecting one effect of the effect group as the first effect further includes: judging whether a second clicking operation is detected before the first clicking operation is detected; if a second click operation is detected before the first click operation is detected, judging whether the time interval between the first click operation and the second click operation is smaller than a preset threshold value; if the time interval is smaller than a preset threshold value, selecting an effect group where a second effect played in response to a second click operation is located, and selecting one effect as the first effect; if a second click operation has not been detected before the first click operation is detected, or if the time interval is greater than a predetermined threshold, selecting a group of effects based on a predetermined rule and selecting one of the effects as the first effect.
In addition, in the interactive feedback apparatus according to the embodiment of the present invention, the first effect includes a first audio corresponding to the first click operation.
In addition, in the interactive feedback apparatus according to the embodiment of the present invention, the first effect includes a first animation corresponding to the first click operation.
In addition, in the interactive feedback device according to an embodiment of the present invention, when the detection unit does not detect the first click operation during the playing of the music, the control unit further controls the playing unit to play a scene corresponding to the music, wherein the scene includes a beat animation corresponding to a repeated beat period of the music.
In addition, in the interactive feedback device according to an embodiment of the present invention, when playing the first effect corresponding to the first click operation, the playing unit is further configured to: and replacing the currently played rhythm animation with the first animation.
In addition, in the interactive feedback device according to an embodiment of the present invention, the scene further includes a background picture, and the dividing unit is further configured to divide the music into a plurality of consecutive combined beat periods, wherein each of the combined beat periods includes a plurality of the repeated beat periods; and the playback unit is further configured to: and when the position of the music playing enters the next combined beat time period from the current combined beat time period, transforming the beat animation and/or the background image.
According to still another aspect of the present invention, there is provided an interactive feedback apparatus including: a processor, and a memory having a computer program stored thereon which, when executed by the processor, performs the steps of: acquiring the repeated beat time length of a piece of music; dividing the music into a plurality of continuous repeated beat time periods by taking the repeated beat time length as a unit; and when a first click operation is detected in the playing process of the music, determining a first repeated beat time period in which a time point of the first click operation occurs, and playing a first effect corresponding to the first click operation in a second beat time period after the first repeated beat time period.
According to still another aspect of the present invention, there is provided a computer-readable recording medium storing a computer program which, when executed by a processor, implements the steps of: acquiring the repeated beat time length of a piece of music; dividing the music into a plurality of continuous repeated beat time periods by taking the repeated beat time length as a unit; and when a first click operation is detected in the playing process of the music, determining a first repeated beat time period in which a time point of the first click operation occurs, and playing a first effect corresponding to the first click operation in a second repeated beat time period after the first repeated beat time period.
With the interactive feedback method, apparatus, and recording medium according to the embodiments of the present invention, it is possible to change playback content as feedback in response to an external input. In addition, the effect corresponding to the external input can be played while automatically fitting the tempo of the original music, and since the time interval of the clicking operation of the external input is arbitrary and is generally different from the original music tempo, the effect of multi-rhythm combined playing is achieved.
Drawings
FIG. 1 is a flow diagram illustrating an interactive feedback method according to an embodiment of the present invention;
fig. 2 is a diagram showing an example of temporal precedence of performing a first click operation and playing a first effect;
fig. 3 is a flowchart showing an example of how to determine a corresponding play effect when a plurality of click operations are detected during the play of music;
FIG. 4 illustrates one example of a group of animations that is played in response to a click operation;
fig. 5 shows the temporal correspondence of the repetitive beat period of music, the beat animation, and the scene;
FIG. 6 is a functional block diagram illustrating an interactive feedback device according to an embodiment of the present invention;
FIG. 7 shows an interactive feedback device as an example of a hardware entity, according to an embodiment of the invention; and
fig. 8 illustrates a schematic diagram of a computer-readable recording medium according to an embodiment of the present invention.
Detailed Description
Various preferred embodiments of the present invention will be described below with reference to the accompanying drawings. The following description with reference to the accompanying drawings is provided to assist in understanding the exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist understanding, but they are to be construed as merely illustrative. Accordingly, those skilled in the art will recognize that various changes and modifications can be made to the embodiments described herein without departing from the scope and spirit of the present invention. Also, in order to make the description clearer and simpler, a detailed description of functions and configurations well known in the art will be omitted.
HTML (Hyper Text Mark-up Language), i.e., hypertext markup Language, marks various parts of a web page to be displayed by markup symbols. "hypertext" refers to the non-text elements that may contain pictures, links, and even music and programs. HTML5 is a fifth version of HTML. For example, the interactive feedback method according to the present invention may be developed based on HTML 5. Of course, the invention is not limited thereto. Those skilled in the art will appreciate that the interactive feedback method according to the present invention may also be developed based on other languages. Therefore, before describing embodiments of the present invention, some elements of HTML5 are first described.
canvas: allowing the scripting language to dynamically render the bit image.
audio: allowing the scripting language to dynamically play audio files in the format of MP3 or the like.
webaudio: a more advanced audio interface that allows a scripting language to dynamically play audio files in MP3 or the like and read the attributes of the audio files.
requestAnimFrame: the browser is an interface for timed loop operations.
Hereinafter, an interactive feedback method according to an embodiment of the present invention will be described with reference to fig. 1. The interactive feedback method is applied during the playing of a piece of music. For example, during the playing of the piece of music, a series of controls such as acquiring a background play status, playing, pausing, stopping, controlling the progress of playing, listening to music pausing, and listening to music stopping may be performed.
In a specific implementation, for example, in HTML5, the above-described control may be implemented through the following interface.
wx. getbackgroundaaudioplayerstate (OBJECT): the method is used for acquiring the background music playing state. The return parameters include: success (callback function with successful interface call), fail (callback function with failed interface call), and complete (callback function with stopped interface call). Wherein success further comprises the following parameters: duration (length of selected audio), currentPosition (play position of selected audio), status (play state), downloadpercentage (download progress of audio), and dataUrl (song data connection).
Aux. playbackgroup audio (OBJECT): for playing music.
Pausebackgroundaudio (): for pausing the playing of music.
Seekbackgroundaudio (OBJECT): for controlling the music playing progress.
wx. stopbackgroundaudio (): for stopping the music from playing.
Onbackgroundaaudioplay (callblack): for listening to music play.
Onbackgroundaaudiopause (callblack): for listening to music pauses.
Onbackgroundaaudiostop (CALLBACK) listens for music stops.
Referring to fig. 1, the interactive feedback method includes the following steps.
First, in step S101, a repetitive tempo length of a piece of music is acquired.
In music, the beats are formed by repeating the cycle of strong and weak time intervals according to a certain sequence. By definition, in a beat, the strength relationship is an indispensable factor, and the beat is biased to be the same period and to be cyclically repeated. Such as a strong-weak cyclic repetition (two beats), a strong-weak cyclic repetition (three beats), etc., it can be seen that the order of the strong-weak relationship plays a major role in such cyclic repetition. In a beat, this same period is called a "unit beat", which is what we say as "one beat" at ordinary times. The unit beat in the strong relation is the strong beat, and the unit beat in the weak relation is the weak beat. The time period occupied by one beat is the minimum beat duration. For example, the repetition beat duration determined in step S101 may be equal to one minimum beat duration, or, alternatively, it may also be equal to a plurality of minimum beat durations.
Hereinafter, description will be given taking an example in which the repetition beat duration is equal to one minimum beat duration. This should, of course, not be construed as limiting. The invention is also applicable to the case where the repeated beat duration is equal to multiple minimum beat durations.
The minimum beat time length is determined by first acquiring the total time length of music and beat data (e.g., the number of beats) of the music, and then dividing the total time length by the number of beats.
In a specific implementation, for example, as described above, the total duration of the music may be obtained by the parameter duration of audio or webaudio in HTML 5: t.totaltime ═ t.bgsoundobj. Then, the tempo data of the music (e.g., how many times the piece of music repeats the tempo in total) can be obtained by reading the attribute of the music. Assuming that the piece of music repeats 64 beats in total, the minimum beat duration t.worttime ═ t.totalttime/64.
Then, in step S102, the music is divided into a plurality of continuous repeated tempo periods in units of the repeated tempo time length. For example, the repeated beat periods are equal in length, and the repeated beat periods are arranged consecutively in chronological order. Alternatively, the repeated beat periods may be of unequal length. For example, the 1 st repeating beat period may be equal to 1 repeating beat duration, the 2 nd repeating beat period may be equal to 2 repeating beat durations, … …, and so on.
Next, in step S103, it is determined whether or not the first click operation is detected during the playing of music. For example, the first click operation may be a click operation of a user on a touch display screen of a playback interface on which the music is displayed. Of course, the invention is not limited thereto. For example, depending on the different devices playing the music, the first clicking operation may also be a clicking operation on a physical key or a specific location of the device housing.
If the judgment in step S103 is yes, that is: the first click operation is detected during the playing of the music, the process proceeds to step S104. In step S104, a first repeating beat period in which a time point at which the first click operation occurs is determined, and a first effect corresponding to the first click operation is played in a second repeating beat period subsequent to the first repeating beat period.
In a specific implementation, listening cyclically to the position (position) of music playing is implemented using window. The advantage of the windows, requestanimframe is that the refresh mechanism of the display is fully utilized, and the system resources are comparatively saved. Displays have a fixed refresh rate (60Hz or 75Hz) and the basic idea behind the window. The current playback position of music is detected cyclically at such a frequency. If it is detected that the current play position of the music enters a second repeat section period (i.e., if (t.bg sound bj. position > -t.index is this. word), a first effect corresponding to the first click operation is played.
On the other hand, if it is determined in step S103 as no, that is, the first click operation is not detected during the playing of the music, it proceeds to process 1, which will be described later.
Fig. 2 is a diagram showing an example of temporal precedence of the first click operation and the first effect. As shown in fig. 2, if the first clicking operation occurs during the first repeated beat period of music, no operation is performed in the current period. Playing a first effect as feedback to the first click operation is started at a start point of a next repeated beat period (i.e., a second repeated beat period) adjacent to the first repeated beat period, and the first effect is continuously played until the second repeated beat period ends, that is: the first effect is played during the second repeating beat period. Shown in fig. 2 is a case where the first effect is played at the next repeated beat period adjacent to the first repeated beat period. Of course, the invention is not limited thereto. For example, depending on the specific design requirements, the first effect may also be played at the nth repeating beat period after the first repeating beat period.
Therefore, in the interactive feedback method according to the embodiment of the present invention, for the click operation input from the outside, the effect corresponding to the click operation can be played as the feedback of the click operation while automatically attaching to the tempo of the original music. Since the time interval of the click operation of the external input is arbitrary and is usually different from the original music tempo, the effect of multi-rhythm combined performance is realized.
It is noted herein that references to "first" above and "second" below are intended to distinguish between the same terms or to correspond between different terms, without necessarily highlighting any aspect (e.g., temporal) of precedence. For example, the first repeated beat period is intended to define a period corresponding to the first click operation, and the chronological precedence relationship is not emphasized. That is, the first repeated tempo period is not necessarily the first repeated tempo period of music.
In addition, after the first click operation is detected, the method may further include the step of determining the first effect.
For example, the step of determining the first effect may comprise: selecting one effect among a plurality of effects as the first effect based on a time point at which the first click operation occurs.
In this case, a correspondence relationship may be established in advance between a plurality of effects and a plurality of repeated beat periods. As described in step S104 above, based on the point in time at which the first click operation occurs, a repeated beat period during which an effect is to be played may be determined. Further, based on the correspondence relationship established in advance and the repeated beat period, the effect to be played during the repeated beat period can be determined. Alternatively, one effect may be randomly selected among a plurality of effects as the first effect.
Or, alternatively, the step of determining the first effect may also include: selecting one effect group from a plurality of effect groups based on a time point at which the first click operation occurs, wherein one effect group includes a plurality of effects. And selecting one effect of the set of effects as the first effect.
When multiple click operations are detected during the playing of music, the corresponding effect may be determined independently for each click operation. Or, alternatively, when a plurality of click operations are detected during the playing of music, it may also be decided whether to associate an effect played in response to two adjacent click operations based on an interval between the two click operations.
Fig. 3 shows an example of how to determine a corresponding play effect when a plurality of click operations are detected during the playing of music.
In the case shown in fig. 3, the step of selecting one effect group among a plurality of effect groups based on a point in time at which the first click operation occurs, and selecting one effect among the effect groups as the first effect further includes the following steps.
First, in step S301, it is determined whether a second click operation has been detected before the first click operation is detected.
If it is determined yes in step S301, that is: the second click operation has been detected before the first click operation is detected, then the process proceeds to step S302.
In step S302, it is determined whether a time interval between the first click operation and the second click operation is less than a predetermined threshold.
If the determination in step S302 is yes, namely: the time interval is smaller than the predetermined threshold, the process proceeds to step S303. In step S303, an effect group in which the second effect played in response to the second click operation is located is selected, and one of the effects is selected as the first effect. For example, the first effect may be an effect in the same set of effects that is linked to the second effect. Alternatively, the first effect may be the same as the second effect. That is, if the interval time between two click operations is short, two effects corresponding thereto are set to be associated.
If the determination in step S301 is negative, that is: the second click operation has not been detected before the first click operation is detected, then the process proceeds to step S304. In step S304, one effect group is selected based on a predetermined rule, and one of the effects is selected as the first effect. For example, one effect group may be randomly selected, and one of the effects (e.g., a first effect) may be selected as the first effect. Alternatively, one effect group may be selected based on a point in time at which the first click operation occurs, and one of the effects (for example, the first effect) may be selected as the first effect. Thereby, a completely new first effect is achieved.
If the determination in step S302 is NO: the time interval is greater than the predetermined threshold, the process also proceeds to step S304. In step S304, one effect group is selected based on a predetermined rule, and one of the effects is selected as the first effect. For example, one effect group may be randomly selected, and one of the effects (e.g., a first effect) may be selected as the first effect. Alternatively, one effect group may be selected based on a point in time at which the first click operation occurs, and one of the effects (for example, the first effect) may be selected as the first effect. Alternatively, a different effect group from the effect group in which the second effect is located may be selected, and one of the effects (for example, the first effect) may be selected as the first effect. Thereby, a completely new first effect is achieved.
As an embodiment, the effect played in response to the click operation may include audio. In this case, the first effect includes a first audio corresponding to the first click operation. For example, the first audio may be an audio corresponding to a single chinese character, or may be an audio corresponding to a certain sound (e.g., rata).
In this embodiment, the selecting one of the effects as the first effect specifically includes: one audio is selected among the plurality of audios as a first audio. Or, selecting one effect group from the plurality of effect groups, and selecting one effect in the effect group as the first effect specifically includes: one audio group is selected from a plurality of audio groups, and one audio in the audio group is selected as the first audio.
Referring to fig. 3, when the interval between two click operations is less than a predetermined threshold, step S302 may include: and selecting an audio group where second audio played in response to the second click operation is located, and selecting one of the audio groups as the first audio. For example, an audio group may be a group of four audios corresponding to four words of a idiom. Assuming that the second audio is the audio corresponding to the second word in a idiom (i.e., an audio group), the audio corresponding to the third word in the idiom may be selected as the first audio. Thus, the connection of the first effect and the second effect is realized.
When the interval between two click operations is not less than the predetermined threshold, or when there is only one click operation, step S304 may include: one of the audio groups is selected based on a predetermined rule, and one of the audios is selected as the first audio. That is, the audio group needs to be reselected. For example, another idiom (i.e., another audio group) may be reselected and the audio corresponding to the first word in the idiom selected as the first audio. Thereby, a completely new first effect is achieved.
As another implementation, the effects played in response to the click operation may include animations. In this case, the first effect includes a first animation corresponding to the first click operation.
For example, in a specific implementation, the animation may be rendered by the tags of the canvas in HTML 5.
A canvas is an HTML element added to HTML5 in which a script (usually JavaScript) can be used to draw an image. It can be used to make photo collections or animations and even real-time video processing and rendering.
A canvas is a drawable area defined by HTML code in conjunction with height and width attributes. The JavaScript code can access this area, similar to other general-purpose two-dimensional APIs, to dynamically generate graphics through a complete set of drawing functions.
A canvas creates a fixed-size canvas, exposes one or more rendering contexts (brushes), and uses the rendering contexts to draw and process the content to be presented.
canvas only supports one native graphics rendering: rectangular. All other graphics need to generate at least one path (path). However, the method of generating many paths makes it possible to draw complex graphics.
canvas provides three methods to draw rectangles:
fillRect (x, y, width, height): a filled rectangle is drawn.
strockRect (x, y, width, height): and drawing a rectangular frame.
clearRect (x, y, widh, height): the designated rectangular area is cleared and then the block becomes completely transparent.
The 3 methods above have the same parameters. x, y are coordinates referring to the upper left corner of the rectangle (relative to the canvas origin). width, height refers to the width and height of the drawn rectangle. The basic element of the graph is a path.
A path is a collection of points of different shapes connected by line segments or curves of different colors and widths. One path, or even one sub-path, is closed. The following methods are needed:
beginPath (): and establishing a path, and once the path is successfully established, directing the graphic drawing command to the path to generate the path.
moveTo (x, y): the brush is moved to the specified coordinates (x, y). Corresponding to the coordinates of the starting point of the setup path.
closePath (): after closing the path, the graphics drawing commands are redirected back into the context.
stroke (): the graphical outline is drawn by lines.
fill (): a solid pattern is generated by filling the content area of the path.
Graphics such as line segments, triangles, arcs, and the like can be drawn by the above method.
In the above description of image drawing, default lines and colors are used. Further, for example, styles and colors may also be added by the following functions.
fillStyle ═ color: and setting the filling color of the pattern.
stroks style ═ color: the color of the figure outline is set.
globale alpha ═ transfarencyvalue: the transparency of the graphic is set.
lineWidth ═ value: the line width is set.
lineCap ═ type: and setting a line end pattern.
lineJoin ═ type: in the same path, the style of the joint between the lines is set.
In addition, canvas also provides two methods to render text:
FillText (text, x, y [, maxWidth ]): the specified text is filled in at the specified (x, y) position, and the maximum width of the drawing is optional.
StrokeText (text, x, y [, maxWidth ]): the text border is drawn at the specified (x, y) position, and the maximum width of the drawing is selectable.
The specific way in which the pattern is drawn by the canvas is described above. Of course, existing pictures can also be loaded directly on the canvas.
The basic steps of drawing an animation frame in a canvas include:
first, the canvas is emptied. All needs to be cleared before each frame of animation is drawn. The simplest way to empty all is the clearRect () method.
Next, the canvas state is saved. If the state of the canvas is changed during the rendering process (color, moved origin of coordinates, etc.) and is the original state when each frame is rendered, the state of the canvas is preferably saved.
Next, an animated graphic is drawn. Each animation frame may be drawn by the method described hereinabove.
Finally, the canvas state is restored. If the canvas state is saved ahead, it should be restored after one frame of rendering is completed.
Additionally, an animation instance (animation) can be created, for example, by wx. Step () is called after the animation operation method is called to represent that a group of animations is completed, any multiple animation methods can be called in the group of animations, all animations in the group of animations start at the same time, and the next group of animations is performed after the group of animations is completed. And finally, exporting animation attributes of animation data transmitted to the components through an export method of the animation instance.
In order to perform animation, some methods that can perform redrawing periodically are also needed. For example, the following three methods may be used: setInterval (), setTimeout (), requestAnamationFrame ().
In this embodiment, the selecting one of the effects as the first effect specifically includes: one animation is selected among the plurality of animations as a first animation. Or, selecting one effect group from the plurality of effect groups, and selecting one effect in the effect group as the first effect specifically includes: one animation group is selected from a plurality of animation groups, and one animation in the animation group is selected as the first animation.
Referring to fig. 3, when the interval between two click operations is less than a predetermined threshold, step S302 may include: and selecting the animation group where the second animation played in response to the second click operation is positioned, and selecting one animation as the first animation. For example, FIGS. 4(A) -4(D) illustrate one example of an animation group. As shown in fig. 4(a) -4(D), one animation group may include four animations for a set of consecutive actions. Of course, since only static drawings can be provided, the screenshots corresponding to the four animations are shown in fig. 4(a) -4 (D). Assuming that the second animation is the animation corresponding to the second action in an animation group, the animation corresponding to the third action in the animation group may be selected as the first animation. Thus, the connection of the first effect and the second effect is realized.
When the interval between two click operations is not less than the predetermined threshold, or when there is only one click operation, step S304 may include: one animation group is selected based on a predetermined rule, and one of the animations is selected as the first animation. That is, the animation group needs to be reselected. For example, a set of consecutive actions of another cartoon character (i.e., another animation set) may be reselected, and the animation corresponding to the first action in the animation set may be selected as the first animation. Thereby, a completely new first effect is achieved.
As still another embodiment, the effect played in response to the click operation may include both audio and animation. Also, the audio and animation may be associated with each other. In this case, the first effect includes first audio and first animation corresponding to the first click operation, wherein the first animation may correspond to the first audio. In this embodiment, the first audio played in response to the first click operation may be determined in the same manner as above, and detailed details will not be described. Further, a first animation to be played is determined based on a correspondence between the first audio and the first animation. For example, the four animations included in the animation group shown in fig. 4 may correspond to the audios of the chinese characters "heart", "want", "thing", "yes", respectively.
The method of how to play and determine the effect corresponding to the first click operation when the first click operation is detected during the playing of music is described above. On the other hand, when the first click operation is not detected during the playing of music, that is, when the determination in step S103 in fig. 1 is no, the process 1 is executed, which specifically includes: playing a scene corresponding to the music, wherein the scene includes a beat animation corresponding to a repeating beat period of the music. In a particular implementation, the animation may be rendered by the tags of the canvas in HTML 5.
Fig. 5 shows an example of a correspondence in time between a repeated beat period of music, a scene, and a beat animation. In fig. 5, 501 and 502 respectively represent different scenes, and the scene 501 includes four repeated beat periods 5011, 5012, 5013, and 5014, and the scene 502 includes four repeated beat periods 5021, 5022, 5023, and 5024. That is, in the case shown in fig. 5, different scenes are switched every predetermined number (e.g., four) of repeating beat periods, and the different repeating beat periods correspond to different beat animations. Here, in one scene, although different tempo animations correspond to different repeated tempo periods, these different tempo animations belong to the same animation group. A scenario 501 is taken as an example for description. As shown in fig. 5, four repeated beat periods 5011, 5012, 5013, and 5014 included in the scene 501 correspond to four different beat animations (screen shots), respectively. These different tempo animations represent different movements of the same cartoon character, respectively. The tempo animation corresponding to the repeated tempo period 5011 is an animation of the cartoon character jumping upwards, and in order to visually represent an effect of rhythm along with the music tempo, the tempo animation corresponding to the next repeated tempo period 5012 is an animation of another different action of the cartoon character, such as a downward-jumping animation. The scene 502 is similar to the scene 501, and is not illustrated in detail. The scene 502 differs from the scene 501 in the specific scene content. For example, the scene 502 and the scene 501 play a beat animation corresponding to the motion of different cartoon characters. Alternatively, as described below, the scene 502 is different from the background image in the scene 501.
In this case, when the first click operation is detected, since the tempo animation included in the scene is currently being played, which overlaps in play time with the first animation to be played in response to the first click operation, playing the first effect corresponding to the first click operation described above includes: and replacing the currently played rhythm animation with the first animation.
In addition, the scene may further include a background image in addition to the beat animation corresponding to the repeated beat period of the music.
The method may further comprise: dividing the music into a plurality of successive combined beat periods, wherein each of the combined beat periods comprises a plurality of the repeating beat periods. For example, if the repeated beat period is t.wordtime ═ t.totaltime/64 as described above, the combined beat period may be t.wordtime ═ t.totaltime/16. The plurality of combined beat periods herein correspond to different scenarios, respectively, as described above.
And when the position of the music playing enters the next combined beat time period from the current combined beat time period, transforming the beat animation. Here, transforming the tempo animation may refer to transforming the tempo animation corresponding to different actions in the same animation group, for example, transforming to the tempo animation corresponding to different actions of the same cartoon character. Alternatively, a transition to a different animation group is also possible, for example, a transition to a beat animation corresponding to the motion of another cartoon character. In addition, when the position where the music is played enters the next combined beat period from the current combined beat period, the background image may be further transformed. For example, in the case where the background image is a monochrome image, the color of the background image may be changed.
In a specific implementation, listening cyclically to the position (position) of music playing is implemented using window. The advantage of the windows, requestanimframe is that the refresh mechanism of the display is fully utilized, and the system resources are comparatively saved. Displays have a fixed refresh rate (60Hz or 75Hz) and the basic idea behind the window. The current playback position of music is detected cyclically at such a frequency. If it is detected that the current playing position of the music enters the next repeated tempo period from the current repeated tempo period (i.e., if (t.bg sound bj. position > -t.index × this. word) then it is switched to play the tempo animation corresponding to the repeated tempo period. Of course, if a click operation has been detected in the current repeated tempo period, the tempo animation is replaced with an animation corresponding to the click operation when entry into the next repeated tempo period is detected. If it is detected that the current playback position of the music enters the next scene from the current scene (i.e., if (t.bg sound bj. position > -t.index. this. space time)), the playback is switched to the scene corresponding to the repeated beat period.
In addition, the interactive feedback method according to the embodiment of the present invention may further include: on the basis of the original scene, recording the playing effect in response to the detected clicking operation in the playing process of the music, and outputting the effect after the music is played. Therefore, the user can obtain, reproduce or share the personalized music animation which shows the style of the user.
Next, an interactive feedback device according to an embodiment of the present invention will be described with reference to fig. 6. For example, the interactive feedback device may be a smart phone, a PDA (personal digital assistant), a tablet computer, a desktop computer, and various information processing devices.
As shown in fig. 6, the interactive feedback device 600 may include: a determination unit 601, a division unit 602, a playback unit 603, a detection unit 604, and a control unit 605.
The obtaining unit 601 is configured to obtain a duration of a repeated beat of the piece of music.
The dividing unit 602 is configured to divide the music into a plurality of consecutive repeated tempo periods in units of the repeated tempo duration.
The playback unit 603 is used to perform playback operations, such as playing back music, effects, and the like.
The detection unit 604 is configured to detect a click operation.
The control unit 605 is configured to, when the detection unit detects a first click operation during the playing of the music, determine a first repeating beat period in which a time point at which the first click operation occurs is located, and control the playing unit to play 603 a first effect corresponding to the first click operation in a second repeating beat period subsequent to the first repeating beat period.
Wherein the control unit 605 is further configured to: after the detection unit 604 detects the first click operation, processing for determining the first effect is performed, and the processing for determining the first effect includes: selecting one effect group among a plurality of effect groups based on a time point at which the first click operation occurs, and selecting one audio in the effect group as the first effect.
Therefore, in the interactive feedback device according to the embodiment of the present invention, for the click operation input from the outside, an effect corresponding to the click operation may be played as a feedback to the click operation while automatically attaching to the tempo of the original music. Since the time interval of the click operation of the external input is arbitrary and is usually different from the original music tempo, the effect of multi-rhythm combined performance is realized.
Wherein the processing of selecting one effect group among a plurality of effect groups based on a time point at which the first click operation occurs, and selecting one effect of the effect group as the first effect further includes: judging whether a second clicking operation is detected before the first clicking operation is detected; if a second click operation is detected before the first click operation is detected, judging whether the time interval between the first click operation and the second click operation is smaller than a preset threshold value; if the time interval is smaller than a preset threshold value, selecting an effect group where a second effect played in response to a second click operation is located, and selecting one effect as the first effect; if a second click operation has not been detected before the first click operation is detected, or if the time interval is greater than a predetermined threshold, selecting a group of effects based on a predetermined rule and selecting one of the effects as the first effect.
As one embodiment, the first effect includes a first audio corresponding to the first click operation.
As another embodiment, the first effect includes a first animation corresponding to the first click operation.
As still another embodiment, the first effect includes first audio and a first animation corresponding to the first click operation, wherein the first animation may correspond to the first audio.
When the detection unit 604 does not detect the first click operation during the playing of the music, the control unit 605 further controls the playing unit 603 to play a scene corresponding to the music, wherein the scene includes a beat animation corresponding to a repeated beat period of the music.
Wherein, when playing the first effect corresponding to the first click operation, the playing unit 603 is further configured to: and replacing the currently played rhythm animation with the first animation.
In addition, the scene may further include a background picture. The dividing unit 602 is further configured to divide the music into a plurality of consecutive combined beat periods, wherein each of the combined beat periods comprises a plurality of the repeated beat periods; and the play unit 603 is further configured to: and when the position of the music playing enters the next combined beat time period from the current combined beat time period, transforming the beat animation and/or the background image.
An interactive feedback device for a product according to the present invention is shown in fig. 7 as an example of a hardware entity. The greyscale control device comprises a processor 701, a memory 702 and at least one external communication interface 703. The processor 701, the memory 702, and the external communication interface 703 are all connected by a bus 704.
As for the Processor 11701 for data Processing, when performing Processing, it may be implemented by a microprocessor, a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or a Programmable logic Array (FPGA); the memory 702 contains operation instructions, which may be computer executable codes, and the operation instructions implement the steps in the flow of the interactive feedback method according to the embodiment of the present invention.
Fig. 8 illustrates a schematic diagram of a computer-readable recording medium according to an embodiment of the present invention. As shown in fig. 8, a computer-readable recording medium 800 according to an embodiment of the present invention has stored thereon computer program instructions 801. The computer program instructions 801, when executed by a processor, perform a gray scale control interactive feedback method according to an embodiment of the invention described with reference to the above figures.
Since the operations of the respective units in the interactive feedback device according to the embodiment of the present invention completely correspond to the respective steps in the interactive feedback method described above, details thereof are not described here for the sake of avoiding redundancy.
Heretofore, an interactive feedback method, apparatus, and recording medium according to embodiments of the present invention have been described in detail with reference to fig. 1 to 8. With the interactive feedback method, apparatus, and recording medium according to the embodiments of the present invention, it is possible to change playback content as feedback in response to an external input. In addition, the effect corresponding to the external input can be played while automatically fitting the tempo of the original music, and since the time interval of the clicking operation of the external input is arbitrary and is generally different from the original music tempo, the effect of multi-rhythm combined playing is achieved. In addition, since the interactive feedback method according to the present invention is developed based on HTML5, and HTML5 is widely applied to various product platforms, development and manufacturing costs are low.
It should be noted that, in the present specification, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Finally, it should be noted that the series of processes described above includes not only processes performed in time series in the order described herein, but also processes performed in parallel or individually, rather than in time series.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present invention may be implemented by software plus a necessary hardware platform, and may also be implemented by software entirely. With this understanding in mind, all or part of the technical solutions of the present invention that contribute to the background can be embodied in the form of a software product, which can be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes instructions for causing a computer device (which can be a personal computer, a server, or a network device, etc.) to execute the methods according to the embodiments or some parts of the embodiments of the present invention.
The present invention has been described in detail, and the principle and embodiments of the present invention are explained herein by using specific examples, which are only used to help understand the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (11)

1. An interactive feedback method, comprising:
acquiring the repeated beat time length of a piece of music;
dividing the music into a plurality of continuous repeated beat time periods by taking the repeated beat time length as a unit;
determining a first repeating tempo period in which a point in time at which the first click operation occurs is located when the first click operation is detected during the playing of the music, and playing a first effect corresponding to the first click operation in a second repeating tempo period subsequent to the first repeating tempo period,
wherein, after detecting the first click operation, the method further comprises the step of determining the first effect, and the step of determining the first effect comprises:
judging whether a second clicking operation is detected before the first clicking operation is detected;
if a second click operation is detected before the first click operation is detected, judging whether the time interval between the first click operation and the second click operation is smaller than a preset threshold value;
if the time interval is smaller than a preset threshold value, selecting an effect group where a second effect played in response to a second click operation is located, and selecting one effect as the first effect;
if a second click operation has not been detected before the first click operation is detected, or if the time interval is greater than a predetermined threshold, selecting a group of effects based on a predetermined rule and selecting one of the effects as the first effect.
2. The method of claim 1, wherein the first effect comprises first audio corresponding to the first tap operation.
3. The method of claim 1 or 2, wherein the first effect comprises a first animation corresponding to the first click operation.
4. The method of claim 3, further comprising:
when a first click operation is not detected in the playing process of the music, a scene corresponding to the music is played, wherein the scene comprises a beat animation corresponding to a repeated beat period of the music.
5. The method according to claim 4, wherein, when a first click operation is detected during the playing of the music, playing a first effect corresponding to the first click operation includes: and replacing the currently played rhythm animation with the first animation.
6. The method of claim 4, wherein the scene further comprises a background image, and
the method further comprises:
dividing the music into a plurality of successive combined beat periods, wherein each of the combined beat periods comprises a plurality of the repeating beat periods, an
And when the position of the music playing enters the next combined beat time period from the current combined beat time period, transforming the beat animation and/or the background image.
7. An interactive feedback device comprising:
the acquiring unit is used for acquiring the repeated beat time length of a piece of music;
a dividing unit configured to divide the music into a plurality of continuous repeated tempo periods in units of the repeated tempo duration;
a playback unit configured to perform a playback operation;
the detection unit is used for detecting click operation; and
a control unit configured to determine a first repeating tempo period in which a point in time at which the first click operation occurs is located when the detection unit detects the first click operation during the playing of the music, and control the playing unit to play a first effect corresponding to the first click operation in a second repeating tempo period subsequent to the first repeating tempo period,
wherein the control unit is further configured to: after the detection unit detects the first click operation, performing processing for determining the first effect, and the processing for determining the first effect includes:
judging whether a second clicking operation is detected before the first clicking operation is detected;
if a second click operation is detected before the first click operation is detected, judging whether the time interval between the first click operation and the second click operation is smaller than a preset threshold value;
if the time interval is smaller than a preset threshold value, selecting an effect group where a second effect played in response to a second click operation is located, and selecting one effect as the first effect;
if a second click operation has not been detected before the first click operation is detected, or if the time interval is greater than a predetermined threshold, selecting a group of effects based on a predetermined rule and selecting one of the effects as the first effect.
8. The apparatus of claim 7, wherein the first effect comprises first audio and a first animation corresponding to the first tap operation.
9. The apparatus according to claim 8, wherein when the detection unit does not detect the first click operation during the playing of the music, the control unit further controls the playing unit to play a scene corresponding to the music, wherein the scene includes a beat animation corresponding to a repeated beat period of the music.
10. The apparatus according to claim 9, wherein when a first click operation is detected during the playing of the music, a first effect corresponding to the first click operation is played, and the playing unit is further configured to: and replacing the currently played rhythm animation with the first animation.
11. The apparatus of claim 9, wherein
The scene further comprises a background picture,
the dividing unit is further configured to divide the music into a plurality of consecutive combined beat periods, wherein each of the combined beat periods comprises a plurality of the repeated beat periods; and is
The playback unit is further configured to: and when the position of the music playing enters the next combined beat time period from the current combined beat time period, transforming the beat animation and/or the background image.
CN201811133444.7A 2018-09-27 2018-09-27 Interactive feedback method, apparatus and recording medium Active CN109343770B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811133444.7A CN109343770B (en) 2018-09-27 2018-09-27 Interactive feedback method, apparatus and recording medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811133444.7A CN109343770B (en) 2018-09-27 2018-09-27 Interactive feedback method, apparatus and recording medium

Publications (2)

Publication Number Publication Date
CN109343770A CN109343770A (en) 2019-02-15
CN109343770B true CN109343770B (en) 2021-07-20

Family

ID=65306843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811133444.7A Active CN109343770B (en) 2018-09-27 2018-09-27 Interactive feedback method, apparatus and recording medium

Country Status (1)

Country Link
CN (1) CN109343770B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110559657B (en) * 2019-08-22 2023-11-03 腾讯科技(深圳)有限公司 Network game control method, device and storage medium
CN112044053B (en) * 2020-09-03 2022-05-17 腾讯科技(深圳)有限公司 Information processing method, device, equipment and storage medium in virtual scene
CN113157369B (en) * 2021-04-07 2023-04-18 杭州网易云音乐科技有限公司 Music playing interaction method and device, storage medium and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104754372A (en) * 2014-02-26 2015-07-01 苏州乐聚一堂电子科技有限公司 Beat-synchronized special effect system and beat-synchronized special effect handling method
CN106575424A (en) * 2014-07-31 2017-04-19 三星电子株式会社 Method and apparatus for visualizing music information
CN108287651A (en) * 2012-05-09 2018-07-17 苹果公司 Method and apparatus for providing touch feedback for the operation executed in the user interface
CN108319413A (en) * 2018-01-29 2018-07-24 咪咕音乐有限公司 A kind of method for playing music, device and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7236226B2 (en) * 2005-01-12 2007-06-26 Ulead Systems, Inc. Method for generating a slide show with audio analysis
CN103680562B (en) * 2012-09-03 2017-03-22 腾讯科技(深圳)有限公司 Point distribution method and device for audio file
US9207857B2 (en) * 2014-02-14 2015-12-08 EyeGroove, Inc. Methods and devices for presenting interactive media items
CN104853238A (en) * 2014-11-25 2015-08-19 苏州乐聚一堂电子科技有限公司 Combined beat special effect system and combined beat special effect processing method
CN106503127B (en) * 2016-10-19 2019-09-27 竹间智能科技(上海)有限公司 Music data processing method and system based on facial action identification

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108287651A (en) * 2012-05-09 2018-07-17 苹果公司 Method and apparatus for providing touch feedback for the operation executed in the user interface
CN104754372A (en) * 2014-02-26 2015-07-01 苏州乐聚一堂电子科技有限公司 Beat-synchronized special effect system and beat-synchronized special effect handling method
CN106575424A (en) * 2014-07-31 2017-04-19 三星电子株式会社 Method and apparatus for visualizing music information
CN108319413A (en) * 2018-01-29 2018-07-24 咪咕音乐有限公司 A kind of method for playing music, device and storage medium

Also Published As

Publication number Publication date
CN109343770A (en) 2019-02-15

Similar Documents

Publication Publication Date Title
CN108279964B (en) Method and device for realizing covering layer rendering, intelligent equipment and storage medium
CN107077274B (en) Method and apparatus for moving context tags in a strip
US11216253B2 (en) Application prototyping tool
CN109343770B (en) Interactive feedback method, apparatus and recording medium
EP2840511A1 (en) System and method for dynamically converting webpage, and computer-readable recording medium
CN108924622B (en) Video processing method and device, storage medium and electronic device
CN109656654B (en) Editing method of large-screen scene and computer-readable storage medium
CN105094804A (en) Method and apparatus for adding animation to page
CN107066186A (en) A kind of UI interfaces character methods of exhibiting and display device based on Canvas
KR101656167B1 (en) Method, apparatus, device, program and recording medium for displaying an animation
CN113655999B (en) Page control rendering method, device, equipment and storage medium
CN104915186B (en) A kind of method and apparatus making the page
CN113688341B (en) Dynamic picture decomposition method and device, electronic equipment and readable storage medium
CN114185465A (en) Information processing method, information processing apparatus, storage medium, and electronic device
CN110471700B (en) Graphic processing method, apparatus, storage medium and electronic device
CN115407985A (en) Virtual multimedia scene editing method, electronic device and storage medium
CN111199568B (en) Vector diagram drawing method and device and computer readable storage medium
CN112015410A (en) Webpage editing method, device and system and computer storage medium
EP1632850A1 (en) Method and system for generating and automatically storing the definitions of states related to the appearance and behavior of programming elements in a software application development tool
CN114547519B (en) Page editing method and device, electronic equipment and readable storage medium
CN113676677B (en) Dynamic picture synthesis method and device, electronic equipment and readable storage medium
CN115329720A (en) Document display method, device, equipment and storage medium
CN110968991A (en) Method and related device for editing characters
CN104516860A (en) Methods and systems for selecting text within a displayed document
CN111782309B (en) Method and device for displaying information and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant