CN113157175A - Special effect engine, interactive special effect generating method using same and user interaction method - Google Patents

Special effect engine, interactive special effect generating method using same and user interaction method Download PDF

Info

Publication number
CN113157175A
CN113157175A CN202010559879.9A CN202010559879A CN113157175A CN 113157175 A CN113157175 A CN 113157175A CN 202010559879 A CN202010559879 A CN 202010559879A CN 113157175 A CN113157175 A CN 113157175A
Authority
CN
China
Prior art keywords
special effect
special
interactive
user
effect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010559879.9A
Other languages
Chinese (zh)
Inventor
崔明君
王天源
陈龚
朱艺
杜江杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Publication of CN113157175A publication Critical patent/CN113157175A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Disclosed are a special effect engine, an interactive special effect generating method using the special effect engine, and a user interaction method applying the interactive special effect. The special effects engine includes: the graphic engine layer is used for providing a universal special effect rendering function; and the special effect application layer is used for carrying out standard management, task management and interactive parameter configuration on the functions provided by the graphic engine layer. In this way, it is possible to obtain an interactive effect that can interact with a user and to provide a novel and interesting way of interacting with an interactive effect.

Description

Special effect engine, interactive special effect generating method using same and user interaction method
Technical Field
The present invention relates to the technical field of special effect processing, and more particularly, to a special effect engine, an interactive special effect generating method using the special effect engine, and a user interaction method applying an interactive special effect.
Background
In the fields of movie and television production, animation production and visualization processing, special effect rendering is required. However, these rendered effects only focus on the presentation of the effects, and interaction with the user is not possible.
At present, with continuous iteration of the technology and upgrading of an interaction mode, special effects with interactivity appear more and more, for example, a user can change the effect of the special effect by clicking, or the user drags the special effect by sliding operation, and the like.
Accordingly, it is desirable to provide an improved special effects engine capable of making interactive special effects and methods of using the same.
Disclosure of Invention
The present application is proposed to solve the above-mentioned technical problems. The embodiment of the application provides a special effect engine, an interactive special effect generating method using the special effect engine and a user interaction method applying interactive special effects, wherein standard management, task management and interaction parameter configuration are carried out on a general special effect rendering function provided by a graphic engine layer through a special effect application layer, so that interactive special effects capable of interacting with users are obtained, and a novel and interesting interaction mode applying the interactive special effects is provided.
According to an aspect of the present application, there is provided a special effects engine including: the graphic engine layer is used for providing a universal special effect rendering function to generate special effect data; and the special effect application layer is used for carrying out standard management, task management and interactive parameter configuration on the special effect data generated by the graphic engine layer.
In the above-mentioned special effects engine, the special effects application layer includes: the special effect application framework sublayer is used for carrying out standard management and task management on the special effect data generated by the graphic engine layer; and the user logic sublayer is used for receiving input user interaction parameter configuration information from a user.
In the above-mentioned special effect engine, the specification management includes special effect initialization, mutual information configuration, special effect update execution, and special effect termination.
In the above-mentioned special effect engine, the interaction information configuration includes at least one of an interactable position, an interactable time, and an interaction input strength configuring the special effect.
In the above special effects engine, the user interaction parameter configuration information includes at least one of: the display device comprises a special effect display object, special effect display time, a special effect display position and a special effect display area size.
In the above-mentioned special effects engine, the special effects application layer further includes: and the interface sub-layer is used for carrying out interface packaging suitable for various operating environments on the special-effect application framework sub-layer.
According to another aspect of the present application, there is provided an interactive special effect generating method of an application special effect engine, including: obtaining special effect data generated by one or more universal special effect rendering functions provided by a graphic engine layer; generating, by a special effects application layer, a special effects rendering task from the special effects data; configuring the special effect rendering task based on a special effect rendering specification provided by a special effect application layer; receiving interactive parameter configuration information input by a user through a special effect application layer; and generating the interactive special effect based on the interactive parameter configuration information and the special effect rendering task.
In the above interactive special effect generating method, the forming, by the special effect application layer, a special effect rendering task from the special effect data includes: combining, by a special effects application framework sublayer of the special effects application layer, a plurality of general special effects rendering functions into a single special effects rendering task.
In the above interactive special effect generating method, configuring the special effect rendering task based on a special effect rendering specification provided by a special effect application layer includes: configuring the special effect rendering task based on a special effect rendering specification provided by a special effect application framework sublayer of the special effect application layer, wherein the special effect rendering specification comprises special effect initialization, interactive information configuration, special effect update execution and special effect ending.
In the above interactive special effect generating method, the configuration of the interactive information includes configuring at least one of an interactive position, an interactive time, and an interactive input strength of the special effect.
In the above interactive special effect generating method, receiving, by the special effect application layer, interactive parameter configuration information input by a user includes: and receiving user interaction parameter configuration information input by a user through a user logic sub-layer of the special effect application layer.
In the above method for generating an interactive special effect, the user interaction parameter configuration information includes at least one of the following: the display device comprises a special effect display object, special effect display time, a special effect display position and a special effect display area size.
In the above interactive special effect generating method, before receiving, by the special effect application layer, user interaction parameter configuration information input by a user, the method further includes: interface packaging applicable to various operating environments is carried out on the special effect application framework sublayer through the interface sublayer of the special effect application layer; and receiving, by the special effects application layer, user interaction parameter configuration information input by a user, including: and calling an interface encapsulated by the interface sub-layer by the user logic sub-layer to receive user interaction parameter configuration information input by a user.
According to still another aspect of the present application, there is provided an interactive special effect generating method of an application special effect engine, including: obtaining a fade-in special effect, a wipe special effect and a fade-out special effect provided by a graphic engine layer; combining the fade-in effect, the wipe effect and the fade-out effect into a effect rendering task by an effect application framework layer; configuring the special effect rendering task by a special effect rendering specification provided by a special effect application framework layer; receiving interactive parameter configuration information input by a user logic layer; and generating the interactive special effect based on the interactive parameter configuration information and the special effect rendering task.
In the above interactive special effect generating method, the special effect rendering specification includes special effect initialization, interactive information configuration, special effect update execution, and special effect end; the interaction information configuration comprises at least one of a swappable area, a swappable time, and a force of a wiping input in the wiping effect.
In the above method for generating an interactive special effect, the user interaction parameter configuration information includes at least one of the following: wiping a special effect display image, a special effect display position, a special effect display area size, fading-in special effect time, fading-out special effect time, wiping special effect overtime time and operation result display time.
According to another aspect of the present application, there is provided a user interaction method for applying an interaction effect, including: displaying the first special effect to a user; receiving an operation of a user responding to the first special effect, wherein the operation is the operation of the user relative to a display area of the first special effect on a touch screen; a second special effect corresponding to the operation is given to the user based on the operation; and displaying different branch scenarios to the user based on the operation result of the operation.
In the above user interaction method applying an interaction special effect, after a second special effect corresponding to the operation is applied to the user based on the operation, the method further includes: and displaying a third special effect corresponding to the first special effect to a user in response to reaching a preset condition.
In the above user interaction method applying an interaction effect, the first effect is a fade-in effect of a wipe mask; the operation is a wiping operation in which the user slides a display region of the wiping mask on a touch screen; the second effect is a wiping effect; and, the third effect is a fade-out effect of the wipe mask.
In the above user interaction method applying an interaction effect, displaying the wiping effect to the user based on the wiping operation includes: in response to the wiping operation, performing visible processing of removing the wiping mask on the area of the wiping operation.
In the above user interaction method applying an interaction effect, displaying a wiping effect to a user based on the wiping operation includes: the user is presented with the wiped scale.
According to another aspect of the present application, there is provided a special effects engine, comprising: a receiving unit, configured to receive special effect data and interaction parameters of a general special effect from a terminal; the application unit is used for carrying out standard management and task management on the universal special effect by using the received special effect data and carrying out interactive parameter configuration on the basis of the received interactive parameters so as to generate an interactive special effect; and the output unit is used for outputting the interactive special effect to a terminal.
According to still another aspect of the present application, there is provided an interactive special effect generating apparatus applying a special effect engine, including: a special effect acquisition unit for acquiring a fade-in special effect, a wipe special effect and a fade-out special effect provided by the graphics engine layer; a special effect combination unit, configured to combine the fade-in special effect, the wipe special effect, and the fade-out special effect into a special effect rendering task by a special effect application framework layer; the task configuration unit is used for configuring the special effect rendering task according to the special effect rendering specification provided by the special effect application framework layer; the parameter receiving unit is used for receiving interactive parameter configuration information input by a user logic layer; and the special effect generating unit is used for generating the interactive special effect based on the interactive parameter configuration information and the special effect rendering task.
In the interactive special effect generating device, the special effect rendering specification includes special effect initialization, interactive information configuration, special effect update execution and special effect termination; the interaction information configuration comprises at least one of a swappable area, a swappable time, and a force of a wiping input in the wiping effect.
In the above-mentioned interactive special effect generating apparatus, the user interaction parameter configuration information includes at least one of: wiping a special effect display image, a special effect display position, a special effect display area size, fading-in special effect time, fading-out special effect time, wiping special effect overtime time and operation result display time.
According to still another aspect of the present application, there is provided a user interaction apparatus applying an interaction effect, including: a first display unit for displaying a first effect to a user; an operation receiving unit configured to receive an operation of a user in response to the first special effect, the operation being an operation of the user with respect to touching a display area of the first special effect on a screen; a second display unit configured to display a second special effect corresponding to the operation to a user based on the operation; and a third display unit for displaying different branching scenarios to a user based on an operation result of the operation.
In the above user interaction device applying the interactive special effect, the device further includes a fourth display unit, configured to display a third special effect corresponding to the first special effect to the user in response to reaching a preset condition after a second special effect corresponding to the operation is displayed to the user based on the operation.
In the above user interaction device applying an interactive special effect, the first special effect is a fade-in special effect of a wipe mask; the operation is a wiping operation in which the user slides a display region of the wiping mask on a touch screen; the second effect is a wiping effect; and, the third effect is a fade-out effect of the wipe mask.
In the above user interaction apparatus applying an interactive special effect, the second display unit is configured to: in response to the wiping operation, performing visible processing of removing the wiping mask on the area of the wiping operation.
In the above user interaction apparatus applying an interactive special effect, the second display unit is configured to: the user is presented with the wiped scale.
According to still another aspect of the present application, there is provided an electronic apparatus including: a processor; and a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the interactive special effects generating method of the application special effects engine and the user interaction method of applying interactive special effects as described above.
According to yet another aspect of the present application, there is provided a computer readable medium having stored thereon computer program instructions which, when executed by a processor, cause the processor to perform the method of generating an interactive special effect of an application special effect engine and the method of user interaction of an application interactive special effect as described above.
According to the special effect engine and the interactive special effect generation method using the special effect engine, the general special effect rendering function provided by the graphic engine layer is subjected to standard management, task management and interactive parameter configuration through the special effect application layer, and therefore the interactive special effect capable of interacting with the user is obtained.
In addition, according to the user interaction method for applying the interactive special effect, the special effect of the operation is displayed by receiving the operation of the user responding to the displayed special effect, and different branching scenarios are displayed for the user based on the operation result, so that a novel and interesting interaction mode for applying the interactive special effect can be provided.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.
FIG. 1 illustrates a schematic diagram of a usage scenario of a special effects engine according to an embodiment of the present application.
FIG. 2 illustrates a block diagram of a special effects engine according to an embodiment of the present application.
FIG. 3 illustrates a block diagram of a special effects application layer in a special effects engine according to an embodiment of the present application.
Fig. 4 illustrates a flowchart of an interactive special effects generation method of an application special effects engine according to an embodiment of the present application.
Fig. 5 is a flowchart illustrating an application example one of an interactive special effect generating method of an application special effect engine according to an embodiment of the present application.
Fig. 6 illustrates a schematic diagram of an application scenario of a user interaction method applying an interaction effect according to an embodiment of the present application.
FIG. 7 illustrates a flowchart of a user interaction method applying interactive special effects according to an embodiment of the present application
Fig. 8 illustrates a scene diagram of an application example two of the special effects engine according to an embodiment of the present application.
Fig. 9 illustrates a block diagram of an interactive special effects generation apparatus applying a special effects engine according to an embodiment of the present application.
FIG. 10 illustrates a block diagram of a user interaction device applying interactive special effects, according to an embodiment of the present application.
FIG. 11 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
Detailed Description
Hereinafter, example embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be understood that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and that the present application is not limited by the example embodiments described herein.
Overview of a scene
FIG. 1 illustrates a schematic diagram of a usage scenario of a special effects engine according to an embodiment of the present application.
In an embodiment of the application, the effects engine is applied to generate interactive effects. As shown in fig. 1, in the process of generating an interactive special effect, a graphic engine layer of a special effect engine E provides a general special effect rendering function, so as to generate special effect data of various special effects, and then sends the special effect data to a special effect application layer. The special effect application layer can perform standard management and task management on the special effect data provided by the graphic engine layer, receives interaction parameters from the user U, and performs interaction parameter configuration on the special effect data based on the interaction parameters, so as to generate an interaction special effect V.
Exemplary System
FIG. 2 illustrates a block diagram of a special effects engine according to an embodiment of the present application.
As shown in fig. 2, the special effects engine 100 according to an embodiment of the present application includes: a graphic engine layer 110 for providing a general effect rendering function to generate effect data; and a special effect application layer 120, configured to perform standard management, task management, and interaction parameter configuration on the special effect data generated by the graphics engine layer.
The graphics engine layer 110 is used to provide general rendering capability of special effect graphics, thereby providing basic services for rendering special effect graphics, so as to further customize the rendering and interactive effects of various special effect graphics. In the embodiment of the present application, the rendered special effect graphics may be two-dimensional (2D) graphics or three-dimensional (3D) graphics. The rendered special effect graphics may be static graphics or dynamic graphics, such as a rain effect, a snow effect, a fly effect, or the like. After generating the effect data through the rendering of the general effect graphic, the graphic engine layer 110 transmits the effect data to the effect application layer 120.
The special effect application layer 120 is configured to perform standard management, task management, and interaction parameter configuration on the special effect data generated by the graphics engine layer, that is, customize and create an interaction special effect including various 2D/3D special effects and interaction effects based on the special effect data of various general special effects provided by the graphics engine layer.
Hereinafter, the special effect application layer according to the embodiment of the present application will be described in further detail.
FIG. 3 illustrates a block diagram of a special effects application layer in a special effects engine according to an embodiment of the present application.
As shown in fig. 3, based on the embodiment shown in fig. 1, the special effects application layer 120 according to the embodiment of the present application includes: the special effect application framework sublayer 121 is configured to perform specification management and task management on the special effect data generated by the graphics engine layer; and a user logic sub-layer 122 for receiving input user interaction parameter configuration information from a user.
The special effect application framework sublayer 121 is configured to perform specification management and task management on the special effect data generated by the graphics engine layer. The standard management of the special effect data generated by the graphic engine layer comprises special effect initialization, interactive information configuration, special effect update execution and special effect ending.
Here, the interactive information configuration refers to configuring an interactive function thereof with respect to the special effect data, and specifically may include configuring at least one of an interactable position, an interactable time, and an interactive input strength of the special effect. It is to be noted that, in the embodiment of the present application, the interaction information is configured to enable the special effect to have an interaction function, for example, a certain specific area where an interactable position of the special effect is configured for specifying the special effect may be used to receive an interaction operation of the user, for example, the user may click on the certain specific area, the user may slide the certain specific area, and the like. In addition, a certain period of time for configuring the interactability time of the special effect for specifying the special effect may be used for receiving the user's interaction operation, such as 2 nd to 8 th seconds within a special effect of 10 seconds may be used for receiving the user's interaction operation. Further, user manipulation to configure the interaction input strength of the effect for specifying what strength is received is available for interaction.
Therefore, through the standard management of the special effect application framework sublayer 121, special effect initialization can be performed on the special effect data generated by the graphic engine layer, input information such as the position, time and force of a user touching the screen is received, special effect updating is performed, and the special effect is ended, so that the special effect with the interaction capability with the user is generated.
The task management of the special effect application framework sublayer 121 is used to execute a plurality of general special effects provided by the graphics engine layer as tasks in a merged manner or independently.
The user logic sublayer 122 is configured to receive user interaction parameter configuration information from a user, where the user interaction parameter configuration information is parameter configuration information related to a specific interaction operation input by the user. Specifically, the user interaction parameter configuration information may include a special effect display object, that is, a special effect object specified by the user to be specifically displayed, such as a person in motion, a vehicle, and the like. Furthermore, the user interaction parameter configuration information may include a special effect display time, for example, displaying a 5 second special effect. In addition, the user interaction parameter configuration information may further include a special effect display position and a special effect display area size for displaying a special effect in a specific position and area on the screen.
That is, the configuration of the interaction information through the effects application framework sublayer 121 is such that the effects data provided by the graphic engine layer has an interaction capability, and the user interaction parameter configuration information received by the user logic sublayer 122 indicates that the effects interact with the user in various ways.
Taking the special effect of driving the vehicle from the left side to the right side of the screen by the sliding operation of the user as an example, the interaction information configuration of the special effect application framework sublayer 121 indicates which region of the special effect graphic can receive the sliding operation of the user, in what time range the sliding operation lasts, and how strong the received sliding operation is, so that the characteristic graphic has the interaction capability. And the user interaction parameter configuration information received by the user logic sub-layer 122 is used to indicate the object of the sliding operation, i.e. the car, by the user, the time of the sliding operation, thereby deciding at what speed the car is traveling, and the starting point and the ending point of the sliding operation, i.e. from what position the car is traveling to.
Here, as will be understood by those skilled in the art, when configuring the user interaction parameter, the user may directly specify the value of the parameter, or may configure the user interaction parameter by configuring various sensors through the user logic sublayer 122 to sense the user operation. In particular, the sensors may be voice sensors, such as microphones, gesture sensors, such as cameras, power sensors, position sensors, such as GPS positioning devices, brightness sensors, such as light sensing elements, and the like. The sensors can sense user operation, and parameters of the user operation are obtained through the sensors and serve as user interaction parameters.
Therefore, through the special effects application framework sublayer 121 and the user logic sublayer 122, an interactive special effect that realizes an effect of actually interacting with the user can be obtained.
In addition, since the user logic sublayer 122 receives the input user interaction parameter configuration information from the user, the special effects application framework sublayer 121 needs to provide a special effects use entry for the obtained special effects task, so that the user interaction parameters received from the user can configure the interaction time, position, etc. of the special effects.
In addition, as shown in fig. 3, the special effects application layer 120 according to the embodiment of the present application optionally includes an interface sub-layer 123 (indicated by a dashed box), where the interface sub-layer 123 is used to perform interface encapsulation on the special effects application framework sub-layer, which is suitable for use in multiple operating environments.
For example, the interface sublayer 123 provides an API (application program interface) layer for masking language differences and data states, thereby facilitating any environment use. That is, the interface sub-layer 123 interfaces and encapsulates the special effects application framework sub-layer 122 for any environment, such as web js, ios, android jni, and the like. With the interface sub-layer 123, the user logic sub-layer 122 can receive user input by calling the API layer according to an actual service.
Therefore, the special effect engine according to the embodiment of the application can realize the rendering and interaction effects of the 2D/3D special effect graph, so that the interaction special effect is obtained according to the actual requirement.
The special effect engine according to the embodiment of the application can be implemented in various terminal devices, such as a server for making interactive special effects. In one example, the special effects engine according to embodiments of the present application may be integrated into the terminal device as one software module and/or hardware module. For example, the special effects engine may be a software module in the operating system of the terminal device, or may be an application developed for the terminal device; of course, the special effects engine may also be one of many hardware modules of the terminal device.
Alternatively, in another example, the special effects engine and the terminal device may be separate devices, and the special effects engine may be connected to the terminal device through a wired and/or wireless network and transmit the interaction information according to an agreed data format.
Exemplary method
Fig. 4 illustrates a flowchart of an interactive special effects generation method of an application special effects engine according to an embodiment of the present application.
As shown in fig. 4, the method for generating an interactive special effect by using an application special effect engine according to an embodiment of the present application includes: s210, obtaining special effect data generated by one or more universal special effect rendering functions provided by a graphic engine layer; s220, generating a special effect rendering task from the special effect data by the special effect application layer; s230, configuring the special effect rendering task based on a special effect rendering specification provided by a special effect application layer; s240, receiving interactive parameter configuration information input by a user through a special effect application layer; and S250, generating the interactive special effect based on the interactive parameter configuration information and the special effect rendering task.
In the method for generating an interactive special effect by applying a special effect engine according to the embodiment of the present application, generating a special effect rendering task from the special effect data by a special effect application layer includes: and combining the special effect data corresponding to a plurality of universal special effect rendering functions into a single special effect rendering task by a special effect application framework sublayer of the special effect application layer.
In the method for generating an interactive special effect by applying a special effect engine according to the embodiment of the present application, configuring a special effect rendering task based on a special effect rendering specification provided by a special effect application layer includes: configuring the special effect rendering task based on a special effect rendering specification provided by a special effect application framework sublayer of the special effect application layer, wherein the special effect rendering specification comprises special effect initialization, interactive information configuration, special effect update execution and special effect ending.
In the method for generating an interactive special effect by applying a special effect engine according to the embodiment of the application, the configuration of the interaction information includes configuring at least one of an interactable position, an interactable time and an interaction input strength of the special effect.
In the method for generating an interactive special effect of an application special effect engine according to the embodiment of the present application, receiving, by a special effect application layer, interactive parameter configuration information input by a user includes: and receiving user interaction parameter configuration information input by a user through a user logic sub-layer of the special effect application layer.
In the method for generating an interactive special effect by using a special effect engine according to the embodiment of the present application, the user interaction parameter configuration information includes at least one of the following: the display device comprises a special effect display object, special effect display time, a special effect display position and a special effect display area size.
In the method for generating an interactive special effect of an application special effect engine according to an embodiment of the present application, before receiving, by a special effect application layer, user interaction parameter configuration information input by a user, the method further includes: interface packaging applicable to various operating environments is carried out on the special effect application framework sublayer through the interface sublayer of the special effect application layer; and receiving, by the special effects application layer, user interaction parameter configuration information input by a user, including: and calling an interface encapsulated by the interface sub-layer by the user logic sub-layer to receive user interaction parameter configuration information input by a user.
Here, other details in the interactive special effect generating method using the special effect engine according to the embodiment of the present application are completely the same as corresponding details in the special effect engine according to the embodiment of the present application described in the section of "exemplary system" before, and are not described again in order to avoid redundancy.
Application example 1
Fig. 5 is a flowchart illustrating an application example one of an interactive special effect generating method of an application special effect engine according to an embodiment of the present application.
As shown in fig. 5, this application example includes the following steps.
Step S310, the fade-in special effect, the wipe special effect and the fade-out special effect provided by the graphic engine layer are obtained. That is, in this application example, the generated interactive special effects include three parts of fade-in, wipe, and fade-out. Accordingly, a fade-in effect, a wipe effect, and a fade-out effect are acquired from the graphics engine layer.
Specifically, the wiping special effect can be realized by providing a wiping mask, which can be various special effect patterns for simulating the wiping effect, such as rainwater on a vehicle window, snow on the ground, and the like. In the special wiping effect, the user wipes the screen to remove the corresponding part of the wiping mask, so that the wiping effect is realized.
In this application example, a fade-in effect and a fade-out effect are added to make the generated interactive effect more lively.
Step S320, combining the fade-in special effect, the wiping special effect and the fade-out special effect into a special effect rendering task by a special effect application framework layer. That is, by combining the fade-in effect, the wipe effect, and the fade-out effect, one effect rendering task for generating the interactive effect can be obtained.
Step S330, configuring the special effect rendering task by the special effect rendering specification provided by the special effect application framework layer. Here, by the mutual information configuration, at least one of a erasable area, an erasable time, and an intensity of a wiping input in the wiping effect may be configured.
In particular, the window area in the wiping effect can be configured as a wiping area in the wiping effect, while the window frame or the interior part in the wiping effect is not a wiping area. Alternatively, the wiping input force may be configured to achieve a weaker wiping effect in the case of a lighter wiping input force and a stronger wiping effect in the case of a heavier wiping input force. Specifically, the wiping strength can be determined by the area covered by the sliding operation of the user on the sliding path. That is, when the wiping force is light, the sliding operation by the user only lightly contacts the screen, so that the coverage area on the screen is small, and when the wiping force is heavy, the sliding operation by the user will heavily contact the screen, so that the coverage area on the screen is large.
That is, in this application example, the special effect rendering specification includes special effect initialization, mutual information configuration, special effect update execution, and special effect end; and configuring at least one of a swappable area, a swappable time, and a wipe input force in the wipe special effect.
Step S340, the user logic layer receives the interactive parameter configuration information input by the user. Here, the user interaction parameter configuration information includes at least one of: wiping a special effect display image, a special effect display position, a special effect display area size, fading-in special effect time, fading-out special effect time, wiping special effect overtime time and operation result display time.
For example, the wipe special effect display image may be used for an image designated by a user for wiping a mask, such as fog, snow, and the like. The special effect display position and the special effect display area size may be used to specify a position and an area for displaying a wiping special effect, such as a window area in a screen. The fade-in effect time may be used to specify the time when the effect fades in on the screen, such as the time when a scrub mask goes from transparency 0 to transparency 1. The fade-out effect time may be used to specify the time when the effect fades out on the screen, such as the time when a scrub mask goes from transparency 1 to transparency 0. The wipe effect timeout time may be used to specify a time at which a user may perform a wiping operation, and as the wipe effect timeout time is reached, the user's wiping operation will have a different result. The operation result presentation time may be used to specify a time at which the result of the wiping operation is presented to the user, and for example, the user may be presented with "wiped completed" in text or animation within 5 seconds.
Step S350, generating the interactive special effect based on the interaction parameter configuration information and the special effect rendering task. That is, in the interaction special effect generated in the application example one, a sliding operation of the user on the screen may be received, coordinates of the sliding operation of the user may be received through the user logic layer, and the wiping strength may be determined according to a coverage area of the sliding operation of the user on the sliding path, so that the visible processing is performed corresponding to the corresponding position of the wiping mask, that is, the corresponding portion of the wiping mask is removed, for example, the transparency of the corresponding portion is changed from 1 to 0. Therefore, the special effect and the interaction effect can be rendered in the screen through the special effect rendering and interaction engine according to the actual scene, and the user can feel that the user is actually in the scene of wiping.
Interaction example
Fig. 6 illustrates a schematic diagram of an application scenario of a user interaction method applying an interaction effect according to an embodiment of the present application.
In the interaction process, the application example of the method for generating the interaction special effect by applying the special effect engine according to the embodiment of the application is applied to the interaction video.
Here, a common interactive form of the interactive video is a branching scenario, which is also called as an AB scenario, and the form is also relatively simple, that is, an option of the branching scenario is set at a certain node of the video, and the corresponding branching scenario is played according to a user selection, and the branching scenarios may be completely independent from each other, for example, an independent story line and an independent ending, and of course, the branching scenario may also be only a middle process of the scenario, and a scenario main line is returned when the branching scenario ends.
Since the interactive video may have a requirement of developing from the scenario main line to different branch scenarios, or returning to the scenario main line from different branch scenarios, the interactive video generally consists of a plurality of video segments (or video intervals), for example, the scenario main line is a video segment, and each branch scenario corresponds to a video segment. In addition, the interactive video also comprises an interactive component, such as an interactive component used for presenting options to a user, an interactive component used for receiving user operation and the like.
In the interaction example, a new interaction mode that the user carries out different scenarios is provided, namely, the different scenarios are entered through the user operation applying the interaction special effect.
As shown in FIG. 6, a first effect V1 is displayed to the user U, the first effect V1 being an interactive effect generated by the effects engine as described above. Therefore, the user U can perform an operation such as a slide operation, a click operation, or the like in response to the first special effect. Then, based on the operation of the user U, the second special effect V2 corresponding to the operation is displayed to the user, and based on the operation result of the user U, different branching scenario 1 and branching scenario 2 are entered.
FIG. 7 illustrates a flow chart of a method of user interaction to apply interactive special effects according to an embodiment of the application.
Specifically, as shown in fig. 7, the interactive process includes the following steps.
Step S410, displaying the first special effect to the user. In one example, the first effect may be a fade-in effect that wipes a mask. For example, the wiping mask is rain on a window of a vehicle.
Step S420, receiving an operation of a user in response to the first special effect, where the operation is an operation of the user with respect to touching a display area of the first special effect on a screen. In one example, the operation is a user's wiping operation on a screen, in particular, the wiping operation is an operation in which the user slides a display region of the wiping mask on the screen. That is, the user performs the wiping operation by the sliding operation within the window area displayed on the screen.
And step S430, displaying a second special effect corresponding to the operation to the user based on the operation. In one example, the second effect is a wipe effect. That is, as described above, the wiping effect is a visible process of changing the transparency of the wiping mask to the region of the wiping operation in response to the wiping operation.
In addition, when the wiping effect is displayed to the user based on the wiping operation, the user may be further presented with the wiped rate in order to facilitate the user to control his or her own wiping operation. For example, a user is presented with a progress bar in a certain area of the screen during the user's wiping of the screen according to the proportion of the user's wiping to provide real-time feedback to the user.
And step S440, displaying different branch scenarios to the user based on the operation result of the operation. In the above example, different branching scenarios are displayed to the user based on the wipe result. For example, it jumps to scenario a when the wipe result indicates that the predetermined wipe ratio has been reached, and jumps to scenario B when the predetermined wipe ratio has not been reached.
Additionally, a third effect corresponding to the first effect may be displayed to the user in response to reaching a preset condition, for example, the preset condition may be reaching a wipe effect timeout time, such as the wipe effect timeout time being set to 1 minute, and the third effect may be a fade-out effect.
Further, in this interaction example, it may be set whether to force a jump, i.e., whether to force a jump (trigger a different branching scenario) after obtaining the operation result or wait for a jump after the countdown ends (trigger a different branching scenario).
That is, in the interaction example, the user wipes the screen, the special effect mask is visually processed through the coordinates of the user touching the screen in the wiping process to achieve the wiping effect, the wiping ratio is fed back to the user in real time at a certain position in the screen in a progress bar mode, and finally different scenarios are triggered by calculating the wiping percentage of the user. In this way, a novel interactive mode with real-time feedback and operational feel of wiping can be provided to the user.
Application example two
Fig. 8 illustrates a scene diagram of an application example two of the special effects engine according to an embodiment of the present application.
As shown in fig. 8, the special effects engine according to the embodiment of the present application may be deployed in the cloud, as shown in C of fig. 8. The special effect engine deployed at the cloud comprises a receiving unit used for receiving special effect data and interaction parameters of the universal special effect from the terminal T. The application unit corresponds to the special effects application layer of the special effects engine as described above, and is configured to perform specification management and task management of the general special effects using the received special effects data, and perform interaction parameter configuration based on the received interaction parameters to generate an interaction special effect, such as V shown in fig. 8. And, the special effects engine further comprises an output unit for outputting the interactive special effects V to the terminal T.
Schematic device
Fig. 9 illustrates a block diagram of an interactive special effects generation apparatus applying a special effects engine according to an embodiment of the present application.
As shown in fig. 9, an interactive special effect generating apparatus 500 of an application special effect engine according to an embodiment of the present application includes: a special effect obtaining unit 510, configured to obtain a fade-in special effect, a wipe special effect, and a fade-out special effect provided by the graphics engine layer; a special effect combining unit 520, configured to combine the fade-in special effect, the wipe special effect, and the fade-out special effect into a special effect rendering task by a special effect application framework layer; a task configuration unit 530, configured to configure the special effect rendering task according to a special effect rendering specification provided by the special effect application framework layer; a parameter receiving unit 540, configured to receive, by the user logic layer, interaction parameter configuration information input by a user; and a special effect generating unit 550, configured to generate the interactive special effect based on the interaction parameter configuration information and the special effect rendering task.
In one example, in the above-mentioned interactive special effect generating apparatus 500, the special effect rendering specification includes special effect initialization, interactive information configuration, special effect update execution, and special effect end; the interaction information configuration comprises at least one of a swappable area, a swappable time, and a force of a wiping input in the wiping effect.
In one example, in the above-mentioned interactive special effect generating apparatus 500, the user interaction parameter configuration information includes at least one of the following: wiping a special effect display image, a special effect display position, a special effect display area size, fading-in special effect time, fading-out special effect time, wiping special effect overtime time and operation result display time.
FIG. 10 illustrates a block diagram of a user interaction device applying interactive special effects, according to an embodiment of the present application.
As shown in fig. 10, a user interaction apparatus 600 for applying an interactive special effect according to an embodiment of the present application includes: a first display unit 610 for displaying a first effect to a user; an operation receiving unit 620 configured to receive an operation of a user in response to the first special effect, the operation being an operation of the user with respect to touching a display area of the first special effect on a screen; a second display unit 630 configured to display a second special effect corresponding to the operation to the user based on the operation; and a third display unit 640 for displaying different branching scenarios to the user based on the operation result of the operation.
In one example, in the above-mentioned user interaction apparatus 600 applying an interaction effect, a fourth display unit is further included, configured to display a third effect corresponding to the first effect to the user in response to reaching a preset condition after a second effect corresponding to the operation is displayed to the user based on the operation.
In one example, in the above-described user interaction device 600 applying an interaction effect, the first effect is a fade-in effect that wipes a mask; the operation is a wiping operation in which the user slides a display region of the wiping mask on a touch screen; the second effect is a wiping effect; and, the third effect is a fade-out effect of the wipe mask.
In an example, in the user interaction apparatus 600 applying the interactive special effect, the second display unit 630 is configured to: in response to the wiping operation, performing visible processing of removing the wiping mask on the area of the wiping operation.
In an example, in the user interaction apparatus 600 applying the interactive special effect, the second display unit 630 is configured to: the user is presented with the wiped scale.
Here, it will be understood by those skilled in the art that the specific functions and operations of the respective units and modules in the above-described interactive special effect generating apparatus 500 applying an interactive special effect and the user interaction apparatus 600 applying an interactive special effect have been described in detail in the above description about the interactive special effect generating method of an application special effect engine and the user interaction method applying an interactive special effect, and thus, a repeated description thereof will be omitted.
As described above, the interactive special effect generating apparatus 500 of the application special effect engine and the user interaction apparatus 600 of the application interactive special effect according to the embodiment of the present application may be implemented in various terminal devices, such as a smart phone carried by a user. In one example, it may be integrated into the terminal device as a software module and/or a hardware module. For example, it may be a software module in the operating system of the terminal device, or it may be an application developed for the terminal device; of course, the interactive special effect generating apparatus 500 of the application special effect engine and the user interaction apparatus 600 of the application interactive special effect according to the embodiment of the present application may also be one of many hardware modules of the terminal device.
Alternatively, in another example, the interactive special effect generating apparatus 500 of the application special effect engine and the user interaction apparatus 600 of the application interactive special effect according to the embodiment of the present application and the terminal device may also be separate devices, and may be connected to the terminal device through a wired and/or wireless network, and transmit the interactive information according to an agreed data format.
Exemplary electronic device
Next, an electronic apparatus according to an embodiment of the present application is described with reference to fig. 11.
FIG. 11 illustrates a block diagram of an electronic device in accordance with an embodiment of the present application.
As shown in fig. 11, the electronic device 10 includes one or more processors 11 and memory 12.
The processor 11 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 10 to perform desired functions.
Memory 12 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by processor 11 to implement the above-described interactive special effects generation method of the application special effects engine and the user interaction method of applying interactive special effects of the various embodiments of the present application, and/or other desired functions. Various contents such as special effects data, interaction parameters, etc. may also be stored in the computer readable storage medium.
In one example, the electronic device 10 may further include: an input device 13 and an output device 14, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The input device 13 may include, for example, a keyboard, a mouse, and the like.
The output device 14 may output various information including the generated interactive special effect to the outside. The output devices 14 may include, for example, a display, speakers, a printer, and a communication network and its connected remote output devices, among others.
Of course, for the sake of simplicity, only some of the components of the electronic device 10 relevant to the present application are shown in fig. 11, and components such as buses, input/output interfaces, and the like are omitted. In addition, the electronic device 10 may include any other suitable components depending on the particular application.
Exemplary computer program product and computer-readable storage Medium
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the interactive special effects generation method of the application special effects engine and the user interaction method of applying interactive special effects according to various embodiments of the present application described in the above-mentioned "exemplary methods" section of this specification.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the first user computing device, partly on the first user device, as a stand-alone software package, partly on the first user computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform steps in an interactive special effects generating method of an application special effects engine and a user interaction method of applying interactive special effects according to various embodiments of the present application described in the above section "exemplary methods" of this specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (23)

1. A special effects engine, comprising:
the graphic engine layer is used for providing a universal special effect rendering function to generate special effect data; and
and the special effect application layer is used for carrying out standard management, task management and interactive parameter configuration on the special effect data generated by the graphic engine layer.
2. The special effects engine of claim 1, wherein the special effects application layer comprises:
the special effect application framework sublayer is used for carrying out standard management and task management on the special effect data generated by the graphic engine layer; and
and the user logic sublayer is used for receiving input user interaction parameter configuration information from a user.
3. The effects engine of claim 2, wherein the specification management includes effects initialization, interaction information configuration, effects update execution, and effects end.
4. The special effects engine of claim 3, wherein the interaction information configuration includes at least one of an interactable position, an interactable time, and an interaction input strength to configure a special effect.
5. The special effects engine of claim 2, wherein the user interaction parameter configuration information comprises at least one of: the display device comprises a special effect display object, special effect display time, a special effect display position and a special effect display area size.
6. The special effects engine of claim 2, wherein the special effects application layer further comprises:
and the interface sub-layer is used for carrying out interface packaging suitable for various operating environments on the special-effect application framework sub-layer.
7. An interactive special effect generating method of an application special effect engine comprises the following steps:
obtaining special effect data generated by one or more universal special effect rendering functions provided by a graphic engine layer;
generating, by a special effects application layer, a special effects rendering task from the special effects data;
configuring the special effect rendering task based on a special effect rendering specification provided by a special effect application layer;
receiving interactive parameter configuration information input by a user through a special effect application layer; and
and generating the interactive special effect based on the interactive parameter configuration information and the special effect rendering task.
8. The interactive special effects generation method of claim 7, wherein generating, by a special effects application layer, a special effects rendering task from the special effects data comprises:
and combining the special effect data corresponding to a plurality of universal special effect rendering functions into a single special effect rendering task by a special effect application framework sublayer of the special effect application layer.
9. The interactive special effects generation method of claim 7, wherein configuring the special effects rendering task based on a special effects rendering specification provided by a special effects application layer comprises:
configuring the special effect rendering task based on a special effect rendering specification provided by a special effect application framework sublayer of the special effect application layer, wherein the special effect rendering specification comprises special effect initialization, interactive information configuration, special effect update execution and special effect ending.
10. The interactive special effect generation method of claim 9, wherein the interactive information configuration includes configuring at least one of an interactable position, an interactable time, and an interactive input strength of the special effect.
11. The interactive special effects generation method of claim 7, wherein receiving, by the special effects application layer, the user-input interaction parameter configuration information comprises:
and receiving user interaction parameter configuration information input by a user through a user logic sub-layer of the special effect application layer.
12. The interactive special effects generation method of claim 11, wherein the user interaction parameter configuration information includes at least one of: the display device comprises a special effect display object, special effect display time, a special effect display position and a special effect display area size.
13. The interactive special effects generation method of claim 11, wherein, prior to receiving, by the special effects application layer, user-input user interaction parameter configuration information, further comprising:
interface packaging applicable to various operating environments is carried out on the special effect application framework sublayer through the interface sublayer of the special effect application layer; and
receiving, by the special effects application layer, user interaction parameter configuration information input by a user includes:
and calling an interface encapsulated by the interface sub-layer by the user logic sub-layer to receive user interaction parameter configuration information input by a user.
14. An interactive special effect generating method of an application special effect engine comprises the following steps:
obtaining a fade-in special effect, a wipe special effect and a fade-out special effect provided by a graphic engine layer;
combining the fade-in effect, the wipe effect and the fade-out effect into a effect rendering task by an effect application framework layer;
configuring the special effect rendering task by a special effect rendering specification provided by a special effect application framework layer;
receiving interactive parameter configuration information input by a user logic layer; and
and generating the interactive special effect based on the interactive parameter configuration information and the special effect rendering task.
15. The interactive special effect generation method according to claim 14, wherein the special effect rendering specification includes a special effect initialization, an interactive information configuration, a special effect update execution, and a special effect end;
the interaction information configuration comprises at least one of a swappable area, a swappable time, and a force of a wiping input in the wiping effect.
16. The interactive special effect generation method according to claim 14, wherein the user interaction parameter configuration information includes at least one of: wiping a special effect display image, a special effect display position, a special effect display area size, fading-in special effect time, fading-out special effect time, wiping special effect overtime time and operation result display time.
17. A user interaction method of applying an interactive special effect, comprising:
displaying the first special effect to a user;
receiving an operation of a user responding to the first special effect, wherein the operation is the operation of the user relative to a display area of the first special effect on a touch screen;
displaying a second special effect corresponding to the operation to the user based on the operation; and
and displaying different branch scenarios to the user based on the operation result of the operation.
18. The user interaction method applying interactive special effects according to claim 17, further comprising, after a second special effect corresponding to the operation to the user based on the operation:
and displaying a third special effect corresponding to the first special effect to a user in response to reaching a preset condition.
19. The user interaction method applying the interaction effect of claim 18,
the first effect is a fade-in effect that wipes a mask;
the operation is a wiping operation in which the user slides a display region of the wiping mask on a touch screen;
the second effect is a wiping effect; and
the third effect is a fade-out effect of the wipe mask.
20. The user interaction method of applying an interaction effect of claim 19, wherein displaying a wipe effect to a user based on the wipe operation comprises:
in response to the wiping operation, performing visible processing of removing the wiping mask on the area of the wiping operation.
21. The user interaction method of applying an interaction effect of claim 19, wherein displaying a wipe effect to a user based on the wipe operation comprises:
the user is presented with the wiped scale.
22. A special effects engine, comprising:
a receiving unit, configured to receive special effect data and interaction parameters of a general special effect from a terminal;
the application unit is used for carrying out standard management and task management on the universal special effect by using the received special effect data and carrying out interactive parameter configuration on the basis of the received interactive parameters so as to generate an interactive special effect; and
and the output unit is used for outputting the interactive special effect to a terminal.
23. An electronic device, comprising:
a processor; and
a memory having stored therein computer program instructions which, when executed by the processor, cause the processor to perform the method of generating an interactive special effect of an application special effect engine of any one of claims 7-13, the method of generating an interactive special effect of an application special effect engine of any one of claims 14-16, and the method of user interaction of an application interactive special effect of any one of claims 17-21.
CN202010559879.9A 2020-01-22 2020-06-18 Special effect engine, interactive special effect generating method using same and user interaction method Pending CN113157175A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010075600X 2020-01-22
CN202010075600 2020-01-22

Publications (1)

Publication Number Publication Date
CN113157175A true CN113157175A (en) 2021-07-23

Family

ID=76882171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010559879.9A Pending CN113157175A (en) 2020-01-22 2020-06-18 Special effect engine, interactive special effect generating method using same and user interaction method

Country Status (1)

Country Link
CN (1) CN113157175A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105912234A (en) * 2016-04-06 2016-08-31 腾讯科技(深圳)有限公司 Virtual scene interaction method and device
CN106504339A (en) * 2016-11-09 2017-03-15 四川长虹电器股份有限公司 Historical relic 3D methods of exhibiting based on virtual reality
CN107608608A (en) * 2017-09-25 2018-01-19 浙江科澜信息技术有限公司 A kind of information interacting method, the apparatus and system of three-dimensional graphics renderer engine
US20180190004A1 (en) * 2016-12-30 2018-07-05 Microsoft Technology Licensing, Llc Interactive and dynamically animated 3d fonts
CN109240564A (en) * 2018-10-12 2019-01-18 武汉辽疆科技有限公司 Artificial intelligence realizes the device and method of interactive more plot animations branch
US20190080017A1 (en) * 2016-03-09 2019-03-14 Alibaba Group Holding Limited Method, system, and device that invokes a web engine
CN109710353A (en) * 2018-12-12 2019-05-03 浙江口碑网络技术有限公司 Animated element in the page shows method and device
CN109766150A (en) * 2017-11-06 2019-05-17 广州市动景计算机科技有限公司 Implementation method, device and the terminal device of interactive animation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190080017A1 (en) * 2016-03-09 2019-03-14 Alibaba Group Holding Limited Method, system, and device that invokes a web engine
CN105912234A (en) * 2016-04-06 2016-08-31 腾讯科技(深圳)有限公司 Virtual scene interaction method and device
CN106504339A (en) * 2016-11-09 2017-03-15 四川长虹电器股份有限公司 Historical relic 3D methods of exhibiting based on virtual reality
US20180190004A1 (en) * 2016-12-30 2018-07-05 Microsoft Technology Licensing, Llc Interactive and dynamically animated 3d fonts
CN107608608A (en) * 2017-09-25 2018-01-19 浙江科澜信息技术有限公司 A kind of information interacting method, the apparatus and system of three-dimensional graphics renderer engine
CN109766150A (en) * 2017-11-06 2019-05-17 广州市动景计算机科技有限公司 Implementation method, device and the terminal device of interactive animation
CN109240564A (en) * 2018-10-12 2019-01-18 武汉辽疆科技有限公司 Artificial intelligence realizes the device and method of interactive more plot animations branch
CN109710353A (en) * 2018-12-12 2019-05-03 浙江口碑网络技术有限公司 Animated element in the page shows method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
邱航;陈雷霆;: "面向对象三维图形引擎的设计与实现", 电子科技大学学报, no. 01, 30 January 2010 (2010-01-30) *

Similar Documents

Publication Publication Date Title
KR101329619B1 (en) Computer network-based 3D rendering system
JP4465049B2 (en) Virtual environment navigation aid
CN103092612B (en) Realize method and the electronic installation of Android operation system 3D desktop pinup picture
JP7270661B2 (en) Video processing method and apparatus, electronic equipment, storage medium and computer program
CN103157281B (en) Display method and display equipment of two-dimension game scene
CN110989878B (en) Animation display method and device in applet, electronic equipment and storage medium
EP4044123A1 (en) Display method and device based on augmented reality, and storage medium
AU2021339341B2 (en) Augmented reality-based display method, device, and storage medium
US10013059B2 (en) Haptic authoring tool for animated haptic media production
CN110825467A (en) Rendering method, rendering apparatus, hardware apparatus, and computer-readable storage medium
CN110262763B (en) Augmented reality-based display method and apparatus, storage medium, and electronic device
CN114924712A (en) AUI modularization realization method and system based on domain controller platform
US20170060601A1 (en) Method and system for interactive user workflows
CN109905753B (en) Corner mark display method and device, storage medium and electronic device
CN112435313A (en) Method and device for playing frame animation, electronic equipment and readable storage medium
CN113157175A (en) Special effect engine, interactive special effect generating method using same and user interaction method
CN111915708B (en) Image processing method and device, storage medium and electronic equipment
CN116775174A (en) Processing method, device, equipment and medium based on user interface frame
CN111045674A (en) Interactive method and device of player
CN110662099A (en) Method and device for displaying bullet screen
CN112464126B (en) Method for generating panoramic chart based on Threejs, terminal equipment and storage medium
CN115442650B (en) Barrage information processing method and device, barrage information processing equipment and storage medium
WO2023142756A1 (en) Live broadcast interaction method, device, and system
CN115623147A (en) Video production template generation method and related device
CN114793274A (en) Data fusion method and device based on video projection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination