CN110377192B - Method, device, medium and electronic equipment for realizing interactive effect - Google Patents

Method, device, medium and electronic equipment for realizing interactive effect Download PDF

Info

Publication number
CN110377192B
CN110377192B CN201910527297.XA CN201910527297A CN110377192B CN 110377192 B CN110377192 B CN 110377192B CN 201910527297 A CN201910527297 A CN 201910527297A CN 110377192 B CN110377192 B CN 110377192B
Authority
CN
China
Prior art keywords
effect
information
rule
interactive
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910527297.XA
Other languages
Chinese (zh)
Other versions
CN110377192A (en
Inventor
俞亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Douyin Vision Co Ltd
Douyin Vision Beijing Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201910527297.XA priority Critical patent/CN110377192B/en
Publication of CN110377192A publication Critical patent/CN110377192A/en
Application granted granted Critical
Publication of CN110377192B publication Critical patent/CN110377192B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The present disclosure provides a method, an apparatus, a medium, and an electronic device for implementing an interactive effect, the method including: acquiring trigger information aiming at an interactive object in a user interface; acquiring interaction information according to the trigger information and an initialization rule; and displaying the interactive effect of the interactive object according to the interactive information and the interactive rule. According to the method and the device, the interactive objects in the user interface generate special display effects through the trigger information, and the use experience of the user is improved.

Description

Method, device, medium and electronic equipment for realizing interactive effect
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method, an apparatus, a medium, and an electronic device for implementing an interactive effect.
Background
With the development of mobile internet and intelligent terminals, terminal devices are gradually becoming a part of daily life of many users.
However, the unchangeable interaction mode between the human machine and the machine is accompanied with the user for decades, and the appearance and the rigid keys designed based on the square frame structure bring certain convenience for development, but the fast food type development mode and the design idea cannot provide bright pleasure in the future and cannot meet the requirement of seeking new and changed by the user.
Disclosure of Invention
An object of the present disclosure is to provide a method, an apparatus, a medium, and an electronic device for implementing an interactive effect, which can solve at least one of the above-mentioned technical problems. The specific scheme is as follows:
according to a specific embodiment of the present disclosure, in a first aspect, the present disclosure provides a method for implementing an interactive effect, including:
acquiring trigger information aiming at an interactive object in a user interface;
acquiring interaction information according to the trigger information and an initialization rule;
and displaying the interactive effect of the interactive object according to the interactive information and the interactive rule.
Optionally, the triggering information includes: interactive object information, trigger position and trigger duration;
the initialization rule includes: displaying a time rule, an initialization position rule and an initialization frequency rule;
the interactive information comprises effect display time, an effect center position in a preset display area of the interactive object and effect change frequency;
the acquiring of the interaction information according to the trigger information and the initialization rule includes:
generating the effect center position according to the trigger position and the initialization position rule; the triggering position is the coordinates of four corners of a rectangle surrounding the impact area; the swipe area comprises an area resulting from a swipe of a touch screen associated with the interactive object;
generating the effect change frequency according to the trigger duration and the initialization frequency rule;
and acquiring the effect display time according to the interactive object information and the display time rule.
Optionally, the initializing rule further includes: effect image rules;
the interactive information comprises information of a first storage structure used for storing image information;
the acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
and acquiring a plurality of preset effect images according to the interactive object information and the effect image rule, and storing the preset effect images in a first storage structure according to a time sequence.
Optionally, the initialization rule further includes an initialization material rule and a synthesis rule;
the acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
acquiring a display material image according to the interactive object information, the touch area and the initialization material rule;
acquiring a plurality of preset effect images according to the interactive object information and the effect image rule;
and respectively synthesizing the preset effect image and the display material image according to the synthesis rule, respectively generating a special effect image, and storing the information of the special effect image in the first storage structure according to a time sequence.
Optionally, the displaying the interactive effect of the interactive object according to the interactive information and the interactive rule includes:
acquiring information of the first storage structure;
circularly displaying the images in the first storage structure in a preset display area of the interactive object according to the effect change frequency within the effect display time range; an image in the first storage structure having a center position displayed at the effect center position.
Optionally, the interactive information further includes a preset effect mode; the preset effect mode includes: a transparent mode, a progressive transparent mode, and an opaque mode;
the displaying the images in the first storage structure in a circulating manner in a preset display area of the interactive object according to the effect change frequency comprises:
and circularly displaying the images in the first storage structure according to a preset effect mode in a preset display area of the interactive object according to the effect change frequency.
Optionally, the initializing rule further includes: an image parameter rule and a preset image generation model associated with the interactive object; the input of the preset image generation model comprises an image change parameter;
the interactive information comprises information of a second storage structure used for storing a plurality of groups of image change parameters;
the acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
and acquiring a plurality of groups of image change parameters according to the interactive object information and the image parameter rule, and storing the image change parameters in a second storage structure according to a time sequence.
Optionally, the displaying the interactive effect of the interactive object according to the interactive information and the interactive rule includes:
acquiring information of the second storage structure;
within the effect display time range, inputting each group of image change parameters in the second storage structure into the preset image generation model according to the effect change frequency cycle;
the preset image generation model generates an effect image according to the image change parameters and displays the effect image in a preset display area of the interactive object; the effect image, a center position of which is displayed on the effect center position.
Optionally, a preset effect mode is further included; the preset effect mode includes: a transparent mode, a progressive transparent mode, and an opaque mode;
the preset image generation model generates an effect image according to the image change parameters, and displays the effect image in a preset display area of the interactive object, and the preset image generation model comprises the following steps:
and the preset image generation model generates an effect image according to the image change parameters, and displays the effect image in a preset display area of the interactive object according to the preset effect mode.
According to a second aspect of the present disclosure, there is provided an apparatus for implementing interactive effects, including:
the acquisition triggering information unit is used for acquiring triggering information aiming at the interactive object in the user interface;
the interactive information acquisition unit is used for acquiring interactive information according to the trigger information and the initialization rule;
and the display unit is used for displaying the interactive effect of the interactive object according to the interactive information and the interactive rule.
According to a third aspect, the present disclosure provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the method for implementing an interactive effect according to any one of the first aspect.
According to a fourth aspect thereof, the present disclosure provides an electronic device, comprising: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the method of implementing an interactive effect as claimed in any one of the first aspects.
Compared with the prior art, the scheme of the embodiment of the disclosure at least has the following beneficial effects:
the present disclosure provides a method, an apparatus, a medium, and an electronic device for implementing an interactive effect, the method including: acquiring trigger information aiming at an interactive object in a user interface; acquiring interaction information according to the trigger information and an initialization rule; and displaying the interactive effect of the interactive object according to the interactive information and the interactive rule. According to the method and the device, the interactive objects in the user interface generate special display effects through the trigger information, and the use experience of the user is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
fig. 1 is a flowchart of a method for implementing an interactive effect according to an embodiment of the present disclosure;
fig. 2 is a block diagram of units of an apparatus for implementing an interactive effect according to an embodiment of the present disclosure;
fig. 3 is a schematic view of a connection structure of an electronic device according to an embodiment of the disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the present disclosure clearer, the present disclosure will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present disclosure, rather than all embodiments. All other embodiments, which can be derived by one of ordinary skill in the art from the embodiments disclosed herein without making any creative effort, shall fall within the scope of protection of the present disclosure.
The terminology used in the embodiments of the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in the disclosed embodiments and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and "a plurality" typically includes at least two.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that although the terms first, second, third, etc. may be used to describe technical names in embodiments of the present disclosure, the technical names should not be limited to the terms. These terms are only used to distinguish between technical names. For example, a first check signature may also be referred to as a second check signature, and similarly, a second check signature may also be referred to as a first check signature, without departing from the scope of embodiments of the present disclosure.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such article or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in the article or device in which the element is included.
Alternative embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
The first embodiment provided by the present disclosure, that is, an embodiment of a method for realizing an interactive effect.
The embodiment of the present disclosure is described in detail below with reference to fig. 1, where fig. 1 is a flowchart of a method for implementing an interactive effect according to the embodiment of the present disclosure.
Step S101, acquiring trigger information aiming at the interactive object in the user interface.
A User Interface (UI) is a medium for interaction and information exchange between a system and a User, and it realizes conversion between an internal form of information and a human-acceptable form. The user interface is designed to interact and communicate with the related software between the user and the hardware, so that the user can conveniently and efficiently operate the hardware to achieve bidirectional interaction and complete the work expected to be completed by the hardware. For example, in a B/S architecture, the user interface is a web page; in the application program of the mobile phone, the user interface is a graphical interface displayed on a screen of the mobile phone and used for a user to exchange information with the mobile phone.
The interactive object refers to a control which can communicate information with a user in the user interface. Such as a confirmation button, a selectable box, and a text edit box, etc.
The trigger information is information for starting to display the interactive effect. For example, clicking information of a confirmation button in a computer by a mouse; or touch information generated by touching the touch screen associated with the confirmation button with a finger in the touch screen of the intelligent terminal.
And step S102, acquiring interaction information according to the trigger information and the initialization rule.
The purpose of generating the trigger information is to generate an interactive effect.
Firstly, interactive information is obtained after triggering.
Interaction is the meaning of the process of interaction in connection with each other. In computer application, the interaction refers to a process of information communication between a user and a computer in a communication mode approved by the computer. The interactive information is the information generated by the communication process. The interactive information in the embodiment of the present disclosure is information for realizing an interactive effect generated by a computer in response to a trigger information of a user.
And step S103, displaying the interactive effect of the interactive object according to the interactive information and the interactive rule.
The interaction rule is a rule set for generating the interaction effect according to the interaction information. For example, clicking a confirmation button through a mouse, taking a click position as a display center position of a display area of the confirmation button according to an interaction rule, and displaying an annular water ripple image in the display area of the confirmation button by taking the click position as a center.
The embodiment of the present disclosure provides a first specific implementation method for implementing the above-mentioned interaction effect. The first specific implementation method is mainly explained by using a mouse to click the confirmation key as a case.
The trigger information includes: interactive object information, trigger location and trigger duration.
The interactive object information may be a name indicating the interactive object, or may be a unique identification code of the interactive object. For example, the interactive object information is the unique identification code 10010 of the confirmation button in the user interface, and the coordinate position of the mouse clicking the confirmation button is the trigger position, for example, the coordinate of the upper left corner relative to the confirmation button is X100 pixels, and Y is-100 pixels; the duration of mouse clicking the confirmation button is the trigger duration, such as 10 seconds.
The interactive information comprises effect display time, an effect center position in a preset display area of the interactive object and effect change frequency.
The effect display time is the time when the trigger information displays the interactive effect in the interactive object once.
The center of the effect is mainly used to locate the position of the displayed image and improve the effect of dynamic display, for example, the click position of the mouse is used as the center of the effect, so that when the annular ripple is displayed, the center of the annular ripple changes along with the change of the click position of the mouse.
The frequency of the effect change, i.e. the frequency of the change of the display image in the display area, is intended to achieve a preset dynamic effect by the change of the display image.
The initialization rule includes: displaying a time rule, an initialization position rule and an initialization frequency rule.
And the display time rule is used for expressing the relation between the interactive object and the effect display time. Different ones of the interactive objects may be set to different ones of the effect display times. For example, the display time rule is that the interactive object and the effect display time are in a one-to-one correspondence relationship, the effect display time of the confirmation key is 3 seconds, and the effect display time of the cancel key is 5 seconds.
And the initialized position rule is used for expressing the relation between the trigger position and the effect center position. For example, the click position of the mouse is taken as the effect center position.
And the initialization frequency rule is used for expressing the relation between the trigger duration and the effect change frequency. For example, the trigger duration and the effect change frequency are in a square proportional relationship, and if the trigger duration of clicking the confirm button by the mouse is 0.5 second, the effect change frequency is 2 times per second; if the trigger duration of clicking the confirmation button by the mouse is 0.1 second, the effect change frequency is 10 times per second.
The acquiring of the interaction information according to the trigger information and the initialization rule includes:
and S102-11, generating the effect center position according to the trigger position and the initialization position rule.
And S102-12, generating the effect change frequency according to the trigger duration and the initialization frequency rule.
And S102-13, acquiring the effect display time according to the interactive object information and the display time rule.
In terms of image processing, the initialization rule further includes: and (5) effect image rules.
The effect image rule is used for expressing the relation between the interactive object and the preset effect image. For example, the interactive object and the preset effect image are in a one-to-one correspondence relationship, the preset effect image of the confirmation key is a plurality of annular water ripple images, and the preset effect image of the cancel key is a plurality of vortex images.
The interactive information comprises information of a first storage structure used for storing information of a plurality of images. For example, the first storage structure is a saved image array, and the information of the first storage structure is address information of the array.
The acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
and acquiring a plurality of preset effect images according to the interactive object information and the effect image rule, and storing the preset effect images in a first storage structure according to a time sequence.
The preset effect image may be an image captured from a video according to a time sequence. And displaying the preset effect images according to the time sequence in the application, thereby realizing the dynamic effect.
The chronological order, i.e. the chronological order in which the display content of the images occurs.
The displaying the interactive effect of the interactive object according to the interactive information and the interactive rule includes:
and step S103-11, acquiring the information of the first storage structure.
For example, continuing with the above example, the information of the first storage structure is address information of an array, and the preset effect image can be obtained through the address information of the array.
Step S103-12, circularly displaying the images in the first storage structure in a preset display area of the interactive object according to the effect change frequency within the effect display time range; an image in the first storage structure having a center position displayed at the effect center position.
I.e. the centre position of the image in the first storage structure coincides with the effect centre position. For example, if the effect center position is a click position of a mouse, the image in the first storage structure is displayed with the click position of the mouse as a center.
Optionally, the interactive information further includes a preset effect mode; the preset effect mode includes: a transparent mode, a progressive transparent mode, and an opaque mode.
The gradually transparent mode is that the display image and the original display image in the display area of the interactive object are displayed simultaneously according to a preset display scale.
The displaying the images in the first storage structure in a circulating manner in a preset display area of the interactive object according to the effect change frequency comprises:
and circularly displaying the images in the first storage structure according to a preset effect mode in a preset display area of the interactive object according to the effect change frequency.
The embodiment of the present disclosure provides a second specific implementation method for implementing the above-mentioned interaction effect. The second embodiment is mainly described by using a case where the user clicks the confirmation key with a finger of a device having a touch screen, and since some contents are the same as those of the first embodiment, further description is omitted here, and the detailed contents refer to the first embodiment.
The trigger information includes: interactive object information, trigger location and trigger duration.
The trigger position is the coordinates of four corners of a rectangle surrounding the impact area; the swipe area includes an area resulting from a swipe of a touch screen associated with the interactive object.
The initialization rule includes: displaying a time rule, an initialization position rule and an initialization frequency rule.
The interactive information comprises effect display time, an effect center position in a preset display area of the interactive object and effect change frequency.
And the initialized position rule is used for expressing the relation between the trigger position and the effect center position.
The acquiring of the interaction information according to the trigger information and the initialization rule includes:
and S102-21, generating the effect center position according to the trigger position and the initialization position rule.
That is, the effect center position is obtained by coordinates of the four corners of the rectangle. For example, the center coordinates of the rectangle, that is, the effect center position, are obtained from the coordinates of the four corners of the rectangle.
And S102-22, generating the effect change frequency according to the trigger duration and the initialization frequency rule.
And S102-23, acquiring the effect display time according to the interactive object information and the display time rule.
The initialization rule further comprises an initialization material rule and a synthesis rule.
And the initialization material rule represents the relationship between the interactive object information and the touch area and the display material image. For example, if the interactive object information is a confirmation key, and the impact area is 0.8 cm x 0.5 cm, the display material image is a water image.
And the synthesis rule represents the relationship between the preset effect image and the display material image and the special effect image.
The acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
and S102-24, acquiring a display material image according to the interactive object information, the touch area and the initialization material rule.
Step S102-25, acquiring a plurality of preset effect images according to the interactive object information and the effect image rule;
and S102-26, respectively synthesizing the preset effect image and the display material image according to the synthesis rule, respectively generating special effect images, and sequentially storing the special effect images in the first storage structure.
The displaying the interactive effect of the interactive object according to the interactive information and the interactive rule includes:
and step S103-21, acquiring the information of the first storage structure.
S103-22, circularly displaying the images in the first storage structure in a preset display area of the interactive object according to the effect change frequency within the effect display time range; an image in the first storage structure having a center position displayed at the effect center position.
Optionally, the interactive information further includes a preset effect mode; the preset effect mode includes: a transparent mode, a progressive transparent mode, and an opaque mode.
The displaying the images in the first storage structure in a circulating manner in a preset display area of the interactive object according to the effect change frequency comprises:
and circularly displaying the images in the first storage structure according to a preset effect mode in a preset display area of the interactive object according to the effect change frequency.
The embodiment of the present disclosure provides a third specific implementation method for implementing the above-mentioned interaction effect. Since some of the contents are the same as those of the first embodiment, further description is omitted here, and for details, reference is made to the first embodiment.
The trigger information includes: interactive object information, trigger location and trigger duration.
The trigger position is a click coordinate of the interactive object clicked by the mouse.
Or, the trigger position is coordinates of four corners of a rectangle surrounding the impact area; the swipe area includes an area resulting from a swipe of a touch screen associated with the interactive object.
The initialization rule includes: displaying a time rule, an initialization position rule and an initialization frequency rule.
The interactive information comprises effect display time, an effect center position in a preset display area of the interactive object and effect change frequency.
And the initialized position rule is used for expressing the relation between the trigger position and the effect center position.
The acquiring of the interaction information according to the trigger information and the initialization rule includes:
and S102-31, generating the effect center position according to the trigger position and the initialization position rule.
And when the trigger position is a click coordinate of the interactive object clicked by the mouse, taking the click coordinate as the effect center position.
When the trigger position is coordinates of four corners of a rectangle surrounding the impact area, the effect center position is acquired by the coordinates of the four corners of the rectangle. For example, the center coordinates of the rectangle, that is, the effect center position, are obtained from the coordinates of the four corners of the rectangle.
And S102-32, generating the effect change frequency according to the trigger duration and the initialization frequency rule.
And S102-33, obtaining the effect display time according to the interactive object information and the display time rule.
Optionally, the initializing rule further includes: an image parameter rule and a preset image generation model associated with the interactive object; the input of the preset image generation model comprises an image change parameter.
And the image parameter rule represents the relationship between the interactive object information and the plurality of groups of image change parameters.
The preset image generation model is a mathematical model and generates an image by inputting image change parameters. The disclosed embodiments can generate a changing image by inputting different image change parameters.
The interactive information comprises information of a second storage structure used for storing a plurality of groups of image change parameters. For example, the second storage structure is an array for storing a plurality of sets of image variation parameters, and the information of the first storage structure is address information of the array.
The acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
and acquiring a plurality of groups of image change parameters according to the interactive object information and the image parameter rule, and sequentially storing the image change parameters in a second storage structure.
The displaying the interactive effect of the interactive object according to the interactive information and the interactive rule includes:
and S103-31, acquiring the information of the second storage structure.
For example, continuing with the above example, the information of the first storage structure is address information of an array, and the image variation parameter can be obtained through the address information of the array.
And S103-32, circularly inputting each group of image change parameters in the second storage structure into the preset image generation model according to the effect change frequency within the effect display time range.
S103-33, the preset image generation model generates an effect image according to the image change parameters, and the effect image is displayed in a preset display area of the interactive object; the effect image, a center position of which is displayed on the effect center position.
Optionally, the interactive information further includes a preset effect mode; the preset effect mode includes: a transparent mode, a progressive transparent mode, and an opaque mode.
The preset image generation model generates an effect image according to the image change parameters, and displays the effect image in a preset display area of the interactive object, and the preset image generation model comprises the following steps:
and the preset image generation model generates an effect image according to the image change parameters, and displays the effect image in a preset display area of the interactive object according to the preset effect mode.
According to the embodiment of the disclosure, the interactive object in the user interface generates a special display effect through the trigger information, so that the use experience of the user is improved.
Corresponding to the first embodiment provided by the present disclosure, the present disclosure also provides a second embodiment, that is, a device for realizing an interactive effect. Since the second embodiment is basically similar to the first embodiment, the description is simple, and the relevant portions should be referred to the corresponding description of the first embodiment. The device embodiments described below are merely illustrative.
Fig. 2 shows an embodiment of an apparatus for implementing an interactive effect provided by the present disclosure. Fig. 2 is a block diagram of units of an apparatus for implementing an interactive effect according to an embodiment of the present disclosure.
Referring to fig. 2, the present disclosure provides an apparatus for implementing an interactive effect, including: a trigger information acquiring unit 201, an interaction information acquiring unit 202 and a display unit 203.
A trigger information obtaining unit 201, configured to obtain trigger information for an interactive object in a user interface;
an acquiring interaction information unit 202, configured to acquire interaction information according to the trigger information and the initialization rule;
and the display unit 203 is configured to display the interaction effect of the interaction object according to the interaction information and the interaction rule.
Optionally, the triggering information includes: interactive object information, trigger position and trigger duration;
the initialization rule includes: displaying a time rule, an initialization position rule and an initialization frequency rule;
the interactive information comprises effect display time, an effect center position in a preset display area of the interactive object and effect change frequency;
in the unit 202 for acquiring interaction information, the following are included:
an effect center generating subunit, configured to generate the effect center position according to the trigger position and the initialization position rule; the triggering position is the coordinates of four corners of a rectangle surrounding the impact area; the swipe area comprises an area resulting from a swipe of a touch screen associated with the interactive object;
the effect change frequency generation subunit is used for generating the effect change frequency according to the trigger duration and the initialization frequency rule;
and the effect display time obtaining subunit is used for obtaining the effect display time according to the interactive object information and the display time rule.
Optionally, the initializing rule further includes: effect image rules;
the interactive information comprises information of a first storage structure used for storing image information;
in the unit 202 for acquiring interactive information, the method further includes:
and the preset effect image saving subunit is used for acquiring a plurality of preset effect images according to the interactive object information and the effect image rule and saving the preset effect images in a first storage structure according to a time sequence.
Optionally, the initialization rule further includes an initialization material rule and a synthesis rule;
in the unit 202 for acquiring interactive information, the method further includes:
the display material image acquisition subunit is used for acquiring a display material image according to the interactive object information, the touch area and the initialization material rule;
the preset effect image obtaining subunit is used for obtaining a plurality of preset effect images according to the interaction object information and the effect image rule;
and the special effect image generating subunit is configured to respectively combine the preset effect image and the display material image according to the combination rule, respectively generate a special effect image, and store information of the special effect image in the first storage structure according to a time sequence.
In the display unit 203, there are included:
an information subunit for acquiring a first storage structure, configured to acquire information of the first storage structure;
the first display subunit is used for circularly displaying the images in the first storage structure in a preset display area of the interactive object according to the effect change frequency within the effect display time range; an image in the first storage structure having a center position displayed at the effect center position.
Optionally, the interactive information further includes a preset effect mode; the preset effect mode includes: a transparent mode, a progressive transparent mode, and an opaque mode;
in the first display subunit, comprising:
and the first preset effect mode displaying subunit is used for circularly displaying the images in the first storage structure according to the preset effect mode in the preset display area of the interactive object according to the effect change frequency within the effect display time range.
Optionally, the initializing rule further includes: an image parameter rule and a preset image generation model associated with the interactive object; the input of the preset image generation model comprises an image change parameter;
the interactive information comprises information of a second storage structure used for storing a plurality of groups of image change parameters;
in the unit 202 for acquiring interactive information, the method further includes:
and the acquisition multi-group image change parameter subunit is used for acquiring multi-group image change parameters according to the interactive object information and the image parameter rule and storing the image change parameters in a second storage structure according to a time sequence.
Optionally, the display unit 203 includes:
a second storage structure information acquiring subunit, configured to acquire information of the second storage structure;
the input subunit is configured to input each group of image change parameters in the second storage structure to the preset image generation model according to the effect change frequency cycle within the effect display time range;
the second display subunit is used for generating an effect image by the preset image generation model according to the image change parameters and displaying the effect image in a preset display area of the interactive object; the effect image, a center position of which is displayed on the effect center position.
Optionally, the interactive information further includes a preset effect mode; the preset effect mode includes: a transparent mode, a progressive transparent mode, and an opaque mode;
in the second display subunit, comprising:
and the second display preset effect mode subunit is used for generating an effect image by the preset image generation model according to the image change parameters and displaying the effect image in a preset display area of the interactive object according to the preset effect mode.
According to the embodiment of the disclosure, the interactive object in the user interface generates a special display effect through the trigger information, so that the use experience of the user is improved.
The embodiment of the present disclosure provides a third embodiment, that is, an electronic device, where the electronic device is used for implementing an interactive effect, and the electronic device includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the processor to cause the processor to perform the method for achieving interactive effects as described in the first embodiment.
The fourth embodiment of the present disclosure provides a computer storage medium for implementing an interactive effect, where the computer storage medium stores computer-executable instructions, and the computer-executable instructions can execute the method for implementing an interactive effect described in any of the above method embodiments.
Referring to fig. 3, a schematic structural diagram of an electronic device suitable for implementing an embodiment of the disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 3, the electronic device may include a processing device (e.g., a central processing unit, a graphic processor, etc.) 301 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)302 or a program loaded from a storage device 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device 301, the ROM 302, and the RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
Generally, the following devices may be connected to the I/O interface 305: input devices 306 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 307 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage devices 308 including, for example, magnetic tape, hard disk, etc.; and a communication device 309. The communication means 309 may allow the electronic device to communicate wirelessly or by wire with other devices to exchange data. While fig. 3 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication means 309, or installed from the storage means 308, or installed from the ROM 302. The computer program, when executed by the processing device 301, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring at least two internet protocol addresses; sending a node evaluation request comprising the at least two internet protocol addresses to node evaluation equipment, wherein the node evaluation equipment selects the internet protocol addresses from the at least two internet protocol addresses and returns the internet protocol addresses; receiving an internet protocol address returned by the node evaluation equipment; wherein the obtained internet protocol address indicates an edge node in the content distribution network.
Alternatively, the computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: receiving a node evaluation request comprising at least two internet protocol addresses; selecting an internet protocol address from the at least two internet protocol addresses; returning the selected internet protocol address; wherein the received internet protocol address indicates an edge node in the content distribution network.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".

Claims (9)

1. A method for implementing interactive effects, comprising:
acquiring trigger information aiming at an interactive object in a user interface;
acquiring interaction information according to the trigger information and an initialization rule;
displaying the interactive effect of the interactive object according to the interactive information and the interactive rule;
the trigger information includes: interactive object information, trigger position and trigger duration;
the initialization rule includes: displaying a time rule, an initialization position rule and an initialization frequency rule;
the interactive information comprises effect display time, an effect center position in a preset display area of the interactive object and effect change frequency;
the acquiring of the interaction information according to the trigger information and the initialization rule includes:
generating the effect center position according to the trigger position and the initialization position rule; the triggering position is the coordinates of four corners of a rectangle surrounding the impact area; the swipe area comprises an area resulting from a swipe of a touch screen associated with the interactive object;
generating the effect change frequency according to the trigger duration and the initialization frequency rule;
acquiring the effect display time according to the interactive object information and the display time rule;
the initialization rule further includes: effect image rules;
the interactive information comprises information of a first storage structure used for storing image information;
the acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
acquiring a plurality of preset effect images according to the interactive object information and the effect image rule, and storing the preset effect images in a first storage structure according to a time sequence;
the initialization rule also comprises an initialization material rule and a synthesis rule;
the acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
acquiring a display material image according to the interactive object information, the touch area and the initialization material rule;
acquiring a plurality of preset effect images according to the interactive object information and the effect image rule;
and respectively synthesizing the preset effect image and the display material image according to the synthesis rule, respectively generating a special effect image, and storing the information of the special effect image in the first storage structure according to a time sequence.
2. The method of claim 1, wherein displaying the interactive effect of the interactive object according to the interactive information and the interactive rule comprises:
acquiring information of the first storage structure;
circularly displaying the images in the first storage structure in a preset display area of the interactive object according to the effect change frequency within the effect display time range; an image in the first storage structure having a center position displayed at the effect center position.
3. The method of claim 2, wherein the interactive information further comprises a preset effect mode; the preset effect mode includes: a transparent mode, a progressive transparent mode, and an opaque mode;
the displaying the images in the first storage structure in a circulating manner in a preset display area of the interactive object according to the effect change frequency comprises:
and circularly displaying the images in the first storage structure according to a preset effect mode in a preset display area of the interactive object according to the effect change frequency.
4. The method of claim 1, wherein the initializing a rule further comprises: an image parameter rule and a preset image generation model associated with the interactive object; the input of the preset image generation model comprises an image change parameter;
the interactive information comprises information of a second storage structure used for storing a plurality of groups of image change parameters;
the acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
and acquiring a plurality of groups of image change parameters according to the interactive object information and the image parameter rule, and storing the image change parameters in a second storage structure according to a time sequence.
5. The method of claim 4, wherein displaying the interactive effect of the interactive object according to the interactive information and the interactive rule comprises:
acquiring information of the second storage structure;
within the effect display time range, inputting each group of image change parameters in the second storage structure into the preset image generation model according to the effect change frequency cycle;
the preset image generation model generates an effect image according to the image change parameters and displays the effect image in a preset display area of the interactive object; the effect image, a center position of which is displayed on the effect center position.
6. The method of claim 5, wherein the interactive information further comprises a preset effect mode; the preset effect mode includes: a transparent mode, a progressive transparent mode, and an opaque mode;
the preset image generation model generates an effect image according to the image change parameters, and displays the effect image in a preset display area of the interactive object, and the preset image generation model comprises the following steps:
and the preset image generation model generates an effect image according to the image change parameters, and displays the effect image in a preset display area of the interactive object according to the preset effect mode.
7. An apparatus for implementing interactive effects, comprising:
the acquisition triggering information unit is used for acquiring triggering information aiming at the interactive object in the user interface;
the interactive information acquisition unit is used for acquiring interactive information according to the trigger information and the initialization rule;
the display unit is used for displaying the interactive effect of the interactive object according to the interactive information and the interactive rule;
the trigger information includes: interactive object information, trigger position and trigger duration;
the initialization rule includes: displaying a time rule, an initialization position rule and an initialization frequency rule;
the interactive information comprises effect display time, an effect center position in a preset display area of the interactive object and effect change frequency;
the acquiring of the interaction information according to the trigger information and the initialization rule includes:
generating the effect center position according to the trigger position and the initialization position rule; the triggering position is the coordinates of four corners of a rectangle surrounding the impact area; the swipe area comprises an area resulting from a swipe of a touch screen associated with the interactive object;
generating the effect change frequency according to the trigger duration and the initialization frequency rule;
acquiring the effect display time according to the interactive object information and the display time rule;
the initialization rule further includes: effect image rules;
the interactive information comprises information of a first storage structure used for storing image information;
the acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
acquiring a plurality of preset effect images according to the interactive object information and the effect image rule, and storing the preset effect images in a first storage structure according to a time sequence;
the initialization rule also comprises an initialization material rule and a synthesis rule;
the acquiring of the interaction information according to the trigger information and the initialization rule further comprises:
acquiring a display material image according to the interactive object information, the touch area and the initialization material rule;
acquiring a plurality of preset effect images according to the interactive object information and the effect image rule;
and respectively synthesizing the preset effect image and the display material image according to the synthesis rule, respectively generating a special effect image, and storing the information of the special effect image in the first storage structure according to a time sequence.
8. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the method according to any one of claims 1 to 6.
9. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method of any one of claims 1 to 6.
CN201910527297.XA 2019-06-18 2019-06-18 Method, device, medium and electronic equipment for realizing interactive effect Active CN110377192B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910527297.XA CN110377192B (en) 2019-06-18 2019-06-18 Method, device, medium and electronic equipment for realizing interactive effect

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910527297.XA CN110377192B (en) 2019-06-18 2019-06-18 Method, device, medium and electronic equipment for realizing interactive effect

Publications (2)

Publication Number Publication Date
CN110377192A CN110377192A (en) 2019-10-25
CN110377192B true CN110377192B (en) 2020-12-08

Family

ID=68249101

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910527297.XA Active CN110377192B (en) 2019-06-18 2019-06-18 Method, device, medium and electronic equipment for realizing interactive effect

Country Status (1)

Country Link
CN (1) CN110377192B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111309206A (en) 2020-02-04 2020-06-19 北京达佳互联信息技术有限公司 Data processing method and device, electronic equipment and storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108549512A (en) * 2018-03-30 2018-09-18 武汉斗鱼网络科技有限公司 A kind of display methods, device and computer equipment

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7546543B2 (en) * 2004-06-25 2009-06-09 Apple Inc. Widget authoring and editing environment
DE112013002412T5 (en) * 2012-05-09 2015-02-19 Apple Inc. Apparatus, method and graphical user interface for providing feedback for changing activation states of a user interface object
CN106612229B (en) * 2015-10-23 2019-06-25 腾讯科技(深圳)有限公司 The method and apparatus that user-generated content is fed back and shows feedback information
CN105630383B (en) * 2015-12-22 2019-01-29 武汉斗鱼网络科技有限公司 A kind of method and system promoting attention rate in human-computer interaction process
CN109062652A (en) * 2018-08-15 2018-12-21 Oppo广东移动通信有限公司 Fingerprint recognition reminding method, device, storage medium and electronic equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108549512A (en) * 2018-03-30 2018-09-18 武汉斗鱼网络科技有限公司 A kind of display methods, device and computer equipment

Also Published As

Publication number Publication date
CN110377192A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
CN109408685B (en) Thinking guide graph display method and device
RU2677595C2 (en) Application interface presentation method and apparatus and electronic device
US11003315B2 (en) Terminal device and sharing method thereof
EP3979048A1 (en) User terminal device and displaying method thereof
CN111459586B (en) Remote assistance method, device, storage medium and terminal
CN108984707B (en) Method, device, terminal equipment and storage medium for sharing personal information
KR102004986B1 (en) Method and system for executing application, device and computer readable recording medium thereof
US20210191684A1 (en) Control method, control device, control system, electronic whiteboard, and mobile terminal
CN111291244B (en) House source information display method, device, terminal and storage medium
KR20140070218A (en) Mobile apparatus displaying end effect and cotrol method there of
CN112230909A (en) Data binding method, device and equipment of small program and storage medium
EP4258165A1 (en) Two-dimensional code displaying method and apparatus, device, and medium
EP4210320A1 (en) Video processing method, terminal device and storage medium
CN110188299B (en) Response type page processing method and device and electronic equipment
CN106233237A (en) A kind of method and apparatus of the new information processed with association
US20220392130A1 (en) Image special effect processing method and apparatus
CN109389365B (en) Multi-person collaborative document processing method and device and electronic equipment
US9256358B2 (en) Multiple panel touch user interface navigation
CN110377192B (en) Method, device, medium and electronic equipment for realizing interactive effect
CN113783995A (en) Display control method, display control device, electronic apparatus, and medium
CN112911052A (en) Information sharing method and device
CN109313529A (en) Carousel between document and picture
CN111913614A (en) Multi-picture display control method and device, storage medium and display
CN107765858B (en) Method, device, terminal and storage medium for determining face angle
CN112492399A (en) Information display method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Douyin Vision Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: Tiktok vision (Beijing) Co.,Ltd.

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Tiktok vision (Beijing) Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder