WO2022062788A1 - 互动特效展示方法及终端 - Google Patents

互动特效展示方法及终端 Download PDF

Info

Publication number
WO2022062788A1
WO2022062788A1 PCT/CN2021/113600 CN2021113600W WO2022062788A1 WO 2022062788 A1 WO2022062788 A1 WO 2022062788A1 CN 2021113600 W CN2021113600 W CN 2021113600W WO 2022062788 A1 WO2022062788 A1 WO 2022062788A1
Authority
WO
WIPO (PCT)
Prior art keywords
display area
video playback
playback interface
display
interactive
Prior art date
Application number
PCT/CN2021/113600
Other languages
English (en)
French (fr)
Inventor
王慧
赵军
徐兴灿
刘庆
马哲
Original Assignee
北京达佳互联信息技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京达佳互联信息技术有限公司 filed Critical 北京达佳互联信息技术有限公司
Publication of WO2022062788A1 publication Critical patent/WO2022062788A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation

Definitions

  • the present disclosure relates to the field of Internet technologies, and in particular, to a method and terminal for displaying interactive special effects.
  • a short video application is installed in the terminal, and the user views the short video file through the short video application installed in the terminal.
  • the user can interact with the short video file through the terminal, for example, in order to express his love for the short video file, the user can like the short video file.
  • the terminal will display the interactive special effect corresponding to the interactive process.
  • Embodiments of the present disclosure provide a method and a terminal for displaying interactive special effects.
  • the technical solution is as follows:
  • a method for displaying interactive special effects includes:
  • a second display area is determined in the video playback interface based on the first display area, and the second display area is different from the first display area.
  • an interactive special effect display device is provided, and the device includes:
  • a first display unit configured to display an interactive special effect corresponding to the interactive operation in the first display area of the video playing interface in response to an interactive operation on the video playing interface
  • the determining unit is configured to, in response to the trigger time of the interactive operation being within a target time period, perform determining a second display area in the video playback interface based on the first display area, the second display area being the same as the the first display areas do not overlap;
  • the second display unit is configured to display the additional special effect corresponding to the interactive operation in the target time period in the second display area.
  • a terminal includes a processor and a memory, the memory stores at least one piece of program code, and the at least one piece of program code is loaded and executed by the processor , to achieve the following steps:
  • a second display area is determined in the video playback interface based on the first display area, and the second display area is different from the first display area.
  • a computer-readable storage medium is provided, and at least one piece of program code is stored in the computer-readable storage medium, and the at least one piece of program code is loaded and executed by a processor to implement follows the steps below:
  • a second display area is determined in the video playback interface based on the first display area, and the second display area is different from the first display area.
  • a computer program product in the case where program codes in the computer program product are executed by a processor of a terminal, the following steps are implemented:
  • a second display area is determined in the video playback interface based on the first display area, and the second display area is different from the first display area.
  • the additional special effects corresponding to the target time period are displayed in the second display area in the video playback interface, wherein there is no overlapping area between the second display area and the first display area where the interactive special effects are located, so that the video The interactive effects and additional effects displayed in the playback interface do not overlap, thereby optimizing the display effect of the effects.
  • FIG. 1 is a schematic diagram of an implementation environment involved in a method for displaying interactive special effects provided according to an exemplary embodiment
  • FIG. 2 is a flowchart of a method for displaying interactive special effects provided according to an exemplary embodiment
  • FIG. 3 is a flowchart of a method for displaying interactive special effects provided according to an exemplary embodiment
  • FIG. 4 is a schematic diagram of a video playback interface provided according to an exemplary embodiment
  • FIG. 5 is a flowchart of a method for displaying interactive special effects provided according to an exemplary embodiment
  • FIG. 6 is a schematic diagram of a video playback interface provided according to an exemplary embodiment
  • FIG. 7 is a flowchart of a method for displaying interactive special effects provided according to an exemplary embodiment
  • FIG. 8 is a schematic diagram of a video playback interface provided according to an exemplary embodiment
  • FIG. 9 is a schematic diagram of a video playback interface provided according to an exemplary embodiment.
  • FIG. 10 is a schematic diagram of a video playback interface provided according to an exemplary embodiment
  • FIG. 11 is a schematic diagram of a video playback interface provided according to an exemplary embodiment
  • Fig. 12 is a block diagram of an interactive special effect display device provided according to an exemplary embodiment
  • FIG. 13 is a schematic structural diagram of a terminal provided according to an exemplary embodiment.
  • FIG. 1 is a schematic diagram of an implementation environment involved in a method for displaying interactive special effects according to an exemplary embodiment.
  • the implementation environment includes: a terminal 101 and a server 102 .
  • the terminal 101 and the server 102 are connected through a wireless network.
  • a target application program is installed in the terminal 101, and the terminal 101 can perform data connection with the server 102 through the target application program, so as to realize functions such as data transmission and message interaction.
  • the server 102 provides services for the target application.
  • the target application is a video playback application, a short video playback application, a social networking application with a video playback function, an information browsing application, or the like.
  • the terminal 101 interacts with the video screen provided by the target application through the target application, and displays corresponding interactive special effects.
  • the terminal 101 is a mobile phone, a tablet computer, a wearable device, a computer or other electronic device.
  • the server 102 is a server, or a server cluster composed of several servers, or a cloud computing service center, which is not specifically limited in this embodiment of the present disclosure.
  • FIG. 2 is a flowchart of a method for displaying interactive special effects according to an exemplary embodiment.
  • the execution body of the method is the terminal 101 in FIG. 1 .
  • the method includes the following steps:
  • the terminal in response to an interactive operation on the video playing interface, displays an interactive special effect corresponding to the interactive operation in a first display area of the video playing interface.
  • the terminal determines a second display area in the video playback interface based on the first display area, and the second display area does not overlap with the first display area.
  • the terminal displays additional special effects corresponding to the interactive operation within the target time period in the second display area.
  • determining a second display area that does not overlap with the first display area in the video playback interface based on the first display area including:
  • the third display area is used as the second display area
  • a new third display area is determined in the video playback interface.
  • determining the third display area in the video playback interface includes:
  • the third display area centered on the target position is determined.
  • determining the target position in the video playback interface includes:
  • the fourth display area is the display area where the video image played in the video playback interface is located
  • the fourth display area is reduced by the target padding to obtain a fifth display area, and the target padding is the padding that matches the size information of the additional special effect;
  • the target position is determined.
  • determining a second display area in the video playback interface based on the first display area includes:
  • the sixth display area is a display area where an existing special effect is located in the video playback interface, and the existing special effect is an interactive special effect or an additional special effect corresponding to the previous interactive operation;
  • the seventh display area is used as the second display area ;
  • a new seventh display area is determined in the video playback interface.
  • the interactive special effects corresponding to the interactive operation are displayed in the first display area of the video playback interface, including:
  • the interactive special effect corresponding to the interactive operation is displayed in the first display area.
  • determining a second display area that does not overlap with the first display area in the video playback interface based on the first display area including:
  • the peripheral area of the first display area is determined as the second display area.
  • the second display area there are multiple second display areas, and additional special effects corresponding to the target time period are displayed in the second display area, including:
  • the additional special effect is displayed through each second display area of the plurality of second display areas in turn counterclockwise.
  • the additional special effects corresponding to the target time period are displayed in the second display area in the video playback interface, wherein there is no overlapping area between the second display area and the first display area where the interactive special effects are located, so that the video The interactive effects and additional effects displayed in the playback interface do not overlap, thereby optimizing the display effect of the effects.
  • FIG. 3 is a flowchart of a method for displaying interactive special effects according to an exemplary embodiment.
  • the terminal determines the second display area according to the existing first display area as an example for description. As shown in Figure 3, the method includes the following steps:
  • the terminal in response to an interactive operation on the video playing interface, displays an interactive special effect corresponding to the interactive operation in a first display area of the video playing interface.
  • the first display area is a display area used for displaying interactive special effects in the video playback interface.
  • the video playback interface is an interface for displaying video images.
  • the video picture is a video picture displayed to the user through the terminal and capable of transmitting information to the user.
  • the video playing interface is an interface in a video playing application or a short video playing application, or the video playing interface is an interface in an instant messaging application or an information browsing application. In the embodiments of the present disclosure, this is not specifically limited.
  • the video playing interface is a full-screen interface of the terminal, or the video playing interface is an interface of the terminal screen occupied by the video picture displayed in the terminal. In the embodiments of the present disclosure, this is not specifically limited.
  • the interactive operation is an operation received by the terminal through the video playback interface.
  • the interactive operation is a like operation, a comment operation, a forwarding operation, and the like.
  • the like operation is a single-click operation, a double-click operation, a long-press operation, etc. received by the terminal on the video playback interface.
  • the terminal displays the like button through the video playback interface, and the like operation is a trigger operation of the like button received by the terminal.
  • the comment operation is an operation completed by the comment received by the terminal.
  • the terminal displays a comment box and a finish button through the video playback interface, and in response to the comment box being triggered, the terminal receives comment information input by the user through the comment box; and in response to the finish button being triggered, the terminal determines that a comment operation is received.
  • the terminal after receiving the comment operation, the terminal directly displays the interactive special effect corresponding to the comment operation.
  • the terminal obtains comment information corresponding to the comment operation, and in response to the comment information including a preset target keyword, displays the interactive special effect corresponding to the target keyword in the comment operation.
  • the forwarding operation is a forwarding operation received by the terminal for the video picture in the video playback interface.
  • the terminal displays a forwarding button through the video playback interface, and in response to the forwarding button being triggered, the terminal executes a forwarding process, and in response to completing the forwarding process, the terminal determines that a forwarding operation has been received.
  • the interactive special effect is set in advance, and the interactive special effect is display content composed of at least one element such as an animation element and a picture corresponding to the interactive operation.
  • the interactive special effect is an animation composed of icons
  • the interactive special effect is a special effect composed of pictures and animation elements.
  • the animation element is a lottie (an open source animation library) animation element or the like.
  • the interactive special effect is a word art animation marked with a target keyword or words such as "comment success", “forward success”, “like success” or "thank you for your support”, etc.
  • the terminal receives the interactive operation for the video playback interface, and determines the interactive special effect corresponding to the interactive operation.
  • the terminal obtains the interactive special effect corresponding to the interactive operation from the server in advance, and stores the obtained corresponding relationship between the interactive operation and the interactive special effect locally.
  • the terminal invokes the pre-stored corresponding relationship between the interactive operation and the interactive special effect, and acquires the interactive special effect corresponding to the interactive operation according to the corresponding relationship.
  • the terminal in response to an interactive operation on the video playing interface, acquires an interactive special effect corresponding to the interactive operation from the server.
  • the terminal sends an acquisition request to the server, and the acquisition request carries the operation identifier of the interactive operation, and the server receives the acquisition request, determines the interactive effect corresponding to the operation identifier according to the operation identifier in the acquisition request, and determines the interactive effect corresponding to the interactive effect. It is sent to the terminal, and correspondingly, the terminal receives the interactive special effect sent by the server.
  • the terminal determines the first display area in the video playback interface.
  • the terminal determines, according to the interactive operation, a first display area in the video playback interface for displaying the interactive special effect.
  • the first display area is an area where an operation button corresponding to the interactive operation is located.
  • the terminal determines the operation button corresponding to the interactive operation, and further determines the first display area corresponding to the operation button.
  • the first display area is a display area around the operation button, or the first display area is any display area in a video playback interface.
  • the terminal predetermines the first display area in the video playback interface, and the terminal does not need to determine the first display area during the process of displaying the interactive special effect, which improves the efficiency of displaying the interactive special effect.
  • the first display area is a display area corresponding to the position generated by the interactive operation.
  • the terminal determines the operation position generated by the interactive operation, takes the operation position as the center position of the first display area, and obtains the first display area.
  • the terminal determines the position of the first display area according to the position generated by the interactive operation, so that the position of the first display area corresponds to the interactive operation position of the user, which improves the interestingness of the interactive operation.
  • the terminal determines a third display area in the video playback interface.
  • the third display area is a preselected display area determined by the terminal in the video playback interface according to the size information of the additional special effect.
  • the additional special effect is an additional special effect displayed when the interactive operation is generated.
  • the composition form of the additional special effect is similar to that of the interactive special effect, and will not be repeated here.
  • the target time period is the time period specified by the developer.
  • the target time period is the time period in which any festival is located, or any time period in a day, or other time periods, and the like.
  • the additional special effects of different target time periods are the same or different, which are not specifically limited in the embodiments of the present disclosure. For example, if the additional special effect is a special effect corresponding to any festival, the additional special effect generated by any operation is the same, such as fireworks special effects.
  • the additional special effect is the special effect corresponding to the keyword included in the interactive operation detected within the target time period, and different keywords correspond to different special effects.
  • the additional special effects are special effects corresponding to the interactive operation of the target account during the target time period, and the additional special effects generated by different target accounts may be the same or different.
  • the target account is a member account, an account whose level exceeds a preset level, and the like.
  • the process of determining the third display area in the video playback interface by the terminal is implemented through the following steps (1)-(3), including:
  • the terminal determines the target position in the video playback interface.
  • the terminal randomly determines a position in the video playback interface, and uses the randomly determined position as the target position.
  • the terminal determines a fourth display area occupied by a displayed video image, performs a reduction process on the fourth display area, and randomly selects a target position in the reduced display area. The process is achieved through the following steps (1-1)-(1-3), including:
  • the terminal determines a fourth display area in the video playback interface, where the fourth display area is the display area where the video image played in the video playback interface is located.
  • the fourth display area is the display area in the terminal screen occupied by the video picture.
  • the terminal converts the rendering data corresponding to the video picture into coordinate data corresponding to the terminal screen coordinate system, and obtains the fourth display area occupied by the video picture in the video playing interface.
  • the terminal performs reduction processing on the fourth display area based on the target padding to obtain the fifth display area, where the target padding is the padding matching the size information of the additional special effect.
  • the fifth display area is a display area obtained by reducing the fourth display area according to the target padding.
  • the size information includes the shape of the special effect information and the maximum area of the display area occupied during the display process.
  • the terminal acquires size information of the additional special effect, and determines the target padding according to the size information.
  • the matching between the target padding and the size information of the additional special effect means that the target padding is not less than half of the maximum width of the additional special effect.
  • the terminal determines the target padding according to the size information of the additional special effect, and removes the edge portion of the fourth display area according to the target padding to obtain the fifth display area.
  • the terminal determines the target position in the fifth display area.
  • the terminal randomly selects a position from the fifth display area as the target position.
  • steps (1-1)-(1-3) are executed in any step before this step.
  • the execution of steps (1-1)-(1-3) The order is not specifically limited.
  • the terminal determines the size information of the additional special effect according to the special effect information of the additional special effect.
  • This step is similar to the process in which the terminal determines the size information of the additional special effect in step (1-2), and will not be repeated here.
  • the terminal determines, in the video playback interface, the third display area centered on the target position.
  • the terminal takes the target position as the center of the third display area, and determines the size information of the third display area according to the size information of the additional special effect, wherein the size of the third display area is not smaller than the additional special effect The size of the effect.
  • the second display area is determined by whether the third display area overlaps with the first display area of the existing interactive special effect in the video playback interface, which improves the efficiency of determining the second display area.
  • the terminal After determining the third display area, the terminal determines whether there is an overlapping area between the third display area and the first display area. If there is no overlapping area between the third display area and the first display area, the terminal executes step 303 , and if there is an overlapping area between the third display area and the first display area, the terminal executes step 304 . The terminal determines whether there is an overlapping area between the first display area and the third display area through a coordinate intersection judgment method.
  • the overlapping of regions means that there is an intersection between different display regions, that is, there is an intersecting part; the non-overlapping region means that there is no intersection between different display regions.
  • the terminal uses the third display area as the second display area.
  • the third display area is determined. It is the second display area for displaying additional special effects.
  • the terminal in the case that the first display area and the third display area have an overlapping area, the terminal returns to step 302, and determines a new third display area in the video playback interface.
  • the terminal performs step 302 to re-determine a new first display area.
  • Three display areas determine whether the new third display area has an overlapping area with the first display area, and so on, until there is no overlapping area between the first display area and the new third display area, the new third display area
  • the display area is determined as the second display area.
  • the new third display area and the original third display area are different display areas.
  • the second display area is determined by determining whether the third display area overlaps with the first display area of the existing interactive special effect in the video playback interface, which prevents the selected second display area from being overlapped with the first display area. Overlapping occurs and the display is optimized.
  • the terminal also counts the number of times the third display area is repeatedly determined. In response to the number of times the terminal determines the third display area exceeds the preset threshold, the terminal directly determines the third display area determined for the first time. for the second display area. The process is: the terminal counts the number of times the third display area is determined in the video playback interface; and in response to the number of times exceeding a preset threshold, determining the first determined third display area as the second display area.
  • the preset threshold is set as required, and in this embodiment of the present disclosure, the preset threshold is not specifically limited.
  • the preset threshold is 10, 15, or 20, and so on.
  • the number of times of re-determining the third display area is counted, and when the number of times exceeds a preset threshold, the third display area is not repeatedly determined, thereby ensuring that the additional special effects can be determined within a preset time. render area.
  • the terminal displays additional special effects corresponding to the interactive operation in the target time period in the second display area.
  • the terminal renders the additional special effect to the video playback interface corresponding to the second display area.
  • FIG. 4 shows a schematic diagram of a video playback interface provided according to an exemplary embodiment.
  • the video playback interface includes labels such as "Home”, “Featured”, “Columns”, and “Me”, and the video playback interface also includes the currently displayed video picture, the content of the video picture is The publisher's account number, the video title corresponding to the video screen, the background music used, etc., as well as the publisher's avatar, follow tag, like button, comment button, share button, etc.
  • An additional special effect is displayed in the second display area, and the additional special effect is a holiday icon, for example, the additional special effect is an icon with the words "Happy Holidays".
  • An interactive special effect is displayed in the first display area, and the interactive special effect is a heart-shaped icon generated after a like.
  • the additional special effects corresponding to the target time period are displayed in the second display area in the video playback interface, wherein the second display area does not have an overlapping area with the first display area where the interactive special effects are located, so that the video is played
  • the interactive effects and additional effects displayed in the interface do not overlap, thereby optimizing the display effect of special effects.
  • the terminal continuously receives the interactive operations input by the user, each interactive operation generates corresponding interactive special effects and additional special effects, and each interactive special effect and additional special effects are displayed in the video playback interface for a preset duration. Therefore, There will be situations where multiple special effects are displayed at the same time in the video playback interface.
  • FIG. 5 is a flowchart of a method for displaying interactive special effects according to an exemplary embodiment.
  • the terminal determines the second display area according to the existing first display area and the existing sixth display area with special effects as an example for description. As shown in Figure 5, the method includes the following steps:
  • the terminal in response to an interactive operation on a video playing interface, displays an interactive special effect corresponding to the interactive operation in a first display area of the video playing interface.
  • This step is similar to step 301 and will not be repeated here.
  • the terminal determines at least one sixth display area in the video playback interface.
  • the at least one sixth display area is a display area where the existing special effects are located in the video playback interface.
  • the existing special effects are the interactive special effects and additional special effects generated by the interactive operation before the current interactive operation.
  • the at least one sixth display area includes: one sixth display area, or a plurality of sixth display areas including two.
  • the terminal determines a seventh display area in the video playback interface.
  • This step is similar to step 302 and will not be repeated here.
  • the terminal displays the seventh display area area as the second display area.
  • This step is similar to step 303 and will not be repeated here.
  • the terminal if there is an overlapping area between the first display area and the seventh display area, or, in the case where there is an overlapping area between the seventh display area and the at least one sixth display area, the terminal returns to step 503, where the video playback A new seventh display area is determined in the interface.
  • This step is similar to step 304 and will not be repeated here.
  • the terminal displays additional special effects corresponding to the interactive operation within the target time period in the second display area.
  • This step is similar to step 305 and will not be repeated here.
  • FIG. 6 shows a schematic diagram of a video playing interface provided according to an exemplary embodiment.
  • the second display area displays additional special effects of this interactive operation corresponding to the target time period, and the additional special effects are holiday icons, for example, the additional special effects are icons with the words "Happy Holidays".
  • the interactive special effect corresponding to this interactive operation is displayed in the first display area.
  • the interactive special effect is a heart-shaped icon generated after a like.
  • the video playback interface also displays the interactive special effect or additional interactive effect corresponding to the interactive operation before this interactive operation. Special effects, for example, the icon with the words "forwarding successful" corresponding to the previous forwarding operation, or additional special effects corresponding to other interactive operations.
  • the terminal determines the first display area, that is, determines a display area that does not have an overlapping area with at least one sixth display area.
  • the process of determining the first display area by the terminal is implemented through the following steps (1)-(2), including:
  • the terminal determines a first display area that does not overlap with at least one sixth display area in the video playback interface.
  • This step is similar to steps 502-505 and will not be repeated here.
  • the terminal displays the interactive special effect corresponding to the interactive operation in the first display area.
  • the terminal determines the first display area according to the display area that does not overlap with the current at least one sixth display area, so that the terminal will not It overlaps with existing special effects, thereby optimizing the display effect of special effects.
  • the additional special effects are displayed in the second display area in the video playback interface, wherein the second display area and the first display area where the interactive special effects are located do not have overlapping areas, so that the display area displayed in the video playback interface does not overlap.
  • Interactive effects and additional effects do not overlap, thereby optimizing the display of special effects.
  • the terminal determines the second display area according to the display area that does not overlap with the current at least one sixth display area, so that the terminal will not interact with the existing special effects during the process of displaying additional special effects. Overlapping, thereby optimizing the display of special effects.
  • the terminal determines the display area of the interactive special effect and the additional special effect according to the operation position of the interactive operation, and in the display area, the interactive special effect and the additional special effect are displayed without overlapping.
  • FIG. 7 is a flowchart of a method for displaying interactive special effects according to an exemplary embodiment.
  • the terminal determines the first display area and the second display area according to the operation position of the interactive operation as an example for description. As shown in Figure 7, the method includes the following steps:
  • the terminal in response to an interactive operation on the video playing interface, displays an interactive special effect corresponding to the interactive operation in a first display area of the video playing interface.
  • This step is similar to step 301 and will not be repeated here.
  • the terminal determines the first display area; taking the first display area as the central area, determines the peripheral area of the first display area as the second display area Display area.
  • the terminal detects the interactive operation, and determines the operation position of the interactive operation in the video playback interface.
  • the second display area is a display area.
  • FIG. 8 it shows a schematic diagram of a video playing interface provided according to an exemplary embodiment.
  • the first display area is a circular area
  • the second display area is an annular area around the first display area.
  • the second display area is a plurality of display areas, see FIG. 9 , which shows a schematic diagram of a video playback interface provided according to an exemplary embodiment.
  • the first display area is a circular area
  • the plurality of second display areas are a plurality of fan-shaped areas around the first display area.
  • the terminal displays additional special effects corresponding to the interactive operation within the target time period in the second display area.
  • the terminal displays interactive special effects in the first display area, and displays additional special effects in the second display area.
  • FIG. 8 additional special effects are displayed in the form of radioactivity in the annular second display area, and are simultaneously displayed in the annular second display area.
  • FIG. 10 it shows a schematic diagram of a video playback interface provided according to an exemplary embodiment.
  • the terminal displays additional special effects corresponding to the target time period through each of the plurality of second display areas in a clockwise order.
  • FIG. 11 shows a schematic diagram of a video playback interface provided according to an exemplary embodiment.
  • the terminal displays additional special effects corresponding to the target time period through each of the plurality of second display areas in turn counterclockwise.
  • the additional special effects displayed in the plurality of second display areas are additional special effects corresponding to the same interactive operation, or, the additional special effects displayed in the plurality of second display areas are additional special effects corresponding to different interactive operations, In the embodiments of the present disclosure, this is not specifically limited.
  • each interactive operation when the additional special effects displayed in the plurality of second display areas are additional special effects corresponding to the same interactive operation, each interactive operation generates a plurality of additional special effects, the multiple additional special effects are the same or different, and the multiple additional special effects are the same or different.
  • the special effects are displayed in the plurality of second display areas in turn in a clockwise or counter-clockwise direction; if the additional special effects displayed in the plurality of second display areas are exclusive special effects corresponding to different interactive operations, each interactive operation generates an additional special effect.
  • the terminal determines the second display area of the additional special effect corresponding to the current interactive operation according to the second display area of the additional special effect corresponding to the previous interactive operation, wherein the two second display areas are adjacent.
  • the first display area is a display area corresponding to the position where the interactive operation is generated in the video playback interface.
  • the terminal determines the generation coordinates of the interactive operation, and determines the coordinates as the position of the interactive operation.
  • the first display area is an operation position predetermined according to the type of the interactive operation when an interactive operation is generated in the video playback interface.
  • the terminal determines the operation type of the interactive operation, and according to the operation type, determines the operation position corresponding to the operation type from the corresponding relationship between the operation type and the operation position stored in the terminal.
  • the first display area is an operation position that does not overlap with other display areas in the video playback interface.
  • the terminal detects the interactive operation, and determines the current operation position that does not overlap with other operation positions from the video playback interface.
  • the additional special effects corresponding to the target time period are displayed in the second display area in the video playback interface, wherein there is no overlapping area between the second display area and the first display area where the interactive special effects are located, so that the video The interactive effects and additional effects displayed in the playback interface do not overlap, thereby optimizing the display effect of the effects.
  • Fig. 12 is a block diagram of an interactive special effect display device provided according to an exemplary embodiment. Referring to Figure 12, the device includes:
  • the first display unit 1201 is configured to, in response to the interactive operation on the video playing interface, display the interactive special effect corresponding to the interactive operation in the first display area of the video playing interface;
  • the determining unit 1202 is configured to, in response to the trigger time of the interactive operation being within the target time period, based on the first display area, determine a second display area in the video playback interface, the second display area and the first display area 1.
  • the display areas do not overlap;
  • the second display unit 1203 is configured to display additional special effects corresponding to the interactive operation in the target time period in the second display area.
  • the determining unit 1202 includes:
  • a first determination subunit configured to determine a third display area in the video playback interface
  • the second determination subunit is configured to use the third display area as the second display area when the first display area and the third display area do not have an overlapping area; or,
  • the second determination subunit is configured to determine a new third display area in the video playback interface when the first display area and the third display area have an overlapping area.
  • the second determination subunit is configured to determine the target position in the video playback interface; obtain size information of the additional special effect; based on the size information of the additional special effect, in the video playback interface, determine the third display area centered on the target position.
  • the second determination subunit is configured to determine a fourth display area in the video playback interface, where the fourth display area is the display area where the video image played in the video playback interface is located; Padding, reducing the fourth display area to obtain a fifth display area, the target padding is the padding that matches the size information of the additional special effect; in the fifth display area, determine the target Location.
  • the determining unit 1202 includes:
  • the third determination subunit is configured to determine a sixth display area in the video playback interface, where the sixth display area is a display area where an existing special effect is located in the video playback interface, and the existing special effect is the previous interactive operation Corresponding interactive special effects or additional special effects;
  • a fourth determination subunit configured to determine a seventh display area in the video playback interface
  • the fifth determination subunit is configured to, when the first display area and the seventh display area do not have an overlapping area, and when the seventh display area and the sixth display area do not have an overlapping area, determine the first display area and the sixth display area. Seven display areas as the second display area; or,
  • the fifth determining subunit is configured to play the video when the first display area and the seventh display area have an overlapping area, or when the seventh display area and the sixth display area have an overlapping area A new seventh display area is determined in the interface.
  • the first display unit 1201 is configured to determine, in the video playback interface, a first display area that does not overlap with the sixth display area; and to display the interactive special effect corresponding to the interactive operation on the first display area. in the display area.
  • the determining unit 1202 includes:
  • the sixth determining subunit is configured to take the first display area as a central area, and determine a peripheral area of the first display area as the second display area.
  • the second display unit 1203 is configured to display the additional special effect through each second display area of the multiple second display areas in a clockwise order; or ,
  • the second display unit 1203 is configured to display the additional special effect through each second display area of the plurality of second display areas in turn counterclockwise.
  • the additional special effects corresponding to the target time period are displayed in the second display area in the video playback interface, wherein there is no overlapping area between the second display area and the first display area where the interactive special effects are located, so that the video The interactive effects and additional effects displayed in the playback interface do not overlap, thereby optimizing the display effect of the effects.
  • the interactive special effect display device when the interactive special effect display device provided by the above-mentioned embodiments displays interactive special effects, only the division of the above-mentioned functional modules is used as an example for illustration. In practical applications, the above-mentioned functions can be allocated by different functional modules as required. , that is, dividing the internal structure of the device into different functional modules to complete all or part of the functions described above.
  • the interactive special effect display device and the interactive special effect display method provided by the above embodiments belong to the same concept, and the specific implementation process is detailed in the method embodiment, which will not be repeated here.
  • FIG. 13 is a schematic structural diagram of a terminal 1300 provided according to an exemplary embodiment.
  • the terminal 1300 is a portable mobile terminal, such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, Moving Picture Experts Group Audio Layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Expert Compression Standard Audio Layer 4) Player, Laptop or Desktop.
  • Terminal 1300 may also be called user equipment, portable terminal, laptop terminal, desktop terminal, and the like by other names.
  • the terminal 1300 includes: a processor 1301 and a memory 1302 .
  • the processor 1301 includes one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. In some embodiments, the processor 1301 adopts at least one of DSP (Digital Signal Processing, digital signal processing), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array, programmable logic array). A form of hardware implementation. In some embodiments, the processor 1301 also includes a main processor and a co-processor. The main processor is a processor for processing data in a wake-up state, also referred to as a CPU (Central Processing Unit, central processing unit). ; a coprocessor is a low-power processor for processing data in a standby state.
  • CPU Central Processing Unit, central processing unit
  • a coprocessor is a low-power processor for processing data in a standby state.
  • the processor 1301 is integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used for rendering and drawing the content that needs to be displayed on the display screen.
  • the processor 1301 further includes an AI (Artificial Intelligence, artificial intelligence) processor, where the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence, artificial intelligence
  • memory 1302 includes one or more computer-readable storage media that are non-transitory. In some embodiments, memory 1302 also includes high-speed random access memory, and non-volatile memory, such as one or more disk storage devices, flash storage devices. In some embodiments, the non-transitory computer-readable storage medium in the memory 1302 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1301 to realize the interactive effects provided by the method embodiments of the present disclosure. Show method.
  • the terminal 1300 may optionally further include: a peripheral device interface 1303 and at least one peripheral device.
  • the processor 1301, the memory 1302 and the peripheral device interface 1303 are connected by a bus or a signal line.
  • each peripheral device is connected to the peripheral device interface 1303 through a bus, signal line or circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 1304 , a display screen 1305 , a camera assembly 1306 , an audio circuit 1307 , a positioning assembly 1308 and a power supply 1309 .
  • the peripheral device interface 1303 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1301 and the memory 1302.
  • processor 1301, memory 1302, and peripherals interface 1303 are integrated on the same chip or circuit board; in some other embodiments, any one of processor 1301, memory 1302, and peripherals interface 1303 or The two are implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the radio frequency circuit 1304 is used for receiving and transmitting RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1304 communicates with communication networks and other communication devices via electromagnetic signals.
  • the radio frequency circuit 1304 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • radio frequency circuitry 1304 includes: an antenna system, an RF transceiver, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and the like.
  • radio frequency circuitry 1304 communicates with other terminals via at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, 3G, 4G and 5G), wireless local area network and/or WiFi (Wireless Fidelity, Wireless Fidelity) network.
  • the radio frequency circuit 1304 further includes a circuit related to NFC (Near Field Communication, short-range wireless communication), which is not limited in the present disclosure.
  • the display screen 1305 is used to display UI (User Interface, user interface).
  • the UI includes graphics, text, icons, video, and any combination thereof.
  • the display screen 1305 also has the ability to acquire touch signals on or above the surface of the display screen 1305 .
  • the touch signal is input to the processor 1301 as a control signal for processing.
  • the display screen 1305 is also used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1305 is a flexible display screen disposed on a curved or folded surface of the terminal 1300 . Even, the display screen 1305 is also set as a non-rectangular irregular figure, that is, a special-shaped screen.
  • the display screen 1305 is made of materials such as LCD (Liquid Crystal Display, liquid crystal display), OLED (Organic Light-Emitting Diode, organic light emitting diode).
  • the camera assembly 1306 is used to capture images or video.
  • camera assembly 1306 includes a front-facing camera and a rear-facing camera.
  • the front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal.
  • there are at least two rear cameras which are any one of a main camera, a depth-of-field camera, a wide-angle camera, and a telephoto camera, so as to realize the fusion of the main camera and the depth-of-field camera to realize the background blur function, the main camera It is integrated with the wide-angle camera to achieve panoramic shooting and VR (Virtual Reality, virtual reality) shooting functions or other integrated shooting functions.
  • the camera assembly 1306 also includes a flash.
  • the flash is a single color temperature flash, and in some embodiments, the flash is a dual color temperature flash. Dual color temperature flash refers to the combination of warm light flash and cold light flash, which is used for light compensation under different color temperatures.
  • the audio circuit 1307 includes a microphone and a speaker.
  • the microphone is used to collect the sound waves of the user and the environment, convert the sound waves into electrical signals, and input them to the processor 1301 for processing, or to the radio frequency circuit 1304 to realize voice communication.
  • there are multiple microphones which are respectively disposed in different parts of the terminal 1300 .
  • the microphones are array microphones or omnidirectional collection microphones.
  • the speaker is used to convert the electrical signal from the processor 1301 or the radio frequency circuit 1304 into sound waves.
  • the loudspeaker is a conventional thin-film loudspeaker, and in some embodiments, the loudspeaker is a piezoelectric ceramic loudspeaker.
  • the speaker is a piezoelectric ceramic speaker, it can not only convert electrical signals into sound waves audible to humans, but also convert electrical signals into sound waves inaudible to humans for distance measurement and other purposes.
  • the audio circuit 1307 also includes a headphone jack.
  • the positioning component 1308 is used to locate the current geographic location of the terminal 1300 to implement navigation or LBS (Location Based Service).
  • LBS Location Based Service
  • the positioning component 1308 is a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China or the Galileo system of Russia.
  • the power supply 1309 is used to power various components in the terminal 1300 .
  • the power source 1309 is alternating current, direct current, a disposable battery, or a rechargeable battery.
  • the rechargeable battery is a wired rechargeable battery or a wireless rechargeable battery. Wired rechargeable batteries are batteries that are charged through wired lines, and wireless rechargeable batteries are batteries that are charged through wireless coils.
  • the rechargeable battery is also used to support fast charging technology.
  • the terminal 1300 also includes one or more sensors 1310 .
  • the one or more sensors 1310 include, but are not limited to, an acceleration sensor 1311 , a gyro sensor 1312 , a pressure sensor 1313 , a fingerprint sensor 1314 , an optical sensor 1315 and a proximity sensor 1316 .
  • the acceleration sensor 1311 detects the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 1300 .
  • the acceleration sensor 1311 is used to detect the components of the gravitational acceleration on the three coordinate axes.
  • the processor 1301 controls the display screen 1305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1311 .
  • the acceleration sensor 1311 is also used for game or user movement data collection.
  • the gyroscope sensor 1312 detects the body direction and rotation angle of the terminal 1300 , and the gyroscope sensor 1312 cooperates with the acceleration sensor 1311 to collect 3D actions of the user on the terminal 1300 .
  • the processor 1301 can implement the following functions according to the data collected by the gyro sensor 1312: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1313 is disposed on the side frame of the terminal 1300 and/or the lower layer of the display screen 1305 .
  • the pressure sensor 1313 can detect the user's holding signal of the terminal 1300 , and the processor 1301 performs left and right hand identification or shortcut operations according to the holding signal collected by the pressure sensor 1313 .
  • the processor 1301 controls the operability controls on the UI interface according to the user's pressure operation on the display screen 1305.
  • the operability controls include at least one of button controls, scroll bar controls, icon controls, and menu controls.
  • the fingerprint sensor 1314 is used to collect the user's fingerprint, and the processor 1301 identifies the user's identity according to the fingerprint collected by the fingerprint sensor 1314, or the fingerprint sensor 1314 identifies the user's identity according to the collected fingerprint. When the user's identity is identified as a trusted identity, the processor 1301 authorizes the user to perform relevant sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, making payments, and changing settings.
  • the fingerprint sensor 1314 is disposed on the front, back, or side of the terminal 1300 . When the terminal 1300 is provided with a physical button or a manufacturer's logo, the fingerprint sensor 1314 is integrated with the physical button or the manufacturer's logo.
  • Optical sensor 1315 is used to collect ambient light intensity.
  • the processor 1301 controls the display brightness of the display screen 1305 according to the ambient light intensity collected by the optical sensor 1315 . Specifically, when the ambient light intensity is high, the display brightness of the display screen 1305 is increased; when the ambient light intensity is low, the display brightness of the display screen 1305 is decreased.
  • the processor 1301 also dynamically adjusts the shooting parameters of the camera assembly 1306 according to the ambient light intensity collected by the optical sensor 1315 .
  • a proximity sensor 1316 also called a distance sensor, is usually provided on the front panel of the terminal 1300.
  • the proximity sensor 1316 is used to collect the distance between the user and the front of the terminal 1300 .
  • the processor 1301 controls the display screen 1305 to switch from the bright screen state to the off screen state; when the proximity sensor 1316 detects When the distance between the user and the front of the terminal 1300 gradually increases, the processor 1301 controls the display screen 1305 to switch from the off-screen state to the bright-screen state.
  • FIG. 13 does not constitute a limitation on the terminal 1300, and can include more or less components than the one shown, or combine some components, or adopt different component arrangements.
  • a computer-readable storage medium is also provided, where at least one piece of program code is stored in the computer-readable storage medium, and at least one piece of program code is loaded and executed by a terminal to implement the method for generating interactive special effects in the foregoing embodiment .
  • the computer-readable storage medium is a memory.
  • the computer-readable storage medium is ROM (Read-Only Memory, read-only memory), RAM (Random Access Memory, random access memory), CD-ROM (Compact Disc Read-Only Memory, compact disc read-only storage) devices), magnetic tapes, floppy disks, and optical data storage devices, etc.
  • a computer program product is also provided, in the case where the program code in the computer program product is executed by the processor of the terminal, to realize the interactive special effect display method according to the embodiment of the present disclosure instruction.

Abstract

本公开关于一种互动特效展示方法及终端。方法包括:响应于针对视频播放界面的互动操作,在该视频播放界面的第一显示区域中展示该互动操作对应的互动特效;响应于该互动操作的触发时间在目标时间段内,基于该第一显示区域,在该视频播放界面中确定第二显示区域,该第二显示区域与该第一显示区域不重叠;在该第二显示区域中展示该目标时间段内该互动操作对应的附加特效。

Description

互动特效展示方法及终端
本公开基于申请日为2020年09月28日、申请号为202011045824.2的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本公开作为参考。
技术领域
本公开涉及互联网技术领域,特别涉及一种互动特效展示方法及终端。
背景技术
随着互联网技术的发展,用户能够通过终端进行的活动越来越丰富。例如,终端中安装短视频应用,用户通过终端中安装的短视频应用查看短视频文件。在此过程中,用户能够通过终端与该短视频文件进行互动,例如,为了表示对短视频文件的喜爱,用户能够对该短视频文件进行点赞。相应的,终端会展示该互动过程对应的互动特效。
发明内容
本公开实施例提供了一种互动特效展示方法及终端。所述技术方案如下:
根据本公开实施例的一方面,提供了一种互动特效展示方法,所述方法包括:
响应于针对视频播放界面的互动操作,在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
响应于所述互动操作的触发时间在目标时间段内,基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
根据本公开实施例的另一方面,提供了一种互动特效展示装置,所述装置包括:
第一展示单元,被配置为响应于针对视频播放界面的互动操作,执行在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
确定单元,被配置为响应于所述互动操作的触发时间在目标时间段内,执行基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
第二展示单元,被配置为执行在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
根据本公开实施例的另一方面,提供了一种终端,所述终端包括处理器和存储器,所述存储器中存储有至少一条程序代码,所述至少一条程序代码由所述处理器加载并执行,以实现如下步骤:
响应于针对视频播放界面的互动操作,在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
响应于所述互动操作的触发时间在目标时间段内,基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
根据本公开实施例的另一方面,提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条程序代码,所述至少一条程序代码由处理器加载并执行,以实现如下 步骤:
响应于针对视频播放界面的互动操作,在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
响应于所述互动操作的触发时间在目标时间段内,基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
根据本公开实施例的另一方面,提供了一种计算机程序产品,在所述计算机程序产品中的程序代码由终端的处理器执行的情况下,以实现如下步骤:
响应于针对视频播放界面的互动操作,在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
响应于所述互动操作的触发时间在目标时间段内,基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
在本公开实施例中,将目标时间段对应的附加特效展示在视频播放界面中的第二显示区域中,其中,第二显示区域与互动特效所在的第一显示区域不存在重叠区域,使得视频播放界面中展示的互动特效和附加特效不重叠,进而优化了特效的显示效果。
附图说明
图1是根据一示例性实施例提供的一种互动特效展示方法所涉及的实施环境的示意图;
图2是根据一示例性实施例提供的一种互动特效展示方法流程图;
图3是根据一示例性实施例提供的一种互动特效展示方法流程图;
图4是根据一示例性实施例提供的一种视频播放界面的示意图;
图5是根据一示例性实施例提供的一种互动特效展示方法流程图;
图6是根据一示例性实施例提供的一种视频播放界面的示意图;
图7是根据一示例性实施例提供的一种互动特效展示方法流程图;
图8是根据一示例性实施例提供的一种视频播放界面的示意图;
图9是根据一示例性实施例提供的一种视频播放界面的示意图;
图10是根据一示例性实施例提供的一种视频播放界面的示意图;
图11是根据一示例性实施例提供的一种视频播放界面的示意图;
图12是根据一示例性实施例提供的一种互动特效展示装置的框图;
图13是根据一示例性实施例提供的一种终端的结构示意图。
具体实施方式
图1为根据一示例性实施例提供的一种互动特效展示方法所涉及的实施环境的示意图。参见图1,该实施环境包括:终端101和服务器102。其中,终端101和服务器102之间通过无线网络连接。
终端101中安装有目标应用程序,终端101能够通过该目标应用程序与服务器102进行数据连接,以实现例如数据传输、消息交互等功能。其中,服务器102为该目标应用程序提供服务。在一些实施例中,该目标应用程序为视频播放应用程序、短视频播放应用程序、具有视频播放功能的社交应用程序或信息浏览应用程序等。相应的,终端101通过该目标应用程序与该目标应用程序提供的视频画面进行交互,展示相应的互动特效。
在一些实施例中,该终端101为手机、平板电脑、可穿戴设备、电脑或其他电子设备。服务器102是一台服务器,或者由若干台服务器组成的服务器集群,或者是一个云计算服务中心,在本公开实施例中,对此不作具体限定。
图2为根据一示例性实施例提供的一种互动特效展示方法流程图。该方法的执行主体为图1中的终端101,如图2所示,该方法包括以下步骤:
在201中,响应于针对视频播放界面的互动操作,终端在该视频播放界面的第一显示区域中展示该互动操作对应的互动特效。
在202中,响应于该互动操作的触发时间在目标时间段内,终端基于该第一显示区域,在该视频播放界面中确定第二显示区域,第二显示区域与第一显示区域不重叠。
在203中,终端在该第二显示区域中展示该目标时间段内该互动操作对应的附加特效。
在一些实施例中,该基于该第一显示区域,在该视频播放界面中确定与该第一显示区域不重叠的第二显示区域,包括:
在该视频播放界面中确定第三显示区域;
在该第一显示区域和该第三显示区域不存在重叠区域的情况下,将该第三显示区域作为该第二显示区域;或者,
在该第一显示区域和该第三显示区域存在重叠区域的情况下,在该视频播放界面中确定新的第三显示区域。
在一些实施例中,该在该视频播放界面中确定第三显示区域,包括:
在该视频播放界面中确定目标位置;
获取该附加特效的尺寸信息;
基于该附加特效的尺寸信息,在该视频播放界面中,确定以该目标位置为中心的该第三显示区域。
在一些实施例中,该在该视频播放界面中确定目标位置,包括:
在该视频播放界面中确定第四显示区域,该第四显示区域为该视频播放界面中播放的视频画面所在的显示区域;
以目标内边距,对该第四显示区域进行缩小处理,得到第五显示区域,该目标内边距为与该附加特效的尺寸信息匹配的内边距;
在该第五显示区域中,确定该目标位置。
在一些实施例中,该基于该第一显示区域,在该视频播放界面中确定第二显示区域,包括:
在该视频播放界面中确定第六显示区域,该第六显示区域为该视频播放界面中已有特效所在的显示区域,该已有特效为之前的互动操作对应的互动特效或附加特效;
在该视频播放界面中确定第七显示区域;
在该第一显示区域和该第七显示区域不存在重叠区域,且,该第七显示区域和该第六显示区域不存在重叠区域的情况下,将该第七显示区域作为该第二显示区域;
在该第一显示区域和该第七显示区域存在重叠区域,或,该第七显示区域和该第六显示区域存在重叠区域的情况下,在该视频播放界面中确定新的第七显示区域。
在一些实施例中,该在该视频播放界面的第一显示区域中展示该互动操作对应的互动特效,包括:
在该视频播放界面中确定与该第六显示区域不重叠的第一显示区域;
将该互动操作对应的互动特效展示在该第一显示区域中。
在一些实施例中,该基于该第一显示区域,在该视频播放界面中确定与该第一显示区域不重叠的第二显示区域,包括:
以该第一显示区域为中心区域,将该第一显示区域的外围区域确定为该第二显示区域。
在一些实施例中,该第二显示区域为多个,该在该第二显示区域中展示该目标时间段对应的附加特效,包括:
顺时针依次通过该多个第二显示区域中的各个第二显示区域分别展示该附加特效;或者,
逆时针依次通过该多个第二显示区域中的各个第二显示区域分别展示该附加特效。
在本公开实施例中,将目标时间段对应的附加特效展示在视频播放界面中的第二显示区域中,其中,第二显示区域与互动特效所在的第一显示区域不存在重叠区域,使得视频播放界面中展示的互动特效和附加特效不重叠,进而优化了特效的显示效果。
在本公开实施例中,终端接收到互动操作时,确定该互动操作对应的互动特效以及该互动操作产生的附加特效。图3为根据一示例性实施例提供的一种互动特效展示方法流程图。在本公开实施例中,以终端根据已有的第一显示区域确定第二显示区域为例进行说明。如图3所示,该方法包括以下步骤:
在301中,响应于针对视频播放界面的互动操作,终端在该视频播放界面的第一显示区域中展示该互动操作对应的互动特效。
第一显示区域为视频播放界面中用于展示互动特效的显示区域。该视频播放界面为用于展示视频画面的界面。该视频画面为通过终端向用户展示、能够向用户传递信息的视频画面。
该视频播放界面为视频播放应用程序或短视频播放应用程序中的界面,或者,该视频播放界面为即时通信应用程序或信息浏览应用程序中的界面。在本公开实施例中,对此不作具体限定。
需要说明的一点是,该视频播放界面为终端的全屏界面,或者,该视频播放界面为终端中展示的视频画面所占终端屏幕的界面。在本公开实施例中,对此不作具体限定。
该互动操作为终端通过该视频播放界面接收的操作。例如,该互动操作为点赞操作、评论操作、转发操作等。其中,该点赞操作为终端接收到的对该视频播放界面的单击操作、双击操作、长按操作等。或者,终端通过该视频播放界面展示点赞按钮,则该点赞操作为终端接收到的对该点赞按钮的触发操作。
该评论操作为终端接收到的评论完成的操作。相应的,终端通过该视频播放界面展示评论框和完成按钮,响应于评论框被触发,终端接收用户通过该评论框输入的评论信息;响应于完成按钮被触发,终端确定接收到评论操作。在一些实施例中,终端接收到评论操作后直接展示该评论操作对应的互动特效。在另一些实施例中,终端接收到评论操作后,获取该评论操作对应的评论信息,响应于该评论信息中包括预设的目标关键字,展示该评论操作中目标关键字对应的互动特效。
该转发操作为终端接收到的对该视频播放界面中的视频画面的转发操作。相应的,终端通过该视频播放界面展示转发按钮,响应于该转发按钮被触发,终端执行转发流程,响应于完成转发流程,终端确定接收到了转发操作。
该互动特效为事先设置的,并且,该互动特效是由该互动操作对应的动画元素、图片等至少一种元素组成的显示内容。例如,该互动特效为图标组成的动画,该互动特效为图片和动画元素组成的特效。在一些实施例中,该动画元素为lottie(一种开源动画库)动画元素等。或者,该互动特效为标有目标关键字或“评论成功”、“转发成功”、“点赞成功”或“谢谢支持”等字样的艺术字动画等。
终端接收到针对视频播放界面的互动操作,确定该互动操作对应的互动特效。在一些实施例中,终端事先从服务器中获取互动操作对应的互动特效,将获取到的互动操作与互动特效的对应关系存储在本地。相应的,响应于针对该视频播放界面的互动操作,终端调用事先存储的互动操作和互动特效的对应关系,根据该对应关系获取该互动操作对应的互动特效。在另一些实施例中,响应于针对该视频播放界面的互动操作,终端从服务器中获取该互动操作对应的互动特效。相应的,终端向服务器发送获取请求,该获取请求中携带该互动操作的操作标识,服务器接收该获取请求,根据该获取请求中的操作标识,确定该操作标识对应的互动特效,将该互动特效发送给终端,相应的,终端接收该服务器发送的互动特效。
另外,终端在通过第一显示区域显示互动特效之前,在视频播放界面中确定该第一显示区域。相应的,终端根据该互动操作,确定该视频播放界面中用于展示该互动特效的第一显示区域。
在一些实施例中,该第一显示区域为该互动操作对应的操作按钮所在的区域。相应的,终端确定该互动操作对应的操作按钮,进而确定该操作按钮对应的第一显示区域。在一些实施例中,该第一显示区域为该操作按钮周围的显示区域,或者,该第一显示区域为视频播放界面中的任一显示区域。
在本公开实施例中,终端预先确定视频播放界面中的第一显示区域,终端无需在显示互动特效的过程中再确定第一显示区域,提高了展示互动特效的效率。
在另一些实施例中,该第一显示区域为该互动操作产生的位置对应的显示区域。相应的,终端确定该互动操作产生的操作位置,将该操作位置作为该第一显示区域的中心位置,得到第一显示区域。
在本公开实施例中,终端根据互动操作产生的位置确定第一显示区域的位置,使得第一显示区域的位置与用户的互动操作位置对应,提高了互动操作的趣味性。
在302中,响应于该互动操作的触发时间在目标时间段内,终端在该视频播放界面中确定第三显示区域。
其中,该第三显示区域为终端根据附加特效的尺寸信息在视频播放界面中确定的预选显示区域。其中,该附加特效为产生互动操作时展示的附加特效。该附加特效的组成形式与互动特效相似,在此不再赘述。
该目标时间段为开发人员规定的时间段。例如,该目标时间段为任一节日所在的时间段,或者,一天中的任一时间段,或者,其他时间段等。需要说明的一点是,不同目标时间段的附加特效相同或者不同,在本公开实施例中,对此不作具体限定。例如,附加特效为任一节日对应的特效,则任一操作产生的附加特效相同,例如都为烟花特效等。
另外,该附加特效为在目标时间段内检测到的互动操作中包括的关键字对应的特效,则不同的关键字对应不同的特效。或者,附加特效为在目标时间段内目标账号产生互动操作时对应的特效,则不同的目标账户产生的附加特效可能相同也可能不同。其中,目标账号为会员账号、等级超过预设等级的账号等。
在本公开实施例中,终端在该视频播放界面中确定第三显示区域的过程,通过以下步骤(1)-(3)实现,包括:
(1)终端在该视频播放界面中确定目标位置。
在一些实施例中,终端在该视频播放界面中随机确定一个位置,将该随机确定的位置作为目标位置。在另一个实施例中,终端在视频播放界面中,确定显示视频画面所占用的第四显示区域,对该第四显示区域进行缩小处理,在缩小处理后的显示区域中随机选择目标位置。该过程通过以下步骤(1-1)-(1-3)实现,包括:
(1-1)终端在该视频播放界面中确定第四显示区域,该第四显示区域为该视频播放界面中播放的视频画面所在的显示区域。
第四显示区域为视频画面占用的终端屏幕中的显示区域。其中,终端将视频画面对应的渲染数据转换成终端屏幕坐标系对应的坐标数据,得到视频画面在该视频播放界面中所占的第四显示区域。
(1-2)终端以目标内边距,对该第四显示区域进行缩小处理,得到第五显示区域,该目标内边距为与该附加特效的尺寸信息匹配的内边距。
第五显示区域为根据目标内边距对第四显示区域进行缩小处理后得到的显示区域。其中,该尺寸信息包括特效信息的形状以及展示过程中占用的显示区域的最大面积。在本公开实施例中,终端获取附加特效的尺寸信息,根据该尺寸信息确定目标内边距。该目标内边距与该附加特效的尺寸信息匹配是指,该目标内边距不小于该附加特效的最大宽度的一半。
相应的,在本公开实施例中,终端根据该附加特效的尺寸信息确定该目标内边距,根据该目标内边距,去除第四显示区域的边缘部分,得到第五显示区域。
(1-3)终端在该第五显示区域中,确定该目标位置。
在一些实施例中,终端从第五显示区域中随机选择一个位置作为目标位置。
需要说明的一点是,步骤(1-1)-(1-3)在本步骤之前的任一步骤执行,在本公开实施例中,对步骤(1-1)-(1-3)的执行顺序不作具体限定。
在本公开实施例中,通过在视频播放界面中确定第五显示区域,进而在第五显示区域内随机选取目标位置并根据该目标位置确定附加特效的展示区域,这样在展示附加特效时,能够保证展示的附加特效全部在视频播放界面中,防止展示的附加特效产生缺失。
(2)终端根据该附加特效的特效信息,确定该附加特效的尺寸信息。
本步骤与步骤(1-2)中终端确定附加特效的尺寸信息的过程相似,在此不再赘述。
(3)终端基于该尺寸信息,在该视频播放界面中,确定以该目标位置为中心的该第三显示区域。
在本公开实施例中,终端将该目标位置作为第三显示区域的中心,根据该附加特效的尺寸信息确定该第三显示区域的尺寸信息,其中,该第三显示区域的尺寸不小于该附加特效的尺寸。
在本公开实施例中,通过第三显示区域与视频播放界面中已有的互动特效的第一显示区域是否重叠来确定第二显示区域,提高了确定第二显示区域的效率。
终端在确定第三显示区域后,确定该第三显示区域与第一显示区域是否存在重叠区域。在第三显示区域与第一显示区域不存在重叠区域的情况下,终端执行步骤303,在第三显示区域与第一显示区域之间存在重叠区域的情况下,终端执行步骤304。其中,终端通过坐标相交判断法,确定第一显示区域与第三显示区域是否存在重叠区域。
需要说明的是,区域重叠指代不同显示区域之间存在交集,即存在相交的部分;区域不重叠指代不同显示区域之间不存在交集。
在303中,在该第一显示区域和该第三显示区域不存在重叠区域的情况下,终端将该第三显示区域作为第二显示区域。
在本公开实施例中,第一显示区域与第三显示区域不存在重叠区域,则第一显示区域与第三显示区域之间显示的内容互相不会产生影响,则将该第三显示区域确定为用于展示附加特效的第二显示区域。
在304中,在该第一显示区域和该第三显示区域存在重叠区域的情况下,终端返回步骤302,在视频播放界面中确定新的第三显示区域。
在本公开实施例中,第一显示区域与第三显示区域存在重叠区域,则第一显示区域与第三显示区域之间显示的内容互相会产生影响,则终端执行步骤302重新确定新的第三显示区域,再判断新的第三显示区域是否与第一显示区域存在重叠区域,以此类推,直到第一显示区域与新的第三显示区域不存在重叠区域为止,将该新的第三显示区域确定为第二显示区域。其中,该新的第三显示区域与原第三显示区域为不同的显示区域。
在本公开实施例中,通过确定第三显示区域与视频播放界面中已有的互动特效的第一显示区域是否重叠来确定第二显示区域,防止了选择的第二显示区域与第一显示区域发生重叠,优化了显示效果。
需要说明的一点是,在一些实施例中,终端还统计重复确定第三显示区域的次数,响应于终端确定第三显示区域的次数超过预设阈值,终端直接将首次确定的第三显示区域确定为第二显示区域。该过程为:终端统计在该视频播放界面中确定第三显示区域的次数;响应于该次数超过预设阈值,将首次确定的第三显示区域确定为该第二显示区域。
其中,该预设阈值根据需要进行设置,在本公开实施例中,对该预设阈值不作具体限定。例如,该预设阈值为10、15或20等。
在本公开实施例中,通过统计重新确定第三显示区域的次数,在次数超过预设阈值的情况下,不再重复确定第三显示区域,从而保证了能够在预设时间内确定附加特效的渲染区域。
在305中,终端在该第二显示区域中展示该目标时间段内该互动操作对应的附加特效。
在本公开实施例中,终端将附加特效渲染至该第二显示区域对应的视频播放界面中。参见图4,其示出了根据一示例性实施例提供的一种视频播放界面的示意图。在一些实施例中, 该视频播放界面中包括“首页”、“精选”、“栏目”和“我”等标签,并且,该视频播放界面中还包括当前展示的视频画面,该视频画面的发布者的账号、该视频画面对应的视频标题、使用的背景音乐等,以及,该发布者的头像、加关注标签、点赞按钮、评论按钮、分享按钮等。
第二显示区域中展示附加特效,该附加特效为节日图标,例如,该附加特效为“节日快乐”字样的图标。第一显示区域中展示互动特效,该互动特效为点赞后产生的心形图标。
在本公开实施例中,将目标时间段对应的附加特效展示在视频播放界面中的第二显示区域,其中,第二显示区域与互动特效所在的第一显示区域不存在重叠区域,使得视频播放界面中展示的互动特效和附加特效不重叠,进而优化了特效的显示效果。
在一些实施例中,终端连续接收到用户输入的互动操作,每个互动操作产生对应的互动特效和附加特效,并且,每个互动特效和附加特效在视频播放界面中显示预设时长,因此,会出现视频播放界面中同时展示多个特效的情况。图5为根据一示例性实施例提供的一种互动特效展示方法流程图。在本公开实施例中,以终端根据已有的第一显示区域和已有特效的第六显示区域确定第二显示区域为例进行说明。如图5所示,该方法包括以下步骤:
在501中,响应于针对视频播放界面的互动操作,终端在该视频播放界面的第一显示区域中展示该互动操作对应的互动特效。
本步骤与步骤301相似,在此不再赘述。
在502中,响应于该互动操作的触发时间在目标时间段内,终端在该视频播放界面中确定至少一个第六显示区域。
其中,该至少一个第六显示区域为该视频播放界面中已有特效所在的显示区域。该已有特效为在本次互动操作之前的互动操作产生的互动特效和附加特效。
需要说明的是,至少一个第六显示区域包括:一个第六显示区域,或,包括两个在内的多个第六显示区域。
在503中,终端在该视频播放界面中确定第七显示区域。
本步骤与步骤302相似,在此不再赘述。
在504中,在该第一显示区域和该第七显示区域不存在重叠区域,且,该第七显示区域和该至少一个第六显示区域不存在重叠区域的情况下,终端将该第七显示区域作为该第二显示区域。
本步骤与步骤303相似,在此不再赘述。
在505中,在该第一显示区域和该第七显示区域存在重叠区域,或,该第七显示区域和该至少一个第六显示区域存在重叠区域的情况下,终端返回步骤503,在视频播放界面中确定新的第七显示区域。
本步骤与步骤304相似,在此不再赘述。
在506中,终端在该第二显示区域中展示该目标时间段内该互动操作对应的附加特效。
本步骤与步骤305相似,在此不再赘述。
参见图6,其示出了根据一示例性实施例提供的一种视频播放界面的示意图。其中,第二显示区域中展示目标时间段对应的本次互动操作的附加特效,该附加特效为节日图标,例如,该附加特效为“节日快乐”字样的图标。第一显示区域中展示本次互动操作对应的互动特效,该互动特效为点赞后产生的心形图标,该视频播放界面中还展示在本次互动操作之前的互动操作对应的互动特效或附加特效,例如,之前转发操作对应的“转发成功”等字样的图标,或者,其他互动操作对应的附加特效。
另外,需要说明的一点是,在本公开实施例中,终端确定第一显示区域,即是确定一个与至少一个第六显示区域不存在重叠区域的显示区域。相应的,终端确定第一显示区域的过程通过以下步骤(1)-(2)实现,包括:
(1)终端在该视频播放界面中确定与至少一个第六显示区域不重叠的第一显示区域。
本步骤与步骤502-505相似,在此不再赘述。
(2)终端将该互动操作对应的互动特效展示在该第一显示区域中。
在本公开实施例中,终端在确定第一显示区域的过程中,根据与当前至少一个第六显示区域不重叠的显示区域确定第一显示区域,使得终端在展示互动特效的过程中,不会与已有特效发生重叠,进而优化了特效的显示效果。
在本公开实施例中,将附加特效展示在视频播放界面中的第二显示区域中,其中,第二显示区域与互动特效所在的第一显示区域不存在重叠区域,使得视频播放界面中展示的互动特效和附加特效不重叠,进而优化了特效的显示效果。
并且,终端在确定第二显示区域的过程中,根据与当前至少一个第六显示区域不重叠的显示区域确定第二显示区域,使得终端在展示附加特效的过程中,不会与已有特效发生重叠,进而优化了特效的显示效果。
在一些实施例中,终端根据互动操作的操作位置确定互动特效和附加特效的显示区域,在该显示区域中,不重叠地显示互动特效和附加特效。图7为根据一示例性实施例提供的一种互动特效展示方法流程图。在本公开实施例中,以终端根据互动操作的操作位置确定第一显示区域和第二显示区域为例进行说明。如图7所示,该方法包括以下步骤:
在701中,响应于针对视频播放界面的互动操作,终端在该视频播放界面的第一显示区域中展示该互动操作对应的互动特效。
本步骤与步骤301相似,在此不再赘述。
在702中,响应于该互动操作的触发时间在目标时间段内,终端确定该第一显示区域;以该第一显示区域为中心区域,将该第一显示区域的外围区域确定为该第二显示区域。
在本公开实施例中,终端检测到互动操作,确定该互动操作在该视频播放界面中的操作位置。
在一些实施例中,第二显示区域为一个显示区域。参见图8,其示出了根据一示例性实施例提供的一种视频播放界面的示意图。其中,第一显示区域为圆形区域,第二显示区域为第一显示区域外围的环形区域。
在另一些实施例中,第二显示区域为多个显示区域,参见图9,其示出了根据一示例性实施例提供的一种视频播放界面的示意图。其中,第一显示区域为圆形区域,多个第二显示区域为第一显示区域外围的多个扇形区域。
在703中,终端在该第二显示区域中展示该目标时间段内该互动操作对应的附加特效。
在本公开实施例中,终端将互动特效展示在第一显示区域中,将附加特效展示在第二显示区域中。
在一些实施例中,请继续参见图8,附加特效在该环形的第二显示区域中以放射性的显示形式,同时展示在该环形的第二显示区域中。在一些实施例中,参见图10,其示出了根据一示例性实施例提供的一种视频播放界面的示意图。终端顺时针依次通过该多个第二显示区域中的各个第二显示区域分别展示该目标时间段对应的附加特效。在另一些实施例中,参见图11,其示出了根据一示例性实施例提供的一种视频播放界面的示意图。终端逆时针依次通过该多个第二显示区域中的各个第二显示区域分别展示该目标时间段对应的附加特效。
需要说明的一点是,该多个第二显示区域中展示的附加特效为同一互动操作对应的附加特效,或者,该多个第二显示区域中展示的附加特效为不同互动操作对应的附加特效,在本公开实施例中,对此不作具体限定。
其中,在该多个第二显示区域中展示的附加特效为同一互动操作对应的附加特效的情况下,每个互动操作产生多个附加特效,该多个附加特效相同或者不同,该多个附加特效依次顺时或逆时方向展示在多个第二显示区域中;在该多个第二显示区域中展示的附加特效为不同互动操作对应的独家特效的情况下,每个互动操作产生一个附加特效,终端根据上一个互动操作对应的附加特效的第二显示区域,确定当前互动操作对应的附加特效的第二显示区域,其中,这两个第二显示区域相邻。
在本公开实施例中,通过在事先确定的显示区域中确定第二显示区域和第一显示区域,防止了第一显示区域与第二显示区域之间产生重叠,从而优化了显示效果。
需要说明的另一点是,在一些实施例中,该第一显示区域为视频播放界面中产生互动操作的位置对应的显示区域。相应的,终端确定互动操作的产生坐标,将该坐标确定为该互动操作的位置。在另一些实施例中,该第一显示区域为视频播放界面中产生互动操作时,根据互动操作的类型预先确定的操作位置。相应的,终端确定互动操作的操作类型,根据该操作类型,从终端中存储的操作类型和操作位置的对应关系中,确定该操作类型对应的操作位置。在另一些实施例中,该第一显示区域为视频播放界面中与其他显示区域不重叠的操作位置。相应的,终端检测到互动操作,从该视频播放界面中确定当前与其他操作位置不重叠的操作位置。
在本公开实施例中,将目标时间段对应的附加特效展示在视频播放界面中的第二显示区域中,其中,第二显示区域与互动特效所在的第一显示区域不存在重叠区域,使得视频播放界面中展示的互动特效和附加特效不重叠,进而优化了特效的显示效果。
图12是根据一示例性实施例提供的一种互动特效展示装置的框图。参见图12,装置包括:
第一展示单元1201,被配置为响应于针对视频播放界面的互动操作,在该视频播放界面的第一显示区域中展示该互动操作对应的互动特效;
确定单元1202,被配置为响应于该互动操作的触发时间在目标时间段内,基于该第一显示区域,在该视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
第二展示单元1203,被配置为在该第二显示区域中展示该目标时间段内所述互动操作对应的附加特效。
在一些实施例中,该确定单元1202包括:
第一确定子单元,被配置为在该视频播放界面中确定第三显示区域;
第二确定子单元,被配置为在该第一显示区域和该第三显示区域不存在重叠区域的情况下,将该第三显示区域作为该第二显示区域;或者,
该第二确定子单元,被配置为在该第一显示区域和该第三显示区域存在重叠区域的情况下,在该视频播放界面中确定新的第三显示区域。
在一些实施例中,该第二确定子单元,被配置为在该视频播放界面中确定目标位置;获取该附加特效的尺寸信息;基于该附加特效的尺寸信息,在该视频播放界面中,确定以该目标位置为中心的该第三显示区域。
在一些实施例中,该第二确定子单元,被配置为在该视频播放界面中确定第四显示区域,该第四显示区域为该视频播放界面中播放的视频画面所在的显示区域;以目标内边距,对该第四显示区域进行缩小处理,得到第五显示区域,该目标内边距为与该附加特效的尺寸信息匹配的内边距;在该第五显示区域中,确定该目标位置。
在一些实施例中,该确定单元1202包括:
第三确定子单元,被配置为在该视频播放界面中确定第六显示区域,该第六显示区域为该视频播放界面中已有特效所在的显示区域,所述已有特效为之前的互动操作对应的互动特效或附加特效;
第四确定子单元,被配置为在该视频播放界面中确定第七显示区域;
第五确定子单元,被配置为在该第一显示区域和该第七显示区域不存在重叠区域,且,该第七显示区域和该第六显示区域不存在重叠区域的情况下,将该第七显示区域作为该第二显示区域;或者,
该第五确定子单元,被配置为在该第一显示区域和该第七显示区域存在重叠区域,或,该第七显示区域和该第六显示区域存在重叠区域的情况下,在该视频播放界面中确定新的第 七显示区域。
在一些实施例中,该第一展示单元1201,被配置为在该视频播放界面中确定与该第六显示区域不重叠的第一显示区域;将该互动操作对应的互动特效展示在该第一显示区域中。
在一些实施例中,该确定单元1202包括:
第六确定子单元,被配置为以该第一显示区域为中心区域,将该第一显示区域的外围区域确定为该第二显示区域。
在一些实施例中,该第二显示区域为多个,该第二展示单元1203,被配置为顺时针依次通过该多个第二显示区域中的各个第二显示区域分别展示该附加特效;或者,
该第二展示单元1203,被配置为逆时针依次通过该多个第二显示区域中的各个第二显示区域分别展示该附加特效。
在本公开实施例中,将目标时间段对应的附加特效展示在视频播放界面中的第二显示区域中,其中,第二显示区域与互动特效所在的第一显示区域不存在重叠区域,使得视频播放界面中展示的互动特效和附加特效不重叠,进而优化了特效的显示效果。
需要说明的是:上述实施例提供的互动特效展示装置在展示互动特效时,仅以上述各功能模块的划分进行举例说明,实际应用中,能够根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的互动特效展示装置与互动特效展示方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
图13是根据一示例性实施例提供的一种终端1300的结构示意图。在一些实施例中,该终端1300是便携式移动终端,比如:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。终端1300还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端1300包括有:处理器1301和存储器1302。
在一些实施例中,处理器1301包括一个或多个处理核心,比如4核心处理器、8核心处理器等。在一些实施例中,处理器1301采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。在一些实施例中,处理器1301也包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1301集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1301还包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
在一些实施例中,存储器1302包括一个或多个计算机可读存储介质,该计算机可读存储介质是非暂态的。在一些实施例中,存储器1302还包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1302中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1301所执行以实现本公开中方法实施例提供的互动特效展示方法。
在一些实施例中,终端1300还可选包括有:外围设备接口1303和至少一个外围设备。在一些实施例中,处理器1301、存储器1302和外围设备接口1303之间通过总线或信号线相连。在一些实施例中,各个外围设备通过总线、信号线或电路板与外围设备接口1303相连。具体地,外围设备包括:射频电路1304、显示屏1305、摄像头组件1306、音频电路1307、定位组件1308和电源1309中的至少一种。
外围设备接口1303可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设 备连接到处理器1301和存储器1302。在一些实施例中,处理器1301、存储器1302和外围设备接口1303被集成在同一芯片或电路板上;在一些其他实施例中,处理器1301、存储器1302和外围设备接口1303中的任意一个或两个在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1304用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1304通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1304将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。在一些实施例中,射频电路1304包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。在一些实施例中,射频电路1304通过至少一种无线通信协议来与其他终端进行通信。该无线通信协议包括但不限于:万维网、城域网、内联网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1304还包括NFC(Near Field Communication,近距离无线通信)有关的电路,本公开对此不加以限定。
显示屏1305用于显示UI(User Interface,用户界面)。在一些实施例中,该UI包括图形、文本、图标、视频及其他们的任意组合。当显示屏1305是触摸显示屏时,显示屏1305还具有采集在显示屏1305的表面或表面上方的触摸信号的能力。在一些实施例中,该触摸信号作为控制信号输入至处理器1301进行处理。此时,显示屏1305还用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1305为一个,设置在终端1300的前面板;在另一些实施例中,显示屏1305为至少两个,分别设置在终端1300的不同表面或呈折叠设计;在另一些实施例中,显示屏1305是柔性显示屏,设置在终端1300的弯曲表面上或折叠面上。甚至,显示屏1305还设置成非矩形的不规则图形,也即异形屏。在一些实施例中,显示屏1305采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1306用于采集图像或视频。在一些实施例中,摄像头组件1306包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其他融合拍摄功能。在一些实施例中,摄像头组件1306还包括闪光灯。在一些实施例中,闪光灯是单色温闪光灯,在一些实施例中,闪光灯是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,用于不同色温下的光线补偿。
在一些实施例中,音频电路1307包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1301进行处理,或者输入至射频电路1304以实现语音通信。出于立体声采集或降噪的目的,在一些实施例中,麦克风为多个,分别设置在终端1300的不同部位。在一些实施例中,麦克风是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1301或射频电路1304的电信号转换为声波。在一些实施例中,扬声器是传统的薄膜扬声器,在一些实施例中,扬声器以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅能够将电信号转换为人类可听见的声波,也能够将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1307还包括耳机插孔。
定位组件1308用于定位终端1300的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。在一些实施例中,定位组件1308是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统或俄罗斯的伽利略系统的定位组件。
电源1309用于为终端1300中的各个组件进行供电。在一些实施例中,电源1309是交流电、直流电、一次性电池或可充电电池。当电源1309包括可充电电池时,该可充电电池是有线充电电池或无线充电电池。有线充电电池是通过有线线路充电的电池,无线充电电池是通过无线线圈充电的电池。该可充电电池还用于支持快充技术。
在一些实施例中,终端1300还包括有一个或多个传感器1310。该一个或多个传感器1310包括但不限于:加速度传感器1311、陀螺仪传感器1312、压力传感器1313、指纹传感器1314、光学传感器1315以及接近传感器1316。
在一些实施例中,加速度传感器1311检测以终端1300建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1311用于检测重力加速度在三个坐标轴上的分量。在一些实施例中,处理器1301根据加速度传感器1311采集的重力加速度信号,控制显示屏1305以横向视图或纵向视图进行用户界面的显示。在一些实施例中,加速度传感器1311还用于游戏或者用户的运动数据的采集。
在一些实施例中,陀螺仪传感器1312检测终端1300的机体方向及转动角度,陀螺仪传感器1312与加速度传感器1311协同采集用户对终端1300的3D动作。处理器1301根据陀螺仪传感器1312采集的数据,能够实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
在一些实施例中,压力传感器1313设置在终端1300的侧边框和/或显示屏1305的下层。当压力传感器1313设置在终端1300的侧边框时,能够检测用户对终端1300的握持信号,由处理器1301根据压力传感器1313采集的握持信号进行左右手识别或快捷操作。当压力传感器1313设置在显示屏1305的下层时,由处理器1301根据用户对显示屏1305的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1314用于采集用户的指纹,由处理器1301根据指纹传感器1314采集到的指纹识别用户的身份,或者,由指纹传感器1314根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1301授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。在一些实施例中,指纹传感器1314被设置在终端1300的正面、背面或侧面。当终端1300上设置有物理按键或厂商Logo时,指纹传感器1314与物理按键或厂商Logo集成在一起。
光学传感器1315用于采集环境光强度。在一个实施例中,处理器1301根据光学传感器1315采集的环境光强度,控制显示屏1305的显示亮度。具体地,当环境光强度较高时,调高显示屏1305的显示亮度;当环境光强度较低时,调低显示屏1305的显示亮度。在另一个实施例中,处理器1301还根据光学传感器1315采集的环境光强度,动态调整摄像头组件1306的拍摄参数。
接近传感器1316,也称距离传感器,通常设置在终端1300的前面板。接近传感器1316用于采集用户与终端1300的正面之间的距离。在一个实施例中,当接近传感器1316检测到用户与终端1300的正面之间的距离逐渐变小时,由处理器1301控制显示屏1305从亮屏状态切换为息屏状态;当接近传感器1316检测到用户与终端1300的正面之间的距离逐渐变大时,由处理器1301控制显示屏1305从息屏状态切换为亮屏状态。
本领域技术人员能够理解,图13中示出的结构并不构成对终端1300的限定,能够包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
在示例性实施例中,还提供了一种计算机可读存储介质,计算机可读存储介质中存储至少一条程序代码,至少一条程序代码由终端加载并执行,以实现上述实施例中互动特效生成方法。在一些实施例中,该计算机可读存储介质是存储器。例如,该计算机可读存储介质是ROM(Read-Only Memory,只读存储器)、RAM(Random Access Memory,随机存取存储器)、CD-ROM(Compact Disc Read-Only Memory,紧凑型光盘只读储存器)、磁带、软盘和光数据存储设备等。
在示例性实施例中,还提供了一种计算机程序产品,在所述计算机程序产品中的程序代码由终端的处理器执行的情况下,以实现如本公开实施例所述的互动特效展示方法的指令。
本领域普通技术人员能够理解实现上述实施例的全部或部分步骤能够通过硬件来完成,也能够通过程序来程序代码相关的硬件完成,该程序存储于一种计算机可读存储介质中,上述提到的存储介质是只读存储器,磁盘或光盘等。
本公开所有实施例均可以单独被执行,也可以与其他实施例相结合被执行,均视为本公开要求的保护范围。

Claims (26)

  1. 一种互动特效展示方法,所述方法包括:
    响应于针对视频播放界面的互动操作,在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
    响应于所述互动操作的触发时间在目标时间段内,基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
    在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
  2. 根据权利要求1所述的方法,其中,所述基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,包括:
    在所述视频播放界面中确定第三显示区域;
    在所述第一显示区域和所述第三显示区域不存在重叠区域的情况下,将所述第三显示区域作为所述第二显示区域;或者,
    在所述第一显示区域和所述第三显示区域存在重叠区域的情况下,在所述视频播放界面中确定新的第三显示区域。
  3. 根据权利要求2所述的方法,其中,所述在所述视频播放界面中确定第三显示区域,包括:
    在所述视频播放界面中确定目标位置;
    获取所述附加特效的尺寸信息;
    基于所述附加特效的尺寸信息,在所述视频播放界面中,确定以所述目标位置为中心的所述第三显示区域。
  4. 根据权利要求3所述的方法,其中,所述在所述视频播放界面中确定目标位置,包括:
    在所述视频播放界面中确定第四显示区域,所述第四显示区域为所述视频播放界面中播放的视频画面所在的显示区域;
    以目标内边距,对所述第四显示区域进行缩小处理,得到第五显示区域,所述目标内边距为与所述附加特效的尺寸信息匹配的内边距;
    在所述第五显示区域中,确定所述目标位置。
  5. 根据权利要求1所述的方法,其中,所述基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,包括:
    在所述视频播放界面中确定第六显示区域,所述第六显示区域为所述视频播放界面中已有特效所在的显示区域,所述已有特效为之前的互动操作对应的互动特效或附加特效;
    在所述视频播放界面中确定第七显示区域;
    在所述第一显示区域和所述第七显示区域不存在重叠区域,且,所述第七显示区域和所述第六显示区域不存在重叠区域的情况下,将所述第七显示区域作为所述第二显示区域;
    在所述第一显示区域和所述第七显示区域存在重叠区域,或,所述第七显示区域和所述第六显示区域存在重叠区域的情况下,在所述视频播放界面中确定新的第七显示区域。
  6. 根据权利要求5所述的方法,其中,所述在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效,包括:
    在所述视频播放界面中确定与所述第六显示区域不重叠的第一显示区域;
    将所述互动操作对应的互动特效展示在所述第一显示区域中。
  7. 根据权利要求1所述的方法,其中,所述基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,包括:
    以所述第一显示区域为中心区域,将所述第一显示区域的外围区域确定为所述第二显示区域。
  8. 根据权利要求7所述的方法,其中,所述第二显示区域为多个,所述在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效,包括:
    顺时针依次通过所述多个第二显示区域中的各个第二显示区域分别展示所述附加特效;或者,
    逆时针依次通过所述多个第二显示区域中的各个第二显示区域分别展示所述附加特效。
  9. 一种互动特效展示装置,所述装置包括:
    第一展示单元,被配置为响应于针对视频播放界面的互动操作,在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
    确定单元,被配置为响应于所述互动操作的触发时间在目标时间段内,基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
    第二展示单元,被配置为在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
  10. 根据权利要求9所述的装置,其中,所述确定单元包括:
    第一确定子单元,被配置为在所述视频播放界面中确定第三显示区域;
    第二确定子单元,被配置为在所述第一显示区域和所述第三显示区域不存在重叠区域的情况下,将所述第三显示区域作为所述第二显示区域;或者,
    所述第二确定子单元,被配置为在所述第一显示区域和所述第三显示区域存在重叠区域的情况下,在所述视频播放界面中确定新的第三显示区域。
  11. 根据权利要求10所述的装置,其中,所述第二确定子单元,被配置为在所述视频播放界面中确定目标位置;获取所述附加特效的尺寸信息;基于所述附加特效的尺寸信息,在所述视频播放界面中,确定以所述目标位置为中心的所述第三显示区域。
  12. 根据权利要求11所述的装置,其中,所述第二确定子单元,被配置为在所述视频播放界面中确定第四显示区域,所述第四显示区域为所述视频播放界面中播放的视频画面所在的显示区域;以目标内边距,对所述第四显示区域进行缩小处理,得到第五显示区域,所述目标内边距为与所述附加特效的尺寸信息匹配的内边距;在所述第五显示区域中,确定所述目标位置。
  13. 根据权利要求9所述的装置,其中,所述确定单元包括:
    第三确定子单元,被配置为在所述视频播放界面中确定第六显示区域,所述第六显示区域为所述视频播放界面中已有特效所在的显示区域,所述已有特效为之前的互动操作对应的互动特效或附加特效;
    第四确定子单元,被配置为在所述视频播放界面中确定第七显示区域;
    第五确定子单元,被配置为在所述第一显示区域和所述第七显示区域不存在重叠区域,且,所述第七显示区域和所述第六显示区域不存在重叠区域的情况下,将所述第七显示区域作为所述第二显示区域;或者,
    所述第五确定子单元,被配置为在所述第一显示区域和所述第七显示区域存在重叠区域,或,所述第七显示区域和所述第六显示区域存在重叠区域的情况下,在所述视频播放界面中确定新的第七显示区域。
  14. 根据权利要求13所述的装置,其中,所述第一展示单元,被配置为在所述视频播放界面中确定与所述第六显示区域不重叠的第一显示区域;将所述互动操作对应的互动特效展示在所述第一显示区域中。
  15. 根据权利要求9所述的装置,其中,所述确定单元包括:
    第六确定子单元,被配置为以所述第一显示区域为中心区域,将所述第一显示区域的外围区域确定为所述第二显示区域。
  16. 根据权利要求15所述的装置,其中,所述第二显示区域为多个,所述第二展示单元,被配置为顺时针依次通过所述多个第二显示区域中的各个第二显示区域分别展示所述附加特效;或者,
    所述第二展示单元,被配置为逆时针依次通过所述多个第二显示区域中的各个第二显示区域分别展示所述附加特效。
  17. 一种终端,包括:
    处理器;
    用于存储所述处理器可执行的至少一条程序代码的存储器;
    其中,所述处理器被配置为执行所述至少一条程序代码,以实现如下步骤:
    响应于针对视频播放界面的互动操作,在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
    响应于所述互动操作的触发时间在目标时间段内,基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
    在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
  18. 根据权利要求17所述的终端,其中,所述基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,包括:
    在所述视频播放界面中确定第三显示区域;
    在所述第一显示区域和所述第三显示区域不存在重叠区域的情况下,将所述第三显示区域作为所述第二显示区域;或者,
    在所述第一显示区域和所述第三显示区域存在重叠区域的情况下,在所述视频播放界面中确定新的第三显示区域。
  19. 根据权利要求18所述的终端,其中,所述在所述视频播放界面中确定第三显示区域,包括:
    在所述视频播放界面中确定目标位置;
    获取所述附加特效的尺寸信息;
    基于所述附加特效的尺寸信息,在所述视频播放界面中,确定以所述目标位置为中心的所述第三显示区域。
  20. 根据权利要求19所述的终端,其中,所述在所述视频播放界面中确定目标位置,包括:
    在所述视频播放界面中确定第四显示区域,所述第四显示区域为所述视频播放界面中播 放的视频画面所在的显示区域;
    以目标内边距,对所述第四显示区域进行缩小处理,得到第五显示区域,所述目标内边距为与所述附加特效的尺寸信息匹配的内边距;
    在所述第五显示区域中,确定所述目标位置。
  21. 根据权利要求17所述的终端,其中,所述基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,包括:
    在所述视频播放界面中确定第六显示区域,所述第六显示区域为所述视频播放界面中已有特效所在的显示区域,所述已有特效为之前的互动操作对应的互动特效或附加特效;
    在所述视频播放界面中确定第七显示区域;
    在所述第一显示区域和所述第七显示区域不存在重叠区域,且,所述第七显示区域和所述第六显示区域不存在重叠区域的情况下,将所述第七显示区域作为所述第二显示区域;
    在所述第一显示区域和所述第七显示区域存在重叠区域,或,所述第七显示区域和所述第六显示区域存在重叠区域的情况下,在所述视频播放界面中确定新的第七显示区域。
  22. 根据权利要求21所述的终端,其中,所述在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效,包括:
    在所述视频播放界面中确定与所述第六显示区域不重叠的第一显示区域;
    将所述互动操作对应的互动特效展示在所述第一显示区域中。
  23. 根据权利要求17所述的终端,其中,所述基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,包括:
    以所述第一显示区域为中心区域,将所述第一显示区域的外围区域确定为所述第二显示区域。
  24. 根据权利要求23所述的终端,其中,所述第二显示区域为多个,所述在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效,包括:
    顺时针依次通过所述多个第二显示区域中的各个第二显示区域分别展示所述附加特效;或者,
    逆时针依次通过所述多个第二显示区域中的各个第二显示区域分别展示所述附加特效。
  25. 一种计算机可读存储介质,在所述计算机可读存储介质中的至少一条程序代码由终端的处理器执行的情况下,所述终端能够执行如下步骤:
    响应于针对视频播放界面的互动操作,在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
    响应于所述互动操作的触发时间在目标时间段内,基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
    在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
  26. 一种计算机程序产品,包括计算机程序代码,所述计算机程序代码被处理器执行时实现如下步骤:
    响应于针对视频播放界面的互动操作,在所述视频播放界面的第一显示区域中展示所述互动操作对应的互动特效;
    响应于所述互动操作的触发时间在目标时间段内,基于所述第一显示区域,在所述视频播放界面中确定第二显示区域,所述第二显示区域与所述第一显示区域不重叠;
    在所述第二显示区域中展示所述目标时间段内所述互动操作对应的附加特效。
PCT/CN2021/113600 2020-09-28 2021-08-19 互动特效展示方法及终端 WO2022062788A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011045824.2A CN112181572A (zh) 2020-09-28 2020-09-28 互动特效展示方法、装置、终端及存储介质
CN202011045824.2 2020-09-28

Publications (1)

Publication Number Publication Date
WO2022062788A1 true WO2022062788A1 (zh) 2022-03-31

Family

ID=73945700

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/113600 WO2022062788A1 (zh) 2020-09-28 2021-08-19 互动特效展示方法及终端

Country Status (2)

Country Link
CN (1) CN112181572A (zh)
WO (1) WO2022062788A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361566A (zh) * 2022-08-17 2022-11-18 广州繁星互娱信息科技有限公司 直播观看方法、装置、终端及存储介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112181572A (zh) * 2020-09-28 2021-01-05 北京达佳互联信息技术有限公司 互动特效展示方法、装置、终端及存储介质
CN112732152B (zh) * 2021-01-27 2022-05-24 腾讯科技(深圳)有限公司 直播处理方法、装置、电子设备及存储介质
CN113190156A (zh) * 2021-05-13 2021-07-30 杭州网易云音乐科技有限公司 音乐播放控制方法、装置、存储介质及电子设备
CN115840610A (zh) * 2021-09-18 2023-03-24 华为技术有限公司 桌面动效显示方法及电子设备
CN114090167B (zh) * 2021-11-30 2024-02-27 东风汽车有限公司东风日产乘用车公司 节日彩蛋展示方法、装置、设备及存储介质
CN114385298A (zh) * 2022-01-12 2022-04-22 北京字跳网络技术有限公司 信息交互方法、装置、设备及存储介质
CN114567805A (zh) * 2022-02-24 2022-05-31 北京字跳网络技术有限公司 确定特效视频的方法、装置、电子设备及存储介质
CN116737028A (zh) * 2022-03-02 2023-09-12 北京字跳网络技术有限公司 短视频的播放方法、装置及电子设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102779028A (zh) * 2011-05-09 2012-11-14 腾讯科技(深圳)有限公司 一种客户端特效合成引擎的实现方法及装置
CN105302408A (zh) * 2014-06-24 2016-02-03 腾讯科技(深圳)有限公司 对悬浮按钮的位置进行调节的方法、装置及终端
CN106469165A (zh) * 2015-08-18 2017-03-01 腾讯科技(深圳)有限公司 弹幕展示方法及弹幕展示装置
CN106878825A (zh) * 2017-01-09 2017-06-20 腾讯科技(深圳)有限公司 基于直播的声效展示方法和装置
CN108234903A (zh) * 2018-01-30 2018-06-29 广州市百果园信息技术有限公司 互动特效视频的处理方法、介质和终端设备
CN109568937A (zh) * 2018-10-31 2019-04-05 北京市商汤科技开发有限公司 游戏控制方法及装置、游戏终端及存储介质
US20200014986A1 (en) * 2016-06-02 2020-01-09 John Senew Apparatus and method for displaying video
CN112181572A (zh) * 2020-09-28 2021-01-05 北京达佳互联信息技术有限公司 互动特效展示方法、装置、终端及存储介质

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436341A (zh) * 2011-09-19 2012-05-02 百度在线网络技术(北京)有限公司 在移动终端的显示屏幕上进行内容操作的方法与装置
CN104184731B (zh) * 2014-08-22 2017-10-27 广州华多网络科技有限公司 一种信息显示方法、装置及系统
CN106354381B (zh) * 2015-07-22 2019-12-20 腾讯科技(深圳)有限公司 图像文件的处理方法及装置
CN106686398A (zh) * 2017-01-16 2017-05-17 北京达佳互联信息技术有限公司 一种信息交互方法、相关设备及系统
CN107241636A (zh) * 2017-05-25 2017-10-10 北京潘达互娱科技有限公司 一种虚拟礼物展示方法及装置
CN110062269A (zh) * 2018-01-18 2019-07-26 腾讯科技(深圳)有限公司 附加对象显示方法、装置及计算机设备
CN109859102B (zh) * 2019-02-01 2021-07-23 北京达佳互联信息技术有限公司 特效显示方法、装置、终端及存储介质
CN110337023B (zh) * 2019-07-02 2022-05-13 游艺星际(北京)科技有限公司 动画显示方法、装置、终端及存储介质
CN110830813B (zh) * 2019-10-31 2020-11-06 北京达佳互联信息技术有限公司 一种视频切换的方法、装置、电子设备及存储介质
CN111182343B (zh) * 2019-12-09 2021-09-24 腾讯科技(深圳)有限公司 动画素材的播放方法和装置、存储介质及电子装置
CN111601139A (zh) * 2020-04-27 2020-08-28 维沃移动通信有限公司 信息显示方法、电子设备及存储介质
CN111526411A (zh) * 2020-04-29 2020-08-11 北京字节跳动网络技术有限公司 视频的处理方法、装置、设备及介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102779028A (zh) * 2011-05-09 2012-11-14 腾讯科技(深圳)有限公司 一种客户端特效合成引擎的实现方法及装置
CN105302408A (zh) * 2014-06-24 2016-02-03 腾讯科技(深圳)有限公司 对悬浮按钮的位置进行调节的方法、装置及终端
CN106469165A (zh) * 2015-08-18 2017-03-01 腾讯科技(深圳)有限公司 弹幕展示方法及弹幕展示装置
US20200014986A1 (en) * 2016-06-02 2020-01-09 John Senew Apparatus and method for displaying video
CN106878825A (zh) * 2017-01-09 2017-06-20 腾讯科技(深圳)有限公司 基于直播的声效展示方法和装置
CN108234903A (zh) * 2018-01-30 2018-06-29 广州市百果园信息技术有限公司 互动特效视频的处理方法、介质和终端设备
CN109568937A (zh) * 2018-10-31 2019-04-05 北京市商汤科技开发有限公司 游戏控制方法及装置、游戏终端及存储介质
CN112181572A (zh) * 2020-09-28 2021-01-05 北京达佳互联信息技术有限公司 互动特效展示方法、装置、终端及存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361566A (zh) * 2022-08-17 2022-11-18 广州繁星互娱信息科技有限公司 直播观看方法、装置、终端及存储介质

Also Published As

Publication number Publication date
CN112181572A (zh) 2021-01-05

Similar Documents

Publication Publication Date Title
WO2022062788A1 (zh) 互动特效展示方法及终端
CN108776568B (zh) 网页页面的显示方法、装置、终端及存储介质
CN112162671B (zh) 直播数据处理方法、装置、电子设备及存储介质
WO2022121358A1 (zh) 信息显示方法及其装置
CN109729411B (zh) 直播互动方法及装置
WO2022088884A1 (zh) 页面展示方法和终端
CN108737897B (zh) 视频播放方法、装置、设备及存储介质
CN107908929B (zh) 播放音频数据的方法和装置
CN109275013B (zh) 虚拟物品展示的方法、装置、设备及存储介质
WO2022033227A1 (zh) 信息显示方法及装置
CN112016941A (zh) 虚拟物品的领取方法、装置、终端及存储介质
CN109766098B (zh) 应用程序的运行方法、设备及存储介质
CN108900925B (zh) 设置直播模板的方法和装置
WO2023050737A1 (zh) 基于直播间的资源展示方法及终端
EP4093032A1 (en) Method and apparatus for displaying data
EP4184412A1 (en) Method and apparatus for presenting resources
CN112044065B (zh) 虚拟资源的显示方法、装置、设备及存储介质
WO2022134632A1 (zh) 作品处理方法及装置
WO2022095465A1 (zh) 信息显示方法及装置
CN107656794B (zh) 界面显示方法和装置
US20220291791A1 (en) Method and apparatus for determining selected target, device, and storage medium
CN109800003B (zh) 应用下载方法、装置、终端及存储介质
WO2022127488A1 (zh) 人机交互界面的控制方法、装置、计算机设备及存储介质
CN113613028A (zh) 直播数据处理方法、装置、终端、服务器及存储介质
EP4125274A1 (en) Method and apparatus for playing videos

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21871161

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21871161

Country of ref document: EP

Kind code of ref document: A1