WO2020103657A1 - 视频文件播放方法、装置和存储介质 - Google Patents

视频文件播放方法、装置和存储介质

Info

Publication number
WO2020103657A1
WO2020103657A1 PCT/CN2019/114292 CN2019114292W WO2020103657A1 WO 2020103657 A1 WO2020103657 A1 WO 2020103657A1 CN 2019114292 W CN2019114292 W CN 2019114292W WO 2020103657 A1 WO2020103657 A1 WO 2020103657A1
Authority
WO
WIPO (PCT)
Prior art keywords
animation
click
screen
position information
file
Prior art date
Application number
PCT/CN2019/114292
Other languages
English (en)
French (fr)
Inventor
袁佳平
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to EP19886284.9A priority Critical patent/EP3886449A4/en
Publication of WO2020103657A1 publication Critical patent/WO2020103657A1/zh
Priority to US17/085,797 priority patent/US11528535B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8146Monomedia components thereof involving graphical data, e.g. 3D object, 2D graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47205End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for manipulating displayed content, e.g. interacting with MPEG-4 objects, editing locally
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/816Monomedia components thereof involving special video data, e.g 3D video
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8545Content authoring for generating interactive applications

Definitions

  • the present application relates to the technical field of data processing, and in particular, to a video file playing method, device, and storage medium.
  • Video generally refers to various technologies that capture, record, process, store, transmit and reproduce a series of still images in the form of electrical signals.
  • the continuous image change exceeds 24 frames per second or more, according to the principle of persistence of vision, the human eye cannot distinguish a single static picture; it seems to be a smooth and continuous visual effect.
  • Such a continuous picture is called video.
  • the development of network technology has also caused the recorded video clips to exist on the Internet in the form of streaming media and can be received and played by terminal devices such as computers and mobile phones.
  • the user's interactive operation with the video may include video playback control operations, such as video playback start and pause, video playback volume adjustment, and playback progress control operations, etc.
  • An embodiment of the present application provides a video file playback method, which is executed by a terminal device and includes:
  • the animation file is played frame by frame according to the playback time of the video file, the video file includes at least one display object, and the animation file includes an animation element generated according to the display object;
  • the interactive operation corresponding to the hit animation element is determined, and the interactive operation is performed.
  • An embodiment of the present application provides a video file playback device, including:
  • a playback unit is configured to play the animation file frame by frame according to the playback time of the video file during the playback of the video file, the video file includes at least one display object, and the animation file includes the display object The generated animation symbol;
  • a first determining unit configured to determine the click position information of the screen click event when a screen click event is heard during playing a video file
  • the matching unit is configured to determine, in the animation file, an animation element display area matching the click position information of the screen click event according to the click position information of the screen click event;
  • a second determining unit configured to determine the animation element hit by the screen click event according to the matching animation element display area
  • the playing unit is also used to determine an interactive operation corresponding to the hit animation element and execute the interactive operation.
  • An embodiment of the present application provides a computing device, including at least one processor and at least one memory, wherein the memory stores a computer program, and when the program is executed by the processor, the processor is caused to execute the above Steps described in any video file playback method.
  • An embodiment of the present application also provides a computer-readable medium that stores a computer program executable by a terminal device, and when the program runs on the terminal device, causes the terminal device to execute any of the above-mentioned video file playback methods. The steps described.
  • FIG. 1 is a schematic diagram of an application scenario of a video file playing method according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of a video file in an embodiment of this application.
  • FIG. 3 is a schematic diagram of an animation file generated in an embodiment of this application.
  • FIG. 4 is a schematic diagram of an animation frame including animation elements in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a frame editing environment of animation production software in an embodiment of the present application.
  • 6A is a schematic diagram of a video playback interface after combining an animation file and a video file in an embodiment of the present application
  • 6B is a schematic diagram of another video playback interface after combining an animation file and a video file in an embodiment of the present application
  • FIG. 7A is a schematic diagram of an implementation process of a video file playing method in an embodiment of this application.
  • FIG. 7B is a schematic flowchart of step S72 in the embodiment of the present application.
  • FIG. 7C is a specific flowchart of determining an animation element display area corresponding to an animation element included in the animation file in an embodiment of the present application
  • 7D is a specific flow diagram of obtaining the display position information of the animation element in the screen coordinate system in step S721 in the embodiment of the present application;
  • FIG. 7E is a specific flow diagram of determining an animation element display area in the animation file that coincides with the click area in step S712 in an embodiment of the present application;
  • FIG. 8 is a schematic diagram of the relationship between the click area and the display area of the animation element in the embodiment of the present application.
  • FIG. 9 is a schematic diagram of an implementation process of a video file playing method according to another embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a video file playback device in an embodiment of this application.
  • FIG. 11 is a schematic structural diagram of a computing device according to an embodiment of the present application.
  • embodiments of the present application provide a video file playback method, device, and storage medium.
  • Video tag HTML5 defines a method to include video through the video element, which can be used to play video.
  • the terminal equipment in this application may be a personal computer (English name: Personal Computer, PC), tablet computer, personal digital assistant (Personal Digital Assistant (PDA)), personal communication service (English name: Personal Communication Service, PCS) phone, notebook Terminal devices such as mobile phones, or computers with mobile terminals, such as portable, pocket-sized, handheld, built-in computer, or vehicle-mounted mobile devices that can provide users with voice and / or data connectivity , And exchange language and / or data with the wireless access network.
  • PC Personal Computer
  • PDA Personal Digital Assistant
  • PCS Personal Communication Service
  • Multiple or several refers to two or more.
  • “And / or” describes the relationship of the related objects, indicating that there can be three relationships, for example, A and / or B, which can indicate: there are three conditions: A exists alone, A and B exist at the same time, and B exists alone.
  • the character "/” generally indicates that the related object is a "or" relationship.
  • FIG. 1 it is a schematic diagram of an application scenario of a video file playing method provided by an embodiment of the present application.
  • the user 10 accesses the server 12 through a client installed in the terminal device 11, where the client may be a web browser or an application client installed in the terminal device, such as a mobile phone or a tablet computer.
  • the server 12 may provide different services for the user 10, for example, video playback services, game services, and so on.
  • the terminal device 11 and the server 12 are connected by communication through a network, and the network may be a local area network, a cellular network, a wide area network, or the like.
  • the terminal device 11 may be a portable device (for example, a mobile phone, a tablet, a notebook computer, etc.) or a personal computer (PC, Personal Computer), and the server 12 may be any device that can provide Internet services.
  • the existing user interaction with the video is limited to the user's control operation on the video playback, and the user cannot further interact with the display object in the video, that is, the character in the video.
  • the embodiments of the present application provide a video file playing method for detecting user interaction with objects displayed in the video.
  • the video resources involved in the embodiments of the present application include two parts, one is an animation file and the other is a video file.
  • the animation file can use Flash software and other animation editing tools to generate animation effects and logic information files, and the video file can be mp4, etc. Format file.
  • the video file includes at least one display object, and the animation file includes an animation element generated according to each of the display objects in each video frame in the video file, and the animation element is used to identify the display area of the corresponding display object.
  • the position of the animation element corresponding to the display object is determined according to the display object in each video frame in the video file, that is, one video frame can be determined
  • the animation file includes an animation element corresponding to the display object in each video frame in the video.
  • the movement position of the animation element can be kept consistent with the movement position of the display object in the video file. In this way, by identifying the position of the user's screen click operation and the position of the animation element, the user's interactive operation with the display object in the video file can be identified.
  • the corresponding interactive result can be displayed, for example, an interactive operation reward score, and when the score reaches a certain value, member upgrades, game clearance, or interactive rewards can be performed.
  • a corresponding animation file is created for the frame image 201 shown in FIG. 2, and the created animation file includes the frame shown in FIG. 2
  • the animation element corresponding to the object is displayed in the image 201, as shown in FIG. 3.
  • the display object 202 may be the hands of the person in the frame image 201, then the animation element corresponds to the hands of the person in the frame image 201 of the video file.
  • the animation element may be a rectangular frame representing the position of the person's hands 202 in the animation file. As shown in FIG.
  • the rectangular frame 40 in the figure is created by using animation production software to represent the display object in the video Animation symbol at position 202.
  • FIG. 5 it is a schematic diagram of a frame editing environment of animation production software.
  • the rectangular frame 40 in FIG. 4 indicating the position of a person ’s hands 202 is placed in the frame corresponding to the frame image 201 in the animation file.
  • the rectangular frames corresponding to each video frame in each frame of these animation files will be played frame by frame synchronously with the video playback, so that the moving position of the rectangular frame is consistent with the moving position of the display object in the video .
  • the user clicks on the screen it is determined whether the user clicked position matches the position of the rectangular frame representing the display object in the video, and if it matches, it is determined that the user hits the display object in the video.
  • the interactive operation corresponding to the animation element is performed.
  • the interactive operation may be to perform a corresponding response operation, such as scoring the user, so as to realize the user Interact with the objects shown in the video.
  • FIG. 6A it is a schematic diagram of a video playback interface after an animation file and a video file are combined.
  • the animation frame is played synchronously, and the moving position of the rectangular frame 40 is always consistent with the moving position of the person ’s hands 202 in the video file.
  • the operation such as scoring the user, if a hit is determined, 10 may be added, otherwise no score will be scored, thereby realizing user interaction with the displayed object in the video.
  • the interactive operation corresponding to the animation element is performed, and the interactive operation may be a preset behavior feedback by the characters in the video, such as when When it is determined that the user hits the hands of the character in the video, the character in the video is triggered to interact with the user in real-time hand clapping.
  • the animation file can be played frame by frame in the hidden layer according to the playback time of the video file, that is, the animation file played synchronously with the video file is invisible to the user, and it can be used to determine the user's screen Whether the click operation hits the display position of the display object in the video file, and recognizes the interactive operation between the user and the display object in the video according to the judgment result.
  • FIG. 6B it is a schematic diagram of another video playback interface after the animation file and the video file are combined.
  • an operation guide area can also be added to the animation file, as shown by the circular area 60 in FIG. 6B, the operation guide area is visible to the user, and the operation The guide area may be larger than the above-mentioned animation element display area to facilitate user identification.
  • the current animation frame that should be played in the process of playing a video file, can be determined according to the current playhead position of the video file and the frame rate of the video playback.
  • the position of the playhead refers to the current play position of the video file. For example, if a video file currently plays to 16 seconds, the current play position of the video file is 16 seconds, that is, the position of the playhead is 16 seconds.
  • a display position information set under the video coordinate system may be established for the animation element, where the display position information under the video coordinate system
  • the set includes the display position information of the animation element corresponding to the display object in each video frame, and the display position information is the position information of the display object corresponding to the animation element at different display positions in the video.
  • the display position information it can be expressed as the starting point coordinates (x 0, y 0 ) of the display position and the width value w 1 and height value h 1 of the display area, which corresponds to a rectangular display area in the video.
  • the display parameters of the screen may include the display screen size and display resolution, and so on.
  • the display position information set of the animation element in the screen coordinate system is obtained by converting each display position information in the video coordinate system into the corresponding display position information in the screen coordinate system.
  • the display position information also includes starting point information (x 0 ′, y 0 ′), display width value w 2 and display height value h 2 , which correspond to a rectangular area on the display screen, and the rectangular area includes several pixels coordinate.
  • each display position information included in the display position information set of the animation element in the video coordinate system is converted to the corresponding screen coordinate system
  • the display position information of the corresponding display position information set, the display position information set under the screen coordinate system includes multiple display position information under the screen coordinate system, the display position information under each screen coordinate system corresponds to the display screen On the corresponding rectangular area, the rectangular area also includes several pixel coordinates.
  • the animation file accordingly includes a display list composed of animation elements corresponding to each display object in each video frame of the video file, and the display list also includes The display position information of the animation element in the video coordinate system and the screen coordinate system. Since the display position of the display object in different video frames is different, one display object may correspond to multiple animation elements at different display positions, each video frame
  • the display position information of each animation element in the video coordinate system and the screen coordinate system constitutes a prototype chain of each display object; as shown in Table 1, a possible data structure of its display list is shown:
  • an embodiment of the present application provides a video file playback method, as shown in FIG. 7A, which may include the following step:
  • the animation file is played frame by frame according to the playing time of the video file.
  • the video file includes at least one display object, and the animation file includes an animation element generated according to the display object.
  • the corresponding animation file is played frame by frame, so that each video frame of the video file and each frame of the animation file are played synchronously to maintain
  • the moving position of the animation element in each frame in the animation file is consistent with the moving position of the display object in each video frame in the video file.
  • the click location information of the screen click event is determined, where the click location information can be expressed as the user clicking the screen
  • the click coordinate of the screen is the horizontal coordinate X and vertical coordinate Y of the screen being clicked.
  • the click position information of the screen click event determine an animation element display area matching the click position information of the screen click event in the animation file.
  • the display area of the animation element matching the click position information of the screen click event may be the same display area as the click position information of the screen click event.
  • FIG. 7B is a schematic flowchart of step S72. As shown in FIG. 7B, it includes the following steps:
  • S712 Determine an animation element display area in the animation file that coincides with the click area, and use the determined animation element display area as the animation element display area that matches the screen click event.
  • a certain tolerance can be added to the detected click coordinates to obtain the corresponding click area to simulate the area of the user ’s finger.
  • the click area of the click coordinate for example, a circular area centered on the click coordinate and L as the radius can be determined as the click area corresponding to the click event, where L is a preset value, and in actual implementation, the size of L can be based on the actual It needs to be set, which is not limited in the embodiments of the present application.
  • the click coordinate can be used as the center point, and the rectangular area determined by the preset length and width can be used as the click area, or the click coordinate can be used as the center point, and the square area determined by the preset side length can be used as the click area.
  • the click area can also be other regular or irregular areas, which are not limited in the embodiments of the present application.
  • the display object can correspond to multiple animation elements in the animation file.
  • After determining the click area corresponding to the screen click event it is necessary to sequentially traverse the corresponding display position information of each animation element in the screen coordinate system , According to the display area of the animation element corresponding to the display position information, determine whether it coincides with the click area.
  • the circular area 80 shown in FIG. 8 is the click area determined in step S72, and the rectangular area 81 in FIG. 8 is the display area of any animation element on the display screen.
  • the display area is also It can be called an animation element display area. If the circular area 80 overlaps the rectangular area 81, it can be determined that the screen click event hits the animation element.
  • the following method can be used to determine whether the circular area and the rectangular area overlap: determine whether the coordinates of the pixels included in the clicked area coincide with the coordinates of at least one pixel included in the display area of the animation element, and if so, determine The click area coincides with the animation element display area. If the pixel coordinates included in the click area do not coincide with any pixel coordinates included in the animation element display area, it is determined that the click area does not coincide with the animation element display area.
  • FIG. 7C is a specific flowchart of determining the display area of the animation element corresponding to the animation element included in the animation file, as shown in FIG. 7C, including the following steps:
  • S722 Determine an animation element display area corresponding to the animation element according to the display position information of the animation element in the screen coordinate system.
  • FIG. 7D is a specific flow diagram of obtaining the display position information of the animation element in the screen coordinate system in step S721. As shown in FIG. 7D, it specifically includes the following steps: S731. Determine the display position information of the animation element in the video coordinate system according to the display position information of the display object corresponding to the animation element in different video frames.
  • S732 Convert the display position information of the animation element in the video coordinate system to the display position information in the screen coordinate system according to the display parameters of the display screen on which the video file is played.
  • the display position information includes the coordinates of the starting point of the animation element display area and the width and height values of the animation element display area.
  • 7E is a specific flow diagram of determining the display area of the animation element in the animation file that coincides with the click area in step S712. As shown in FIG. 7E, it specifically includes the following steps:
  • S73 Determine the animation element hit by the screen click event according to the matching animation element display area. During specific implementation, if it is determined that the clicked area matches the animation element display area corresponding to any animation element, it can be further determined whether the animation element corresponds to the animation element corresponding to the specified display object, for example, whether the animation element is represented in the video The animation elements of the character's hands, if it is, determine that the screen click event hits, otherwise, ignore the screen click event.
  • the interactive operation may be to perform a corresponding response operation, such as showing a corresponding score to the user, or a character in the video may make preset behavior feedback, such as a preset gesture action.
  • the interactive operation corresponding to the animation element is performed, and the interactive operation may be to perform a corresponding response operation, such as scoring the user In this way, the user interacts with the display objects in the video.
  • the interactive operation corresponding to the animation element is performed, and the interactive operation may be a preset behavior feedback by the characters in the video, such as when When it is determined that the user hits the hands of the character in the video, the character in the video is triggered to interact with the user in real-time hand clapping.
  • tracking gestures in video files is used as an example, such as As shown in Figure 9, the following steps can be included:
  • the loaded assets include a video file and an animation file generated for the video file, where the movement position of the animation element tracking gesture included in the animation file corresponds to the movement position of the person's hands in the video file.
  • the loaded animation component is parsed, and the parsed animation is rendered onto the canvas canvas, and then the canvas canvas is placed above the video window. It should be noted that the animation rendered on the canvas is invisible to the user.
  • step S94 Determine whether the playback of the video file is completed. If yes, perform step S913. If no, perform step S95.
  • the playback head position of the video is detected in real time, the frame position of the currently playing animation file is calculated according to the frame rate of the video playback and the current playback head position, and the currently playing animation is changed again according to the calculation result Frame to achieve the purpose of synchronizing the gesture animation element with the position of the hands in the video.
  • step S910 determines whether the click coordinate matches any display area of the animation element. If yes, perform step S910; if not, perform step S912.
  • this step first determine the corresponding click area according to the click coordinates, and then determine whether there is at least one pixel coordinate corresponding to at least one animation element included in the display list according to the pixel coordinates included in the click area The coordinates of the included pixels coincide. If yes, it is determined that the click coordinates match any display area of the animation element. If not, it is determined that the click coordinates do not match the display area of all animation elements.
  • step S910 Determine whether the animation element whose position matches is a gesture animation element. If yes, perform step S911; if not, perform step S912.
  • the matching animation element is a gesture animation element. If it is, it can be determined that the screen click event hits the character's hands in the video. If not, the operation can be ignored.
  • step S911 Determine that the screen click event hits the character's hands in the video, determine the interactive operation corresponding to the hit character's hands, perform the interactive operation, and return to step S94.
  • the video file playback method generateds an animation file including its corresponding animation elements for the display objects included in the video file, and when playing the video, plays the animation frames synchronously according to the video playback progress, and at the same time, listens for screen clicks Event, when a screen click event is heard, the corresponding click area is determined according to the click position information of the screen click event, and whether the click area and the animation element display area match is compared to determine whether the screen click event hits the animation element. It realizes the deep interaction between the user and the display objects in the video, and enhances the user's experience of interacting with the video. Because the user is no longer just a simple and tedious simple click interaction with the video, but the same as in the video. Roles carry out rich interactions to increase the user's initiative and enthusiasm in participating in video operations.
  • the video file playback method provided in the embodiment of the present application may be applied to a client including a video player such as a video playback client, or may be applied to a video playback webpage, which is not limited in the embodiment of the present application.
  • a video file playback device is also provided in the embodiments of the present application. Since the principle of the above device to solve the problem is similar to the video file playback method, the implementation of the above device may refer to the implementation of the method, and the repetition is not repeated Repeat.
  • FIG. 10 it is a schematic structural diagram of a video file playback device provided by an embodiment of the present application, including:
  • the playing unit 101 is configured to play the animation file frame by frame according to the playing time of the video file during the playing of the video file, the video file includes at least one display object, and the animation file includes Animation symbols generated by objects;
  • the first determining unit 102 is configured to determine the click position information of the screen click event when a screen click event is heard during playing a video file;
  • the matching unit 103 is configured to determine an animation element display area matching the screen click event in the animation file according to the click position information of the screen click event;
  • the second determining unit 104 is configured to determine the animation element corresponding to the matched display area as the animation element hit by the screen click event;
  • the playing unit 101 is configured to determine an interactive operation corresponding to the hit animation element and execute the interactive operation.
  • the matching unit 103 is configured to determine a click area including the click position according to the click position information of the screen click event; determine an animation element display area in the animation file that coincides with the click area And use the determined animation element display area as the animation element display area matching the screen click event.
  • the click location information includes click coordinates
  • the matching unit 103 is configured to determine a click area including the click coordinates according to the click coordinates of the screen click event.
  • the video file playback device provided by the embodiments of the present application further includes:
  • the third determining unit 105 is used to determine, before the matching unit determines an animation element display area matching the click position information of the screen click event in the animation file according to the click position information of the screen click event Any animation element included in the animation file obtains the display position information of the animation element in the screen coordinate system; the animation element display area corresponding to the animation element is determined according to the display position information of the animation element in the screen coordinate system.
  • the third determining unit 105 is configured to determine the display position information of the animation element in the video coordinate system according to the display position information of the display object corresponding to the animation element in different video frames;
  • the display parameters of the display screen playing the video file convert the display position information of the animation element in the video coordinate system to the display position information in the screen coordinate system.
  • the display position information includes the coordinates of the starting point of the display area of the animation element and the width and height values of the display area of the animation element;
  • the matching unit 103 is configured to determine the display of the animation element according to the coordinates of the starting point of the animation element display area and the width and height values of the animation element display area for any animation element display area in the animation file Coordinates of pixels included in the area; if coordinates of pixels included in the click area coincide with coordinates of at least one pixel included in the display area of the animation element; then determine that the display area of the animation element is the click area in the animation file The overlapping animation symbol display area.
  • the playback unit 101 is configured to determine the animation frame in the currently playing animation file according to the current playback position of the video file and the frame rate of the video playback.
  • the interactive operation may be to perform a corresponding response operation, such as showing a corresponding score to the user, or a character in the video may make preset behavior feedback, such as a preset gesture action.
  • each module or unit
  • the functions of each module can be implemented in one or more software or hardware when implementing this application.
  • the computing device may include at least one processor and at least one memory.
  • the memory stores program code, and when the program code is executed by the processor, the processor is caused to execute the video file playback method according to various exemplary embodiments of the present application described above in this specification step.
  • the processor may perform step S71 as shown in FIG.
  • step S72 determine the click location information of the screen click event when a screen click event is heard during the playback of the video file, and step S72, according to the screen click
  • the click position information of the event determines the animation element display area matching the determined screen click event in the animation file; and Step S73, the animation element corresponding to the matched animation element display area is determined as the animation element hit by the screen click event; And step S74, determine the interactive operation corresponding to the hit animation element, and execute the interactive operation.
  • the computing device 110 according to this embodiment of the present application is described below with reference to FIG. 11.
  • the computing device 110 shown in FIG. 11 is only an example, and should not bring any limitation to the functions and usage scope of the embodiments of the present application.
  • the computing device 110 is expressed in the form of a general-purpose computing device.
  • the components of the computing device 110 may include, but are not limited to: the at least one processor 111, the at least one memory 112, and a bus 113 connecting different system components (including the memory 112 and the processor 111).
  • the bus 113 represents one or more of several types of bus structures, including a memory bus or a memory controller, a peripheral bus, a processor, or a local bus that uses any of a variety of bus structures.
  • the memory 112 may include a readable medium in the form of volatile memory, such as random access memory (RAM) 1121 and / or cache memory 1122, and may further include read only memory (ROM) 1123.
  • RAM random access memory
  • ROM read only memory
  • the memory 112 may further include a program / utility tool 1125 having a set of (at least one) program modules 1124.
  • program modules 1124 include but are not limited to: an operating system, one or more application programs, other program modules, and program data. Each of the examples or some combination may include an implementation of the network environment.
  • the computing device 110 may also communicate with one or more external devices 114 (eg, keyboard, pointing device, etc.), and may also communicate with one or more devices that enable a user to interact with the computing device 110, and / or with the computing device 110 Any device (eg, router, modem, etc.) capable of communicating with one or more other computing devices. Such communication may be performed through an input / output (I / O) interface 115.
  • the computing device 110 may also communicate with one or more networks (such as a local area network (LAN), a wide area network (WAN), and / or a public network, such as the Internet) through the network adapter 116. As shown, the network adapter 116 communicates with other modules for the computing device 110 via the bus 113.
  • LAN local area network
  • WAN wide area network
  • public network such as the Internet
  • computing device 110 may be used in conjunction with the computing device 110, including but not limited to: microcode, device drivers, redundant processors, external disk drive arrays, RAID systems, tape drives And data backup storage system.
  • various aspects of the video file playback method provided by the present application may also be implemented in the form of a program product, which includes program code, and when the program product runs on a computer device, the program The code is used to cause the computer device to perform the steps in the video file playing method according to various exemplary embodiments of the present application described above in this specification.
  • the computer device may perform step S71 as shown in FIG. 7A.
  • step S72 When a screen click event is heard during the playback of a video file, the click position information of the screen click event is determined, and step S72, according to the click position information of the screen click event, determine in the animation file that matches the determined screen click event Animation element display area; Step S73: Determine the animation element corresponding to the matched animation element display area as the animation element hit by the screen click event; and Step S74, determine the interactive operation corresponding to the hit animation element and execute The interactive operation is described.
  • the program product may employ any combination of one or more computer-readable storage media.
  • the readable medium may be a readable signal medium or a computer-readable storage medium.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or any combination of the above. More specific examples of computer-readable storage media (non-exhaustive list) include: electrical connections with one or more wires, portable disks, hard disks, random access memory (RAM), read only memory (ROM), Erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the foregoing.
  • the program product for playing video files may use a portable compact disk read-only memory (CD-ROM) and include program code, and may run on a computing device.
  • CD-ROM portable compact disk read-only memory
  • the program product of the present application is not limited to this.
  • the computer-readable storage medium may be any tangible medium including or storing a program, which may be used by or in combination with an instruction execution system, apparatus, or device.
  • the readable signal medium may include a data signal that is propagated in baseband or as part of a carrier wave, in which readable program code is carried. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the above.
  • the readable signal medium may also be any readable medium other than a computer-readable storage medium, and the readable medium may send, propagate, or transmit a program for use by or in combination with an instruction execution system, apparatus, or device.
  • the program code included on the readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wired, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • the program code for performing the operations of the present application can be written in any combination of one or more programming languages, which includes object-oriented programming languages such as Java, C ++, etc., as well as conventional procedural formulas Programming language-such as "C" language or similar programming language.
  • the program code may be executed entirely on the user's computing device, partly on the user's device, as an independent software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server On the implementation.
  • the remote computing device may be connected to the user computing device through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computing device (eg, using Internet services Provider to connect via the Internet).
  • LAN local area network
  • WAN wide area network
  • Internet services Provider to connect via the Internet
  • the embodiments of the present application may be provided as methods, systems, or computer program products. Therefore, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware. Moreover, the present application may take the form of a computer program product implemented on one or more computer usable storage media (including but not limited to disk storage, CD-ROM, optical storage, etc.) including computer usable program code.
  • computer usable storage media including but not limited to disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions can be provided to the processor of a general-purpose computer, special-purpose computer, embedded processing machine, or other programmable data processing device to produce a machine that enables the generation of instructions executed by the processor of the computer or other programmable data processing device
  • These computer program instructions may also be stored in a computer-readable memory that can guide a computer or other programmable data processing device to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including an instruction device, the instructions The device implements the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and / or block diagrams.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device, so that a series of operating steps are performed on the computer or other programmable device to produce computer-implemented processing, which is executed on the computer or other programmable device
  • the instructions provide steps for implementing the functions specified in one block or multiple blocks of the flowchart one flow or multiple flows and / or block diagrams.

Abstract

本申请公开了一种视频文件播放方法、装置和存储介质。所述视频文件播放方法包括:在播放视频文件的过程中,根据视频文件的播放时间逐帧播放动画文件,所述视频文件包括至少一个显示对象,所述动画文件中包括根据所述显示对象生成的动画元件;在播放视频文件过程中侦听到屏幕点击事件时,确定屏幕点击事件的点击位置信息;根据屏幕点击事件的点击位置信息,在动画文件中确定与屏幕点击事件的点击位置信息匹配的动画元件显示区域;根据匹配出的动画元件显示区域,确定屏幕点击事件命中的动画元件;确定与所述命中的动画元件对应的互动操作,执行所述互动操作。

Description

视频文件播放方法、装置和存储介质
本申请要求于2018年11月19日提交中国专利局、申请号为201811377143.9、名称为“一种视频文件操作方法、装置和存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及数据处理技术领域,尤其涉及一种视频文件播放方法、装置和存储介质。
背景
视频(Video)泛指将一系列静态影像以电信号的方式加以捕捉、纪录、处理、储存、传送与重现的各种技术。连续的图像变化每秒超过24帧(frame)画面以上时,根据视觉暂留原理,人眼无法辨别单幅的静态画面;看上去是平滑连续的视觉效果,这样连续的画面叫做视频。网络技术的发达也促使视频的纪录片段以串流媒体的形式存在于因特网之上并可被电脑、手机等终端设备接收与播放。
在现有的视频播放过程中,用户与视频的交互操作可以包括视频播放控制操作,例如,视频播放开始、暂停,视频播放音量的调节以及播放进度控制操作等等。
技术内容
本申请实施例提供一种视频文件播放方法,由终端设备执行,包括:
在播放视频文件的过程中,根据视频文件的播放时间逐帧播放动画文件,所述视频文件包括至少一个显示对象,所述动画文件中包括根据所述显示对象生成的动画元件;
在播放视频文件过程中侦听到屏幕点击事件时,确定所述屏幕点击事件的点击位置信息;
根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件的点击位置信息匹配的动画元件显示区域;
根据匹配出的动画元件显示区域,确定所述屏幕点击事件命中的动画元件;
确定与所述命中的动画元件对应的互动操作,执行所述互动操作。
本申请实施例提供一种视频文件播放装置,包括:
播放单元,用于在播放所述视频文件的过程中,根据视频文件的播放时间逐帧播放所述动画文件,所述视频文件包括至少一个显示对象,所述动画文件中包括根据所述显示对象生成的动画元件;
第一确定单元,用于在播放视频文件过程中侦听到屏幕点击事件时,确定所述屏幕点击事件的点击位置信息;
匹配单元,用于根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件的点击位置信息匹配的动画元件显示区域;
第二确定单元,用于根据匹配出的动画元件显示区域,确定所述屏幕点击事件命中的动画元件;
所述播放单元,还用于确定与所述命中的动画元件对应的互动操作,执行所述互动操作。
本申请实施例提供一种计算装置,包括至少一个处理器、以及至少一个存储器,其中,所述存储器存储有计算机程序,当所述程序被所述处理器执行时,使得所述处理器执行上述任一种视频文件播放方法所述的步骤。
本申请实施例还提供一种计算机可读介质,其存储有可由终端设备执行的计算机程序,当所述程序在终端设备上运行时,使得所述终端设备执行上述任一种视频文件播放方法所述的步骤。
附图说明
此处所说明的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的不当限定。在附图中:
图1为根据本申请实施方式的视频文件播放方法的应用场景示意图;
图2为本申请实施例中,视频文件示意图;
图3为本申请实施例中,生成的动画文件示意图;
图4为本申请实施例中,包括有动画元件的动画帧示意图;
图5为本申请实施例中,动画制作软件的帧编辑环境示意图;
图6A为本申请实施例中,动画文件与视频文件结合后的视频播放界面示意图;
图6B为本申请实施例中,动画文件与视频文件结合后的另一视频播放界面示意图;
图7A为本申请实施例中,视频文件播放方法的实施流程示意图;
图7B为本申请实施例中,步骤S72的具体流程示意图;
图7C为本申请实施例中,确定所述动画文件中包含的动画元件对应的动画元件显示区域的具体流程示意图;
图7D为本申请实施例中,步骤S721中获得动画元件在屏幕坐标系中的显示位置信息的具体流程示意图;
图7E为本申请实施例中,步骤S712中确定所述动画文件中与所述点击区域重合的动画元件显示区域的具体流程示意图;
图8为本申请实施例中,点击区域与动画元件显示区域之间的关系示意图;
图9为根据本申请另一实施方式的视频文件播放方法的实施流程示意图;
图10为本申请实施例中,视频文件播放装置的结构示意图;
图11为根据本申请实施方式的计算装置的结构示意图。
实施方式
为了实现用户与视频文件中的显示对象的互动,提升用户与视频交互操作的体验,本申请实施例提供了一种视频文件播放方法、装置和存储介质。
以下结合说明书附图对本申请的实施例进行说明,应当理解,此处所描述的实施例仅用于说明和解释本申请,并不用于限定本申请,并且在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互组合。
首先,对本申请实施例中涉及的部分用语进行说明,以便于本领域技术人员理解。
Canvas:HTML5的一部分,允许脚本语言动态渲染位图像。
Video标记:HTML5定义了一种通过video元素来包括视频的标注方法,Video元素可以用来播放视频。
本申请中的终端设备可以是个人电脑(英文全称:Personal Computer,PC)、平板电脑、个人数字助理(Personal Digital Assistant,PDA)、个人通信业务(英文全称:Personal Communication Service,PCS)电话、笔记本和手机等终端设备,也可以是具有移动终端的计算机,例如,可以是便携式、袖珍式、手持式、计算机内置的或者车载的移动装置,它们能够向用户提供语音和/或数据连通性的设备,以及与无线接入网交换语言和/或数据。
另外,本申请实施例中的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的实施例能够以除了在这里图示或描述的内容以外的顺序实施。
在本文中提及的“多个或者若干个”是指两个或两个以上。“和/或”,描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。字符“/”一般表示前后关联对象是一种“或”的关系。
如图1所示,其为本申请实施例提供的视频文件播放方法的应用场景示意图。用户10通过终端设备11中安装的客户端访问服务器12,其中,客户端可以为网页的浏览器,也可以为安装于终端设备,如手机,平板电脑等中的应用程序客户端。服务器12可以为用户10提供不同的业务,例如,视频播放业务,游戏业务等等。
终端设备11与服务器12之间通过网络进行通信连接,该网络可以为局域网、蜂窝网和广域网等。终端设备11可以为便携设备(例如:手机、平板、笔记本电脑等),也可以为个人电脑(PC,Personal Computer),服务器12可以为任何能够提供互联网服务的设备。
现有的用户与视频的交互操作仅限于用户对于视频播放的控制操作,而用户无法与视频中的显示对象,即视频中的角色进行进一步的交互。有鉴于此,本申请实施例提供了一种视频文件播放方法,用以检测用户与视频中显示对象的交互。本申请实施例中涉及的视频资源包括两部分内容,一部分为动画文件,一部分为 视频文件,其中,动画文件可以使用Flash软件等动画编辑工具生成动画效果和逻辑信息文件,视频文件可以为mp4等格式文件。其中,视频文件包括至少一个显示对象,动画文件中包括根据视频文件中的每个视频帧中的每一个所述显示对象生成的动画元件,动画元件用于标识对应的显示对象的显示区域。
为了实现用户与视频中角色的互动,本申请实施例中先根据视频文件中的每个视频帧中的显示对象确定出所述显示对象对应的动画元件的位置,也即一个视频帧可以确定出对应的一个或多个动画元件,然后使用动画制作软件制作动画文件,所述动画文件包括与视频中每个视频帧中的显示对象对应的动画元件。随着视频的播放同步地逐帧播放动画文件,可以保持动画元件的移动位置与视频文件中的显示对象的移动位置一致。这样,通过识别用户的屏幕点击操作的位置与动画元件的位置,可以识别出用户与视频文件中显示对象的互动操作。进一步地,根据用户与视频中显示对象互动操作的识别结果,可以展示对应的互动结果,例如进行互动操作奖励评分,在评分达到一定数值时,可以进行会员升级、游戏通关或者互动奖励等等。
以图2所示的视频文件中的某帧图像为例,本申请实施例中,针对图2所示的帧图像201制作相应的动画文件,制作的动画文件中包含与图2所示的帧图像201中显示对象对应的动画元件,如图3所示。本例中,显示对象202可以是帧图像201中人物的双手,那么动画元件也就与视频文件的帧图像201中的人物的双手相对应。具体实施时,动画元件可以为动画文件中的表示人物双手202位置的矩形框,如图4所示,图中的矩形方框40即为利用动画制作软件制作出的用于代表视频中显示对象202位置的动画元件。如图5所示,其为动画制作软件的帧编辑环境示意图,图4中的表示人物双手202位置的矩形框40放置于动画文件中帧图像201所对应的帧。随着视频的播放,这些动画文件的各帧中对应于各视频帧的矩形框会随着视频的播放进行同步的逐帧播放,使得矩形框的移动位置与视频中显示对象的移动位置保持一致。
在一些实施例中,当用户点击屏幕时,通过判断用户点击位置与表示视频中显示对象的矩形框的位置是否匹配,如果匹配,则确定用户命中视频中的显示对象。
例如,当用户点击屏幕时,通过判断用户点击位置与表示视频中显示对象的 矩形框的位置是否相同,如果相同,则确定用户命中视频中的显示对象命中视频中的人物双手。当确定用户命中视频中的显示对象时,也即当确定用户命中动画元件时,执行动画元件对应的互动操作,该互动操作可以是执行对应的响应操作,例如对用户进行评分,以此实现用户与视频中的显示对象的互动。
如图6A所示,其为动画文件与视频文件结合后的视频播放界面示意图,随着视频的播放同步播放动画帧,矩形框40的移动位置始终与视频文件中人物双手202的移动位置保持一致,这样,当用户点击屏幕时,通过判断用户点击位置与表示视频中人物双手202的矩形框40的位置是否相同,可以确定用户是否命中视频中的人物双手202,根据判断结果,可以执行对应的响应操作,比如对用户进行评分,如果确定命中,则可以加10,否则不计分,以此实现用户与视频中的显示对象的互动。
或者,当确定用户命中视频中的显示对象时,也即当确定用户命中动画元件时,执行动画元件对应的互动操作,该互动操作可以是视频中的人物进行的预设的行为反馈,例如当确定用户命中视频中的人物双手时,触发视频中的人物与用户进行实时的手拍手互动。
具体实施时,根据视频文件的播放时间可以在隐藏图层中逐帧播放所述动画文件,即与视频文件同步播放的动画文件对用户来说是不可见的,其可以用于判断用户的屏幕点击操作是否命中视频文件中显示对象的显示位置,并根据判断结果对用户与视频中显示对象的交互操作进行识别。
如图6B所示,其为动画文件与视频文件结合后的另一视频播放界面示意图。在具体实施时,为了引导用户与视频文件中的显示对象互动操作,还可以在动画文件添加操作引导区域,如图6B中的圆形区域60所示,该操作引导区域对用户可见,该操作引导区域可以大于上述的动画元件显示区域,以便于用户识别。
本申请实施例中,在播放视频文件的过程中,可以根据视频文件当前播放头的位置和视频播放的帧速率,确定当前应当播放的动画帧。例如,可以按照以下公式确定当前应当播放的动画帧:动画帧位置=视频当前播放头的位置(秒)*视频播放的帧速率(帧/秒),例如,假设视频当前播放头的位置为1秒,视频播放的帧速率为24帧/秒,则可以确定出当前应当播放的动画帧为第24帧。
其中,上述播放头的位置是指视频文件当前的播放位置,例如,一视频文件 当前播放至16秒,则该视频文件当前的播放位置为16秒,也即播放头的位置为16秒。
具体实施时,为了判断用户的屏幕点击事件是否命中动画元件,本申请实施例中,可以针对动画元件建立其在视频坐标系下的显示位置信息集合,其中,在视频坐标系下的显示位置信息集合中包括有每个视频帧中的显示对象对应的动画元件的显示位置信息,这些显示位置信息是与该动画元件对应的显示对象在视频中不同显示位置处的位置信息。针对任一显示位置信息,其可以表示为显示位置的起始点坐标(x 0,y 0)和显示区域的宽度值w 1和高度值h 1,对应于视频中的一块矩形的显示区域。
由于视频文件可能在不同屏幕尺寸的终端设备上播放,因此,在播放视频时,需要根据显示屏幕的显示参数将视频坐标系转换为屏幕坐标系,以适配不同大小的终端屏幕,其中,显示屏幕的显示参数可以包括显示屏幕尺寸以及显示分辨率等等。本申请实施例中,通过将视频坐标系下的每一显示位置信息转换为屏幕坐标系下对应的显示位置信息得到动画元件在屏幕坐标系下的显示位置信息集合,每一屏幕坐标系下的显示位置信息同样包括起始点信息(x 0′,y 0′)、显示宽度值w 2和显示高度值h 2,其对应于显示屏幕上的一矩形区域,该矩形区域内包括若干个像素点坐标。
因此,在播放视频文件之前,根据播放视频文件的终端设备的显示屏幕的显示参数,将动画元件在视频坐标系下的显示位置信息集合中包括的每一显示位置信息转换为相应屏幕坐标系下的显示位置信息得到相应的显示位置信息集合,屏幕坐标系下的显示位置信息集合中包括有多个屏幕坐标系下的显示位置信息,每一屏幕坐标系下的显示位置信息,对应于显示屏幕上相应的矩形区域,该矩形区域也包括有若干个像素点坐标。
具体实施时,如果视频文件中包括多个显示对象,则在动画文件中相应地包括与视频文件的每个视频帧中的每一显示对象对应的动画元件组成的显示列表,显示列表中还包括动画元件在视频坐标系中和屏幕坐标系中的显示位置信息,由于显示对象在不同的视频帧中的显示位置不同,一个显示对象可能对应不同显示位置处的多个动画元件,每个视频帧中的各动画元件在视频坐标系下和屏幕坐标系下的显示位置信息组成各个显示对象的原型链;如表1所示,其显示列表一种 可能的数据结构示意:
表1
Figure PCTCN2019114292-appb-000001
基于根据视频文件中包括的显示对象生成的动画文件以及针对该动画文件中包括的动画元件建立的显示列表,本申请实施例提供了一种视频文件播放方法,如图7A所示,可以包括以下步骤:
S71、在播放视频文件的过程中,根据视频文件的播放时间逐帧播放动画文件,所述视频文件包括至少一个显示对象,所述动画文件中包括根据所述显示对象生成的动画元件。
在一些实施例中,在播放视频文件的过程中,根据视频文件当前的播放时间,对应的逐帧播放动画文件,以通过视频文件的每个视频帧与动画文件的每帧同步播放,来保持动画文件中每帧中的动画元件的移动位置与视频文件中每个视频帧中的显示对象的移动位置一致。
S72、在播放视频文件过程中侦听到屏幕点击事件时,确定所述屏幕点击事件的点击位置信息。
具体实施时,在播放视频文件的过程中,侦听用户点击屏幕的所有事件,当捕获到用户点击屏幕时,确定屏幕点击事件的点击位置信息,其中,该点击位置信息可以表示为用户点击屏幕的点击坐标,即屏幕被点击的横坐标X和纵坐标Y。
S72、根据屏幕点击事件的点击位置信息,在动画文件中确定与屏幕点击事件的点击位置信息匹配的动画元件显示区域。其中,与屏幕点击事件的点击位置 信息匹配的动画元件显示区域可以是与屏幕点击事件的点击位置信息相同的显示区域。
在一些实施例中,图7B为步骤S72的具体流程示意图。如图7B所示,包括以下步骤:
S711、根据所述屏幕点击事件的点击位置信息,确定包括点击位置的点击区域;
S712、确定所述动画文件中与所述点击区域重合的动画元件显示区域,并将确定的动画元件显示区域作为与所述屏幕点击事件匹配的动画元件显示区域。
本申请实施例中,为了提高互动操作检测的准确性,可以对检测到的点击坐标添加一定的容差得到相应的点击区域,以模拟用户手指的面积,具体实施时,可以根据点击坐标确定包括该点击坐标的点击区域,例如,可以确定以点击坐标为中心,L为半径的圆形区域为点击事件对应的点击区域,其中,L为预设值,具体实施中,L的大小可以根据实际需要设置,本申请实施例对此不进行限定。又例如,还可以以点击坐标为中心点,以预先设置的长度和宽度确定出的矩形区域作为点击区域,或者以点击坐标为中心点,以预先设置的边长确定出的方形区域作为点击区域等等,当然,点击区域还可以为其他规则或者不规则的区域,本申请实施例对此不进行限定。基于此,确定动画文件中与点击区域重合的动画元件显示区域,并将确定的动画元件显示区域作为与所述屏幕点击事件匹配的动画元件显示区域。
具体实施时,对于视频文件中存在的每一个显示对象,由于其在视频中的每个视频帧的位置可能不同,也即每个显示对象在每个视频帧中的位置可以是移动的,因此,该显示对象在动画文件中可以对应多个动画元件,本申请实施例中,在确定了屏幕点击事件对应的点击区域之后,需要依次遍历每一动画元件在屏幕坐标系中对应的显示位置信息,根据该显示位置信息对应的动画元件显示区域,判断其是否与点击区域重合。
如图8所示,图8中所示的圆形区域80为步骤S72中确定出的点击区域,图8中的矩形区域81为任一动画元件在显示屏幕中的显示区域,该显示区域也可以称为动画元件显示区域,如果圆形区域80与矩形区域81有重叠,则可以确定本次屏幕点击事件命中了动画元件。
在一些实施例中,可以按照以下方法判断圆形区域与矩形区域是否有重叠:判断点击区域包括的像素点坐标是否与该动画元件显示区域包括的至少一个像素点坐标重合,如果是,则确定点击区域与该动画元件显示区域重合,如果点击区域包括的像素点坐标与该动画元件显示区域包括的任一像素点坐标均不重合,则确定点击区域与该动画元件显示区域不重合。
在一些实施例中,根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件匹配的动画元件显示区域之前,还可以确定所述动画文件中包含的动画元件对应的动画元件显示区域。图7C为确定所述动画文件中包含的动画元件对应的动画元件显示区域的具体流程示意图,如图7C所示,包括以下步骤:
S721、针对所述动画文件中包含的任一动画元件,获得该动画元件在屏幕坐标系中的显示位置信息。
S722、根据该动画元件在屏幕坐标系中的显示位置信息确定该动画元件对应的动画元件显示区域。
图7D为步骤S721中获得动画元件在屏幕坐标系中的显示位置信息的具体流程示意图。如图7D所示,具体包括以下步骤:S731、根据所述动画元件对应的显示对象在不同视频帧中的显示位置信息,确定所述动画元件在视频坐标系中的显示位置信息。
S732、根据播放所述视频文件的显示屏幕的显示参数,将所述动画元件在视频坐标系中的显示位置信息转换为屏幕坐标系中的显示位置信息。
在一些实施例中,所述显示位置信息包括所述动画元件显示区域的起始点坐标以及动画元件显示区域的宽度值和高度值。图7E为步骤S712中确定所述动画文件中与所述点击区域重合的动画元件显示区域的具体流程示意图。如图7E所示,具体包括以下步骤:
S741、针对所述动画文件中的任一动画元件显示区域,根据该动画元件显示区域的起始点坐标以及该动画元件显示区域的宽度值和高度值,确定该动画元件显示区域包括的像素点坐标。
S742、如果所述点击区域包括的像素点坐标与该动画元件显示区域包括的至少一个像素点坐标重合,则确定该动画元件显示区域为所述动画文件中与所述点 击区域重合的动画元件显示区域。
S73、根据匹配出的动画元件显示区域,确定屏幕点击事件命中的动画元件。具体实施时,如果判断出点击区域与任一动画元件对应的动画元件显示区域匹配,还可以进一步判断该动画元件是否与指定的显示对象对应的动画元件,例如判断动画元件是否为表示视频中的人物双手的动画元件,如果是,则确定本次屏幕点击事件命中,否则,忽略本次屏幕点击事件。
S74,确定与所述命中的动画元件对应的互动操作,执行所述互动操作。
在一些实施例中,所述互动操作可以是执行对应的响应操作,例如向用户展示对应的评分,还可以是视频中的人物做出预设的行为反馈,例如预设的手势动作等。
具体实施时,当确定用户命中视频中的显示对象时,也即当确定用户命中动画元件时,执行动画元件对应的互动操作,该互动操作可以是执行对应的响应操作,例如对用户进行评分,以此实现用户与视频中的显示对象的互动。
或者,当确定用户命中视频中的显示对象时,也即当确定用户命中动画元件时,执行动画元件对应的互动操作,该互动操作可以是视频中的人物进行的预设的行为反馈,例如当确定用户命中视频中的人物双手时,触发视频中的人物与用户进行实时的手拍手互动。
为了更好地理解本申请实施例,以下结合视频文件或者游戏文件加载的过程对本申请实施例的实施过程进行说明,为了便于描述,本申请实施例中,以跟踪视频文件中手势为例,如图9所示,可以包括以下步骤:
S91、加载播放视频所需的素材资源。
本步骤中,加载的素材资源包括视频文件和针对该视频文件生成的动画文件,其中,动画文件中包括的跟踪手势的动画元件的移动位置与视频文件中的人物双手的移动位置相对应。
S92、解析加载的动画文件。
本步骤中,解析加载的动画元件,并将解析后的动画渲染到canvas画布上,然后,再将此canvas画布放置在视频窗口的上方。需要说明的是,canvas画布上渲染的动画对用户来说是不可见的。
S93、播放视频文件。
S94、判断视频文件是否播放完成,如果是,执行步骤S913,如果否,执行步骤S95。
S95、根据视频播放进步同步播放动画帧。
具体实施时,在播放视频文件时,实时检测视频的播放头位置,根据视频播放的帧速率和当前的播放头位置计算当前应播放的动画文件的帧位置,根据计算结果重新改变当前播放的动画帧,以达到手势动画元件与视频中任务双手位置同步的目的。
S96、侦听屏幕点击事件。
S97、在侦听到屏幕点击事件时,获取点击位置的坐标。
在视频播放过程中侦听屏幕点击事件,当捕获到用户点击屏幕的操作时,记录点击位置的坐标。
S98、遍历动画文件对应的显示列表中包括的所有动画元件。
具体实施时,在视频播放过程中侦听到屏幕点击事件之后,遍历在canvas画布显示列表中的所有动画元件。
S99、判断点击坐标是否与任一动画元件显示区域匹配,如果是,执行步骤S910,如果否,执行步骤S912。
本步骤中,首先根据点击坐标确定其对应的点击区域,然后根据点击区域中包括的像素点坐标,判断是否存在至少一个像素点坐标与显示列表中包括的至少一个动画元件对应的动画元件显示区域所包括的像素点坐标重合,如果是,则确定点击坐标与任一动画元件显示区域匹配,如果否,则确定点击坐标与所有动画元件显示区域不匹配。
S910、判断位置匹配的动画元件是否为手势动画元件,如果是,执行步骤S911,如果否,执行步骤S912。
本步骤中,进一步判断位置匹配的动画元件是否为手势动画元件,如果是,则可以确定本次屏幕点击事件命中视频中的人物双手,如果否,则可以忽略本次操作。
S911、确定本次屏幕点击事件命中视频中的人物双手,确定与所述命中的人物双手对应的互动操作,执行所述互动操作,并返回执行步骤S94。
S912、忽略本次点击,返回执行步骤S96。
S913、播放落地页。
本申请实施例提供的视频文件播放方法,针对视频文件中包括的显示对象生成包括其对应的动画元件的动画文件,在播放视频时,根据视频播放进度同步播放动画帧,同时,侦听屏幕点击事件,在侦听到屏幕点击事件时,根据本次屏幕点击事件的点击位置信息确定相应的点击区域,通过比较点击区域与动画元件显示区域是否匹配,判断本次屏幕点击事件是否命中动画元件,实现了用户与视频中的显示对象的深度互动,提升了用户与视频交互的体验,由于用户不再是仅仅与视频进行生硬乏味的简单点击互动,而是像置身于视频中一样与视频中的角色进行丰富的互动,提高用户参与视频操作的主动性与积极性。
本申请实施例提供的视频文件播放方法可以应用于视频播放客户端等包含有视频播放器的客户端中,也可以应用于视频播放网页中,本申请实施例对此不进行限定。
基于同一发明构思,本申请实施例中还提供了一种视频文件播放装置,由于上述装置解决问题的原理与视频文件播放方法相似,因此上述装置的实施可以参见方法的实施,重复之处不再赘述。
如图10所示,其为本申请实施例提供的视频文件播放装置的结构示意图,包括:
播放单元101,用于在播放所述视频文件的过程中,根据视频文件的播放时间逐帧播放所述动画文件,所述视频文件包括至少一个显示对象,所述动画文件中包括根据所述显示对象生成的动画元件;
第一确定单元102,用于在播放视频文件过程中侦听到屏幕点击事件时,确定所述屏幕点击事件的点击位置信息;
匹配单元103,用于根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件匹配的动画元件显示区域;
第二确定单元104,用于将匹配出的显示区域对应的动画元件,确定为所述屏幕点击事件命中的动画元件;
播放单元101,用于确定与所述命中的动画元件对应的互动操作,执行所述互动操作。
在一些实施例中,所述匹配单元103,用于根据所述屏幕点击事件的点击位 置信息,确定包括点击位置的点击区域;确定所述动画文件中与所述点击区域重合的动画元件显示区域,并将确定的动画元件显示区域作为与所述屏幕点击事件匹配的动画元件显示区域。
在一些实施例中,所述点击位置信息包括点击坐标;以及
所述匹配单元103,用于根据所述屏幕点击事件的点击坐标,确定包括所述点击坐标的点击区域。
在一些实施例中,本申请实施例提供的视频文件播放装置,还包括:
第三确定单元105,用于在所述匹配单元根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件的点击位置信息匹配的动画元件显示区域之前,针对所述动画文件中包含的任一动画元件,获得该动画元件在屏幕坐标系中的显示位置信息;根据该动画元件在屏幕坐标系中的显示位置信息确定该动画元件对应的动画元件显示区域。
在一些实施例中,所述第三确定单元105,用于根据所述动画元件对应的显示对象在不同视频帧中的显示位置信息确定所述动画元件在视频坐标系中的显示位置信息;根据播放所述视频文件的显示屏幕的显示参数,将所述动画元件在视频坐标系中的显示位置信息转换为屏幕坐标系中的显示位置信息。
在一些实施例中,所述显示位置信息包括所述动画元件显示区域的起始点坐标以及动画元件显示区域的宽度值和高度值;以及
所述匹配单元103,用于针对所述动画文件中的任一动画元件显示区域,根据该动画元件显示区域的起始点坐标以及该动画元件显示区域的宽度值和高度值,确定该动画元件显示区域包括的像素点坐标;如果所述点击区域包括的像素点坐标与该动画元件显示区域包括的至少一个像素点坐标重合;则确定该动画元件显示区域为所述动画文件中与所述点击区域重合的动画元件显示区域。
在一些实施例中,所述播放单元101,用于根据所述视频文件当前的播放位置和视频播放的帧速率,确定当前播放的动画文件中的动画帧。
在一些实施例中,所述互动操作可以是执行对应的响应操作,例如向用户展示对应的评分,还可以是视频中的人物做出预设的行为反馈,例如预设的手势动作等。
为了描述的方便,以上各部分按照功能划分为各模块(或单元)分别描述。 当然,在实施本申请时可以把各模块(或单元)的功能在同一个或多个软件或硬件中实现。
在介绍了本申请示例性实施方式的视频文件播放方法和装置之后,接下来,介绍根据本申请的另一示例性实施方式的计算装置。
所属技术领域的技术人员能够理解,本申请的各个方面可以实现为系统、方法或程序产品。因此,本申请的各个方面可以具体实现为以下形式,即:完全的硬件实施方式、完全的软件实施方式(包括固件、微代码等),或硬件和软件方面结合的实施方式,这里可以统称为“电路”、“模块”或“系统”。
在一些可能的实施方式中,根据本申请的计算装置可以至少包括至少一个处理器、以及至少一个存储器。其中,所述存储器存储有程序代码,当所述程序代码被所述处理器执行时,使得所述处理器执行本说明书上述描述的根据本申请各种示例性实施方式的视频文件播放方法中的步骤。例如,所述处理器可以执行如图7A中所示的步骤S71、在播放视频文件过程中侦听到屏幕点击事件时,确定所述屏幕点击事件的点击位置信息,和步骤S72、根据屏幕点击事件的点击位置信息,在动画文件中确定与确定屏幕点击事件匹配的动画元件显示区域;以及步骤S73、将匹配出的动画元件显示区域对应的动画元件,确定为屏幕点击事件命中的动画元件;以及步骤S74、确定与所述命中的动画元件对应的互动操作,执行所述互动操作。
下面参照图11来描述根据本申请的这种实施方式的计算装置110。图11显示的计算装置110仅仅是一个示例,不应对本申请实施例的功能和使用范围带来任何限制。
如图11所示,计算装置110以通用计算设备的形式表现。计算装置110的组件可以包括但不限于:上述至少一个处理器111、上述至少一个存储器112、连接不同系统组件(包括存储器112和处理器111)的总线113。
总线113表示几类总线结构中的一种或多种,包括存储器总线或者存储器控制器、外围总线、处理器或者使用多种总线结构中的任意总线结构的局域总线。
存储器112可以包括易失性存储器形式的可读介质,例如随机存取存储器(RAM)1121和/或高速缓存存储器1122,还可以进一步包括只读存储器(ROM)1123。
存储器112还可以包括具有一组(至少一个)程序模块1124的程序/实用工具1125,这样的程序模块1124包括但不限于:操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。
计算装置110也可以与一个或多个外部设备114(例如键盘、指向设备等)通信,还可与一个或者多个使得用户能与计算装置110交互的设备通信,和/或与使得该计算装置110能与一个或多个其它计算设备进行通信的任何设备(例如路由器、调制解调器等等)通信。这种通信可以通过输入/输出(I/O)接口115进行。并且,计算装置110还可以通过网络适配器116与一个或者多个网络(例如局域网(LAN),广域网(WAN)和/或公共网络,例如因特网)通信。如图所示,网络适配器116通过总线113与用于计算装置110的其它模块通信。应当理解,尽管图中未示出,可以结合计算装置110使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理器、外部磁盘驱动阵列、RAID系统、磁带驱动器以及数据备份存储系统等。
在一些可能的实施方式中,本申请提供的视频文件播放方法的各个方面还可以实现为一种程序产品的形式,其包括程序代码,当所述程序产品在计算机设备上运行时,所述程序代码用于使所述计算机设备执行本说明书上述描述的根据本申请各种示例性实施方式的视频文件播放方法中的步骤,例如,所述计算机设备可以执行如图7A中所示的步骤S71、在播放视频文件过程中侦听到屏幕点击事件时,确定所述屏幕点击事件的点击位置信息,和步骤S72、根据屏幕点击事件的点击位置信息,在动画文件中确定与确定屏幕点击事件匹配的动画元件显示区域;步骤S73、将匹配出的动画元件显示区域对应的动画元件,确定为屏幕点击事件命中的动画元件;以及步骤S74、确定与所述命中的动画元件对应的互动操作,执行所述互动操作。
所述程序产品可以采用一个或多个计算机可读存储介质的任意组合。可读介质可以是可读信号介质或者计算机可读存储介质。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式盘、硬盘、随机存取存储器(RAM)、 只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。
本申请的实施方式的用于视频文件播放的程序产品可以采用便携式紧凑盘只读存储器(CD-ROM)并包括程序代码,并可以在计算设备上运行。然而,本申请的程序产品不限于此,在本文件中,计算机可读存储介质可以是任何包括或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了可读程序代码。这种传播的数据信号可以采用多种形式,包括——但不限于——电磁信号、光信号或上述的任意合适的组合。可读信号介质还可以是计算机可读存储介质以外的任何可读介质,该可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。
可读介质上包括的程序代码可以用任何适当的介质传输,包括——但不限于——无线、有线、光缆、RF等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言的任意组合来编写用于执行本申请操作的程序代码,所述程序设计语言包括面向对象的程序设计语言—诸如Java、C++等,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算设备上执行、部分地在用户设备上执行、作为一个独立的软件包执行、部分在用户计算设备上部分在远程计算设备上执行、或者完全在远程计算设备或服务器上执行。在涉及远程计算设备的情形中,远程计算设备可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算设备,或者,可以连接到外部计算设备(例如利用因特网服务提供商来通过因特网连接)。
应当注意,尽管在上文详细描述中提及了装置的若干单元或子单元,但是这种划分仅仅是示例性的并非强制性的。实际上,根据本申请的实施方式,上文描述的两个或更多单元的特征和功能可以在一个单元中具体化。反之,上文描述的一个单元的特征和功能可以进一步划分为由多个单元来具体化。
此外,尽管在附图中以特定顺序描述了本申请方法的操作,但是,这并非要 求或者暗示必须按照该特定顺序来执行这些操作,或是必须执行全部所示的操作才能实现期望的结果。附加地或备选地,可以省略某些步骤,将多个步骤合并为一个步骤执行,和/或将一个步骤分解为多个步骤执行。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包括有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器、CD-ROM、光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
尽管已描述了上述本申请的实施例,但本领域内的技术人员一旦得知了基本创造性概念,则可对这些实施例做出另外的变更和修改。所以,所附权利要求意欲解释为包括上述实施例以及落入本申请范围的所有变更和修改。
显然,本领域的技术人员可以对本申请进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包括这些改动和变型在内。

Claims (15)

  1. 一种视频文件播放方法,由终端设备执行,包括:
    在播放视频文件的过程中,根据视频文件的播放时间逐帧播放动画文件,所述视频文件包括至少一个显示对象,所述动画文件中包括根据所述显示对象生成的动画元件;
    在播放视频文件过程中侦听到屏幕点击事件时,确定所述屏幕点击事件的点击位置信息;
    根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件的点击位置信息匹配的动画元件显示区域;
    根据匹配出的动画元件显示区域,确定所述屏幕点击事件命中的动画元件;
    确定与所述命中的动画元件对应的互动操作,执行所述互动操作。
  2. 如权利要求1所述的方法,其中,根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件的点击位置信息匹配的动画元件显示区域,包括:
    根据所述屏幕点击事件的点击位置信息,确定包括点击位置的点击区域;
    确定所述动画文件中与所述点击区域重合的动画元件显示区域,并将确定的动画元件显示区域作为与所述屏幕点击事件匹配的动画元件显示区域。
  3. 如权利要求2所述的方法,其中,所述点击位置信息包括点击坐标;以及
    根据所述屏幕点击事件的点击位置信息,确定包括点击位置的点击区域,包括:
    根据所述屏幕点击事件的点击坐标,将包括所述点击坐标的区域确定为点击区域。
  4. 如权利要求2所述的方法,其中,根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件的点击位置信息匹配的动画元件显示区域之前,还包括:
    针对所述动画文件中包含的任一动画元件,获得该动画元件在屏幕坐标系中的显示位置信息;
    根据该动画元件在屏幕坐标系中的显示位置信息确定该动画元件对应的动画元件显示区域。
  5. 如权利要求4所述的方法,其中,获得动画元件在屏幕坐标系中的显示位置信息,具体包括:
    根据所述动画元件对应的显示对象在不同视频帧中的显示位置信息,确定所述动画元件在视频坐标系中的显示位置信息;
    根据播放所述视频文件的显示屏幕的显示参数,将所述动画元件在视频坐标系中的显示位置信息转换为屏幕坐标系中的显示位置信息。
  6. 如权利要求5所述的方法,其中,所述显示位置信息包括所述动画元件显示区域的起始点坐标以及动画元件显示区域的宽度值和高度值;以及
    确定所述动画文件中与所述点击区域重合的动画元件显示区域,具体包括:
    针对所述动画文件中的任一动画元件显示区域,根据该动画元件显示区域的起始点坐标以及该动画元件显示区域的宽度值和高度值,确定该动画元件显示区域包括的像素点坐标;
    如果所述点击区域包括的像素点坐标与该动画元件显示区域包括的至少一个像素点坐标重合,则确定该动画元件显示区域为所述动画文件中与所述点击区域重合的动画元件显示区域。
  7. 如权利要求1所述的方法,其中,根据视频文件的播放时间逐帧播放根据所述视频文件生成的动画文件,具体包括:
    根据所述视频文件当前的播放位置和视频播放的帧速率,确定所述动画文件中的当前播放动画帧。
  8. 如权利要求1所述的方法,其中,根据视频文件的播放时间逐帧播放根据所述视频文件生成的动画文件,包括:
    根据所述视频文件的播放时间在隐藏图层中逐帧播放所述动画文件。
  9. 一种视频文件播放装置,包括:
    播放单元,用于在播放所述视频文件的过程中,根据视频文件的播放时间逐帧播放所述动画文件,所述视频文件包括至少一个显示对象,所述动画文件中包括根据所述显示对象生成的动画元件;
    第一确定单元,用于在播放视频文件过程中侦听到屏幕点击事件时,确定所述屏幕点击事件的点击位置信息;
    匹配单元,用于根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件的点击位置信息匹配的动画元件显示区域;
    第二确定单元,用于根据匹配出的动画元件显示区域,确定所述屏幕点击事件命中的动画元件;
    所述播放单元,还用于确定与所述命中的动画元件对应的互动操作,执行所述互动操作。
  10. 如权利要求9所述的装置,其中,所述匹配单元,用于根据所述屏幕点击事件的点击位置信息,确定包括点击位置的点击区域;确定所述动画文件中与所述点击区域重合的动画元件显示区域,并将确定的动画元件显示区域作为与所述屏幕点击事件匹配的动画元件显示区域。
  11. 如权利要求10所述的装置,其中,所述点击位置信息包括点击坐标;以及
    所述匹配单元,用于根据所述屏幕点击事件的点击坐标确定包括所述点击坐标的点击区域。
  12. 如权利要求10所述的装置,其中,还包括:
    第三确定单元,用于在所述匹配单元根据所述屏幕点击事件的点击位置信息,在所述动画文件中确定与所述屏幕点击事件匹配的动画元件显示区域之前,针对所述动画文件中包含的任一动画元件,获得该动画元件在屏幕坐标系中的显示位置信息;根据该动画元件在屏幕坐标系中的显示位置信息确定该动画元件对应的动画元件显示区域。
  13. 如权利要求12所述的装置,其中,
    所述第三确定单元,用于根据所述动画元件对应的显示对象在不同视频帧中的显示位置信息,确定所述动画元件在视频坐标系中的显示位置信息;根据播放所述视频文件的显示屏幕的显示参数,将所述动画元件在视频坐标系中的显示位置信息转换为屏幕坐标系中的显示位置信息。
  14. 一种计算装置,包括至少一个处理器、以及至少一个存储器,其中, 所述存储器存储有计算机程序,当所述程序被所述处理器执行时,使得所述处理器执行权利要求1~8任一方法的步骤。
  15. 一种计算机可读介质,其存储有可由终端设备执行的计算机程序,当所述程序在终端设备上运行时,使得所述终端设备执行权利要求1~8任一方法的步骤。
PCT/CN2019/114292 2018-11-19 2019-10-30 视频文件播放方法、装置和存储介质 WO2020103657A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP19886284.9A EP3886449A4 (en) 2018-11-19 2019-10-30 VIDEO FILE PLAYBACK AND APPARATUS, AND STORAGE MEDIA
US17/085,797 US11528535B2 (en) 2018-11-19 2020-10-30 Video file playing method and apparatus, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811377143.9A CN110166842B (zh) 2018-11-19 2018-11-19 一种视频文件操作方法、装置和存储介质
CN201811377143.9 2018-11-19

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/085,797 Continuation US11528535B2 (en) 2018-11-19 2020-10-30 Video file playing method and apparatus, and storage medium

Publications (1)

Publication Number Publication Date
WO2020103657A1 true WO2020103657A1 (zh) 2020-05-28

Family

ID=67645172

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/114292 WO2020103657A1 (zh) 2018-11-19 2019-10-30 视频文件播放方法、装置和存储介质

Country Status (4)

Country Link
US (1) US11528535B2 (zh)
EP (1) EP3886449A4 (zh)
CN (1) CN110166842B (zh)
WO (1) WO2020103657A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112650401A (zh) * 2020-12-24 2021-04-13 北京百度网讯科技有限公司 信息轮播方法、装置、电子设备及存储介质
CN114449336A (zh) * 2022-01-20 2022-05-06 杭州海康威视数字技术股份有限公司 一种车辆轨迹动画播放方法、装置及设备
CN114928755A (zh) * 2022-05-10 2022-08-19 咪咕文化科技有限公司 一种视频制作方法、电子设备及计算机可读存储介质

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110166842B (zh) * 2018-11-19 2020-10-16 深圳市腾讯信息技术有限公司 一种视频文件操作方法、装置和存储介质
CN111625099B (zh) * 2020-06-02 2024-04-16 上海商汤智能科技有限公司 一种动画展示控制方法及装置
CN113271486B (zh) * 2021-06-03 2023-02-28 北京有竹居网络技术有限公司 交互视频处理方法、装置、计算机设备及存储介质
CN114385270A (zh) * 2022-01-19 2022-04-22 平安付科技服务有限公司 网页动画自动生成方法、装置、设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102868919A (zh) * 2012-09-19 2013-01-09 上海高越文化传媒股份有限公司 交互式播放设备及播放方法
CN103108248A (zh) * 2013-01-06 2013-05-15 王汝迟 一种互动式视频的实现方法和系统
CN105828160A (zh) * 2016-04-01 2016-08-03 腾讯科技(深圳)有限公司 视频播放方法及装置
CN108022279A (zh) * 2017-11-30 2018-05-11 广州市百果园信息技术有限公司 视频特效添加方法、装置及智能移动终端
CN108462883A (zh) * 2018-01-08 2018-08-28 平安科技(深圳)有限公司 一种直播互动方法、装置、终端设备及存储介质
CN110166842A (zh) * 2018-11-19 2019-08-23 深圳市腾讯信息技术有限公司 一种视频文件操作方法、装置和存储介质

Family Cites Families (96)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6057833A (en) * 1997-04-07 2000-05-02 Shoreline Studios Method and apparatus for providing real time enhancements and animations over a video image
US6268864B1 (en) * 1998-06-11 2001-07-31 Presenter.Com, Inc. Linking a video and an animation
US6714202B2 (en) * 1999-12-02 2004-03-30 Canon Kabushiki Kaisha Method for encoding animation in an image file
US8171430B1 (en) * 2001-07-24 2012-05-01 Adobe Systems Incorporated System and method for providing an image and image instructions responsive to a mouse cursor position
US20030046348A1 (en) * 2001-08-29 2003-03-06 Pinto Albert Gregory System and method of converting video to bitmap animation for use in electronic mail
AU2002336354A1 (en) * 2001-09-15 2003-04-01 Michael Neuman Dynamic variation of output media signal in response to input media signal
AUPS058602A0 (en) * 2002-02-15 2002-03-14 Canon Kabushiki Kaisha Representing a plurality of independent data items
US6937950B2 (en) * 2002-12-26 2005-08-30 International Business Machines Corporation Animated graphical object notification system
US8079052B2 (en) * 2004-04-23 2011-12-13 Concurrent Computer Corporation Methods, apparatuses, and systems for presenting advertisement content within trick files
US7990386B2 (en) * 2005-03-24 2011-08-02 Oracle America, Inc. Method for correlating animation and video in a computer system
BRPI0613542B1 (pt) * 2005-06-02 2018-05-08 Tencent Tech Shenzhen Co Ltd método para exibir animação
AU2005202866A1 (en) * 2005-06-29 2007-01-18 Canon Kabushiki Kaisha Storing video data in a video file
US20070030273A1 (en) * 2005-08-08 2007-02-08 Lager Interactive Inc. Method of serially connecting animation groups for producing computer game
US7675520B2 (en) * 2005-12-09 2010-03-09 Digital Steamworks, Llc System, method and computer program for creating two dimensional (2D) or three dimensional (3D) computer animation from video
US7911467B2 (en) * 2005-12-30 2011-03-22 Hooked Wireless, Inc. Method and system for displaying animation with an embedded system graphics API
CN101005609B (zh) * 2006-01-21 2010-11-03 腾讯科技(深圳)有限公司 生成互动视频图像的方法及系统
US8645991B2 (en) * 2006-03-30 2014-02-04 Tout Industries, Inc. Method and apparatus for annotating media streams
WO2007134115A2 (en) * 2006-05-09 2007-11-22 Disney Enterprises, Inc. Interactive animation
US20080165195A1 (en) * 2007-01-06 2008-07-10 Outland Research, Llc Method, apparatus, and software for animated self-portraits
WO2009137368A2 (en) * 2008-05-03 2009-11-12 Mobile Media Now, Inc. Method and system for generation and playback of supplemented videos
US8594740B2 (en) * 2008-06-11 2013-11-26 Pantech Co., Ltd. Mobile communication terminal and data input method
WO2010051493A2 (en) * 2008-10-31 2010-05-06 Nettoons, Inc. Web-based real-time animation visualization, creation, and distribution
US8493408B2 (en) * 2008-11-19 2013-07-23 Apple Inc. Techniques for manipulating panoramas
US20100134499A1 (en) * 2008-12-03 2010-06-03 Nokia Corporation Stroke-based animation creation
US8836706B2 (en) * 2008-12-18 2014-09-16 Microsoft Corporation Triggering animation actions and media object actions
EP2387850A4 (en) * 2009-01-14 2012-07-18 Innovid Inc OBJECTS ASSOCIATED WITH A VIDEO
CA2760289A1 (en) * 2009-04-27 2010-11-11 Sonoma Data Solutions Llc A method and apparatus for character animation
US8369974B2 (en) * 2009-06-16 2013-02-05 Kyran Daisy Virtual phonograph
US10721526B2 (en) * 2009-12-15 2020-07-21 Sony Corporation Enhancement of main items video data with supplemental audio or video
US20110170008A1 (en) * 2010-01-13 2011-07-14 Koch Terry W Chroma-key image animation tool
US9373186B2 (en) * 2010-06-14 2016-06-21 Nintendo Co., Ltd. Device and method utilizing animated frames to dynamically create snapshots for selectable menus
US9071885B2 (en) * 2010-08-18 2015-06-30 Demand Media, Inc. Systems, methods, and machine-readable storage media for presenting animations overlying multimedia files
WO2012040827A2 (en) * 2010-10-01 2012-04-05 Smart Technologies Ulc Interactive input system having a 3d input space
US20120123865A1 (en) * 2010-11-12 2012-05-17 Cellco Partnership D/B/A Verizon Wireless Enhanced shopping experience for mobile station users
JP5645626B2 (ja) * 2010-12-06 2014-12-24 キヤノン株式会社 表示制御装置、表示制御方法、プログラム、並びに記憶媒体
JP6033792B2 (ja) * 2011-01-18 2016-11-30 サバント システムズ エルエルシーSavant Systems LLC ヘッドアップ操作及び視覚的フィードバックを提供するリモート制御インタフェイス
US20120326993A1 (en) * 2011-01-26 2012-12-27 Weisman Jordan K Method and apparatus for providing context sensitive interactive overlays for video
US8990689B2 (en) * 2011-02-03 2015-03-24 Sony Corporation Training for substituting touch gestures for GUI or hardware keys to control audio video play
US8527359B1 (en) * 2011-02-23 2013-09-03 Amazon Technologies, Inc. Immersive multimedia views for items
US20140026036A1 (en) * 2011-07-29 2014-01-23 Nbor Corporation Personal workspaces in a computer operating environment
US20130031463A1 (en) * 2011-07-29 2013-01-31 Denny Jaeger Personal workspaces in a computer operating environment
US9007381B2 (en) * 2011-09-02 2015-04-14 Verizon Patent And Licensing Inc. Transition animation methods and systems
US9641790B2 (en) * 2011-10-17 2017-05-02 Microsoft Technology Licensing, Llc Interactive video program providing linear viewing experience
WO2013074926A1 (en) * 2011-11-18 2013-05-23 Lucasfilm Entertainment Company Ltd. Path and speed based character control
KR101870775B1 (ko) * 2012-02-08 2018-06-26 삼성전자 주식회사 휴대 단말기에서 애니메이션 재생 방법 및 장치
WO2013116937A1 (en) * 2012-02-09 2013-08-15 Flixel Photos Inc. Systems and methods for creation and sharing of selectively animated digital photos
US8471857B1 (en) * 2012-04-12 2013-06-25 Google Inc. Changing animation displayed to user
US9465882B2 (en) * 2012-07-19 2016-10-11 Adobe Systems Incorporated Systems and methods for efficient storage of content and animation
US9254436B2 (en) * 2012-07-23 2016-02-09 Zynga Inc. Regular visitor to friend board in viral game
KR20140017397A (ko) * 2012-08-01 2014-02-11 삼성전자주식회사 브라우징 이력을 이용하여 지도를 검색할 수 있는 전자 장치 및 방법
US9826286B2 (en) * 2012-09-18 2017-11-21 Viacom International Inc. Video editing method and tool
US9558578B1 (en) * 2012-12-27 2017-01-31 Lucasfilm Entertainment Company Ltd. Animation environment
US9349206B2 (en) * 2013-03-08 2016-05-24 Apple Inc. Editing animated objects in video
CN105164628B (zh) * 2013-03-14 2018-11-16 华为技术有限公司 移动设备的透镜触摸图形效果
US20140280490A1 (en) * 2013-03-15 2014-09-18 Atakan Artun Systems and methods for visual communication
US20140317511A1 (en) * 2013-04-18 2014-10-23 Google Inc. Systems and Methods for Generating Photographic Tours of Geographic Locations
US20140355961A1 (en) * 2013-05-31 2014-12-04 Microsoft Corporation Using simple touch input to create complex video animation
US10332233B2 (en) * 2013-08-14 2019-06-25 Flipboard, Inc. Preloading animation files in a memory of a client device
CN104423814A (zh) * 2013-08-20 2015-03-18 腾讯科技(深圳)有限公司 控制网络媒体信息互动的方法及浏览器
KR102173123B1 (ko) * 2013-11-22 2020-11-02 삼성전자주식회사 전자장치에서 이미지 내의 특정 객체를 인식하기 위한 방법 및 장치
CN103678631B (zh) * 2013-12-19 2016-10-05 华为技术有限公司 页面渲染方法及装置
CN104899912B (zh) * 2014-03-07 2019-07-05 腾讯科技(深圳)有限公司 动画制作方法和回放方法以及设备
US20150339006A1 (en) * 2014-05-21 2015-11-26 Facebook, Inc. Asynchronous Preparation of Displayable Sections of a Graphical User Interface
US10042537B2 (en) * 2014-05-30 2018-08-07 Apple Inc. Video frame loupe
WO2016013893A1 (en) * 2014-07-25 2016-01-28 Samsung Electronics Co., Ltd. Displaying method, animation image generating method, and electronic device configured to execute the same
US9471203B1 (en) * 2014-09-02 2016-10-18 Audible, Inc. Presenting animated visual supplemental content
CN106201161B (zh) * 2014-09-23 2021-09-03 北京三星通信技术研究有限公司 电子设备的显示方法及系统
US10476937B2 (en) * 2014-10-20 2019-11-12 Facebook, Inc. Animation for image elements in a display layout
US9472009B2 (en) * 2015-01-13 2016-10-18 International Business Machines Corporation Display of context based animated content in electronic map
US10685471B2 (en) * 2015-05-11 2020-06-16 Facebook, Inc. Methods and systems for playing video while transitioning from a content-item preview to the content item
US10871868B2 (en) * 2015-06-05 2020-12-22 Apple Inc. Synchronized content scrubber
EP3113470B1 (en) * 2015-07-02 2020-12-16 Nokia Technologies Oy Geographical location visual information overlay
US9532106B1 (en) * 2015-07-27 2016-12-27 Adobe Systems Incorporated Video character-based content targeting
US9665972B2 (en) * 2015-07-28 2017-05-30 Google Inc. System for compositing educational video with interactive, dynamically rendered visual aids
US10356493B2 (en) * 2015-12-22 2019-07-16 Google Llc Methods, systems, and media for presenting interactive elements within video content
US20170178685A1 (en) * 2015-12-22 2017-06-22 Le Holdings (Beijing) Co., Ltd. Method for intercepting video animation and electronic device
WO2017117422A1 (en) * 2015-12-29 2017-07-06 Echostar Technologies L.L.C Methods and apparatus for presenting advertisements during playback of recorded television content
JP2019511139A (ja) * 2015-12-29 2019-04-18 インプレスビュー インコーポレイテッドImpressView Inc. ビデオ及び関連文書を提示するための並びにその視聴を追跡するためのシステム及び方法
CN105847998A (zh) * 2016-03-28 2016-08-10 乐视控股(北京)有限公司 一种视频播放方法、播放终端及媒体服务器
DK201670596A1 (en) * 2016-06-12 2018-02-19 Apple Inc Digital touch on live video
CN106210808B (zh) 2016-08-08 2019-04-16 腾讯科技(深圳)有限公司 媒体信息投放方法、终端、服务器及系统
US10127632B1 (en) * 2016-09-05 2018-11-13 Google Llc Display and update of panoramic image montages
US10809956B1 (en) * 2016-11-17 2020-10-20 Pinterest, Inc. Supplemental content items
US10321092B2 (en) * 2016-12-28 2019-06-11 Facebook, Inc. Context-based media effect application
US10521468B2 (en) * 2017-06-13 2019-12-31 Adobe Inc. Animated seek preview for panoramic videos
US10878851B2 (en) * 2017-08-18 2020-12-29 BON2 Media Services LLC Embedding interactive content into a shareable online video
CN109420338A (zh) * 2017-08-31 2019-03-05 腾讯科技(深圳)有限公司 模拟镜头移动的虚拟场景显示方法及装置、电子设备
US11259088B2 (en) * 2017-10-27 2022-02-22 Google Llc Previewing a video in response to computing device interaction
CN108012179B (zh) * 2017-11-08 2020-08-21 北京密境和风科技有限公司 一种基于直播的数据分析方法、装置和终端设备
US10599289B1 (en) * 2017-11-13 2020-03-24 Snap Inc. Interface to display animated icon
CN108256062B (zh) * 2018-01-16 2020-11-24 携程旅游信息技术(上海)有限公司 网页动画实现方法、装置、电子设备、存储介质
US10616666B1 (en) * 2018-02-27 2020-04-07 Halogen Networks, LLC Interactive sentiment-detecting video streaming system and method
US10375313B1 (en) * 2018-05-07 2019-08-06 Apple Inc. Creative camera
US10546409B1 (en) * 2018-08-07 2020-01-28 Adobe Inc. Animation production system
TWI728446B (zh) * 2019-08-28 2021-05-21 華碩電腦股份有限公司 使用者介面之控制方法及電子裝置
US10769682B1 (en) * 2019-09-11 2020-09-08 James Trevor McDonald Touch and hold system and method to activate and control mobile content

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102868919A (zh) * 2012-09-19 2013-01-09 上海高越文化传媒股份有限公司 交互式播放设备及播放方法
CN103108248A (zh) * 2013-01-06 2013-05-15 王汝迟 一种互动式视频的实现方法和系统
CN105828160A (zh) * 2016-04-01 2016-08-03 腾讯科技(深圳)有限公司 视频播放方法及装置
CN108022279A (zh) * 2017-11-30 2018-05-11 广州市百果园信息技术有限公司 视频特效添加方法、装置及智能移动终端
CN108462883A (zh) * 2018-01-08 2018-08-28 平安科技(深圳)有限公司 一种直播互动方法、装置、终端设备及存储介质
CN110166842A (zh) * 2018-11-19 2019-08-23 深圳市腾讯信息技术有限公司 一种视频文件操作方法、装置和存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3886449A4 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112650401A (zh) * 2020-12-24 2021-04-13 北京百度网讯科技有限公司 信息轮播方法、装置、电子设备及存储介质
CN112650401B (zh) * 2020-12-24 2023-09-01 北京百度网讯科技有限公司 信息轮播方法、装置、电子设备及存储介质
CN114449336A (zh) * 2022-01-20 2022-05-06 杭州海康威视数字技术股份有限公司 一种车辆轨迹动画播放方法、装置及设备
CN114449336B (zh) * 2022-01-20 2023-11-21 杭州海康威视数字技术股份有限公司 一种车辆轨迹动画播放方法、装置及设备
CN114928755A (zh) * 2022-05-10 2022-08-19 咪咕文化科技有限公司 一种视频制作方法、电子设备及计算机可读存储介质
CN114928755B (zh) * 2022-05-10 2023-10-20 咪咕文化科技有限公司 一种视频制作方法、电子设备及计算机可读存储介质

Also Published As

Publication number Publication date
EP3886449A1 (en) 2021-09-29
CN110166842A (zh) 2019-08-23
US11528535B2 (en) 2022-12-13
US20210051374A1 (en) 2021-02-18
EP3886449A4 (en) 2021-11-03
CN110166842B (zh) 2020-10-16

Similar Documents

Publication Publication Date Title
WO2020103657A1 (zh) 视频文件播放方法、装置和存储介质
US11158102B2 (en) Method and apparatus for processing information
US9628744B2 (en) Display apparatus and control method thereof
JP2019102063A (ja) ページ制御方法および装置
US10921979B2 (en) Display and processing methods and related apparatus
US9864612B2 (en) Techniques to customize a user interface for different displays
WO2017166622A1 (zh) 一种视频播放方法、播放终端及媒体服务器
US20190130185A1 (en) Visualization of Tagging Relevance to Video
US20130268826A1 (en) Synchronizing progress in audio and text versions of electronic books
WO2015027912A1 (en) Method and system for controlling process for recording media content
US20110231194A1 (en) Interactive Speech Preparation
AU2012359080A1 (en) Managing playback of supplemental information
US20150347461A1 (en) Display apparatus and method of providing information thereof
WO2023071917A1 (zh) 虚拟对象互动方法、装置、存储介质及计算机程序产品
US11516550B2 (en) Generating an interactive digital video content item
TW201421341A (zh) 行動裝置應用頁面樣版之產生系統、方法及其記錄媒體
CN113191184A (zh) 实时视频处理方法、装置、电子设备及存储介质
US20170017632A1 (en) Methods and Systems of Annotating Local and Remote Display Screens
JP2016523011A (ja) モバイルコンピューティングデバイスによって注釈付きビデオコンテンツを表示するためのシステムおよび方法
WO2023241360A1 (zh) 在线课堂的语音交互方法、装置、设备及存储介质
US20170004859A1 (en) User created textbook
US20230043683A1 (en) Determining a change in position of displayed digital content in subsequent frames via graphics processing circuitry
CN114449355B (zh) 一种直播互动的方法、装置、设备及存储介质
WO2022231703A1 (en) Integrating overlaid digital content into displayed data via processing circuitry using a computing memory and an operating system memory
CN109343761B (zh) 基于智能交互设备的数据处理方法及相关设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19886284

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2019886284

Country of ref document: EP

Effective date: 20210621