CN112423087A - Video interaction information display method and terminal equipment - Google Patents

Video interaction information display method and terminal equipment Download PDF

Info

Publication number
CN112423087A
CN112423087A CN202011289985.6A CN202011289985A CN112423087A CN 112423087 A CN112423087 A CN 112423087A CN 202011289985 A CN202011289985 A CN 202011289985A CN 112423087 A CN112423087 A CN 112423087A
Authority
CN
China
Prior art keywords
user
target
video
target video
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011289985.6A
Other languages
Chinese (zh)
Inventor
盛碧星
张升辉
李璋毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202011289985.6A priority Critical patent/CN112423087A/en
Publication of CN112423087A publication Critical patent/CN112423087A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44222Analytics of user selections, e.g. selection of programs or purchase activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/45Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
    • H04N21/4508Management of client data or end-user data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/475End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data
    • H04N21/4756End-user interface for inputting end-user data, e.g. personal identification number [PIN], preference data for rating content, e.g. scoring a recommended movie
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/478Supplemental services, e.g. displaying phone caller identification, shopping application
    • H04N21/4788Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting

Abstract

The embodiment of the disclosure provides a video interaction information display method and terminal equipment, and relates to the technical field of terminals. The method comprises the following steps: displaying a first interface, wherein the first interface comprises a target control, and the target control is used for triggering and displaying the interactive information of a target video; receiving a first operation, wherein the first operation is an operation on a target control; in response to the first operation, displaying a second interface, the second interface comprising: the target video playing method comprises the steps that a time shaft of a target video and identification of interactive information input by a user to the target video are arranged, the identification of each interactive information is located at the corresponding moment of the interactive information on the time shaft, the corresponding moment of the interactive information on the time shaft is the corresponding moment of the playing progress of the interactive information on the time shaft, and the playing progress of the interactive information is the playing progress of the target video when the user inputs the interactive information. The embodiment of the disclosure is used for solving the problem that a user is difficult to acquire the interactive information of the video.

Description

Video interaction information display method and terminal equipment
Technical Field
The disclosure relates to the technical field of terminals, in particular to a video interaction information display method and terminal equipment.
Background
With the popularization of mobile terminals and the speed increase of networks, videos have become one of mainstream content transmission media by virtue of the characteristics of strong infectivity, various forms and contents, strong social attributes, transmission speed, simple manufacture and the like.
Both video creators and video viewers are generally very concerned about the interaction of other users with the video that they create or view. For example: under the condition that a teacher makes teaching contents into teaching videos and issues the teaching videos to designated positions for students to watch, the teacher needs to analyze the interaction between the students and the teaching videos so as to obtain the understanding degree and other information of the students on the teaching contents. For another example: after the short video creator issues the short video to the content platform, the short video creator and some short video viewers need to know the user's preference degree for the short video according to the interaction between the user and the short video. However, in the prior art, it is difficult for a user to acquire interactive information of a video.
Disclosure of Invention
In view of this, the present disclosure provides a video interaction information display method and a terminal device, which are used to solve the problem that a user is difficult to acquire interaction information of a video.
In order to achieve the above object, the embodiments of the present disclosure provide the following technical solutions:
in a first aspect, an embodiment of the present disclosure provides a method for displaying video interaction information, where the method includes:
displaying a first interface, wherein the first interface comprises a target control, and the target control is used for triggering and displaying the interactive information of a target video;
receiving a first operation, wherein the first operation is an operation on the target control;
in response to the first operation, displaying a second interface, the second interface comprising: the time axis of the target video and the identification of the interactive information input by the user to the target video are located at the corresponding time of the interactive information on the time axis, the corresponding time of the interactive information on the time axis is the corresponding time of the playing progress of the interactive information on the time axis, and the playing progress of the interactive information is used for representing the playing progress of the target video when the user inputs the interactive information.
As an optional implementation manner of the embodiment of the present disclosure, the interaction information includes: at least one of comments, likes, and expressions.
As an optional implementation manner of the embodiment of the present disclosure, the second interface further includes: user information;
the user information includes an identification of each user who accessed the video playback interface of the target video.
As an optional implementation manner of the embodiment of the present disclosure, the user information further includes: accessing statistical information of users of the video playing interface of the target video; and the statistical information of the user is displayed in association with the identification of the user.
As an optional implementation manner of the embodiment of the present disclosure, the statistical information of the user includes at least one of the following information:
the play ratio of the user is used for representing the ratio of the time length of the video clip played by the user in the target video to the total time length of the target video;
the access duration of the user is used for representing the time length of the user for accessing the video playing page of the target video;
the playing duration of the user is used for representing the time length of the video clip played by the user in the target video;
the number of comments the user inputs to the target video;
the number of expressions input by the user to the target video;
at least one of the number of times that the user approves of the target video.
As an optional implementation manner of the embodiment of the present disclosure, the method further includes:
receiving a second operation, wherein the second operation is a trigger operation on the identification of the target user in the user information;
in response to the second operation, displaying a third interface, the third interface comprising: the interaction information of the target user.
As an optional implementation manner of the embodiment of the present disclosure, the third interface specifically includes:
the time axis of the target video and the identification of the target user for the interaction information input by the target video;
the identification of each interactive information is located at the corresponding moment of each interactive information on the time axis, the corresponding moment of the interactive information on the time axis is the corresponding moment of the playing progress of the interactive information on the time axis, and the playing progress of the interactive information is used for representing the playing progress of the target video when the target user inputs the interactive information.
As an optional implementation manner of the embodiment of the present disclosure, the method further includes:
highlighting the target time period on the time axis in a preset mode in the third interface;
wherein the target time period is used for characterizing the video segment played by the target user in the target video.
As an optional implementation manner of the embodiment of the present disclosure, the third interface further includes: the user information;
and the identification of the target user is highlighted in the user information in a preset mode.
As an optional implementation manner of the embodiment of the present disclosure, the third interface further includes: an identification of the target user;
and the identification of the target user is displayed in association with the interactive information of the target user.
As an optional implementation manner of the embodiment of the present disclosure, the identifier of the interactive information includes at least one first identifier, where the first identifier is used to identify a comment input by a user on the target video; the method further comprises the following steps:
receiving a third operation, wherein the second operation is an operation on a target first identifier in the at least one first identifier;
and responding to the third operation, and displaying the content of the comment corresponding to the target first identification.
As an optional implementation manner of the embodiment of the present disclosure, the second interface further includes at least one of the following information:
the number of users who have accessed the video playing interface of the target video;
accessing the average playing ratio of the users of the video playing interface of the target video;
the average access duration of a user who has accessed the video playing interface of the target video;
accessing the average playing time of the user of the video playing interface of the target video;
the total number of comments input by the user on the target video;
the total number of expressions input by the user to the target video;
and the number of people who like the target video point.
In a second aspect, an embodiment of the present disclosure provides a terminal device, including:
the display unit is used for displaying a first interface, the first interface comprises a target control, and the target control is used for triggering and displaying the interactive information of a target video;
a receiving unit, configured to receive a first operation, where the first operation is an operation on the target control;
the display unit is further configured to display a second interface in response to the first operation, where the second interface includes: the time axis of the target video and the identification of the interactive information input by the user to the target video are located at the corresponding time of the interactive information on the time axis, the corresponding time of the interactive information on the time axis is the corresponding time of the playing progress of the interactive information on the time axis, and the playing progress of the interactive information is used for representing the playing progress of the target video when the user inputs the interactive information.
As an optional implementation manner of the embodiment of the present disclosure, the interaction information includes: at least one of comments, likes, and expressions.
As an optional implementation manner of the embodiment of the present disclosure, the second interface further includes: user information;
the user information includes an identification of a user who accessed the video playback interface of the target video.
As an optional implementation manner of the embodiment of the present disclosure, the user information further includes: accessing statistical information of users of the video playing interface of the target video; and the statistical information of the user is displayed in association with the identification of the user.
As an optional implementation manner of the embodiment of the present disclosure, the statistical information of the user includes at least one of the following information:
the play ratio of the user is used for representing the ratio of the time length of the video clip played by the user in the target video to the total time length of the target video;
the access duration of the user is used for representing the time length of the user for accessing the video playing page of the target video;
the playing duration of the user is used for representing the time length of the video clip played by the user in the target video;
the number of comments input by the user on the target video;
the number of expressions input by the user to the target video;
and the user approves at least one of the times of the target video points.
As an alternative implementation of the disclosed embodiments,
the receiving unit is further configured to receive a second operation, where the second operation is a trigger operation on an identifier of a target user in the user information;
the display unit is further configured to display a third interface in response to the second operation, where the third interface includes: including the interaction information of the target user.
As an optional implementation manner of the embodiment of the present disclosure, the third interface specifically includes:
the time axis of the target video and the identification of the target user for the interaction information input by the target video;
the identification of each interactive information is located at the corresponding time of each interactive information on a time axis, the corresponding time of each interactive information on the time axis is the time corresponding to the playing progress of each interactive information on the time axis, and the playing progress of each interactive information is used for representing the playing progress of the target video when the target user inputs each interactive information.
As an optional implementation manner of the embodiment of the present disclosure, the display unit is further configured to highlight, in a preset manner, the target time period on the time axis in the third interface;
wherein the target time period is used for characterizing the video segment played by the target user in the target video.
As an optional implementation manner of the embodiment of the present disclosure, the third interface further includes: the user information;
and the identification of the target user is highlighted in the user information in a preset mode.
As an optional implementation manner of the embodiment of the present disclosure, the third interface further includes: an identification of the target user;
and the identification of the target user is displayed in association with the interactive information of the target user.
As an optional implementation manner of the embodiment of the present disclosure, the identifier of the interactive information includes at least one first identifier, where the first identifier is used to identify a comment input by a user on the target video;
the receiving unit is further configured to receive a third operation, where the third operation is an operation on a target first identifier in the at least one first identifier;
the display unit is further configured to display content of a comment corresponding to the target first identifier in response to the third operation.
As an optional implementation manner of the embodiment of the present disclosure, the second interface further includes at least one of the following information:
the number of users who have accessed the video playing interface of the target video;
accessing the average playing ratio of the users of the video playing interface of the target video;
the average access duration of a user who has accessed the video playing interface of the target video;
accessing the average playing time of the user of the video playing interface of the target video;
the total number of comments input by the user on the target video;
the total number of expressions input by the user to the target video;
and the number of people who like the target video point.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: a memory for storing a computer program and a processor; the processor is configured to execute the video interaction information presentation method according to the first aspect or any one of the optional embodiments of the first aspect when the computer program is invoked.
In a fourth aspect, an embodiment of the present disclosure provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the video interaction information displaying method according to the first aspect or any optional implementation manner of the first aspect.
The video interaction information display method provided by the embodiment of the disclosure displays a target control for triggering display of interaction information of a target video on a first interface, receives a first operation on the target control, and displays a second interface including a time axis of the target video and an identification of the interaction information input by a user on the target video in response to the first operation. Because the identification of each interactive information in the second interface is located at the corresponding time of each interactive information on the time axis, the corresponding time of each interactive information on the time axis is the corresponding time of the playing progress of each interactive information on the time axis, and the playing progress of each interactive information is used for representing the playing progress of the target video when the user inputs each interactive information, the user can obtain the number of the interactive information input by the user through the second interface, and also can obtain the playing progress of the target video when the interactive information is input, and further can associate the playing progress of the interactive information and the target video, the problem that the user is difficult to obtain the interactive information of the video can be solved by the embodiment of the present disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart illustrating steps of a video interaction information displaying method according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a first interface provided by an embodiment of the present disclosure;
fig. 3 is one of schematic diagrams of application scenarios provided by an embodiment of the present disclosure;
FIG. 4 is one of the schematic diagrams of a second interface provided by the embodiments of the present disclosure;
fig. 5 is a second schematic diagram of a second interface provided in an embodiment of the disclosure;
fig. 6 is a second flowchart illustrating steps of a video interaction information displaying method according to an embodiment of the disclosure;
fig. 7 is a second schematic view of an application scenario provided by the embodiment of the present disclosure;
FIG. 8 is one of the schematic diagrams of a third interface provided by the embodiments of the present disclosure;
fig. 9 is a second schematic view of a third interface provided by the embodiment of the present disclosure;
fig. 10 is a third schematic view of a third interface provided by the embodiment of the present disclosure;
fig. 11 is a third schematic view of a second interface provided by the embodiment of the present disclosure;
fig. 12 is a third flowchart illustrating steps of a video interaction information displaying method according to a third embodiment of the present disclosure;
fig. 13 is a third schematic view of an application scenario provided by the embodiment of the present disclosure;
fig. 14 is a schematic structural diagram of a terminal device provided in the embodiment of the present disclosure;
fig. 15 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure.
Detailed Description
In order that the above objects, features and advantages of the present disclosure may be more clearly understood, aspects of the present disclosure will be further described below. It should be noted that the embodiments and features of the embodiments of the present disclosure may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure, but the present disclosure may be practiced in other ways than those described herein; it is to be understood that the embodiments disclosed in the specification are only a few embodiments of the present disclosure, and not all embodiments.
The terms "first" and "second," and the like, in the description and claims of this disclosure are used to distinguish between synchronized objects, and are not used to describe a particular order of objects. For example, the first and second operations are for distinguishing between different operations and are not intended to describe a particular order of operations.
In the disclosed embodiments, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described as "exemplary" or "e.g.," in an embodiment of the present disclosure is not to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion. Further, in the description of the embodiments of the present disclosure, the meaning of "a plurality" means two or more unless otherwise specified.
The execution main body of the video interaction information display method provided by the embodiment of the disclosure can be terminal equipment. The terminal device may be a mobile phone, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), an intelligent watch, an intelligent bracelet, or other types of terminal devices, and the type of the terminal device is not limited in the embodiment of the present disclosure.
The embodiment of the present disclosure provides a method for displaying video interactive information, referring to fig. 1, the method for displaying video interactive information includes the following steps S101 to S103:
and S101, displaying a first interface.
The first interface comprises a target control, and the target control is used for triggering and displaying the interactive information of the target video.
Specifically, the first interface in the embodiment of the present disclosure may be a playing interface of a target video that is triggered to be displayed by the terminal device when the user operates the terminal device, may also be an interface that is triggered to be displayed by the terminal device when the user operates the terminal device and is used for displaying comments of the target video, and may also be an information flow page that is displayed by the terminal device.
Illustratively, referring to fig. 2, a playing detail page of the video targeted by the first interface 20 is shown in fig. 2 as an example. As shown in fig. 2, the first interface 20 includes a play window 21 of the target video, a comment display area 22 of the target video, and a target control 23. The target control 23 displays the top of the comment display area 22, and comments made by each user are also displayed in the comment display area 22.
S102, receiving a first operation.
Wherein the first operation is an operation on the target control.
Specifically, the first operation in the embodiment of the present disclosure may specifically be a touch and click operation performed by a user on the target control, or a click operation input by the user through a peripheral device such as a mouse on the target control, or a voice instruction input by the user, or a specific gesture input by the user.
In some embodiments of the present disclosure, the specific gesture may be any one of a single-tap gesture, a slide gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-tap gesture.
And S103, responding to the first operation, and displaying a second interface.
Wherein the second interface comprises: the time axis of the target video and the identification of the interactive information input by the user to the target video are located at the corresponding time of the interactive information on the time axis, the corresponding time of the interactive information on the time axis is the corresponding time of the playing progress of the interactive information on the time axis, and the playing progress of the interactive information is used for representing the playing progress of the target video when the user inputs the interactive information.
Specifically, in the embodiment of the present disclosure, each piece of interactive information has a corresponding playing progress, and the playing progress of any piece of interactive information is used to represent the playing progress of the target video when the user inputs the piece of interactive information. For example: when a certain user inputs first interactive information when the target video is played to the third zeroth second, the playing progress of the first interactive information is 03:02, and for example: when a certain user inputs second interactive information when the target video is played for twenty-seven fifth seconds, the playing progress of the second interactive information is 05: 27.
Further, in this embodiment of the present disclosure, each piece of interactive information corresponds to a time on a time axis of the target video, and the time corresponding to any piece of interactive information on the time axis is a time corresponding to the playing progress of the piece of interactive information on the time axis. For example: the playing progress of the first interactive information is 03:02, and the corresponding time of the first interactive information on the time axis of the target video is 03: 02; for another example: the playing progress of the second interactive information is 05:27, and the corresponding time of the second interactive information on the time axis of the target video is 05: 27.
As described in the above example, the playing progress of the first interactive information is 03:02, and the time on the time axis of the target video corresponding to the playing progress of the first interactive information is 03:02, so that the identifier of the first interactive information is displayed at 03:02 on the time axis. The playing progress of the second interactive information is 05:27, and the time on the time axis of the target video corresponding to the playing progress of the second interactive information is 05:27, so that the identifier of the second interactive information is displayed at 03:02 on the time axis.
It should be noted that, in this embodiment of the present disclosure, the identifier of a certain piece of interaction information is located at a certain time of the time axis of the target video, and may be located at a position overlapping with the certain time, or may be located at a position corresponding to the certain time, and specifically, the position corresponding to any time on the time axis may be above the certain time or below the certain time.
Optionally, the interaction information includes: at least one of comment, like, and expression (action).
That is, the interactive information in the embodiment of the present disclosure may include: comments, likes, and expressions.
Further optionally, the identifier of each comment is a first identifier;
the praise identification of each point is a second identification;
the identity of any expression is that expression.
That is, all comments adopt one kind of identification, all praises adopt another kind of identification, and the expression of the expression identifies the expression itself.
Illustratively, when receiving a first operation on the target control 23 in the first interface 20 shown in fig. 2, the terminal device displays the second interface 30 (shown in fig. 3) in response to the first operation. Wherein the second interface comprises: the second interface may present, for example, an identifier 32 representing comments of the user input to the target video, an identifier 33 indicating approval of the user input to the target video, and identifiers 34 and 35 indicating expressions of the user input to the target video, where the identifiers of the respective pieces of interactive information are presented in association with the time corresponding to the respective pieces of interactive information on the time axis.
The video interaction information display method provided by the embodiment of the disclosure displays a target control for triggering display of interaction information of a target video on a first interface, receives a first operation on the target control, and displays a second interface including a time axis of the target video and an identification of the interaction information input by a user on the target video in response to the first operation. Because the identification of each interactive information in the second interface is located at the corresponding time of each interactive information on the time axis, the corresponding time of each interactive information on the time axis is the corresponding time of the playing progress of each interactive information on the time axis, and the playing progress of each interactive information is used for representing the playing progress of the target video when the user inputs each interactive information, the number of the interactive information input by the user can be obtained through the second interface, the playing progress of the target video when the interactive information is input can also be obtained, and then the playing progress of the interactive information and the target video can be associated, and the related information of the interactive information can be obtained in more detail, so that the problem that the user is difficult to obtain the interactive information of the video can be solved by the embodiment of the present disclosure.
As an optional implementation manner of the embodiment of the present disclosure, referring to fig. 4, the second interface 30 further includes: user information 41;
the user information 41 includes an identification of a user of the video playback interface that accessed the target video.
It should be noted that, in fig. 4, the user information is presented in the form of a user list, but the embodiment of the present disclosure is not limited thereto, and on the basis of the above embodiment, the user information may also be presented in other manners, which is not limited by the embodiment of the present disclosure, and the user information is based on an identifier of each user who can present each video playing interface that accesses the target video. For example, the user identification of the user accessing the target video page may be presented in a folded presentation, and after responding to the user's trigger operation on the icon of the folded presentation, the user identification of all users accessing the target video page may be presented.
Specifically, the identification of the user may specifically be an avatar of the user and/or a user name of the user.
Since the second interface further includes user information, and a user who has accessed the video playing interface of the target video can be known in detail through the user information, the above embodiment can acquire the related information of the interactive information in more detail.
Further, on the basis of the foregoing embodiment, the user information further includes: accessing statistical information of users of the video playing interface of the target video; the statistical information of any user is displayed in association with the identity of the user.
Optionally, the statistical information of any user includes at least one of the following information:
the play ratio of the user is used for representing the ratio of the time length of the video clip played by the user in the target video to the total time length of the target video;
the access duration of the user is used for representing the time length of the user for accessing the video playing page of the target video;
the playing time length of the user is used for representing the time length of the video clip played by the user in the target video;
the number of comments the user inputs to the target video;
the number of expressions input by the user to the target video;
at least one of the number of times that the user approves of the target video.
For example, for a target video with a total duration of 10 minutes, a user plays a video segment with a duration of 3 minutes in the target video, and does not play other video segments in the target video, then the play duration of the user is 3 minutes, the access duration of the user is 3 minutes, and the play ratio of the user is 30%.
Illustratively, for a certain target video with the total time length of 10 minutes, a certain user plays video segments of 0: 00-3: 00 in the target video for the first time, plays video segments of 2: 00-6: 00 in the target video for the second time, and does not play other video segments in the target video, so that the playing time length of the user is 6 minutes, the access time length of the user is 7 minutes, and the playing ratio of the user is 60%.
Illustratively, for a certain target video with the total time length of 10 minutes, a certain user plays video clips of 0: 00-5: 00 in the target video for the first time, plays video clips of 2: 00-4: 00 in the target video for the second time, and does not play other video clips in the target video, so that the playing time length of the user is 5 minutes, the access time length of the user is 7 minutes, and the playing ratio of the user is 50%.
Illustratively, for a certain target video with a total duration of 10 minutes, a certain user repeatedly plays a video clip with a certain time length of 1 minute and 30 seconds out of the target video 3 times, and does not play other video clips in the target video, so that the playing duration of the user is 1 minute and 30 seconds, the access duration of the user is 4 minutes and 30 seconds, and the playing ratio of the user is 15%.
For example, the association display of the statistical information of the user and the identifier of the user may specifically be: the statistical information of the users is displayed below the user identification or above the user identification, and the like, and the association display mode is not limited by the disclosure, so that the corresponding relation of the identification of each user of each statistical information can be determined according to the position relation in actual use.
Illustratively, referring to fig. 5, in fig. 5, the statistical information of the user includes a play ratio of the user, the number of comments input by the user to the target video, the number of expressions input by the user to the target video, the user's identification is the user's avatar and the user's name, and the statistical information display of the user is illustrated with the right side of the user's avatar and the lower side of the user's name. As shown in fig. 5, the user information 41 further includes: the play ratio 51 of the user, the number 52 of comments input by the user to the target video, and the number 53 of expressions input by the user to the target video.
In the above embodiment, the user information further includes statistical information of the user, and the interaction information of each user can be obtained in detail through the statistical information, so that the above embodiment can obtain the relevant information of the interaction information in more detail.
As an optional implementation manner of the embodiment of the present disclosure, in a case that the second interface further includes user information, as shown in fig. 6, on the basis of the above step S101 to step S103, the method provided by the embodiment of the present disclosure further includes:
and S601, receiving a second operation.
Wherein the second operation is a trigger operation on an identifier of a target user in the user information.
Specifically, the second operation in the embodiment of the present disclosure may specifically be a touch click operation on an identifier of the target user, or a click operation input by the user on the identifier of the target user through a peripheral device such as a mouse, or a voice instruction input by the user, or a specific gesture input by the user.
In some embodiments of the present disclosure, the specific gesture may be any one of a single-tap gesture, a slide gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-tap gesture.
And S602, responding to the second operation, and displaying a third interface.
Wherein the third interface comprises: the interaction information of the target user.
As an optional implementation manner of the embodiment of the present disclosure, the third interface specifically includes:
the time axis of the target video and the identification of the interaction information input by the target user to the target video.
The identification of each interactive information is located at the corresponding time of each interactive information on the time axis, the corresponding time of any interactive information on the time axis is the corresponding time of the playing progress of the interactive information on the time axis, and the playing progress of the interactive information is used for representing the playing progress of the target video when the target user inputs the interactive information.
Illustratively, referring to fig. 7, when a second operation on the identifier 410 of the second user in the user information 41 is received, the third interface 70 is displayed, and the time axis 71 of the target video of the third interface 70 and the identifier of the interactive information input by the second user on the target video are located at the corresponding time on the time axis of each interactive information.
The above embodiment further displays, when receiving a second operation on the identification of the target user, a third interface including the time axis of the target video and the identification of the interactive information input by the target user on the target video in response to the second operation. Because the identifier of each interactive information in the third interface is located at the corresponding time of each interactive information on the time axis, the corresponding time of any interactive information on the time axis is the corresponding time of the playing progress of the interactive information on the time axis, and the playing progress of any interactive information is the playing progress of the target video when the interactive information is inputted by the target user, the number of the interactive information inputted by the target user can be obtained through the third interface, the playing progress of the target video when the interactive information is inputted by the target user can also be obtained, and further the interactive information inputted by the target user and the playing progress of the target video can be associated, and the relevant information of the interactive information of each user can be more detailed, so that the embodiment can more detailed obtain the relevant information of the interactive information.
Further, on the basis of any of the above embodiments, the method for displaying video interaction information provided by the embodiment of the present disclosure further includes:
highlighting the target time period on the time axis in a preset mode in the third interface;
wherein the target time period is used for characterizing the video segment played by the target user in the target video.
In particular, the manner in which the target time period is highlighted may be highlighted with different brightness and/or different color.
Illustratively, referring to fig. 8, in fig. 8, the second user is the target user, and the second user plays the target video: the above examples are described in 1 minute zero 3 seconds to 3 minutes 15 seconds, 4 minute zero 20 seconds to 5 minutes 30 seconds, and 9 minute zero 0 seconds to 10 minutes 0 seconds. Due to the target video played by the second user: 1 minute-3 seconds to 3 minute-15 seconds, 4 minute-20 seconds to 5 minute-30 seconds, and 9 minute-0 seconds to 10 minute-0 seconds, so that the video clips played by the second user include three, the first video clip is 1 minute-3 seconds to 3 minute-15 seconds of the target video, the second video clip is 4 minute-20 seconds to 5 minute-30 seconds of the target video, the third video clip is 9 minute-0 seconds to 10 minute-0 seconds of the target video, the time period 81 corresponding to the first video clip is 1 minute-3 seconds to 3 minute-15 seconds on the time axis 31, the time period 82 corresponding to the second video clip is 4 minute-20 seconds to 5 minute-30 seconds on the time axis 31, and the time period 83 corresponding to the third video clip is 9 minute-0 seconds to 10 minute-0 seconds on the time axis 31, so that the time period 81, the time period 82, and the time period 83 on the time axis 31 are highlighted in a preset manner.
As an optional implementation manner of the embodiment of the present disclosure, referring to fig. 9, the third interface 70 further includes: the user information 90;
the identification 410 of the target user is highlighted in the user information 90 in a preset manner.
Fig. 9 illustrates an example in which the image implements the identifier of the target user as a frame that widens the identifier of the target user, but the embodiment of the present disclosure is not limited to this, and the identifier of the target user may be highlighted in addition to the above-described embodiment by changing the color, brightness, size, and the like of the identifier of the target user.
As an optional implementation manner of the embodiment of the present disclosure, referring to fig. 10, the third interface 70 further includes: an identification 100 of the target user;
the identification 100 of the target user is displayed in association with the interaction information of the target user.
On the basis of any of the above embodiments, the second interface further includes at least one of the following information:
1. the number of users who have accessed the video playback interface of the target video.
2. And the average playing ratio of the users who access the video playing interface of the target video.
Specifically, the average playing ratio is an average value of the playing ratios of users who have accessed the video playing interface of the target video. For example: if there are three video playing interfaces through which the user has accessed the target video, and the playing ratios of the three users are a%, b%, and c%, respectively, then the average playing ratio of the users who have accessed the video playing interface of the target video is: (a% + b% + c%)/3.
3. And the average access time of the users who have accessed the video playing interface of the target video.
Specifically, the average access duration is an average of access durations of users who have accessed the video playing interface of the target video. For example: if the video playing interface has 2 users who have accessed the target video, the access time lengths of the 2 users are respectively a and B, and then the average access time length of the users who have accessed the video playing interface of the target video is: (A + B)/3.
4. And the average playing time length of the user who has accessed the video playing interface of the target video.
Specifically, the average playing time length is an average value of the playing time lengths of users who have accessed the video playing interface of the target video. For example: if there are 4 video playing interfaces where the user has accessed the target video, and the access time durations of the three users are M, N, P, Q, the average access time duration of the user who has accessed the video playing interface of the target video is: (M + N + P + Q)/3.
5. A total number of comments the user entered for the target video.
6. And the total number of expressions input by the user to the target video.
7. And the number of people who like the target video point.
Illustratively, referring to fig. 11, the second interface in fig. 11 further includes: the number of users who have accessed the video playing interface of the target video, the average playing ratio of users who have accessed the video playing interface of the target video, and the average playing time of users who have accessed the video playing interface of the target video are exemplified. As shown in fig. 11, in the second interface 30, the number 111 of users accessing the video playing interface of the target video, the average playing ratio 112 of users accessing the video playing interface of the target video, and the average playing time 113 of users accessing the video playing interface of the target video may quickly obtain that the number of users accessing the video playing interface of the target video is 78, the average playing ratio of users accessing the video playing interface of the target video is 76%, and the average playing time of users accessing the video playing interface of the target video is 7 minutes and 36 seconds through fig. 11.
As an optional implementation manner of this embodiment of the present disclosure, in a case that the identifier of the interactive information includes at least one first identifier for identifying a comment that a user inputs to the target video, referring to fig. 12, the method further includes:
and S121, receiving a third operation.
Wherein the third operation is an operation on a target first identifier of the at least one first identifier.
Specifically, the third operation in the embodiment of the present disclosure may specifically be a touch click operation of the output of the target first identifier, or a click operation of the user inputting the identifier of the target user through a peripheral device such as a mouse, or an operation of the user hovering a mouse pointer over the target first identifier, or a voice instruction input by the user, or a specific gesture input by the user.
In some embodiments of the present disclosure, the specific gesture may be any one of a single-tap gesture, a slide gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-tap gesture.
And S122, responding to the third operation, and displaying the content of the comment corresponding to the target first identification.
It should be noted that, in the embodiment of the present disclosure, a display position of the content of the comment corresponding to the target first identifier is not limited. The content of the comment corresponding to the first identifier can be displayed at any position according to actual requirements. For example: the content of the comment corresponding to the first identifier can be displayed in an overlapping mode on the second interface.
Illustratively, referring to fig. 13, when a third operation on the first identifier 131 in the second interface 30 is received, the content 132 of the comment corresponding to the first identifier 131 is displayed on the second interface 30 in an overlapping manner for the user to view.
Based on the same inventive concept, as an implementation of the foregoing method, an embodiment of the present disclosure further provides a terminal device, where the terminal device embodiment corresponds to the foregoing method embodiment, and for convenience of reading, details in the foregoing method embodiment are not repeated one by one in this apparatus embodiment, but it should be clear that the terminal device in this embodiment can correspondingly implement all the contents in the foregoing method embodiment.
Fig. 14 is a schematic structural diagram of a terminal device provided in the embodiment of the present disclosure, and as shown in fig. 14, the terminal device 1400 provided in this embodiment includes:
the display unit 141 is configured to display a first interface, where the first interface includes a target control, and the target control is used to trigger display of interaction information of a target video;
a receiving unit 142, configured to receive a first operation, where the first operation is an operation on the target control;
the display unit 141 is further configured to display a second interface in response to the first operation, where the second interface includes: the time axis of the target video and the identification of the interactive information input by the user to the target video are located at the corresponding time of the interactive information on the time axis, the corresponding time of any interactive information on the time axis is the corresponding time of the playing progress of the interactive information on the time axis, and the playing progress of the interactive information is used for representing the playing progress of the target video when the user inputs the interactive information.
As an optional implementation manner of the embodiment of the present disclosure, the interaction information includes: at least one of comments, likes, and expressions.
As an optional implementation manner of the embodiment of the present disclosure, the second interface further includes: user information;
the user information includes an identification of each user who accessed the video playback interface of the target video.
As an optional implementation manner of the embodiment of the present disclosure, the user information further includes: accessing statistical information of users of the video playing interface of the target video; the statistical information of any user is displayed in association with the identity of the user.
As an optional implementation manner of the embodiment of the present disclosure, the statistical information of any user includes at least one of the following information:
the play ratio of the user is used for representing the ratio of the time length of the video clip played by the user in the target video to the total time length of the target video;
the access duration of the user is used for representing the time length of the user for accessing the video playing page of the target video;
the playing time length of the user is used for representing the time length of the video clip played by the user in the target video;
the number of comments the user inputs to the target video;
the number of expressions input by the user to the target video;
and the user gives praise to the target video frequency.
As an alternative implementation of the disclosed embodiments,
the receiving unit 142 is further configured to receive a second operation, where the second operation is a trigger operation on an identifier of a target user in the user information;
the display unit 141 is further configured to display a third interface in response to the second operation, where the third interface includes: the interaction information of the target user.
As an optional implementation manner of the embodiment of the present disclosure, the third interface specifically includes:
the time axis of the target video and the identification of the target user for the interaction information input by the target video;
the identification of each interactive information is located at the corresponding time of each interactive information on the time axis, the corresponding time of any interactive information on the time axis is the corresponding time of the playing progress of the interactive information on the time axis, and the playing progress of the interactive information is used for representing the playing progress of the target video when the target user inputs the interactive information.
As an optional implementation manner of the embodiment of the present disclosure, the display unit 141 is further configured to highlight, in a preset manner, the target time period on the time axis in the third interface;
wherein the target time period is used for characterizing the video segment played by the target user in the target video.
As an optional implementation manner of the embodiment of the present disclosure, the third interface further includes: the user information;
and the identification of the target user is highlighted in the user information in a preset mode.
As an optional implementation manner of the embodiment of the present disclosure, the third interface further includes: an identification of the target user;
and the identification of the target user is displayed in association with the interactive information of the target user.
As an optional implementation manner of the embodiment of the present disclosure, the identifier of the interactive information includes at least one first identifier, where the first identifier is used to identify a comment input by a user on the target video;
the receiving unit 142 is further configured to receive a third operation, where the third operation is an operation on a target first identifier in the at least one first identifier;
the display unit 141 is further configured to display, in response to the third operation, content of a comment corresponding to the target first identifier.
As an optional implementation manner of the embodiment of the present disclosure, the second interface further includes at least one of the following information:
the number of users who have accessed the video playing interface of the target video;
accessing the average playing ratio of the users of the video playing interface of the target video;
the average access duration of a user who has accessed the video playing interface of the target video;
accessing the average playing time of the user of the video playing interface of the target video;
the total number of comments input by the user on the target video;
the total number of expressions input by the user to the target video;
and the number of people who like the target video point.
The terminal device provided in this embodiment may execute the video interaction information display method provided in the foregoing method embodiment, and the implementation principle and the technical effect are similar, which are not described herein again.
Based on the same inventive concept, the embodiment of the disclosure also provides an electronic device. Fig. 15 is a schematic structural diagram of an electronic device provided in the embodiment of the present disclosure, and as shown in fig. 15, the electronic device provided in the embodiment includes: a memory 151 and a processor 152, the memory 151 for storing computer programs; the processor 152 is configured to execute the steps in the comment displaying method provided by the above-mentioned method embodiment when the computer program is called.
In particular, the memory 151 may be used to store software programs as well as various data. The memory 151 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 151 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 152 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 151 and calling data stored in the memory 151, thereby monitoring the electronic device as a whole. Processor 152 may include one or more processing units.
Furthermore, it should be understood that the electronic device provided by the embodiment of the present disclosure may further include: the device comprises a radio frequency unit, a network module, an audio output unit, a receiving unit, a sensor, a display unit, a user receiving unit, an interface unit, a power supply and the like. It will be appreciated by those skilled in the art that the above-described configuration of the electronic device does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components, or some components may be combined, or a different arrangement of components. In the embodiments of the present disclosure, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The radio frequency unit may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink data from a base station and then process the received downlink data to the processor 152; in addition, the uplink data is transmitted to the base station. Typically, the radio frequency units include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access for the user through the network module, such as helping the user send and receive e-mails, browse webpages, access streaming media and the like.
The audio output unit may convert audio data received by the radio frequency unit or the network module or stored in the memory 151 into an audio signal and output as sound. Also, the audio output unit may also provide audio output related to a specific function performed by the electronic device (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit comprises a loudspeaker, a buzzer, a receiver and the like.
The receiving unit is used for receiving audio or video signals. The receiving Unit may include a Graphics Processing Unit (GPU) that processes image data of still pictures or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode, and a microphone. The processed image frames may be displayed on a display unit. The image frames processed by the graphic processor may be stored in a memory (or other storage medium) or transmitted via a radio frequency unit or a network module. The microphone may receive sound and be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit in case of the phone call mode.
The electronic device also includes at least one sensor, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel according to the brightness of ambient light, and a proximity sensor that turns off the display panel and/or the backlight when the electronic device is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., and will not be described herein.
The display unit is used for displaying information input by a user or information provided to the user. The Display unit may include a Display panel, and the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user receiving unit may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user receiving unit includes a touch panel and other input devices. A touch panel, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel (e.g., operations by a user on or near the touch panel using a finger, a stylus, or any other suitable object or attachment). The touch panel may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 152, and receives and executes commands sent from the processor 152. In addition, the touch panel may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel, the user receiving unit may include other input devices. Specifically, the other input devices may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel may be overlaid on the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel transmits the touch operation to the processor 152 to determine the type of the touch event, and then the processor 152 provides a corresponding visual output on the display panel according to the type of the touch event. Generally, the touch panel and the display panel are two independent components to implement the input and output functions of the electronic device, but in some embodiments, the touch panel and the display panel may be integrated to implement the input and output functions of the electronic device, and the implementation is not limited herein.
The interface unit is an interface for connecting an external device and the electronic equipment. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements in the electronic equipment or may be used to transmit data between the electronic equipment and the external device.
The electronic device may also include a power source (e.g., a battery) for powering the various components, and optionally, the power source may be logically coupled to the processor 152 via a power management system, such that functions such as managing charging, discharging, and power consumption may be performed via the power management system.
The embodiment of the disclosure also provides a computer readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the comment display method provided by the above method embodiment is implemented.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied in the medium.
Computer readable media include both permanent and non-permanent, removable and non-removable storage media. Storage media may implement information storage by any method or technology, and the information may be computer-readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include transitory computer readable media (transmyedia) such as modulated data signals and carrier waves.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present disclosure, which enable those skilled in the art to understand or practice the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (15)

1. A method for displaying video interaction information is characterized by comprising the following steps:
displaying a first interface, wherein the first interface comprises a target control, and the target control is used for triggering and displaying the interactive information of a target video;
receiving a first operation, wherein the first operation is an operation on the target control;
in response to the first operation, displaying a second interface, the second interface comprising: the time axis of the target video and the identification of the interactive information input by the user to the target video are located at the corresponding time of the interactive information on the time axis, the corresponding time of the interactive information on the time axis is the corresponding time of the playing progress of the interactive information on the time axis, and the playing progress of the interactive information is used for representing the playing progress of the target video when the user inputs the interactive information.
2. The method of claim 1, wherein the interaction information comprises: at least one of comments, likes, and expressions.
3. The method of claim 1, wherein the second interface further comprises: user information;
the user information includes an identification of a user who accessed a video playback page of the target video.
4. The method of claim 3, wherein the user information further comprises: accessing statistical information of users of the video playing interface of the target video; and the statistical information of the user is displayed in association with the identification of the user.
5. The method of claim 4, wherein the statistical information of the user comprises at least one of the following information:
the play ratio of the user is used for representing the ratio of the time length of the video clip played by the user in the target video to the total time length of the target video;
the access duration of the user is used for representing the time length of the user for accessing the video playing page of the target video;
the playing duration of the user is used for representing the time length of the video clip played by the user in the target video;
the number of comments the user inputs to the target video;
the number of expressions input by the user to the target video;
and the user gives praise to the target video frequency.
6. The method of claim 3, further comprising:
receiving a second operation, wherein the second operation is a trigger operation on the identification of the target user in the user information;
and responding to the second operation, and displaying a third interface, wherein the third interface comprises the interaction information of the target user.
7. The method according to claim 6, wherein the third interface comprises in particular:
the time axis of the target video and the identification of the target user for the interaction information input by the target video;
the identification of each interactive information is located at the corresponding time of each interactive information on a time axis, the corresponding time of each interactive information on the time axis is the time corresponding to the playing progress of each interactive information on the time axis, and the playing progress of each interactive information is used for representing the playing progress of the target video when the target user inputs each interactive information.
8. The method of claim 6, further comprising:
highlighting the target time period on the time axis in a preset mode in the third interface;
wherein the target time period is used for characterizing the video segment played by the target user in the target video.
9. The method of claim 6, wherein the third interface further comprises: the user information;
and the identification of the target user is highlighted in the user information in a preset mode.
10. The method of claim 6, wherein the third interface further comprises: an identification of the target user;
and the identification of the target user is displayed in association with the interactive information of the target user.
11. The method of claim 1, wherein the identification of the interactive information comprises at least one first identification, and the first identification is used for identifying a comment of the user on the target video input; the method further comprises the following steps:
receiving a third operation, wherein the third operation is an operation on a target first identifier in the at least one first identifier;
and responding to the third operation, and displaying the content of the comment corresponding to the target first identification.
12. The method of any of claims 1-11, wherein the second interface further comprises at least one of the following information:
the number of users who have accessed the video playing interface of the target video;
accessing the average playing ratio of the users of the video playing interface of the target video;
the average access duration of a user who has accessed the video playing interface of the target video;
accessing the average playing time of the user of the video playing interface of the target video;
the total number of comments input by the user on the target video;
the total number of expressions input by the user to the target video;
and the number of people who like the target video point.
13. A terminal device, comprising:
the display unit is used for displaying a first interface, the first interface comprises a target control, and the target control is used for triggering and displaying the interactive information of a target video;
a receiving unit, configured to receive a first operation, where the first operation is an operation on the target control;
the display unit is further configured to display a second interface in response to the first operation, where the second interface includes: the target video playing method comprises the steps that the time axis of a target video and the identification of interactive information input by a user to the target video are located at the corresponding moment of the interactive information on the time axis, the corresponding moment of the interactive information on the time axis is the corresponding moment of the playing progress of the interactive information on the time axis, and the playing progress of the interactive information is used for representing the playing progress of the target video when the user inputs the interactive information.
14. An electronic device, comprising: a memory for storing a computer program and a processor; the processor is used for executing the video interaction information display method of any one of claims 1-12 when calling the computer program.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the video interactive information presentation method according to any one of claims 1 to 12.
CN202011289985.6A 2020-11-17 2020-11-17 Video interaction information display method and terminal equipment Pending CN112423087A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011289985.6A CN112423087A (en) 2020-11-17 2020-11-17 Video interaction information display method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011289985.6A CN112423087A (en) 2020-11-17 2020-11-17 Video interaction information display method and terminal equipment

Publications (1)

Publication Number Publication Date
CN112423087A true CN112423087A (en) 2021-02-26

Family

ID=74831589

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011289985.6A Pending CN112423087A (en) 2020-11-17 2020-11-17 Video interaction information display method and terminal equipment

Country Status (1)

Country Link
CN (1) CN112423087A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113179445A (en) * 2021-04-15 2021-07-27 腾讯科技(深圳)有限公司 Video sharing method based on interactive article and interactive article
CN113301441A (en) * 2021-05-21 2021-08-24 北京字跳网络技术有限公司 Application program interaction method and device and electronic equipment
CN113411680A (en) * 2021-06-18 2021-09-17 腾讯科技(深圳)有限公司 Multimedia resource playing method, device, terminal and storage medium
CN113784195A (en) * 2021-08-20 2021-12-10 北京字跳网络技术有限公司 Video page display method and device, electronic equipment and storage medium
CN114125566A (en) * 2021-12-29 2022-03-01 阿里巴巴(中国)有限公司 Interaction method and system and electronic equipment
WO2022262647A1 (en) * 2021-03-10 2022-12-22 北京字跳网络技术有限公司 Data display method, apparatus and device, and medium
CN115529487A (en) * 2021-06-24 2022-12-27 华为技术有限公司 Video sharing method, electronic device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090328122A1 (en) * 2008-06-25 2009-12-31 At&T Corp. Method and apparatus for presenting media programs
CN105792006A (en) * 2016-03-04 2016-07-20 广州酷狗计算机科技有限公司 Interactive information display method and device
CN107925788A (en) * 2015-07-10 2018-04-17 株式会社普兰特 Intuitively video content method for regenerating and its user interface device based on data structured
CN107995515A (en) * 2017-11-30 2018-05-04 华为技术有限公司 The method and device of information alert

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090328122A1 (en) * 2008-06-25 2009-12-31 At&T Corp. Method and apparatus for presenting media programs
CN107925788A (en) * 2015-07-10 2018-04-17 株式会社普兰特 Intuitively video content method for regenerating and its user interface device based on data structured
CN105792006A (en) * 2016-03-04 2016-07-20 广州酷狗计算机科技有限公司 Interactive information display method and device
CN107995515A (en) * 2017-11-30 2018-05-04 华为技术有限公司 The method and device of information alert

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022262647A1 (en) * 2021-03-10 2022-12-22 北京字跳网络技术有限公司 Data display method, apparatus and device, and medium
CN113179445A (en) * 2021-04-15 2021-07-27 腾讯科技(深圳)有限公司 Video sharing method based on interactive article and interactive article
CN113301441A (en) * 2021-05-21 2021-08-24 北京字跳网络技术有限公司 Application program interaction method and device and electronic equipment
CN113301441B (en) * 2021-05-21 2023-02-03 北京字跳网络技术有限公司 Application program interaction method and device and electronic equipment
CN113411680A (en) * 2021-06-18 2021-09-17 腾讯科技(深圳)有限公司 Multimedia resource playing method, device, terminal and storage medium
CN115529487A (en) * 2021-06-24 2022-12-27 华为技术有限公司 Video sharing method, electronic device and storage medium
CN113784195A (en) * 2021-08-20 2021-12-10 北京字跳网络技术有限公司 Video page display method and device, electronic equipment and storage medium
CN114125566A (en) * 2021-12-29 2022-03-01 阿里巴巴(中国)有限公司 Interaction method and system and electronic equipment
CN114125566B (en) * 2021-12-29 2024-03-08 阿里巴巴(中国)有限公司 Interaction method, interaction system and electronic equipment

Similar Documents

Publication Publication Date Title
CN112423087A (en) Video interaction information display method and terminal equipment
CN109525707B (en) Audio playing method, mobile terminal and computer readable storage medium
CN109194818B (en) Information processing method and terminal
CN109240577B (en) Screen capturing method and terminal
CN111596818A (en) Message display method and electronic equipment
CN109857494B (en) Message prompting method and terminal equipment
CN109032486B (en) Display control method and terminal equipment
CN110099296B (en) Information display method and terminal equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN109085968B (en) Screen capturing method and terminal equipment
CN108664185A (en) Picture display process, mobile terminal and computer readable storage medium
CN110752981B (en) Information control method and electronic equipment
WO2019223569A1 (en) Information processing method and mobile terminal
CN112423138A (en) Search result display method and terminal equipment
CN108681427B (en) Access right control method and terminal equipment
CN107948429B (en) Content demonstration method, terminal equipment and computer readable storage medium
CN110703972B (en) File control method and electronic equipment
CN108228034A (en) Control method, mobile terminal and the computer readable storage medium of mobile terminal
CN110855549A (en) Message display method and terminal equipment
CN110795021B (en) Information display method and device and electronic equipment
CN110309003B (en) Information prompting method and mobile terminal
CN110069407B (en) Function test method and device for application program
CN110244884B (en) Desktop icon management method and terminal equipment
CN110012152B (en) Interface display method and terminal equipment
CN109992192B (en) Interface display method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210226

RJ01 Rejection of invention patent application after publication