CN116594530A - Method, apparatus, device and storage medium for interaction - Google Patents
Method, apparatus, device and storage medium for interaction Download PDFInfo
- Publication number
- CN116594530A CN116594530A CN202310552829.1A CN202310552829A CN116594530A CN 116594530 A CN116594530 A CN 116594530A CN 202310552829 A CN202310552829 A CN 202310552829A CN 116594530 A CN116594530 A CN 116594530A
- Authority
- CN
- China
- Prior art keywords
- target
- control
- media content
- sharing
- interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 141
- 238000000034 method Methods 0.000 title claims abstract description 56
- 230000002452 interceptive effect Effects 0.000 claims abstract description 60
- 238000012545 processing Methods 0.000 claims description 20
- 230000004044 response Effects 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 7
- 230000008569 process Effects 0.000 abstract description 20
- 230000001965 increasing effect Effects 0.000 abstract description 6
- 230000014759 maintenance of location Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 18
- 230000006870 function Effects 0.000 description 11
- 230000000875 corresponding effect Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 8
- 230000009471 action Effects 0.000 description 5
- 230000006399 behavior Effects 0.000 description 4
- 230000033228 biological regulation Effects 0.000 description 4
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 102100039250 Essential MCU regulator, mitochondrial Human genes 0.000 description 1
- 101000813097 Homo sapiens Essential MCU regulator, mitochondrial Proteins 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000004992 fission Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000003997 social interaction Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Information Transfer Between Computers (AREA)
Abstract
According to embodiments of the present disclosure, methods, apparatuses, devices, and storage medium for interaction are provided. In one method, a sharing request associated with a target interactive control is received, the target interactive control configured to be associated with a predetermined play time of target media content; and generating sharing information corresponding to the target interaction control, wherein the sharing information at least indicates interaction statistics of the target interaction control, the sharing information further comprises a guiding control, and the guiding control is configured to: and guiding to a playing interface of the target media content to play the target media content from a preset playing moment. Therefore, the user interaction mode can be increased in the playing process of the media content. In this way, sharing and dissemination of media content may be facilitated and user retention and viscosity may be improved.
Description
Technical Field
Example embodiments of the present disclosure relate generally to the field of computers and, more particularly, relate to methods, apparatuses, devices, and computer-readable storage media for interaction.
Background
With the rapid development of media technology and intelligent devices, more and more applications provide various types of media content to users, and bring convenience to the majority of users. Various types of interactive operations, such as comment, forwarding, collection, etc., may be performed during the process of the user obtaining the media content. Such interactive operations can help more users focus on such media content.
Disclosure of Invention
In a first aspect of the present disclosure, an interaction method is provided. The method comprises the following steps: receiving a sharing request associated with a target interactive control, the target interactive control being configured to be associated with a predetermined play time of the target media content; and generating sharing information corresponding to the target interaction control, wherein the sharing information at least indicates interaction statistics of the target interaction control, the sharing information further comprises a guiding control, and the guiding control is configured to: and guiding to a playing interface of the target media content to play the target media content from a preset playing moment.
In a second aspect of the present disclosure, an apparatus for interaction is provided. The device comprises: the system comprises a receiving module, a display module and a display module, wherein the receiving module is configured to receive a sharing request associated with a target interaction control, and the target interaction control is configured to be associated with a preset playing time of target media content; and a generation module configured to generate sharing information corresponding to the target interaction control, wherein the sharing information at least indicates interaction statistics of the target interaction control, the sharing information further includes a guide control configured to: and guiding to a playing interface of the target media content to play the target media content from a preset playing moment.
In a third aspect of the present disclosure, an electronic device is provided. The apparatus comprises at least one processing unit; and at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit. The instructions, when executed by at least one processing unit, cause the apparatus to perform the method of the first aspect.
In a fourth aspect of the present disclosure, a computer-readable storage medium is provided. The computer readable storage medium has stored thereon a computer program executable by a processor to implement the method of the first aspect.
It should be understood that what is described in this section of the disclosure is not intended to limit key features or essential features of the embodiments of the disclosure, nor is it intended to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals denote like or similar elements, in which:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments of the present disclosure may be implemented;
2A-2G illustrate schematic diagrams of various examples of media playback interfaces according to some embodiments of the present disclosure;
FIGS. 3A and 3B illustrate a flowchart of an example of a different user experience process, according to some embodiments of the present disclosure;
FIG. 4 illustrates a flow chart of an interaction method according to some embodiments of the present disclosure;
FIG. 5 illustrates a block diagram of an apparatus for interaction according to some embodiments of the present disclosure; and
fig. 6 illustrates a block diagram of an electronic device capable of implementing various embodiments of the present disclosure.
Detailed Description
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the user should be informed and authorized of the type, usage range, usage scenario, etc. of the personal information related to the present disclosure in an appropriate manner according to the relevant legal regulations.
For example, in response to receiving an active request from a user, a prompt is sent to the user to explicitly prompt the user that the operation it is requesting to perform will require personal information to be obtained and used with the user. Thus, the user can autonomously select whether to provide personal information to software or hardware such as an electronic device, an application program, a server or a storage medium for executing the operation of the technical scheme of the present disclosure according to the prompt information.
As an alternative but non-limiting implementation, in response to receiving an active request from a user, the manner in which the prompt is sent to the user may be, for example, a pop-up window in which the prompt may be presented in text. In addition, a selection control for the user to select "agree" or "disagree" to provide personal information to the electronic device may also be carried in the pop-up window.
It will be appreciated that the above-described notification and user authorization process is merely illustrative and not limiting of the implementations of the present disclosure, and that other ways of satisfying relevant legal regulations may be applied to the implementations of the present disclosure.
It will be appreciated that the data (including but not limited to the data itself, the acquisition or use of the data) involved in the present technical solution should comply with the corresponding legal regulations and the requirements of the relevant regulations.
The term "responsive to" as used herein means a state in which a corresponding event occurs or a condition is satisfied. It will be appreciated that the execution timing of a subsequent action that is executed in response to the event or condition is not necessarily strongly correlated with the time at which the event occurs or the condition is established. For example, in some cases, the follow-up actions may be performed immediately upon occurrence of an event or establishment of a condition; in other cases, the subsequent action may be performed after a period of time has elapsed after the event occurred or the condition was established.
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been illustrated in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather, these embodiments are provided so that this disclosure will be more thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that any section/subsection headings provided herein are not limiting. Various embodiments are described throughout this document, and any type of embodiment may be included under any section/subsection. Furthermore, the embodiments described in any section/subsection may be combined in any manner with any other embodiment described in the same section/subsection and/or in a different section/subsection.
In describing embodiments of the present disclosure, the term "comprising" and its like should be taken to be open-ended, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The term "some embodiments" should be understood as "at least some embodiments". Other explicit and implicit definitions are also possible below. The terms "first," "second," and the like, may refer to different or the same object. Other explicit and implicit definitions are also possible below.
As used herein, a "unit," "operating unit," or "subunit" may be comprised of any suitable structure of a machine learning model or network. As used herein, a set of elements or similar expressions may include one or more of such elements. For example, a "set of convolution units" may include one or more convolution units.
As mentioned briefly above, more and more applications provide various types of media content to users. In the process of listening or watching by users, the interactive playing method provided by some applications is single. For example, in some video-type applications, a user can only post a opinion or question by sending a bullet screen. As another example, in some audio class applications, users may only post comments. Because of less interactive playing methods, users develop a single, screen-extinguishing use habit, so that the users can only passively accept the content.
Still other applications provide community scenarios, but users can only screen interesting content in the top page content stream, and user interaction behavior and audiovisual behavior in the community scenarios cannot be well converted.
To this end, embodiments of the present disclosure propose a solution for interaction. According to various embodiments of the present disclosure, a sharing request associated with a target interactive control is received. The target interactive control is configured to be associated with a predetermined play time of the target media content. And generating sharing information corresponding to the target interaction control. The sharing information may indicate interaction statistics of the target interaction control. Sharing information also includes a guidance control. The guidance control is configured to: and guiding to a playing interface of the target media content to play the target media content from a preset playing moment. Therefore, the user interaction mode can be increased in the playing process of the media content. In this way, sharing and dissemination of media content may be facilitated and user retention and viscosity may be improved.
Example embodiments of the present disclosure are described below with reference to the accompanying drawings.
FIG. 1 illustrates a schematic diagram of an example environment 100 in which embodiments of the present disclosure may be implemented. In the environment 100, an application 120 is installed in a terminal device 110. The user 140 may interact with the application 120 via the terminal device 110 and/or an attachment device of the terminal device 110. The application 120 may be a playback tool class application, a social class application, or any other suitable application. The application 120 can provide various types of services related to media content to the user 140, including but not limited to listening/viewing of the content, user-to-user/user-to-content interactions (e.g., commentary, forwarding, participation in voting, sharing voting results, etc.). In this context, "media content" includes one or more types of content, such as video, images, motion pictures, image sets, audio, text, and so forth.
In the environment 100 of fig. 1, the terminal device 110 may present an interface 150 of the application 120 if the application 120 is in an active state. The interface 150 may include various types of interfaces that the application 120 can provide, such as a content playback interface, a content authoring interface, a content publishing interface, an interactive control creation page, a community plaza interface, a personal homepage, and so forth. The application 120 may provide content audiovisual functionality to listen/view various types of media content published in the application 120. The application 120 may also provide interactive control creation functionality to interact with other users.
It should be appreciated that although the application 120 is shown in fig. 1 as being included in the terminal device 110, some of the processing capabilities of the application 120 may also be based at least in part on the server 130, for example. For example, a front-end portion of the application 120 (e.g., a portion for presentation) may be included in the terminal device 110, but a back-end portion of the application 120 (e.g., a portion for processing of interactive content or interactive controls) may be included in the server 130.
In some embodiments, terminal device 110 communicates with server 130 to enable provisioning of services for application 120. The terminal device 110 may be any type of mobile terminal, fixed terminal, or portable terminal, including a mobile handset, desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, media computer, multimedia tablet, personal Communication System (PCS) device, personal navigation device, personal Digital Assistant (PDA), audio/video player, digital camera/camcorder, positioning device, television receiver, radio broadcast receiver, electronic book device, game device, or any combination of the preceding, including accessories and peripherals for these devices, or any combination thereof. In some embodiments, terminal device 110 is also capable of supporting any type of interface to the user (such as "wearable" circuitry, etc.). Server 130 may be various types of computing systems/servers capable of providing computing power, including, but not limited to, mainframes, edge computing nodes, computing devices in a cloud environment, and so forth.
It should be understood that the structure and function of the various elements in environment 100 are described for illustrative purposes only and are not meant to suggest any limitation as to the scope of the disclosure.
Some example embodiments of the present disclosure will be described below with continued reference to the accompanying drawings.
Fig. 2A-2G illustrate schematic diagrams of examples of media playback interfaces according to some embodiments of the present disclosure. The interfaces of fig. 2A-2G and the interaction methods associated with these interfaces may be implemented by an appropriate electronic device or combination of electronic devices (e.g., server 130, terminal device 110, or a combination of server 130 and terminal device 110 in fig. 1). For convenience of description, the examples of fig. 2A to 2F and the interaction method are described below with reference to fig. 1 by taking an electronic device as an example.
The user 140 may open the application 120 and click on the play-target media content. As described above, the application 120 may be a playback tool class application. In some embodiments, the target media content includes audio content, video content, and/or combinations thereof. The audio content is, for example, a voice book, music, or the like. Video content is for example movie drama, live broadcast, etc. For convenience of description, an example of an application 120 is given below as an listening book application, and an example of a target media content is given as a voice book. It should be understood that the target media content may also include other suitable forms.
In the example of fig. 2A, if the user 140 enters a listening book player or clicks on a listening target media content 220 (e.g., the audio book "xxxxx" chapter XX XXXX "), the electronic device may present a play interface 210. The playback interface 210 may include titles, chapters, covers, playback progress bars, and other functional interface elements of the audio book, among others.
In the example of fig. 2B, during the listening of the user 140, the target interactive control 230 is presented at a predetermined play time 225 (e.g., 24 th seconds) associated with the audio book. In some embodiments, the electronic device may send the alert voice at the predetermined play time 225. In this way, the user's attention may be drawn, for example, to view the target interactive control 230 in a timely manner even in the off-screen state. Additionally or alternatively, the target interaction control 230 may be presented for a predetermined period of time starting at the predetermined play time 225, for example starting at 24 seconds to 1 minute 24 seconds. In this way, the user is allowed to review for a prolonged period of time or sufficient time to consider and interact.
The target interaction control 230 may be a questionnaire interaction control, a questionnaire survey control, or the like. In some embodiments, the target interaction control 230 includes a set of interaction options. For example, in a question-answer interaction control, the interaction options include, for example, answer options associated with the question. For another example, in a questionnaire control, the interactive options may include an input field associated with the question to allow the user to freely enter answer information.
In some embodiments, the target interaction control 230 includes a voting control. The voting control includes a set of interactive options, such as answer options. In the example of fig. 2B, the target interaction controls 230 are voting controls that include questions (e.g., "where you feel XXX will be hidden.
In some embodiments, the target interaction controls 230 include preconfigured interaction controls and/or interaction controls created by a user. For example, a system administrator pre-configures the interactive controls based on the highlight of the current chapter of the audio book. In this way, interactive content can be associated with the storyline characters or storyline developments of the audio book, thereby increasing interactivity between the user and the content. For another example, the user 140 creates an interactive control in the current section of the audio book. In this way, interactive content (e.g., questions or polls) may be sent to other users, thereby increasing the personalization of the interactive content and improving the interactivity between users.
The user 140 may select one or more interactive options 240 in the voting control. In some embodiments, the electronic device may present interaction statistics 250 to user 140. Interaction statistics 250 indicate the voting results of the voting controls. Such interaction statistics may be a summary of historical voting results. The configuration of the interactive controls may include configuration of presentation content and presentation time. For example, for a voting control, presentation may be made at a play time before the answer is revealed. In this way, the curiosity of the user can be motivated, enhancing the user's viscosity.
In the example of fig. 2C, interaction statistics 250 indicate voting results for multiple users (e.g., 102 persons). For example, 20% of users selected "AAAA",12% selected "BBBB",9% selected "CCCC", and 47% selected "DDDD".
The voting playing method is added in the playing process of the media content, so that the users can be attracted to listen and interact simultaneously. To attract more users, the user 140 may also share the voting results. As described further below.
In some embodiments, the electronic device can present a first sharing portal for sharing the target interaction control 230 in response to selection of at least one interaction option 240 of the set of interaction options 240. The electronic device may obtain a sharing request with the target interaction control based on a triggering operation for the first sharing portal. Additionally or alternatively, the electronic device can present an interface element for closing the target interaction control 230. In this way, the user is facilitated to engage in interactions, or to continue listening/viewing of media content.
In the example of fig. 2B, upon the user 140 hearing the scheduled play time 225, the electronic device may present a voting control as well as an interface element 234 (e.g., a close button). If the user 140 does not want to engage in the interaction, the voting control may be closed by clicking a close button. The user 140 may not perform any operation. The voting control automatically closes after a predetermined period of presentation. In the example of fig. 2C, upon selection of one or more answer options from the voting control by the user 140, the electronic device can present the voting results associated with the voting control, a first sharing portal 232 (e.g., a "one-touch sharing" button), and an interface element 234 (e.g., a close button). If the user 140 does not want to share the voting results, the voting control can be closed by clicking a close button. The user 140 may not perform any operation. The voting control automatically closes after a predetermined period of presentation.
The user 140 may send a share request associated with the target interaction control 230 through the first share portal 232 (e.g., by clicking on a "one-click share" button). The electronic device may receive the sharing request and generate sharing information based on the sharing request. The sharing information may indicate interaction statistics 250 of the target interaction control 230. For example, the electronic device may receive a request to share the voting results submitted by the user 140 and generate a post (also referred to as a ballot post) containing the voting results. The sharing information may also include a guidance control for guiding to the playback interface 210 of the target media content 220 to play the target media content 220 from the predetermined playback time 225. Additionally or alternatively, the guidance control may be configured to guide to the playback interface 210 of the target media content 220 to play the target media content 220 from an initial time.
In some embodiments, the electronic device can receive a sharing request associated with the target interaction control 230 and generate information to be shared based on the sharing request. Further, the electronic device may generate the sharing information based on the triggering operation of the user 140. The information to be shared may include interface elements prompting generation of a guidance control.
In the example of fig. 2D, the electronic device generates information to be shared 260 (e.g., a preview of a ticket post). The information to be shared 260 includes an interface element 265 (e.g., checkbox) that prompts generation of a boot control. The tick box may prompt information related to the function of the guidance control, such as "tick 'through' function, and other users may click on the entry to locate the XX section directly. After confirming that the preview content of the ticket is correct, the user 140 may send a share request. The electronic device may generate the sharing information based on the sharing request.
In the example of fig. 2E, the electronic device may generate and present sharing information 270 (e.g., a ticketing tile) in the sharing interface 215. Voting patches include the poster (e.g., user yyyyy), the posting time (e.g., 2023 year X month X day), the voting results, and so forth. The ticket tile also includes a guidance control 280 (e.g., a "pass through" button). If the user clicks the "cross" button, the relevant section of the audio book that triggered the voting control can be located. Additionally or alternatively, the reminder information 285 is popped up to prompt the user that he has navigated to the play interface 210 and to begin playing the audio book from the scheduled play time 225. In the example of FIG. 2F, the hint information 285 is, for example, "have automatically helped you locate to section XX content.
In some embodiments, the sharing information 270 also presents content description information associated with the target media content 220. In this way, relevant information may be presented at a glance, allowing the user to better understand the context of interaction statistics 250.
In some embodiments, the content description information is determined based on the portion of the target media content 230 associated with the predetermined play time 225. The content description information may include text and/or pictures. For example, the initial paragraph, highlight paragraph, or episode abstract of the current chapter, etc., are presented literally. In this way, the selection information or summary information of the target media content 220 may be presented, enabling recommendation of the overall content of the audio book or the target media content.
In some embodiments, the sharing information 270 also presents tags associated with the target media content 220. Such tags may be classification attributes, homogeneous recommendations, related topics, etc. of the target media content 220. In this way, more content that is likely to be of interest can be recommended to the user, thereby improving user retention and viscosity.
In the example of fig. 2E, the voting paste includes content description information 290 (e.g., summary information for a highlight paragraph) and a plurality of tags (e.g., "essence voting topic," "like all looking at for quests," "suspense ceiling").
The interaction method from the playback interface 210 into the sharing interface 215 is described above by way of example in fig. 2A-2G. That is, in some embodiments, the electronic device can present the playback interface 210 of the target media content 220 (e.g., a voice book). When the target media content 220 is played to the predetermined play time 225, the electronic device can present a target interaction control 230 (e.g., a voting control) in the play interface 210. The electronic device may obtain a sharing request with the target interaction control 230 based on a preset operation for the target interaction control 230.
In other embodiments, the electronic device can present a set of interactive controls associated with the target media content 220 in the playback interface 210 of the target media content 220. Further, the electronic device can present a second sharing portal for sharing the target interaction control in response to selection of the target interaction control in the set of interaction controls 240, and obtain a sharing request with the target interaction control 230 based on a triggering operation for the second sharing portal. In this way, access to the sharing interface 215 from the summary interface of the interactive control may be achieved, thereby providing more ways of interaction for the user.
Additionally or alternatively, the play interface 210 includes an interface element (e.g., a "vote entry" button) for entering the summary interface. In the example of fig. 2G, via a button in the play interface 210, the user 140 may enter a summary interface 235 associated with the audio book. The summary interface 235 includes all of the voting controls associated with the audio book. For votes that the user 140 has participated in, the electronic device may present associated interaction statistics. Additionally or alternatively, the electronic device may also present the number of people participating in the vote. In this way, the popularity of the media content can be reflected, increasing the enthusiasm of user participation. For votes that the user 140 is not engaged, if the user 140 chooses to engage in the vote, such as by selecting one or more interactive options, the server may present the corresponding voting results and a second sharing portal (e.g., a "one-touch sharing" button) based on such selections. And the electronic equipment can share the generated voting paste to a social square after the user confirms.
In summary, according to various embodiments of the present disclosure, interactions between users and content or between users may be increased during playing of media content. Based on the sharing of the interaction result, other users can perform operations such as praise, comment and the like, and can also quickly position to the core content, so that listening/watching or participating in the interaction by themselves can be performed. In this way, not only can the transformation of book listening behavior and social interaction be promoted, but also the re-popularization and consumption duration of the media content can be realized.
Fig. 3A and 3B illustrate a flowchart of an example of an experience process for different users according to some embodiments of the present disclosure. Process 300A takes as an example the experience process of user a. Process 300B takes as an example the experience process of user B. The processes 300A and 300B may be implemented by an appropriate electronic device or combination of electronic devices (e.g., the server 130, the terminal device 110, or a combination of the server 130 and the terminal device 110 in fig. 1). For ease of description, the process 300A and the process 300B are described below with reference to fig. 1 using an electronic device as an example.
In the example of fig. 3A, user a is the user who published the ticket. At block 310, the electronic device may detect a play time of a voice book that user a is listening to. At block 315, the electronic device may trigger voting play and present a voting control when a predetermined play time is detected. If user A does not want to participate in the vote, the voting control may be selected to close so that process 300A ends at block 320. If user A is engaged in voting, the electronic device may present the voting results at block 325. Further, if user A wants to share the voting results, at block 330, the electronic device may obtain the user A's sharing request and generate a voting paste at block 335. Additionally or alternatively, user a may choose to share to other users or to share to a community square. If user A relinquishes sharing, then the voting control may be selected to be closed, so that process 300A ends at block 320. If user A chooses to share to the community square, the electronic device may publish the voting paste to the community square at block 340.
In the example of fig. 3B, user B is a browsing user as a summary interface for a ballot sticker or voting control. In one manner of interaction, at block 345, the electronic device may detect that user B entered the community square. At block 350, the electronic device may detect a click operation by user B on a "pass through" button in a voting paste (e.g., published by user a). At block 355, the electronic device locates to a play paragraph (e.g., a predetermined play time) of the corresponding audio book. In another manner of interaction, at block 360, the electronic device may detect that user B enters the play interface of the audio book. At block 365, the electronic device may detect a triggering operation of the voting portal by user B. At block 370, the electronic device may present a summary interface of all voting controls associated with the audio book. User B may view both the participated and the non-participated votes. For non-participating votes, if user B chooses to participate, at block 375, the electronic device may present the voting results 375. At block 380, the electronic device may obtain the sharing request of user B and generate a ticket tile at block 385. Upon confirmation by user B, the electronic device may issue a voting patch to the community square at block 390.
Through the above example embodiments, a user may participate in a "voting" interaction while listening to a voice book. Through the actions of interaction, sharing and the like, the secondary popularization of the audio book content is realized, and the increase of the foreground consumption duration of the audio book is realized, so that the single action of traditional and passive acceptance of the media content is broken. Other users can directly locate the core paragraph of the audio book through the crossing function of the voting paste in the community scene to listen to or vote interaction besides recommending to listen to the audio book. Through such voting, sharing, locating, listening, re-voting, re-sharing links, better fission and conversion of media content is facilitated. And interaction behaviors are formed between users and between the users and the content, so that the retention and viscosity of the users are greatly improved.
It should be understood that the specific interface layouts, specific text contents, specific numerical values, etc., shown in the above examples are exemplary only and are not intended to be limiting of the present disclosure.
Example procedure
Fig. 4 illustrates a flow chart of a method 400 for interaction according to some embodiments of the present disclosure. Method 400 may be implemented by an appropriate electronic device or combination of electronic devices (e.g., server 130, terminal device 110, or a combination of server 130 and terminal device 110 in fig. 1). For ease of description, hereinafter collectively referred to as electronic devices, and process 400 is described with reference to fig. 1.
At block 410, the electronic device receives a sharing request associated with a target interactive control configured to be associated with a predetermined play time of the target media content. At block 420, sharing information corresponding to the target interaction control is generated, where the sharing information indicates at least interaction statistics of the target interaction control, the sharing information further including a guidance control configured to: and guiding to a playing interface of the target media content to play the target media content from a preset playing moment.
In some embodiments, the sharing information also presents content description information associated with the target media content.
In some embodiments, the content description information is determined based on a portion of the content of the target media content associated with the predetermined play time.
In some embodiments, to receive a sharing request associated with a target interactive control, an electronic device presents a playback interface for target media content; when the target media content is played to a preset playing time, presenting a target interaction control in a playing interface; and acquiring a sharing request with the target interaction control based on a preset operation for the target interaction control.
In some embodiments, the target interaction control comprises a set of interaction options, and in order to obtain a sharing request with the target interaction control based on a preset operation for the target interaction control, the electronic device is responsive to a selection of at least one interaction option in the set of interaction options to present a first sharing portal for sharing the target interaction control; and acquiring a sharing request with the target interaction control based on the triggering operation for the first sharing inlet.
In some embodiments, to receive a sharing request associated with a target interactive control, an electronic device presents a set of interactive controls associated with the target media content in a playback interface of the target media content; responsive to a selection of a target interactive control from the set of interactive controls, presenting a second sharing portal for sharing the target interactive control; and acquiring a sharing request with the target interaction control based on the triggering operation for the second sharing portal.
In some embodiments, the electronic device plays the target media content from a predetermined play time in response to selection of a target interactive control from the set of interactive controls.
In some embodiments, the target interactive controls include preconfigured interactive controls and/or interactive controls created by a user.
In some embodiments, the target media content includes audio content and/or video content.
In some embodiments, the target interaction control comprises a voting control, and the interaction statistics indicate voting results of the voting control.
Example apparatus and apparatus
Fig. 5 illustrates a schematic block diagram of an apparatus 600 for interaction according to some embodiments of the present disclosure. The apparatus 500 may be implemented as or included in an appropriate electronic device or combination of electronic devices (e.g., the server 130, the terminal device 110, or a combination of the server 130 and the terminal device 110 in fig. 1). The various modules/components in apparatus 500 may be implemented in hardware, software, firmware, or any combination thereof.
As shown, the apparatus 500 includes a receiving module 510 configured to receive a sharing request associated with a target interaction control configured to be associated with a predetermined play time of a target media content. The apparatus 500 further includes a generating module 520 configured to generate sharing information corresponding to the target interaction control, where the sharing information indicates at least interaction statistics of the target interaction control, and the sharing information further includes a guidance control configured to: and guiding to a playing interface of the target media content to play the target media content from a preset playing moment.
In some embodiments, the sharing information also presents content description information associated with the target media content.
In some embodiments, the content description information is determined based on a portion of the content of the target media content associated with the predetermined play time.
In some embodiments, to receive a sharing request associated with a target interactive control, the apparatus 500 further includes a first presentation module configured to present a playback interface of the target media content; the second presentation module is configured to present the target interaction control in the playing interface when the target media content is played to the preset playing time; and the first acquisition module is configured to acquire the sharing request with the target interaction control based on the preset operation for the target interaction control.
In some embodiments, the target interaction control comprises a set of interaction options, and in order to obtain a sharing request with the target interaction control based on a preset operation for the target interaction control, the apparatus 500 further comprises a third presentation module configured to present a first sharing portal for sharing the target interaction control in response to a selection of at least one interaction option of the set of interaction options; and the second acquisition module is configured to acquire the sharing request with the target interaction control based on the triggering operation for the first sharing inlet.
In some embodiments, to receive a sharing request associated with the target interactive control, the apparatus 500 further includes a fourth rendering module configured to display the target media content at a playback interface P23031908901CN
Presenting a set of interactive controls associated with the target media content; a fifth rendering module configured to render a second sharing portal for sharing the target interactive control in response to a selection of the target interactive control of the set of interactive controls; and the third acquisition module is configured to acquire the sharing request with the target interaction control based on the triggering operation for the second sharing inlet.
In some embodiments, the apparatus 500 further includes a play module configured to play the target media content from a predetermined play time in response to selection of a target interactive control of the set of interactive controls.
In some embodiments, the target interactive controls include preconfigured interactive controls and/or interactive controls created by a user.
In some embodiments, the target media content includes audio content and/or video content.
In some embodiments, the target interaction control comprises a voting control, and the interaction statistics indicate voting results of the voting control.
Fig. 6 illustrates a block diagram that shows an electronic device 600 in which one or more embodiments of the disclosure may be implemented. It should be understood that the electronic device 600 illustrated in fig. 6 is merely exemplary and should not be construed as limiting the functionality and scope of the embodiments described herein. The electronic device 600 shown in fig. 6 may be used to implement the server 130, the terminal device 110, or a combination of the server 130 and the terminal device 110 in fig. 1.
As shown in fig. 6, the electronic device 600 is in the form of a general-purpose electronic device. The components of electronic device 600 may include, but are not limited to, one or more processors or processing units 610, memory 620, storage 630, one or more communication units 640, one or more input devices 650, and one or more output devices 660. The processing unit 610 may be an actual or virtual processor and is capable of performing various processes according to programs stored in the memory 620. In a multiprocessor system, multiple processing units execute computer-executable instructions in parallel to increase the parallel processing capabilities of electronic device 600.
The electronic device 600 typically includes a number of computer storage media. Such a medium may be any available media that is accessible by electronic device 600, including, but not limited to, volatile and non-volatile media, removable and non-removable media. The memory 620 may be volatile memory (e.g., registers, cache, random Access Memory (RAM)), non-volatile memory (e.g., read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), flash memory), or some combination thereof. Storage device 630 may be a removable or non-removable media and may include machine-readable media such as flash drives, magnetic disks, or any other media that may be capable of storing information and/or data (e.g., training data for training) and may be accessed within electronic device 600.
The electronic device 600 may further include additional removable/non-removable, volatile/nonvolatile storage media. Although not shown in fig. 6, a magnetic disk drive for reading from or writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk may be provided. In these cases, each drive may be connected to a bus (not shown) by one or more data medium interfaces. Memory 620 may include a computer program product 625 having one or more program modules configured to perform the various methods or acts of the various embodiments of the disclosure.
The communication unit 640 enables communication with other electronic devices through a communication medium. Additionally, the functionality of the components of the electronic device 600 may be implemented in a single computing cluster or in multiple computing machines capable of communicating over a communication connection. Thus, the electronic device 600 may operate in a networked environment using logical connections to one or more other servers, a network Personal Computer (PC), or another network node.
The input device 650 may be one or more input devices such as a mouse, keyboard, trackball, etc. The output device 660 may be one or more output devices such as a display, speakers, printer, etc. The electronic device 600 may also communicate with one or more external devices (not shown), such as storage devices, display devices, etc., with one or more devices that enable a user to interact with the electronic device 600, or with any device (e.g., network card, modem, etc.) that enables the electronic device 600 to communicate with one or more other electronic devices, as desired, via the communication unit 640. Such communication may be performed via an input/output (I/O) interface (not shown).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium having stored thereon computer-executable instructions, wherein the computer-executable instructions are executed by a processor to implement the method described above is provided. According to an exemplary implementation of the present disclosure, there is also provided a computer program product, the computer program product being tangibly stored on a non-transitory computer-readable P23031908901CN
On a read medium and including computer-executable instructions that are executed by a processor to implement the methods described above.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus, devices, and computer program products implemented according to the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable medium having the instructions stored therein includes an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various implementations of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of implementations of the present disclosure has been provided for illustrative purposes, is not exhaustive, and is not limited to the implementations disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various implementations described. The terminology used herein was chosen in order to best explain the principles of each implementation, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand each implementation disclosed herein.
Claims (13)
1. An interaction method, comprising:
receiving a sharing request associated with a target interaction control, the target interaction control configured to be associated with a predetermined play time of a target media content; and
generating sharing information corresponding to the target interaction control, wherein the sharing information at least indicates interaction statistics of the target interaction control, and the sharing information further comprises a guiding control configured to: and guiding to a playing interface of the target media content to play the target media content from the preset playing moment.
2. The method of claim 1, wherein the sharing information further presents content description information associated with the target media content.
3. The method of claim 2, wherein the content description information is determined based on a portion of content of the target media content associated with the predetermined play time.
4. The method of claim 1, wherein receiving a sharing request associated with a target interaction control comprises:
presenting a playing interface of the target media content;
when the target media content is played to the preset playing time, presenting the target interaction control in the playing interface; and
and acquiring the sharing request with the target interaction control based on the preset operation aiming at the target interaction control.
5. The method of claim 4, wherein the target interaction control comprises a set of interaction options, and obtaining the sharing request with the target interaction control based on a preset operation for the target interaction control comprises:
responsive to a selection of at least one interaction option of the set of interaction options, presenting a first sharing portal for sharing the target interaction control; and
And acquiring the sharing request with the target interaction control based on the triggering operation for the first sharing portal.
6. The method of claim 1, wherein receiving a sharing request associated with a target interaction control comprises:
presenting a set of interactive controls associated with the target media content in a playback interface of the target media content;
responsive to a selection of the target interactive control from the set of interactive controls, presenting a second sharing portal for sharing the target interactive control; and
and acquiring the sharing request with the target interaction control based on the triggering operation for the second sharing portal.
7. The method of claim 6, further comprising:
in response to the selection of the target interactive control from the set of interactive controls, the target media content is played from the predetermined play moment.
8. The method of claim 1, wherein the target interactive controls comprise preconfigured interactive controls and/or interactive controls created by a user.
9. The method of claim 1, wherein the target media content comprises audio content and/or video content.
10. The method of claim 1, wherein the target interaction control comprises a voting control and the interaction statistics indicate voting results of the voting control.
11. An apparatus for interaction, comprising:
the system comprises a receiving module configured to receive a sharing request associated with a target interaction control, the target interaction control configured to be associated with a predetermined playing time of target media content; and
the generation module is configured to generate sharing information corresponding to the target interaction control, wherein the sharing information at least indicates interaction statistics of the target interaction control, the sharing information further comprises a guiding control, and the guiding control is configured to: and guiding to a playing interface of the target media content to play the target media content from the preset playing moment.
12. An electronic device, comprising:
at least one processing unit; and
at least one memory coupled to the at least one processing unit and storing instructions for execution by the at least one processing unit, which when executed by the at least one processing unit, cause the electronic device to perform the method of any one of claims 1 to 10.
13. A computer readable storage medium having stored thereon a computer program executable by a processor to implement the method of any of claims 1 to 10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310552829.1A CN116594530A (en) | 2023-05-16 | 2023-05-16 | Method, apparatus, device and storage medium for interaction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310552829.1A CN116594530A (en) | 2023-05-16 | 2023-05-16 | Method, apparatus, device and storage medium for interaction |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116594530A true CN116594530A (en) | 2023-08-15 |
Family
ID=87593166
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310552829.1A Pending CN116594530A (en) | 2023-05-16 | 2023-05-16 | Method, apparatus, device and storage medium for interaction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116594530A (en) |
-
2023
- 2023-05-16 CN CN202310552829.1A patent/CN116594530A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11233972B2 (en) | Asynchronous online viewing party | |
CN108184144B (en) | Live broadcast method and device, storage medium and electronic equipment | |
Lu et al. | Streamwiki: Enabling viewers of knowledge sharing live streams to collaboratively generate archival documentation for effective in-stream and post hoc learning | |
US8707185B2 (en) | Dynamic information management system and method for content delivery and sharing in content-, metadata- and viewer-based, live social networking among users concurrently engaged in the same and/or similar content | |
US20090063995A1 (en) | Real Time Online Interaction Platform | |
CN113315986B (en) | Live broadcast interaction method and device, product evaluation method and device, electronic equipment and storage medium | |
CN102595212A (en) | Simulated group interaction with multimedia content | |
CN111800668B (en) | Barrage processing method, barrage processing device, barrage processing equipment and storage medium | |
US20220210514A1 (en) | System and process for collaborative digital content generation, publication, distribution, and discovery | |
CN107911749B (en) | Method for displaying and providing rehearsal graph, client and server | |
US20220182251A1 (en) | System and method for bi-directional communication for performances | |
WO2024099452A1 (en) | Video interaction method and apparatus, and device and storage medium | |
Apeh et al. | Implications of citizen journalism on the main stream journalism | |
CN116028724A (en) | Method, apparatus, device and storage medium for user interaction | |
US20120291020A1 (en) | Cross-platform portable personal video compositing and media content distribution system | |
US20230396826A1 (en) | Method of broadcasting real-time on-line competitions and apparatus therefor | |
CN114422843B (en) | video color egg playing method and device, electronic equipment and medium | |
CN116594530A (en) | Method, apparatus, device and storage medium for interaction | |
CN113515336B (en) | Live room joining method, creation method, device, equipment and storage medium | |
CN115914664A (en) | Method, device, equipment and storage medium for live broadcast interaction | |
CN115834967A (en) | Method, apparatus, electronic device, and storage medium for generating multimedia content | |
CN114764485B (en) | Information display method and device, storage medium and computer equipment | |
Diwan | Next episode: the story of video streaming viewership in India | |
Timmons | The use of paratextual devices in broadcast promotion: a content analysis of season three of Glee on Facebook | |
US11750713B1 (en) | System and method for facilitating an online continuous multimedia contest |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |