CN117631924A - Multimedia resource interaction method, device, medium and electronic equipment - Google Patents

Multimedia resource interaction method, device, medium and electronic equipment Download PDF

Info

Publication number
CN117631924A
CN117631924A CN202311606861.XA CN202311606861A CN117631924A CN 117631924 A CN117631924 A CN 117631924A CN 202311606861 A CN202311606861 A CN 202311606861A CN 117631924 A CN117631924 A CN 117631924A
Authority
CN
China
Prior art keywords
target
display
control
interaction
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311606861.XA
Other languages
Chinese (zh)
Inventor
蔡威
王牧川
叶嘉浩
杨宁宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202311606861.XA priority Critical patent/CN117631924A/en
Publication of CN117631924A publication Critical patent/CN117631924A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The method comprises the steps of displaying a target multimedia resource in a display interface of the multimedia resource, displaying an interaction control in the display interface, and executing preset interaction actions when the interaction control is triggered, so that the whole interaction process of displaying the interaction control, triggering the interaction control, extracting target objects from a plurality of objects to be extracted and displaying the extracted target objects can be completed without switching the interface and without interrupting the display process of the target multimedia resource. The method not only can ensure the display duration of the target multimedia resource, but also can enable the user to complete the whole interaction action of extracting the target object and displaying the target object in one interface, and can greatly improve the probability of the user participating in the interaction.

Description

Multimedia resource interaction method, device, medium and electronic equipment
Technical Field
The disclosure relates to the technical field of computers, and in particular relates to a multimedia resource interaction method, a device, a medium and electronic equipment.
Background
With the rapid growth of the internet, short video social software is becoming more popular. Short video refers to high frequency pushed video content played on various new media platforms that is suitable for viewing in a mobile state and a short leisure state. In the related art, in the process of watching a short video, a user can generally only perform interaction operations such as praise, comment, collection, sharing and the like on the short video, and the interaction mode is single.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, the present disclosure provides a multimedia resource interaction method, including:
displaying the target multimedia resource in a display interface of the multimedia resource;
displaying the interaction control in the display interface;
and when the interaction control is triggered, executing preset interaction actions, wherein the preset interaction actions are that at least one target object is extracted from a plurality of displayed objects to be extracted, and displaying the extracted target object through a display control in the display interface.
In a second aspect, the present disclosure provides a multimedia resource interaction device, including:
a first display module configured to display a target multimedia asset in a display interface of the multimedia asset;
the second display module is configured to display the interaction control in the display interface;
and the execution module is configured to execute a preset interaction action when the interaction control is triggered, wherein the preset interaction action is to extract at least one target object from a plurality of displayed objects to be extracted, and display the extracted target object through a display control in the display interface.
In a third aspect, the present disclosure provides a computer readable medium having stored thereon a computer program which when executed by a processing device performs the steps of the method of the first aspect.
In a fourth aspect, the present disclosure provides an electronic device comprising:
a storage device having a computer program stored thereon;
processing means for executing said computer program in said storage means to carry out the steps of the method of the second aspect.
Based on the technical scheme, the target multimedia resource is displayed in the display interface of the multimedia resource, the interactive control is displayed in the display interface, and when the interactive control is triggered, the preset interactive action is executed, so that the whole interactive process of displaying the interactive control, triggering the interactive control, extracting the target object from the plurality of objects to be extracted and displaying the extracted target object can be completed under the condition that the interface is not required to be switched and the display process of the target multimedia resource is not interrupted. The method not only can ensure the display duration of the target multimedia resource, but also can enable the user to complete the whole interaction action of extracting the target object and displaying the target object in one interface, and can greatly improve the probability of the user participating in the interaction.
Additional features and advantages of the present disclosure will be set forth in the detailed description which follows.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale. In the drawings:
fig. 1 is a flow chart illustrating a method of multimedia asset interaction, according to some embodiments.
FIG. 2 is a schematic diagram of a presentation interface shown according to some embodiments.
FIG. 3 is a schematic diagram of an interaction control shown according to some embodiments.
FIG. 4 is a schematic diagram of an interactive control shown according to further embodiments.
FIG. 5 is a schematic diagram illustrating preset interactions, according to some embodiments.
Fig. 6 is a schematic diagram of a progress bar shown in accordance with some embodiments.
Fig. 7 is a schematic diagram of an indication element shown in accordance with some embodiments.
Fig. 8 is a schematic diagram of a multimedia asset presentation layer shown in accordance with some embodiments.
Fig. 9 is a schematic structural diagram of a multimedia asset interaction device, according to some embodiments.
Fig. 10 is a schematic structural diagram of an electronic device shown according to some embodiments.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
Fig. 1 is a flow chart illustrating a method of multimedia asset interaction, according to some embodiments. As shown in fig. 1, an embodiment of the present disclosure provides a method for interaction of multimedia resources, which may be executed by an electronic device, and in particular, may be executed by a multimedia resource interaction apparatus, where the apparatus may be implemented by software and/or hardware, and configured in the electronic device. As shown in fig. 1, the method may include the following steps.
In step 110, a target multimedia asset is presented in a presentation interface of the multimedia asset.
Here, the target multimedia asset may be a video (short video, long video, etc. video stream), a carousel picture (i.e., a group of pictures made up of a plurality of pictures), or the like. Of course, the first multimedia asset may also be an audio asset. The display interface of the multimedia resource may refer to a display interface in a multimedia resource program, which is used for displaying the multimedia resource.
Taking the application scene of the short video as an example, the display interface of the multimedia resource can be a play interface in the short video program. And the electronic equipment plays the short video in the playing interface of the short video. It should be appreciated that where the electronic device is a mobile terminal, the presentation interface may occupy the entire screen area of the mobile terminal and the target multimedia resource may be presented in the entire area or a portion of the area of the presentation interface.
FIG. 2 is a schematic diagram of a presentation interface shown according to some embodiments. As shown in fig. 2, a target multimedia asset 202 is presented in a presentation interface 201.
It should be noted that the presentation interface 201 may further include other elements, such as an interaction element 203, resource description information 204, and the like. Wherein the interactive element 203 may refer to an element for receiving an interactive operation performed by a user on the multimedia resource. For example, the interactive element 203 is used for a praise control for praying the multimedia resource, a forwarding control for forwarding the multimedia resource, a comment control for comment on the multimedia resource, and the like. The resource description information 204 may refer to file information for describing the multimedia resource and/or information for describing a distribution account of the multimedia resource, and the information of the distribution account may include an account name of the distribution account and/or an avatar used by the distribution account.
Of course, the presentation interface may also have controls for implementing other functions, such as controls for implementing a video capturing function, a video uploading function, a chat function, and a focus function, which are not described in detail herein.
In step 120, the interactive controls are presented in a presentation interface.
Here, the interactive control may be an interface element that is displayed superimposed on the presentation interface. Illustratively, the interactive controls may be presented in a presentation interface through a target animation effect. Wherein the target animation effect may be a dynamic effect similar to a shutter effect, a fly-in effect, etc.
FIG. 3 is a schematic diagram of an interaction control shown according to some embodiments. As shown in fig. 3, interactive controls 301 may be displayed in presentation interface 201.
In some embodiments, the interactive control may be presented in the presentation interface when the target multimedia asset is presented to the preset time node.
For example, taking a short video as an example, the interactive control may be displayed in the play interface of the short video after the short video is played for N seconds. The size of N may be set according to actual requirements.
In step 130, when the interaction control is triggered, a preset interaction action is executed, where the preset interaction action is to extract at least one target object from the displayed plurality of objects to be extracted, and display the extracted target object through the display control in the display interface.
Here, the interactive control may be triggered by the user aiming at the target triggering operation of the interactive control, and of course, the interactive control may also be triggered when the interactive control is displayed in the display interface for a preset duration. That is, the interactive controls may be actively triggered by the user or may be passively triggered by setting a trigger condition.
The object to be withdrawn may refer to a coupon, cash packet, packet cover, new year slip, etc. virtual item. Of course, the object to be extracted may also be a real object. It should be noted that the plurality of objects to be extracted may be related to the target multimedia resource. Illustratively, the plurality of objects to be extracted may be provided by a publisher of the target multimedia asset. Of course, the plurality of objects to be extracted may also be a plurality of objects configured by default of the system.
And when the interactive control displayed in the display interface is triggered, executing a preset interactive action. The preset interaction action can be to randomly extract at least one target object from the displayed multiple objects to be extracted, and display the extracted target object through a display control in a display interface.
The target object displayed by the display control can also be in an undelivered state, and accordingly, a user can get the target object through the display control.
The target object presented by the presentation control may be a state that has been picked up. The presentation control may be used to present the extracted target object that is picked up by the user. That is, the target object may be directly sent to the user's corresponding virtual account number. For example, assuming that the target object withdrawn is a cash gift certificate, the presentation control may present the value of the cash gift certificate and present the typeface of "the gift certificate has been placed in" to prompt the user that the gift certificate has been issued.
It should be understood that the extracted target object may be understood as a prize extracted by the viewer. That is, the interactive control is equivalent to a lottery control, and when the interactive control is triggered, the whole prize drawing and picking process is embodied in the display interface through the interactive control.
It is worth to say that, when the preset interaction is executed, the target multimedia resource displayed in the display interface is still normally displayed. For example, if the target multimedia resource is a video, the playing process of the video is not interrupted when the preset interaction is executed, and the video is still normally played. Of course, in other embodiments, the displaying of the target multimedia resource may be paused while the preset interaction is performed, and the displaying of the target multimedia resource may be continued after the preset interaction is performed. For example, after the user completes the whole lottery process, that is, after finishing displaying the target object, the target multimedia resource is continuously displayed so as not to guarantee the display effect of the target multimedia resource.
It should be noted that the foregoing preset interactions are all implemented in the display interface, that is, the whole interaction process of displaying the interaction control, triggering the interaction control, randomly extracting the target object from the plurality of objects to be extracted, and displaying the extracted target object can be implemented in the display interface without switching from the display interface to other interfaces by the user.
Therefore, by displaying the target multimedia resource in the display interface of the multimedia resource and displaying the interactive control in the display interface and executing the preset interactive action when the interactive control is triggered, the whole interactive process of displaying the interactive control, triggering the interactive control, randomly extracting the target object from a plurality of objects to be extracted and displaying the extracted target object can be completed without switching the interface and without interrupting the display process of the target multimedia resource. The method not only can ensure the display duration of the target multimedia resource, but also can enable the user to complete the whole interaction action of extracting the target object and displaying the target object in one interface, and can greatly improve the probability of the user participating in the interaction.
In some implementations, the interactive control includes a plurality of cards, each card is used for displaying a corresponding object to be extracted, and when the interactive control is triggered, the cards are displayed in a carousel mode, at least one target card is extracted from the cards, and the object to be extracted associated with the target card is determined to be the target object.
Here, the interaction control may include a plurality of cards, each card for presenting one object to be extracted. Each card can display a corresponding object to be extracted through texts and images.
FIG. 4 is a schematic diagram of an interactive control shown according to further embodiments. As shown in fig. 4, the interactive controls include a first card 401, a second card 402, and a third card 403.
It should be understood that a plurality of cards may be arranged in sequence. Of course, due to the limited display area of the interactive control, when in static display, part of cards in the plurality of cards can be in a hidden state, and when the interactive control is triggered, the cards in the hidden state are displayed.
When the interactive control is triggered, arrow marks appear on the interactive control, and a plurality of cards are displayed in a carousel mode, the carousel is gradually slowed down and stopped, and the cards pointed by the arrow marks are target cards. And then, displaying the display control corresponding to the target card in the display interface through the special effect. The special effect can be that the target card gradually moves to the center of a screen of the electronic equipment in the form of an electronic red packet, and finally, a display control is displayed in a display interface in the form of a popup window. The display control is used for picking up the target object corresponding to the extracted target card.
FIG. 5 is a schematic diagram illustrating preset interactions, according to some embodiments. As shown in fig. 5, the interactive control 301 displays the object to be extracted through a plurality of cards, when the interactive control 301 is triggered, an arrow mark appears on the interactive control 301, the indicating element is gradually blanked, the plurality of cards begin to be carousel, the carousel is gradually slowed down and stopped, the cards pointed by the arrow mark float to the center of the display interface, meanwhile, other cards are gradually blanked, and finally the extracted target object is displayed in the display interface through the display control 501.
From this, show the object of waiting to extract through a plurality of cards to broadcasting in turn and showing a plurality of cards, extracting at least one target card in the follow a plurality of cards not only can make the user look over the article of waiting to extract directly perceivedly, can also show article extraction process through interesting effect, greatly improved user experience degree.
In some implementations, the interactive control includes an indication element for indicating that the user triggers the interactive control to perform a preset interactive action through a target trigger operation.
Here, in the case where the interactive control is triggered by different target trigger operations, the indication element on the interactive control may be different. That is, the style of the interactive element on the interactive control may be different according to the target trigger operation corresponding to the interactive control.
As shown in fig. 4, when the target trigger operation is a click operation, the pointing element may be the first control 404. The first control 404 prompts the user to trigger the interactive control by a click operation by presenting a word of "click to draw welfare".
In other examples, the target triggering operation may be to change a gesture of an electronic device for displaying the presentation interface, and trigger the interaction control to perform a preset interaction action when a gesture variation amplitude of the electronic device reaches a target amplitude. Accordingly, the indication element may be a progress bar for describing a gesture change amplitude of the electronic device, and the progress bar may take different states under different gesture change amplitudes.
The gesture change amplitude may refer to a gesture change value after the electronic device displays the interaction control. The gesture change amplitude may refer to a shake amplitude or a tilt amplitude of the electronic device. That is, the target trigger operation may be understood as a user shaking or tilting the electronic device and determining to trigger the interactive control to perform a preset interactive action after the shaking amplitude or tilting amplitude reaches a threshold. It is worth noting that the electronic device may detect the gesture change amplitude of the electronic device through the gesture sensor.
When the target triggering operation is to change the gesture of the electronic device for displaying the display interface and the gesture change amplitude of the electronic device reaches the target amplitude, the indication element can be a progress bar under the condition that the interaction control is triggered to execute the preset interaction action. The progress bar is used for displaying the gesture change amplitude of the electronic equipment, so that a user can determine whether the condition for triggering the interaction control is reached according to the gesture change amplitude displayed by the progress bar.
Fig. 6 is a schematic diagram of a progress bar shown in accordance with some embodiments. As shown in fig. 6, progress bar 601 may show that the typeface of "tilt phone draw welfare" prompts the user to trigger the interaction control by tilting the phone. In addition, when the gesture of the electronic device is detected to change, the progress bar 601 presents different states under different gesture change amplitudes. As shown in fig. 6, the progress bar 601 is gradually extended, and when the target amplitude is reached, the progress bar 601 is filled. At this point, the interactive control is triggered.
In still other examples, the target trigger operation may be a change in a gesture of an electronic device used to display the presentation interface, and when a gesture change magnitude of the electronic device reaches a target magnitude, the trigger interaction control performs a preset interaction action, and/or the target trigger operation may be a click operation.
That is, the interactive control may be triggered by a click operation and/or changing a gesture of the electronic device. For example, the user may click on the interaction control first and then trigger the interaction control to perform a preset interaction action by shaking or tilting the electronic device.
Fig. 7 is a schematic diagram of an indication element shown in accordance with some embodiments. As shown in fig. 7, the indication element may be a second control 701. The second control 701 prompts the user to trigger the interactive control by clicking or tilting the mobile phone by displaying the word "click or tilt the mobile phone to draw welfare".
Therefore, through the indication element of the interaction control, the user can intuitively know the triggering mode of the interaction control, so that the user can trigger the interaction control quickly. Moreover, the interaction control is triggered by changing the gesture of the electronic equipment, so that the interaction control can be triggered more interestingly, and the participation feeling and the user experience of the user are improved.
In some implementations, the interactive controls may be presented in a target area of a presentation interface.
Here, the target area may be a partial area of the presentation interface, and the target area may be set according to actual conditions.
As some examples, the target area may be an area in the presentation interface for carrying resource description information of the target multimedia resource.
As shown in fig. 2 and 3, the interactive control may be displayed in an area of the presentation interface for carrying the resource description information 204 of the target multimedia resource, so as to avoid obscuring the target multimedia resource.
It should be noted that, in the case that the target area is an area for carrying resource description information of the target multimedia resource in the display interface, the area for carrying resource description information of the target multimedia resource in the display interface may not display the resource description information any more, but may be replaced by a display interaction control. For example, upon triggering the presentation of the interactive control, the resource description information may be hidden and the interactive control may be displayed in an area for carrying the resource description information of the target multimedia resource.
As yet other examples, the target area may be a single-handed operation area in the presentation interface.
The single-hand operation area refers to an area that a user can directly touch when operating the electronic device with one hand. By presenting the interactive controls in a single-handed operating region of the presentation interface, a user may be enabled to quickly and directly trigger the interactive controls.
For example, the one-handed operation region of the presentation interface may be determined according to the model of the electronic device and/or the manner in which the user holds the electronic device.
As other examples, the target area may be determined from the target multimedia resource. Taking the target multimedia resource as an example, assuming that the target multimedia resource is a video, the target area may be determined according to a video picture of the video. Wherein, the area where the person in the video picture makes the preset action can be determined as the target area.
Therefore, the interaction control is displayed in the target area, the shielding of the target multimedia resource can be avoided, the display effect of the target multimedia resource is guaranteed, or the user can trigger the interaction control quickly and directly, so that the user experience is improved.
In some implementations, the target multimedia asset may be displayed through a multimedia asset display layer that is located below a display interface of the multimedia asset. Accordingly, the interactive control can be displayed on the multimedia resource display layer, and the interactive control is controlled to move from the multimedia resource display layer to the display interface so as to display the interactive control in the display interface, wherein when the interactive control moves from the multimedia resource display layer to the display interface, the interactive control interacts with elements in the display interface to form the naked-eye three-dimensional effect.
Here, the multimedia resource display layer is used for playing multimedia resources, and takes short videos as an example, the multimedia resource display layer is equivalent to a video playing area. The multimedia resource display layer is positioned at the lower layer of the display interface, that is, the display interface is overlapped at the upper layer of the multimedia resource display layer.
The electronic device may first display the interactive control in the multimedia resource display layer, and then control the interactive control to move from the multimedia resource display layer to the layer where the display interface is located. That is, the interactive controls are floating displayed in the presentation interface from the multimedia asset presentation layer.
When the interactive control moves from the multimedia resource display layer to the display interface, the interactive control interacts with elements in the display interface to form a naked-eye three-dimensional effect. Wherein, the elements in the presentation interface may be the interaction element 203 and/or the resource description information 204 described in the above embodiments.
It should be understood that the interaction control interacts with elements in the display interface to form a naked eye three-dimensional effect, which may mean that the interaction control breaks through the elements in the display interface and breaks through the effect from the multimedia resource display layer to the display interface.
In the concrete implementation, the first animation resources corresponding to the interactive control can be played on the multimedia resource display layer and the second animation resources can be played on the upper layer of the display interface respectively, the first animation resources and the second animation resources are matched to form the interactive control, elements in the display interface are broken through, and the effect of breaking through from the multimedia resource display layer to the display interface is achieved.
Fig. 8 is a schematic diagram of a multimedia asset presentation layer shown in accordance with some embodiments. As shown in fig. 8, the multimedia asset presentation layer 801 is located below the presentation interface 201. Interaction control 301 is displayed in multimedia asset presentation layer 801 first, then interaction control 301 moves from multimedia asset presentation layer 801 to presentation interface 201 and finally is displayed at the upper layer of presentation interface 201 (effect is shown in fig. 3).
When the interaction control 301 moves from the multimedia resource display layer 801 to the display interface 201, the interaction control 301 interacts with the resource description information 204 in the display interface 201, and the interaction control 301 cooperates with the resource description information 204 to form an naked-eye three-dimensional effect.
Therefore, through the implementation mode, the interactive control can be displayed through the naked eye three-dimensional effect, and the interestingness of displaying the interactive control is greatly improved.
In some implementations, a target page associated with a type of target object is presented in response to a trigger operation for a presentation control.
Here, the trigger operation for the presentation control may be a click operation for a child control in the presentation control. The child control may refer to a control for implementing page skip. For example, the child control may be a control of the type "learn more", "i want to double", and so on.
The triggering operation for the display control can also be triggered when the display control reaches a preset duration. For example, jump to other pages after an interval of N seconds.
Of course, the triggering operation for the presentation control may also be triggered by a preset gesture or by shaking the electronic device. That is, what kind of triggering operation for the presentation control is adopted can be set according to actual requirements.
The target object presented by the presentation control may be a state that has been picked up, and the target page is associated with the picked up target object presented by the presentation control. Different target pages may be presented for different types of target objects.
For example, if the target object is a cash red package, the presentation control may include a child control for "knowing more" and a child control for "i want to double", and when the user clicks on the child control for "i want to double", the target page may be a corresponding task page for doubling the cash red package drawn after completing the task. The electronic device can display the task page in a 7-split screen mode in the display interface, so that a user can get a task in the task page. When the "know more" child control is triggered, then the target page may be a landing page, store, external platform, etc. associated with the cash red envelope.
For another example, if the target object is a physical object, the target page may be a commodity page or a landing page for acquiring the physical object. After the display control displays the target object for N seconds, the operation of jumping to the commodity page or the landing page is triggered, and the operation of jumping to the corresponding commodity page or the landing page from the display interface is performed, so that a user can acquire the real object on the commodity page or the landing page. Of course, if the target object is a real object, the display control may have an "immediate pickup" sub-control, and the user clicks the "immediate pickup" sub-control, to trigger an operation of jumping to pick up the commodity page or the landing page of the real object, so that the user can obtain the real object on the commodity page or the landing page.
Also for example, if the target object is a coupon, the target page may be a store page to which the coupon corresponds. And triggering the operation of jumping to the shop page after the display control displays the target object for N seconds, and jumping to the corresponding shop page from the display interface so that the user can use the coupon on the shop page.
For another example, if the target object is a virtual article such as a year slip or a virtual red envelope, the target page may be a corresponding landing page, store, or external platform. And triggering the operation of jumping to the corresponding landing page, store or external platform after the display control displays the target object for N seconds, and jumping to the corresponding landing page, store or external platform from the display interface.
That is, for different types of target objects, different target pages may be associated for the target object, so that after the user extracts the target object, the user may jump to different types of target pages through different operations.
It should be noted that, for different types of target objects, the trigger operations corresponding to the presentation control may be different. In an actual application scenario, a user can set according to requirements. For example, the target object is a coupon, and the triggering operation for the presentation control may be triggered when the presentation control is displayed for a preset duration. For another example, the target object is a cash red envelope and the trigger operation for the presentation control may be a click operation.
Therefore, by responding to the triggering operation for the display control, the target page associated with the type of the target object is displayed, and the jump page matched with the extracted type of the target object can be rapidly configured, so that different application scenes are realized.
Fig. 9 is a schematic structural diagram of a multimedia asset interaction device, according to some embodiments. As shown in fig. 9, an embodiment of the present disclosure provides a multimedia asset interaction device 900, where the multimedia asset interaction device 900 includes:
A first display module 901 configured to display a target multimedia resource in a display interface of the multimedia resource;
a second presentation module 902 configured to present an interaction control in the presentation interface;
the execution module 903 is configured to execute a preset interaction action when the interaction control is triggered, where the preset interaction action is to extract at least one target object from the displayed multiple objects to be extracted, and display the extracted target object through a display control in the display interface.
Optionally, the interaction control includes a plurality of cards, each card is used for displaying a corresponding object to be extracted, when the interaction control is triggered, the plurality of cards are displayed in a carousel mode, at least one target card is extracted from the plurality of cards, and the object to be extracted associated with the target card is determined to be a target object.
Optionally, the interaction control includes an indication element, where the indication element is used to instruct a user to trigger the interaction control to execute the preset interaction action through a target triggering operation.
Optionally, the target triggering operation includes changing a gesture of an electronic device for displaying the display interface, and triggering the interaction control to execute the preset interaction action when a gesture variation amplitude of the electronic device reaches a target amplitude;
The indication element comprises a progress bar for describing the gesture change amplitude of the electronic equipment, and the progress bar presents different states under different gesture change amplitudes.
Optionally, the first display module 901 is specifically configured to:
displaying the interactive control in a target area of the display interface;
wherein the target area comprises one of:
the display interface is used for carrying the resource description information of the target multimedia resource;
and displaying a single-hand operation area in the interface.
Optionally, the first display module 901 is specifically configured to:
displaying the target multimedia resources through a multimedia resource display layer positioned at the lower layer of the display interface of the multimedia resources;
the second presentation module 902 is specifically configured to:
and displaying the interactive control on the multimedia resource display layer, and controlling the interactive control to move from the multimedia resource display layer to the display interface so as to display the interactive control in the display interface, wherein when the interactive control moves from the multimedia resource display layer to the display interface, the interactive control interacts with elements in the display interface to form an naked-eye three-dimensional effect.
Optionally, the multimedia resource interaction device 900 further includes:
and the response module is configured to respond to the triggering operation for the showing control and show the target page associated with the type of the target object.
The functional logic executed by each functional module in the above-mentioned multimedia resource interaction device 900 is already described in detail in the section related to the method, and will not be described in detail here.
Referring now to fig. 10, a schematic diagram of an electronic device (e.g., terminal device) 1000 suitable for use in implementing embodiments of the present disclosure is shown. The terminal devices in the embodiments of the present disclosure may include, but are not limited to, mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and stationary terminals such as digital TVs, desktop computers, and the like. The electronic device shown in fig. 10 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 10, the electronic device 1000 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 1001 that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1002 or a program loaded from a storage means 1008 into a Random Access Memory (RAM) 1003. In the RAM 1003, various programs and data necessary for the operation of the electronic apparatus 1000 are also stored. The processing device 1001, the ROM 1002, and the RAM 1003 are connected to each other by a bus 1004. An input/output (I/O) interface 1005 is also connected to bus 1004.
In general, the following devices may be connected to the I/O interface 1005: input devices 1006 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, and the like; an output device 1007 including, for example, a Liquid Crystal Display (LCD), speaker, vibrator, etc.; storage 1008 including, for example, magnetic tape, hard disk, etc.; and communication means 1009. The communication means 1009 may allow the electronic device 1000 to communicate wirelessly or by wire with other devices to exchange data. While fig. 10 shows an electronic device 1000 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication device 1009, or installed from the storage device 1008, or installed from the ROM 1002. The above-described functions defined in the method of the embodiment of the present disclosure are performed when the computer program is executed by the processing device 1001.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the terminal device and the server providing the multimedia resource may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: displaying the target multimedia resource in a display interface of the multimedia resource; displaying the interaction control in the display interface; and when the interaction control is triggered, executing preset interaction actions, wherein the preset interaction actions are that at least one target object is extracted from a plurality of displayed objects to be extracted, and displaying the extracted target object through a display control in the display interface.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present disclosure may be implemented in software or hardware. The name of a module does not in some cases define the module itself.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims. The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.

Claims (10)

1. A method for multimedia resource interaction, comprising:
displaying the target multimedia resource in a display interface of the multimedia resource;
displaying the interaction control in the display interface;
and when the interaction control is triggered, executing preset interaction actions, wherein the preset interaction actions are that at least one target object is extracted from a plurality of displayed objects to be extracted, and displaying the extracted target object through a display control in the display interface.
2. The method of claim 1, wherein the interactive control comprises a plurality of cards, each card is used for displaying a corresponding object to be extracted, the plurality of cards are displayed in a carousel mode when the interactive control is triggered, at least one target card is extracted from the plurality of cards, and the object to be extracted, which is associated with the target card, is determined to be the target object.
3. The method of claim 1, wherein the interactive control comprises an indication element for indicating that a user triggers the interactive control to perform the preset interactive action through a target trigger operation.
4. A method according to claim 3, wherein the target triggering operation comprises a clicking operation for the interactive control, and/or changing the gesture of the electronic device for displaying the presentation interface, and triggering the interactive control to execute the preset interactive action when the gesture change amplitude of the electronic device reaches a target amplitude;
when the target triggering operation comprises changing the gesture of the electronic equipment for displaying the display interface and triggering the interaction control to execute the preset interaction action when the gesture change amplitude of the electronic equipment reaches the target amplitude, the indication element comprises a progress bar for describing the gesture change amplitude of the electronic equipment, and the progress bar presents different states under different gesture change amplitudes.
5. The method of claim 1, wherein the presenting the interactive control in the presentation interface comprises:
Displaying the interactive control in a target area of the display interface;
wherein the target area comprises one of:
the display interface is used for carrying the resource description information of the target multimedia resource;
and displaying a single-hand operation area in the interface.
6. The method of claim 1, wherein the presenting the target multimedia asset in the presentation interface of the multimedia asset comprises:
displaying the target multimedia resources through a multimedia resource display layer positioned at the lower layer of the display interface of the multimedia resources;
the displaying the interactive control in the display interface comprises the following steps:
and displaying the interactive control on the multimedia resource display layer, and controlling the interactive control to move from the multimedia resource display layer to the display interface so as to display the interactive control in the display interface, wherein when the interactive control moves from the multimedia resource display layer to the display interface, the interactive control interacts with elements in the display interface to form an naked-eye three-dimensional effect.
7. The method according to claim 1, wherein the method further comprises:
And responding to the triggering operation for the display control, and displaying a target page associated with the type of the target object.
8. A multimedia asset interaction device, comprising:
a first display module configured to display a target multimedia asset in a display interface of the multimedia asset;
the second display module is configured to display the interaction control in the display interface;
and the execution module is configured to execute a preset interaction action when the interaction control is triggered, wherein the preset interaction action is to extract at least one target object from a plurality of displayed objects to be extracted, and display the extracted target object through a display control in the display interface.
9. A computer readable medium on which a computer program is stored, characterized in that the program, when being executed by a processing device, carries out the steps of the method according to any one of claims 1-7.
10. An electronic device, comprising:
a storage device having a computer program stored thereon;
processing means for executing said computer program in said storage means to carry out the steps of the method according to any one of claims 1-7.
CN202311606861.XA 2023-11-28 2023-11-28 Multimedia resource interaction method, device, medium and electronic equipment Pending CN117631924A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311606861.XA CN117631924A (en) 2023-11-28 2023-11-28 Multimedia resource interaction method, device, medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311606861.XA CN117631924A (en) 2023-11-28 2023-11-28 Multimedia resource interaction method, device, medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN117631924A true CN117631924A (en) 2024-03-01

Family

ID=90029898

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311606861.XA Pending CN117631924A (en) 2023-11-28 2023-11-28 Multimedia resource interaction method, device, medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN117631924A (en)

Similar Documents

Publication Publication Date Title
CN109640188B (en) Video preview method and device, electronic equipment and computer readable storage medium
CN111757135B (en) Live broadcast interaction method and device, readable medium and electronic equipment
CN111970577B (en) Subtitle editing method and device and electronic equipment
JP7317232B2 (en) Information interaction method, device, equipment, storage medium and program product
US20220385997A1 (en) Video processing method and apparatus, readable medium and electronic device
CN111246275A (en) Comment information display and interaction method and device, electronic equipment and storage medium
EP4084488A1 (en) Video interaction method and device, electronic device and storage medium
EP4333440A1 (en) Video interaction method and apparatus, electronic device, and storage medium
CN114727146B (en) Information processing method, device, equipment and storage medium
CN114707065A (en) Page display method, device, equipment, computer readable storage medium and product
CN113553507B (en) Interest tag-based processing method, device, equipment and storage medium
CN112584224A (en) Information display and processing method, device, equipment and medium
CN114470751B (en) Content acquisition method and device, storage medium and electronic equipment
CN111596995B (en) Display method and device and electronic equipment
CN114610198B (en) Interaction method, device, equipment and storage medium based on virtual resources
CN114579030A (en) Information stream display method, device, apparatus, storage medium, and program
CN116156265A (en) Method, device, equipment, storage medium and program product for playing live content
US20230185444A1 (en) Multimedia information playback and apparatus, electronic device, and computer storage medium
CN114053697A (en) Cloud game interaction method and device, readable medium and electronic equipment
CN112732957A (en) Interaction method, interaction device, electronic equipment and computer-readable storage medium
CN116048337A (en) Page display method, device, equipment and storage medium
CN116320654A (en) Message display processing method, device, equipment and medium
CN115022702A (en) Method, device, equipment, medium and product for displaying gift in live broadcast room
EP4207775A1 (en) Method and apparatus for determining object addition mode, electronic device, and medium
CN117631924A (en) Multimedia resource interaction method, device, medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination