CN117055788A - Method, device and equipment for displaying and interacting multimedia resources - Google Patents

Method, device and equipment for displaying and interacting multimedia resources Download PDF

Info

Publication number
CN117055788A
CN117055788A CN202311071160.0A CN202311071160A CN117055788A CN 117055788 A CN117055788 A CN 117055788A CN 202311071160 A CN202311071160 A CN 202311071160A CN 117055788 A CN117055788 A CN 117055788A
Authority
CN
China
Prior art keywords
page
multimedia
resource
multimedia resource
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311071160.0A
Other languages
Chinese (zh)
Inventor
孙晓梦
殷瑞娟
徐滔锴
丁振新
李沁晖
刘妍
郭浩
董澈
唐巧
蒋翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Kuxun Technology Co Ltd
Original Assignee
Beijing Kuxun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Kuxun Technology Co Ltd filed Critical Beijing Kuxun Technology Co Ltd
Priority to CN202311071160.0A priority Critical patent/CN117055788A/en
Publication of CN117055788A publication Critical patent/CN117055788A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a method, a device and equipment for displaying and interacting multimedia resources, and belongs to the technical field of computers. The method comprises the following steps: displaying a first page of the detail page, wherein a first multimedia resource is displayed in the first page; and responding to target triggering operation of the user on the first multimedia resource, displaying a second page, wherein the first multimedia resource and the second multimedia resource are displayed in the second page, and the resource category of the second multimedia resource is the same as that of the first multimedia resource. According to the method, not only the first multimedia resources are displayed in the second page, but also the second multimedia resources with the same resource category as the first multimedia resources are displayed, so that the number of the multimedia resources displayed in the second page is more, the richness is higher, the degree of knowledge of a user on the detail page can be enhanced, the conversion rate of the user is further improved, in addition, the second page can be directly displayed only by aiming at the target triggering operation of the first multimedia resources, and the display efficiency of the second page is improved.

Description

Method, device and equipment for displaying and interacting multimedia resources
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method, a device and equipment for displaying and interacting multimedia resources.
Background
With the continued development of computer technology, more and more objects are able to provide services to users through applications. In order to provide better service, not only the text introduction information of the object, but also the multimedia resources corresponding to the object, such as images, are displayed in the detail page of the object. When the user clicks on an image, the clicked image may be displayed alone.
However, when clicking the image, the above method only displays the clicked image alone, the number of the displayed images is less, the richness is low, and the user's understanding degree of the object is low, so that the user's conversion rate of the object is low, if the user wants more information to assist in decision, the user needs to close the current large-scale image display, and the user can view the image only by clicking the album button additionally, so that the flow is not smooth.
Disclosure of Invention
The embodiment of the application provides a method, a device and equipment for displaying and interacting multimedia resources, which can be used for solving the problems in the related technology. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for displaying and interacting a multimedia resource, where the method includes:
Displaying a first page of the detail page, wherein a first multimedia resource is displayed in the first page;
and responding to target triggering operation of the user on the first multimedia resource, displaying a second page, wherein the first multimedia resource and the second multimedia resource are displayed in the second page, and the resource category of the second multimedia resource is the same as that of the first multimedia resource.
In one possible implementation, the target trigger operation for the first multimedia resource includes any one of a drop-down operation for the first multimedia resource, a click operation for the first multimedia resource, a double click operation for the first multimedia resource, or a long press operation for the first multimedia resource.
In one possible implementation, the target trigger operation for the first multimedia resource is a pull-down operation for the first multimedia resource;
the responding to the target triggering operation of the user for the first multimedia resource displays a second page, comprising:
responding to the pull-down operation of the user on the first multimedia resource, and displaying prompt information, wherein the prompt information is used for indicating the user to finish the pull-down operation;
And displaying the second page in response to the end of the pull-down operation.
In one possible implementation manner, after the displaying the second page, the method further includes:
and responding to a first triggering operation of a user for the second page, updating the multimedia resources displayed in the second page, wherein the resource category of the multimedia resources displayed in the updated second page is the same as that of the first multimedia resources, and the multimedia resources are different from the second multimedia resources.
In one possible implementation, the first trigger operation for the second page includes a slide-up operation for the second page or a slide-down operation for the second page.
In one possible implementation manner, after the displaying the second page, the method further includes:
and in response to a second triggering operation of the user on the second page, canceling the display of the first multimedia resource and the second multimedia resource in the second page, and displaying a third multimedia resource in the second page, wherein the resource category of the third multimedia resource is different from that of the second multimedia resource.
In one possible implementation, the second trigger operation for the second page includes a left-slide operation for the second page or a right-slide operation for the second page.
In a possible implementation manner, a first control and a second control are further displayed in the second page, the first control is used for representing a first resource category, the second control is used for representing a second resource category, the first resource category is a resource category of the first multimedia resource, and the second resource category is different from the first resource category;
the method further comprises the steps of:
and in response to the triggering operation of the user on the second control, canceling the display of the first multimedia resource and the second multimedia resource in the second page, and displaying a fourth multimedia resource in the second page, wherein the fourth multimedia resource is the multimedia resource of the second resource category.
In another aspect, an embodiment of the present application provides a device for displaying and interacting a multimedia resource, where the device includes:
the display module is used for displaying a first page of the detail page, and a first multimedia resource is displayed in the first page;
The display module is further configured to display a second page in response to a target triggering operation of the user on the first multimedia resource, where the first multimedia resource and the second multimedia resource are displayed in the second page, and a resource class of the second multimedia resource is the same as a resource class of the first multimedia resource.
In one possible implementation, the target trigger operation for the first multimedia resource includes any one of a drop-down operation for the first multimedia resource, a click operation for the first multimedia resource, a double click operation for the first multimedia resource, or a long press operation for the first multimedia resource.
In one possible implementation, the target trigger operation for the first multimedia resource is a pull-down operation for the first multimedia resource;
the display module is used for responding to the pull-down operation of the user on the first multimedia resource and displaying prompt information, wherein the prompt information is used for indicating the user to finish the pull-down operation; and displaying the second page in response to the end of the pull-down operation.
In one possible implementation, the apparatus further includes:
and the updating module is used for responding to the first triggering operation of the user for the second page, updating the multimedia resources displayed in the second page, wherein the resource category of the multimedia resources displayed in the updated second page is the same as that of the first multimedia resources, and the updated second page comprises multimedia resources different from the second multimedia resources.
In one possible implementation, the first trigger operation for the second page includes a slide-up operation for the second page or a slide-down operation for the second page.
In one possible implementation manner, the display module is configured to cancel displaying the first multimedia resource and the second multimedia resource in the second page in response to a second trigger operation of the user on the second page, and display a third multimedia resource in the second page, where a resource class of the third multimedia resource is different from a resource class of the second multimedia resource.
In one possible implementation, the second trigger operation for the second page includes a left-slide operation for the second page or a right-slide operation for the second page.
In a possible implementation manner, a first control and a second control are further displayed in the second page, the first control is used for representing a first resource category, the second control is used for representing a second resource category, the first resource category is a resource category of the first multimedia resource, and the second resource category is different from the first resource category;
the display module is further configured to cancel displaying the first multimedia resource and the second multimedia resource in the second page in response to a triggering operation of the user on the second control, and display a fourth multimedia resource in the second page, where the fourth multimedia resource is a multimedia resource of the second resource class.
In another aspect, an embodiment of the present application provides a computer device, where the computer device includes a processor and a memory, where the memory stores at least one program code, and the at least one program code is loaded and executed by the processor, so that the computer device implements any one of the foregoing methods for displaying and interacting multimedia resources.
In another aspect, there is further provided a computer readable storage medium having at least one program code stored therein, the at least one program code being loaded and executed by a processor, to cause a computer to implement any of the above-described methods for displaying and interacting with multimedia resources.
In another aspect, a computer program or a computer program product is provided, where at least one computer instruction is stored, where the at least one computer instruction is loaded and executed by a processor, so that the computer implements a method for displaying and interacting any of the multimedia resources described above.
The technical scheme provided by the embodiment of the application at least has the following beneficial effects:
according to the technical scheme provided by the embodiment of the application, the first multimedia resources are displayed in the first page, when the target triggering operation of the user for the first multimedia resources is responded, the first multimedia resources are displayed in the second page which is displayed, and the second multimedia resources with the same resource type as the first multimedia resources are also displayed in the second page, so that the number of the multimedia resources displayed in the second page is more, the richness is higher, the degree of knowledge of the user on the detail page is further enhanced, the conversion rate of the user is promoted, and the second page can be directly displayed only by the target triggering operation of the user for the first multimedia resources, the display efficiency of the second page is improved, and the display efficiency of the multimedia resources displayed in the second page is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation environment of a method for displaying and interacting multimedia resources according to an embodiment of the present application;
FIG. 2 is a flowchart of a method for displaying and interacting multimedia resources according to an embodiment of the present application;
FIG. 3 is a schematic illustration of a display of a second page of an application provided in an embodiment of the present application;
FIG. 4 is a schematic view of displaying a first page according to an embodiment of the present application;
FIG. 5 is a schematic view of another first page according to an embodiment of the present application;
FIG. 6 is a schematic diagram showing a prompt message according to an embodiment of the present application;
FIG. 7 is a schematic diagram of another prompt message provided in an embodiment of the present application;
FIG. 8 is a schematic diagram of displaying a second page according to an embodiment of the present application;
FIG. 9 is a schematic diagram of displaying an updated second page according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a page change after a left-hand slide operation for a second page according to an embodiment of the present application;
FIG. 11 is a schematic diagram of a page displayed after a triggering operation for a second control provided by an embodiment of the present application;
fig. 12 is a schematic structural diagram of a device for displaying and interacting multimedia resources according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an implementation environment of a method for displaying and interacting multimedia resources according to an embodiment of the present application, where, as shown in fig. 1, the implementation environment includes: a terminal device 101. The terminal device 101 is configured to execute the method for displaying and interacting a multimedia resource provided by the embodiment of the present application.
Alternatively, the terminal device 101 may be any electronic device product that can perform man-machine interaction with a user through one or more manners of a keyboard, a touch pad, a remote controller, a voice interaction or a handwriting device. For example, a PC (Personal Computer ), a mobile phone, a smart phone, a PDA (Personal Digital Assistant ), a wearable device, a PPC (Pocket PC, palm computer), a tablet computer, a smart car machine, a smart television, a smart sound box, a smart watch, and the like.
The terminal device 101 may refer broadly to one of a plurality of terminal devices, and the present embodiment is illustrated only with the terminal device 101. Those skilled in the art will appreciate that the number of terminal devices 101 may be greater or lesser. For example, the number of the terminal devices 101 may be only one, or the number of the terminal devices 101 may be tens or hundreds, or more, and the number and the device type of the terminal devices 101 are not limited in the embodiment of the present application.
In the exemplary embodiment of the present application, an application for acquiring resources, which may be any type of application, is installed and operated in the terminal device 101, and the embodiment of the present application is not limited thereto. The application may acquire any type of resource, nor is the embodiment of the application limited in this regard. For example, the application may obtain venue tickets, or the application may reserve hotel rooms, or the application may obtain dining items.
The application may refer to a stand-alone application that needs to be downloaded and installed, or may refer to an embedded program that depends on a host program to run, such as an applet, which is not limited in the embodiment of the present application. Based on the application program being an embedded program, the embedded program is an application program which is developed based on a programming language and runs depending on a host program. The embedded program does not need to be downloaded and installed, and can be operated by only being dynamically loaded in the host program. The user can find the needed embedded program by searching, sweeping, and the like, and can apply the embedded program by clicking, so that the embedded program does not occupy the memory of the terminal equipment after closing the embedded program after the embedded program is used up, and is quite convenient.
It will be appreciated by those skilled in the art that the above-described terminal device 101 is merely illustrative and that other existing or future-occurring terminal devices, as applicable to the present application, are intended to be within the scope of the present application and are incorporated herein by reference.
The embodiment of the present application provides a method for displaying and interacting multimedia resources, which can be applied to the implementation environment shown in fig. 1, taking a flowchart of a method for displaying and interacting multimedia resources, which is shown in fig. 2, as an example, where the method can be executed by the terminal device 101 in fig. 1. As shown in fig. 2, the method includes the following steps 201 to 202.
In step 201, a first page of the detail page is displayed, in which a first multimedia asset is displayed.
In one possible implementation manner, the display interface of the terminal device displays related information of the application program, where the related information of the application program may be an icon of the application program, a name of the application program, or other information capable of uniquely representing the application program, which is not limited by the embodiment of the present application. When a user wants to acquire resources, the user selects related information of an application program, the terminal equipment displays a first page of the application program, a first control is displayed in the first page of the application program, and the first control is used for acquiring the type of resources corresponding to the first control. One first control may be displayed in the first page of the application program, or a plurality of first controls may be displayed, which is not limited in the embodiment of the present application. The types corresponding to each first control are different.
And responding to the triggering operation of the user on the first control, displaying a second page of the application program, wherein the second page of the application program is displayed with merchant information of a plurality of merchants, and the merchants can provide resources of the type corresponding to the first control. The merchant information of the merchant includes, but is not limited to, the name of the merchant, the star class of the merchant, the average consumption of the merchant, the distance between the merchant and the user, the type of resources available to the merchant, the business district where the merchant is located, and the like. The triggering operation of the user on the first control refers to clicking operation of the user on the first control.
Fig. 3 is a schematic illustration showing a second page of an application according to an embodiment of the present application. In fig. 3, merchant information of a plurality of merchants is displayed, wherein the merchant information of a first merchant is: the name of the first merchant is XXX (XX shop), the star level of the first merchant is 4.1, the average consumption of people of the first merchant is 134 yuan, the distance between the first merchant and the user is 4.5 km, the type of resources available to the first merchant is a secret room/immersion interactive play, and the business circle where the first merchant is located is XX square. The merchant information of other merchants is shown in fig. 3, and will not be described in detail here. Optionally, other information may also be displayed in the second page of the application, which is not limited by the embodiment of the present application.
In response to a user desiring to obtain a resource provided by a first merchant, the user selects merchant information of the first merchant and displays a first page of a detail page, wherein the first page of the detail page refers to the first page displayed after the merchant information of the first merchant is selected. The first page is displayed with a first multimedia resource, the first multimedia resource is a multimedia resource corresponding to a first merchant, and the first multimedia resource can be a video or a picture, which is not limited by the embodiment of the application. Based on the first multimedia resource being a picture, the first multimedia resource may be a still picture or a moving picture, which is not limited in the embodiment of the present application.
Optionally, the first page may further have merchant information of the first merchant displayed therein, and the first multimedia resource is displayed on top of the merchant information of the first merchant.
Fig. 4 is a schematic display diagram of a first page according to an embodiment of the present application, where in fig. 4, a first multimedia resource 401 is displayed, and merchant information of a first merchant is also displayed. And in fig. 4, the display position of the first multimedia resource is before the display position of the merchant information of the first merchant.
Optionally, a control (such as the "environment" control shown in fig. 4) for indicating the first resource category may also be displayed in the first page, where the first resource category is a resource category corresponding to the first multimedia resource. When the first multimedia resource is displayed in the first page, the display state of the control which is displayed in the first page and used for indicating the first resource category is a first state, and the first state is used for indicating that the category of the first multimedia resource displayed in the first page is a first resource category. Optionally, displaying the "environment" control in gray in the first page means that the display state of the "environment" control is the first state.
The first page may further have a control (e.g., a "facility" control shown in fig. 4) for indicating a second resource category displayed therein, where the display state of the control for indicating the second resource category is a second state, and the second state is used to indicate that the category of the first multimedia resource displayed in the first page is not the second resource category. When the user slides the first multimedia resource leftwards or rightwards, the first multimedia resource is canceled from being displayed in the first page, the reference multimedia resource is displayed, the reference multimedia resource is the multimedia resource corresponding to the first merchant, and the resource category of the reference multimedia resource is different from that of the first multimedia resource.
Fig. 5 is a schematic view of another display of a first page according to an embodiment of the present application, in (1) of fig. 5, a user slides a first multimedia resource to the left, in (2) of fig. 5, the first multimedia resource is not displayed, a reference multimedia resource is displayed, and an "environment" control displayed in the first page is displayed as white, and a "facility" control is displayed as gray.
In a possible implementation manner, the server acquires a plurality of multimedia resources corresponding to the first merchant, where the plurality of multimedia resources corresponding to the first merchant may be multimedia resources uploaded by the merchant of the first merchant or multimedia resources uploaded by the customer of the first merchant. The server identifies each multimedia resource to obtain the resource category of each multimedia resource; processing each multimedia resource to obtain the quality score of each multimedia resource, wherein the quality score of any multimedia resource is used for indicating the quality of any multimedia resource. The higher the quality score of the multimedia asset, the higher the quality of the multimedia asset, and conversely, the lower the quality score of the multimedia asset, the lower the quality of the multimedia asset.
And the server determines the multimedia resource with the highest quality score in the multimedia resources of each resource category from the plurality of multimedia resources according to the resource category of each multimedia resource and the quality score of each multimedia resource. And when the first page is displayed, the first multimedia resource displayed in the first page is the multimedia resource with the highest quality score in the multimedia resources included in the resource category corresponding to the first multimedia resource. Optionally, the first multimedia resource displayed in the first page may also be any one of the multimedia resources included in the resource category corresponding to the first multimedia resource.
Optionally, the server identifies each multimedia resource, and the process of obtaining the resource category of each multimedia resource includes: the server comprises a classification model, and for any one of the multimedia resources, any one of the multimedia resources is input into the classification model to obtain the resource category of any one of the multimedia resources. The classification model may be any model capable of classifying the multimedia resources, which is not limited in the embodiment of the present application. Illustratively, the classification model is a Residual neural Network (ResNet) model, or the classification model is a convolutional neural Network (Visual Geometry Group, VGG) model.
The process for processing each multimedia resource to obtain the quality score of each multimedia resource comprises the following steps: processing any multimedia resource in the plurality of multimedia resources to obtain the characteristics of the any multimedia resource; and determining the value corresponding to the characteristic of any multimedia resource according to the characteristic of any multimedia resource, and obtaining the quality fraction of any multimedia resource according to the value corresponding to the characteristic of any multimedia resource and the weight parameter of the characteristic. Among other features, the characteristics of the multimedia asset include, but are not limited to, one or more of content integrity, body prominence, sharpness, illumination intensity, color vividness, color harmony, depth of field, composition, and the like of the multimedia asset.
The server obtains 10 pictures corresponding to a first merchant, wherein the first picture is classified into a first class and a quality score of 90, the second picture is classified into the first class and a quality score of 80, the third picture is classified into the first class and a quality score of 75, the fourth picture is classified into the second class and a quality score of 83, the fifth picture is classified into the second class and a quality score of 92, the sixth picture is classified into the second class and a quality score of 60, the seventh picture is classified into the third class and a quality score of 98, the eighth picture is classified into the third class and a quality score of 88, the ninth picture is classified into the third class and a quality score of 77, and the tenth picture is classified into the third class and a quality score of 78. From the above, the picture with the highest quality score in the pictures of the first category is the first picture, the picture with the highest quality score in the pictures of the second category is the fifth picture, and the picture with the highest quality score in the pictures of the third category is the seventh picture. Therefore, the picture displayed in the first page of the detail page of the first merchant may be the first picture, the fifth picture or the seventh picture.
In step 202, in response to a target trigger operation of the user for the first multimedia resource, a second page is displayed, in which the first multimedia resource and the second multimedia resource are displayed.
Wherein the resource class of the second multimedia resource is the same as the resource class of the first multimedia resource.
In one possible implementation, the target trigger operation for the first multimedia resource includes any one of a drop-down operation for the first multimedia resource, a click operation for the first multimedia resource, a double click operation for the first multimedia resource, or a long press operation for the first multimedia resource.
Optionally, based on the target triggering operation for the first multimedia resource being a pull-down operation for the first multimedia resource, in response to the pull-down operation for the first multimedia resource by the user, displaying prompt information, where the prompt information is used to instruct the user to end the pull-down operation; and displaying the second page in response to the end of the pull-down operation. The prompt information may include any content, which is not limited in this embodiment of the present application. Illustratively, the hint information is "loose hands enter the pages of the album".
Wherein, responding to the pull-down operation of the user for the first multimedia resource, displaying the prompt information comprises: and responding to the pull-down operation of the user on the first multimedia resource, and displaying prompt information when the execution duration of the pull-down operation is not less than a first time threshold. The first time length threshold is set based on experience, or is adjusted according to the implementation environment, which is not limited by the embodiment of the present application. The first time length threshold is, for example, 3 milliseconds.
Or, in response to the pull-down operation of the user on the first multimedia resource, and the distance corresponding to the pull-down operation is not smaller than the distance threshold, displaying the prompt information. The distance threshold is set based on experience, or adjusted according to the implementation environment, which is not limited by the embodiment of the present application. Illustratively, the distance threshold is 2 centimeters. The distance corresponding to the pull-down operation refers to the distance between the starting position of the pull-down operation and any position in the longitudinal direction during the pull-down operation.
Fig. 6 is a schematic diagram of displaying a prompt message according to an embodiment of the present application. In fig. 6, a drop-down operation of the user on the first multimedia resource is received, and the execution duration of the drop-down operation is not less than a first time threshold, or the distance of the drop-down operation is not less than a distance threshold, and a prompt message is displayed, where the prompt message is "loose hand enters a photo album page".
Optionally, based on the target triggering operation for the first multimedia resource being a long-press operation for the first multimedia resource, in response to the long-press operation for the first multimedia resource by the user, displaying prompt information, where the prompt information is used to instruct the user to end the long-press operation; and displaying a second page in response to the long press operation ending. The prompt information may include any content, which is not limited in this embodiment of the present application. Illustratively, the hint information is "loose hands enter the pages of the album".
Wherein, responding to the long-press operation of the user for the first multimedia resource, displaying the prompt information comprises: and responding to the long-press operation of the user on the first multimedia resource, and displaying prompt information if the execution duration of the long-press operation is not less than a second duration threshold. The second time period threshold is set based on experience, or is adjusted according to the implementation environment, which is not limited by the embodiment of the present application. The second duration threshold is, for example, 3 milliseconds.
Fig. 7 is a schematic diagram of another prompt message provided in an embodiment of the present application. In fig. 7, a long-press operation of the user on the first multimedia resource is received, and the execution duration of the long-press operation is not less than a second duration threshold, and a prompt message is displayed, wherein the prompt message is "loose hand enters a album page".
Optionally, the display position of the first multimedia resource displayed in the second page is before the display position of the second multimedia resource displayed in the second page. The size of the first multimedia asset displayed in the second page is larger than the size of the second multimedia asset displayed in the second page. Based on the fact that the second page is displayed with the second multimedia resources, the second multimedia resources are displayed in the order from high to low in quality score, namely, the display position of the second multimedia resources with high quality score is in front of the display position of the second multimedia resources with low quality score.
Fig. 8 is a schematic diagram of displaying a second page according to an embodiment of the present application. In fig. 8, there are shown a first multimedia asset 801, a second multimedia asset 1, a second multimedia asset 2, a second multimedia asset 3 and a second multimedia asset 4. The size of the first multimedia resource is larger than that of the second multimedia resource, and the display position of the first multimedia resource is before the display position of the second multimedia resource. Since the mass fraction of the second multimedia asset 1 is higher than the mass fraction of the second multimedia asset 2, the mass fraction of the second multimedia asset 2 is higher than the mass fraction of the second multimedia asset 3, and the mass fraction of the second multimedia asset 3 is higher than the mass fraction of the second multimedia asset 4. Thus, the display position of the second multimedia asset 1 is before the display position of the second multimedia asset 2, the display position of the second multimedia asset 2 is before the display position of the second multimedia asset 3, and the display position of the second multimedia asset 3 is before the display position of the second multimedia asset 4.
Optionally, a label of each multimedia resource may also be displayed in the second page, where the label of each multimedia resource is used to indicate a source of each multimedia resource. The "merchant upload" as displayed on the first multimedia asset 801 in fig. 8 is a tag of the first multimedia asset. The labels of other multimedia resources are shown in fig. 8, and will not be described in detail here.
Optionally, a first control and a second control are further displayed in the second page, the first control is used for representing a first resource category, the second control is used for representing a second resource category, the first resource category is a resource category of the first multimedia resource, and the second resource category is different from the first resource category. As in fig. 8, 802 is a first control and 803 is a second control. The display state of the first control is a first state, the display state of the second control is a second state, the first state is used for indicating that the multimedia resource displayed in the second page is the multimedia resource of the first resource category, and the second state is used for indicating that the multimedia resource displayed in the second page is not the multimedia resource of the second resource category. As shown in fig. 8, the display state of the first control being a first state means that the first control is displayed in gray, and the display state of the second control being a second state means that the second control is displayed in white. Of course, the display form of the first control in the first state and the display form of the second control in the second state may also be other forms, which are not limited in the embodiment of the present application.
In one possible implementation manner, after the second page is displayed, in response to a first trigger operation of the user on the second page, updating the multimedia resources displayed in the second page, where the resource category of the multimedia resources displayed in the updated second page is the same as the resource category of the first multimedia resources, and includes multimedia resources different from the second multimedia resources.
Wherein the first trigger operation for the second page includes a slide-up operation for the second page or a slide-down operation for the second page.
Fig. 9 is a schematic diagram of displaying an updated second page according to an embodiment of the present application. In fig. 9 there is shown a second multimedia asset 1, a second multimedia asset 2, a second multimedia asset 3, a second multimedia asset 4, a second multimedia asset 5 and a second multimedia asset 6. Wherein the second multimedia asset 5 and the second multimedia asset 6 are multimedia assets of the same asset class as the first multimedia asset, but different from the second multimedia asset.
In one possible implementation, after the second page is displayed, in response to a second trigger operation of the user on the second page, the first multimedia resource and the second multimedia resource are canceled from being displayed in the second page, and the third multimedia resource is displayed in the second page, where a resource class of the third multimedia resource is different from a resource class of the second multimedia resource.
Wherein the second trigger operation for the second page includes a left-slide operation for the second page or a right-slide operation for the second page.
Optionally, in response to a left-sliding operation of the user on the second page, the first multimedia resource and the second multimedia resource are canceled from being displayed in the second page, and the multimedia resource of the resource class indicated by the control located on the right side of the first control and adjacent to the first control is displayed in the second page. And changing the display state of the first control to a second state, and changing the display state of the control which is positioned on the right side of the first control and adjacent to the first control to a first state.
Fig. 10 is a schematic diagram of a page change after a left-sliding operation for a second page according to an embodiment of the present application. In fig. 10 (1), a second page is displayed, in which the "environment" control is displayed in gray, the "facility" control is displayed in white, and the resource category of the multimedia resource displayed in the second page is "environment". In response to the left-sliding operation for the second page, fig. 10 (2) is displayed, the categories of the multimedia resource 1 to the multimedia resource 5 displayed in fig. 10 (2) are all "facilities", and the "facilities" control is displayed in gray, and the "environment" control is displayed in "white".
The page displayed after the right-hand sliding operation for the second page is similar to the page displayed after the left-hand sliding operation for the second page, and will not be described here again.
Optionally, in response to the triggering operation of the user on the second control, the first multimedia resource and the second multimedia resource are canceled from being displayed in the second page, and the fourth multimedia resource is displayed in the second page, where the fourth multimedia resource is a multimedia resource of the second resource class.
And responding to the triggering operation of the user on the second control, changing the display state of the first control into a second state, and changing the display state of the second control into a first state.
Optionally, the second control is a "facility" control, and in response to a triggering operation of the user on the second control, the first multimedia resource and the second multimedia resource are canceled to be displayed in the second page, and the fourth multimedia resource is displayed in the second page, and the resource category of the fourth multimedia resource is "facility". The "environment" control is displayed as white and the "facilities" control is displayed as "gray".
Fig. 11 is a schematic diagram of a page displayed after a triggering operation for a second control according to an embodiment of the present application. In fig. 11, the first multimedia asset and the second multimedia asset are not shown, the fourth multimedia asset (multimedia asset 1 to multimedia asset 5) is shown, and the "environment" control is shown in white and the "facilities" control is shown in gray.
According to the method, the first multimedia resources are displayed in the first page, when the target triggering operation of the user on the first multimedia resources is responded, the first multimedia resources are displayed in the second page, and the second multimedia resources with the same resource type as the first multimedia resources are displayed in the second page, so that the number of the multimedia resources displayed in the second page is large, the richness is high, the degree of knowledge of the user on the detail pages can be further enhanced, the conversion rate of the user is promoted, and the second page can be directly displayed only by the target triggering operation of the user on the first multimedia resources, the display efficiency of the second page is improved, and the display efficiency of the multimedia resources displayed in the second page is improved.
Fig. 12 is a schematic structural diagram of a device for displaying and interacting multimedia resources according to an embodiment of the present application, where, as shown in fig. 12, the device includes:
the display module 1201 is configured to display a first page of the detail page, where a first multimedia resource is displayed;
the display module 1201 is further configured to display a second page in response to a target trigger operation of the user on the first multimedia resource, where the first multimedia resource and the second multimedia resource are displayed in the second page, and a resource class of the second multimedia resource is the same as a resource class of the first multimedia resource.
In one possible implementation, the target trigger operation for the first multimedia resource includes any one of a drop-down operation for the first multimedia resource, a click operation for the first multimedia resource, a double click operation for the first multimedia resource, or a long press operation for the first multimedia resource.
In one possible implementation, the target trigger operation for the first multimedia resource is a pull-down operation for the first multimedia resource;
the display module 1201 is configured to respond to a pull-down operation of the user on the first multimedia resource, and display prompt information, where the prompt information is used to instruct the user to end the pull-down operation; and displaying the second page in response to the end of the pull-down operation.
In one possible implementation, the apparatus further includes:
and an updating module 1202, configured to update, in response to a first trigger operation of the user on the second page, multimedia resources displayed in the second page, where a resource class of the multimedia resources displayed in the updated second page is the same as a resource class of the first multimedia resources, and includes multimedia resources different from the second multimedia resources.
In one possible implementation, the first trigger operation for the second page includes a slide-up operation for the second page or a slide-down operation for the second page.
In one possible implementation, the display module 1201 is configured to cancel displaying the first multimedia resource and the second multimedia resource in the second page in response to a second trigger operation of the user on the second page, and display a third multimedia resource in the second page, where a resource class of the third multimedia resource is different from a resource class of the second multimedia resource.
In one possible implementation, the second trigger operation for the second page includes a left-slide operation for the second page or a right-slide operation for the second page.
In one possible implementation manner, a first control and a second control are further displayed in the second page, the first control is used for representing a first resource category, the second control is used for representing a second resource category, the first resource category is a resource category of the first multimedia resource, and the second resource category is different from the first resource category;
The display module 1201 is further configured to cancel displaying the first multimedia resource and the second multimedia resource in the second page in response to a triggering operation of the user on the second control, and display a fourth multimedia resource in the second page, where the fourth multimedia resource is a multimedia resource of the second resource class.
According to the device, the first multimedia resources are displayed in the first page, when the target triggering operation of the user on the first multimedia resources is responded, the first multimedia resources are displayed in the second page, and the second multimedia resources with the same resource type as the first multimedia resources are displayed in the second page, so that the number of the multimedia resources displayed in the second page is large, the richness is high, the degree of understanding of the user on the detail pages can be further enhanced, the conversion rate of the user is promoted, in addition, the second page can be directly displayed only by the target triggering operation of the user on the first multimedia resources, the display efficiency of the second page is improved, and the display efficiency of the multimedia resources displayed in the second page is improved.
It should be understood that, in implementing the functions of the apparatus provided above, only the division of the above functional modules is illustrated, and in practical application, the above functional allocation may be implemented by different functional modules, that is, the internal structure of the device is divided into different functional modules, so as to implement all or part of the functions described above. In addition, the apparatus and the method embodiments provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the apparatus and the method embodiments are detailed in the method embodiments and are not repeated herein.
Fig. 13 shows a block diagram of a terminal apparatus 1300 according to an exemplary embodiment of the present application. The terminal apparatus 1300 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal device 1300 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, and the like.
In general, the terminal apparatus 1300 includes: a processor 1301, and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. Processor 1301 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). Processor 1301 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, processor 1301 may integrate a GPU (Graphics Processing Unit, image processor) for taking care of rendering and rendering of content that the display screen needs to display. In some embodiments, the processor 1301 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1302 may include one or more computer-readable storage media, which may be non-transitory. Memory 1302 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1302 is used to store at least one instruction for execution by processor 1301 to implement the method of displaying and interacting with a multimedia asset provided by an embodiment of the method of the present application.
In some embodiments, the terminal device 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. The processor 1301, the memory 1302, and the peripheral interface 1303 may be connected by a bus or signal lines. The respective peripheral devices may be connected to the peripheral device interface 1303 through a bus, a signal line, or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, a display screen 1305, a camera assembly 1306, audio circuitry 1307, and a power supply 1309.
A peripheral interface 1303 may be used to connect I/O (Input/Output) related at least one peripheral to the processor 1301 and the memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1301, the memory 1302, and the peripheral interface 1303 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1304 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts an electrical signal to an electromagnetic signal for transmission, or converts a received electromagnetic signal to an electrical signal. Optionally, the radio frequency circuit 1304 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 1304 may communicate with other terminal devices via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 1304 may also include NFC (Near Field Communication ) related circuits, which the present application is not limited to.
The display screen 1305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1305 is a touch display, the display 1305 also has the ability to capture touch signals at or above the surface of the display 1305. The touch signal may be input to the processor 1301 as a control signal for processing. At this point, the display 1305 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 1305 may be one and disposed on the front panel of the terminal apparatus 1300; in other embodiments, the display 1305 may be at least two, disposed on different surfaces of the terminal apparatus 1300 or in a folded design; in other embodiments, the display 1305 may be a flexible display disposed on a curved surface or a folded surface of the terminal apparatus 1300. Even more, the display screen 1305 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display screen 1305 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1306 is used to capture images or video. Optionally, camera assembly 1306 includes a front camera and a rear camera. In general, a front camera is provided at a front panel of the terminal apparatus 1300, and a rear camera is provided at a rear surface of the terminal apparatus 1300. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for voice communication. For purposes of stereo acquisition or noise reduction, a plurality of microphones may be respectively disposed at different portions of the terminal apparatus 1300. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is then used to convert electrical signals from the processor 1301 or the radio frequency circuit 1304 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 1307 may also comprise a headphone jack.
A power supply 1309 is used to power the various components in the terminal apparatus 1300. The power supply 1309 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1309 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal device 1300 also includes one or more sensors 1310. The one or more sensors 1310 include, but are not limited to: acceleration sensor 1311, gyroscope sensor 1312, pressure sensor 1313, optical sensor 1315, and proximity sensor 1316.
The acceleration sensor 1311 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal apparatus 1300. For example, the acceleration sensor 1311 may be used to detect components of gravitational acceleration in three coordinate axes. Processor 1301 may control display screen 1305 to display a user interface in either a landscape view or a portrait view based on gravitational acceleration signals acquired by acceleration sensor 1311. The acceleration sensor 1311 may also be used for the acquisition of motion data of a game or user.
The gyro sensor 1312 may detect a body direction and a rotation angle of the terminal device 1300, and the gyro sensor 1312 may collect a 3D motion of the user on the terminal device 1300 in cooperation with the acceleration sensor 1311. Processor 1301 can implement the following functions based on the data collected by gyro sensor 1312: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 1313 may be disposed on a side frame of the terminal device 1300 and/or on a lower layer of the display screen 1305. When the pressure sensor 1313 is provided at a side frame of the terminal apparatus 1300, a grip signal of the terminal apparatus 1300 by a user may be detected, and the processor 1301 performs left-right hand recognition or quick operation according to the grip signal collected by the pressure sensor 1313. When the pressure sensor 1313 is disposed at the lower layer of the display screen 1305, the processor 1301 realizes control of the operability control on the UI interface according to the pressure operation of the user on the display screen 1305. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 1315 is used to collect ambient light intensity. In one embodiment, processor 1301 may control the display brightness of display screen 1305 based on the intensity of ambient light collected by optical sensor 1315. Specifically, when the intensity of the ambient light is high, the display brightness of the display screen 1305 is turned up; when the ambient light intensity is low, the display brightness of the display screen 1305 is turned down. In another embodiment, processor 1301 may also dynamically adjust the shooting parameters of camera assembly 1306 based on the intensity of ambient light collected by optical sensor 1315.
A proximity sensor 1316, also referred to as a distance sensor, is typically provided on the front panel of the terminal device 1300. The proximity sensor 1316 is used to collect the distance between the user and the front face of the terminal device 1300. In one embodiment, when proximity sensor 1316 detects a gradual decrease in the distance between the user and the front face of terminal device 1300, processor 1301 controls display screen 1305 to switch from a bright screen state to a inactive screen state; when the proximity sensor 1316 detects that the distance between the user and the front surface of the terminal apparatus 1300 gradually increases, the display screen 1305 is controlled by the processor 1301 to switch from the off-screen state to the on-screen state.
It will be appreciated by those skilled in the art that the structure shown in fig. 13 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
Fig. 14 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 1400 may have a relatively large difference due to different configurations or performances, and may include one or more processors (Central Processing Units, CPU) 1401 and one or more memories 1402, where at least one program code is stored in the one or more memories 1402, and the at least one program code is loaded and executed by the one or more processors 1401 to implement the method for displaying and interacting multimedia resources provided by the foregoing method embodiments. Of course, the server 1400 may also have a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server 1400 may also include other components for implementing the functions of the device, which are not described herein.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein at least one program code loaded and executed by a processor to cause a computer to implement a method of displaying and interacting with any of the multimedia assets described above.
Alternatively, the above-mentioned computer readable storage medium may be a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Read-Only optical disk (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program or a computer program product having at least one computer instruction stored therein, the at least one computer instruction being loaded and executed by a processor to cause the computer to implement a method of displaying and interacting with any of the multimedia assets described above is also provided.
It should be noted that, the information (including but not limited to user equipment information, user personal information, etc.), data (including but not limited to data for analysis, stored data, presented data, etc.), and signals related to the present application are all authorized by the user or are fully authorized by the parties, and the collection, use, and processing of the related data is required to comply with the relevant laws and regulations and standards of the relevant countries and regions. For example, merchant information is obtained under the condition of full authorization.
It should be understood that references herein to "a plurality" are to two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
The above embodiments are merely exemplary embodiments of the present application and are not intended to limit the present application, any modifications, equivalent substitutions, improvements, etc. that fall within the principles of the present application should be included in the scope of the present application.

Claims (10)

1. A method for displaying and interacting a multimedia resource, the method comprising:
displaying a first page of the detail page, wherein a first multimedia resource is displayed in the first page;
and responding to target triggering operation of the user on the first multimedia resource, displaying a second page, wherein the first multimedia resource and the second multimedia resource are displayed in the second page, and the resource category of the second multimedia resource is the same as that of the first multimedia resource.
2. The method of claim 1, wherein the target trigger operation for the first multimedia resource comprises any one of a drop down operation for the first multimedia resource, a click operation for the first multimedia resource, a double click operation for the first multimedia resource, or a long press operation for the first multimedia resource.
3. The method of claim 1, wherein the target trigger operation for the first multimedia asset is a pull-down operation for the first multimedia asset;
the responding to the target triggering operation of the user for the first multimedia resource displays a second page, comprising:
responding to the pull-down operation of the user on the first multimedia resource, and displaying prompt information, wherein the prompt information is used for indicating the user to finish the pull-down operation;
and displaying the second page in response to the end of the pull-down operation.
4. A method according to any one of claims 1 to 3, wherein after the second page is displayed, the method further comprises:
and responding to a first triggering operation of a user for the second page, updating the multimedia resources displayed in the second page, wherein the resource category of the multimedia resources displayed in the updated second page is the same as that of the first multimedia resources, and the multimedia resources are different from the second multimedia resources.
5. The method of claim 4, wherein the first trigger operation for the second page comprises a slide up operation for the second page or a slide down operation for the second page.
6. A method according to any one of claims 1 to 3, wherein after the second page is displayed, the method further comprises:
and in response to a second triggering operation of the user on the second page, canceling the display of the first multimedia resource and the second multimedia resource in the second page, and displaying a third multimedia resource in the second page, wherein the resource category of the third multimedia resource is different from that of the second multimedia resource.
7. The method of claim 6, wherein the second trigger operation for the second page comprises a left-slide operation for the second page or a right-slide operation for the second page.
8. A method according to any one of claims 1 to 3, wherein a first control and a second control are further displayed in the second page, the first control being used for representing a first resource category, the second control being used for representing a second resource category, the first resource category being a resource category of the first multimedia resource, the second resource category being different from the first resource category;
The method further comprises the steps of:
and in response to the triggering operation of the user on the second control, canceling the display of the first multimedia resource and the second multimedia resource in the second page, and displaying a fourth multimedia resource in the second page, wherein the fourth multimedia resource is the multimedia resource of the second resource category.
9. A device for displaying and interacting multimedia resources, said device comprising:
the display module is used for displaying a first page of the detail page, and a first multimedia resource is displayed in the first page;
the display module is further configured to display a second page in response to a target triggering operation of the user on the first multimedia resource, where the first multimedia resource and the second multimedia resource are displayed in the second page, and a resource class of the second multimedia resource is the same as a resource class of the first multimedia resource.
10. A computer device, characterized in that it comprises a processor and a memory, in which at least one program code is stored, which is loaded and executed by the processor, to cause the computer device to implement the method of displaying and interacting of multimedia resources according to any of claims 1 to 8.
CN202311071160.0A 2023-08-23 2023-08-23 Method, device and equipment for displaying and interacting multimedia resources Pending CN117055788A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311071160.0A CN117055788A (en) 2023-08-23 2023-08-23 Method, device and equipment for displaying and interacting multimedia resources

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311071160.0A CN117055788A (en) 2023-08-23 2023-08-23 Method, device and equipment for displaying and interacting multimedia resources

Publications (1)

Publication Number Publication Date
CN117055788A true CN117055788A (en) 2023-11-14

Family

ID=88653301

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311071160.0A Pending CN117055788A (en) 2023-08-23 2023-08-23 Method, device and equipment for displaying and interacting multimedia resources

Country Status (1)

Country Link
CN (1) CN117055788A (en)

Similar Documents

Publication Publication Date Title
CN109948581B (en) Image-text rendering method, device, equipment and readable storage medium
CN111368114B (en) Information display method, device, equipment and storage medium
CN110929159B (en) Resource release method, device, equipment and medium
CN111126958B (en) Schedule creation method, schedule creation device, schedule creation equipment and storage medium
CN110045958B (en) Texture data generation method, device, storage medium and equipment
CN112860046A (en) Method, apparatus, electronic device and medium for selecting operation mode
CN111694535B (en) Alarm clock information display method and device
CN113535039B (en) Method and device for updating page, electronic equipment and computer readable storage medium
CN112230822B (en) Comment information display method and device, terminal and storage medium
CN110852093A (en) Text information generation method and device, computer equipment and storage medium
CN112732133B (en) Message processing method and device, electronic equipment and storage medium
CN117055788A (en) Method, device and equipment for displaying and interacting multimedia resources
CN115379274B (en) Picture-based interaction method and device, electronic equipment and storage medium
CN110781343B (en) Method, device, equipment and storage medium for displaying comment information of music
CN115841181B (en) Residual oil distribution prediction method, device, equipment and storage medium
CN113220203B (en) Activity entry display method, device, terminal and storage medium
CN116820318A (en) Content display method, device, equipment and computer readable storage medium
CN117336560A (en) Method, device and equipment for acquiring redemption ticket
CN117539371A (en) Text content display method, apparatus, device and computer readable storage medium
CN116934319A (en) Resource transfer method
CN117942570A (en) Virtual object interaction method, device, equipment and computer readable storage medium
CN117354596A (en) Redemption ticket acquisition method, device, equipment and computer readable storage medium
CN117763232A (en) Resource recommendation method, device, equipment and computer readable storage medium
CN117354571A (en) Live broadcast room management method and equipment
CN116595240A (en) Media content release method, media content display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination