CN113626127A - Content item display method, device, terminal and storage medium - Google Patents

Content item display method, device, terminal and storage medium Download PDF

Info

Publication number
CN113626127A
CN113626127A CN202110959710.7A CN202110959710A CN113626127A CN 113626127 A CN113626127 A CN 113626127A CN 202110959710 A CN202110959710 A CN 202110959710A CN 113626127 A CN113626127 A CN 113626127A
Authority
CN
China
Prior art keywords
interface
content item
target
controlling
interface element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110959710.7A
Other languages
Chinese (zh)
Inventor
陈文华
何嘉霖
黎锦昌
王景韩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Kugou Computer Technology Co Ltd
Original Assignee
Guangzhou Kugou Computer Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Kugou Computer Technology Co Ltd filed Critical Guangzhou Kugou Computer Technology Co Ltd
Priority to CN202110959710.7A priority Critical patent/CN113626127A/en
Publication of CN113626127A publication Critical patent/CN113626127A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/70Information retrieval; Database structures therefor; File system structures therefor of video data
    • G06F16/74Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0277Online advertisement

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Strategic Management (AREA)
  • Multimedia (AREA)
  • Finance (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Databases & Information Systems (AREA)
  • Game Theory and Decision Science (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a content item display method, a content item display device, a terminal and a storage medium, and belongs to the technical field of internet. The method comprises the following steps: displaying a target interface, the target interface comprising interface elements; under the condition that a content item display condition is met, acquiring a content item to be recommended, wherein the content item at least comprises a video of an object to be recommended; controlling interface elements in the target interface to disappear; playing a video in the content item in the target interface after the interface element disappears. The scheme greatly improves the attention of the user to the content item, thereby improving the display effect of the content item.

Description

Content item display method, device, terminal and storage medium
Technical Field
The present application relates to the field of internet technologies, and in particular, to a method, an apparatus, a terminal, and a storage medium for displaying content items.
Background
With the continuous development of internet technology, advertisers can launch advertisements on various platforms such as application clients and websites, so that users can browse the advertisements on the relevant platforms, and the exposure rate of the advertisements is increased.
However, typically, the advertisement is presented in the lower right hand corner of the interface viewed by the user, and the user typically does not notice the advertisement while viewing the interface. Therefore, the advertisement has a poor display effect and cannot increase the exposure rate.
Disclosure of Invention
The embodiment of the application provides a content item display method, a content item display device, a terminal and a storage medium, which can improve the attention of a user to a content item and improve the display effect of the content item. The technical scheme is as follows:
in one aspect, a method for presenting a content item is provided, the method comprising:
displaying a target interface, the target interface comprising interface elements;
under the condition that a content item display condition is met, acquiring a content item to be recommended, wherein the content item at least comprises a video of an object to be recommended;
controlling interface elements in the target interface to disappear;
playing a video in the content item in the target interface after the interface element disappears.
In one aspect, there is provided a content item presentation apparatus, the apparatus comprising:
a display module to display a target interface, the target interface including interface elements;
the device comprises an acquisition module, a recommendation module and a recommendation module, wherein the acquisition module is used for acquiring a content item to be recommended under the condition that a content item display condition is met, and the content item at least comprises a video of an object to be recommended;
the display module is further used for controlling interface elements in the target interface to disappear;
the display module is further configured to play the video in the content item in the target interface after the interface element disappears.
In one possible implementation, the display module includes:
the obtaining unit is used for obtaining a disappearing special effect parameter corresponding to the display mode based on the display mode of the interface element;
and the display unit is used for controlling the interface element to disappear based on the acquired disappearing special effect parameter.
In one possible implementation, the display unit is configured to perform any one of the following steps:
the display mode of the interface element is a first display mode, the disappearing special effect parameter corresponding to the first display mode is a first disappearing special effect parameter, the interface element is controlled to move out of the target interface based on the first disappearing special effect parameter, and the transparency of the interface element is controlled to be gradually reduced to 0 in the process of moving out of the target interface;
the display mode of the interface element is a second display mode, the disappearing special effect parameter corresponding to the second display mode is a second disappearing special effect parameter, the interface element is controlled to be gradually reduced until disappearing based on the second disappearing special effect parameter, and the transparency of the interface element is controlled to be reduced to 0 in the reducing process.
In one possible implementation, the display module is configured to perform at least one of:
controlling the interface element to move out of the target interface;
controlling the transparency of the interface element to gradually decrease until the transparency is 0;
and controlling the interface element to gradually shrink until the interface element disappears.
In one possible implementation manner, the display module is further configured to control the interface element to be restored into the target interface.
In one possible implementation, the display module includes:
the acquisition unit is used for acquiring a special effect recovery parameter corresponding to a display mode based on the display mode of the interface element;
and the display unit is used for controlling the interface element to be restored into the target interface based on the acquired special effect restoration parameters.
In one possible implementation manner, the display unit is configured to perform any one of the following steps:
the display mode of the interface element is a first display mode, the recovery special effect parameter corresponding to the first display mode is a first recovery special effect parameter, the interface element is controlled to move back to the target interface based on the first recovery special effect parameter, and the transparency of the interface element is controlled to be increased to 1 in the process of moving back to the target interface;
the display mode of the interface element is a second display mode, the recovery special effect parameter corresponding to the second display mode is a second recovery special effect parameter, the interface element is controlled to be gradually enlarged to a target size based on the second recovery special effect parameter, in the enlargement process, the transparency of the interface element is controlled to be increased to 1, and the target size is the initial size of the interface element in the target interface.
In one possible implementation, the display module is further configured to perform at least one of:
controlling the interface element to move from a current position to an initial position of the interface element in the target interface;
controlling the interface element to be enlarged from the current size to the initial size of the interface element in the target interface;
and controlling the interface element to increase from the current transparency to 1.
In a possible implementation manner, the display module is further configured to display a last video frame of the video after the video is played; and controlling the video frame to move from the current position to the target position.
In a possible implementation manner, the display module is configured to control the video frame to move from the current position to the target position, and control the video frame to gradually shrink during the moving process.
In a possible implementation manner, the display module is configured to control the video frame to move from the current position to the target position, and in the moving process, control the video frame to gradually shrink until disappearing.
In one possible implementation, the content item further includes descriptive information of the object; the display module is further used for controlling the video frames to disappear; and displaying the description information in the content item in the area to which the target position belongs.
In one possible implementation, the content item further includes a logo image of the object; the display module is further used for controlling the video frames to disappear; displaying a logo image in the content item at the target location.
In one possible implementation manner, the display module is configured to perform any one of the following steps:
controlling the mark image to disappear after popping up on the target position;
and controlling the mark image to be restored after being popped up at the target position.
In one possible implementation manner, the obtaining module is configured to asynchronously request to obtain the content item in response to a trigger operation on the target interface, and load the content item to the local; the content item is retrieved from the local.
In one possible implementation, the content item presentation condition includes at least one of:
the data type of the content item indicates that the presentation style of the content item is a target presentation style;
the current time is the presentation time indicated by the time configuration information of the content item, and the time configuration information includes the duration of waiting of the content item before presentation.
In one possible implementation, the interface element includes a view element and a tool element; and the display module is used for controlling the view elements in the target interface to disappear and keeping the tool elements in the target interface unchanged.
In one possible implementation, the apparatus further includes:
and the updating module is used for updating the state of the tool element to be in a non-triggerable state, and the tool element in the non-triggerable state does not respond to the trigger operation.
In a possible implementation manner, the update module is further configured to update the state of the tool element to a triggerable state after the video playing is completed, and the tool element in the triggerable state responds to a triggering operation.
In a possible implementation manner, the display module is configured to play the video in the target interface if a video playing condition is met;
the content item presentation condition includes at least one of:
the content item has been acquired;
the target interface is a top-level view, and the top-level view is a view presented to a user by the equipment;
tool elements in the target interface are in a non-triggerable state;
the target interface is inhibited from responding to a target gesture, the target gesture being a gesture to exit the target interface.
In one possible implementation, the target interface is an audio playing interface of any song; the display module is further configured to perform any one of the following steps:
pausing the playing of the song and playing the video in the target interface;
and playing the video in the target interface while playing the song.
In one aspect, a terminal is provided that includes one or more processors and one or more memories having stored therein at least one program code that is loaded by the one or more processors and executed to implement the operations performed by the content item presentation method according to any one of the possible implementations described above.
In one aspect, a computer-readable storage medium is provided, in which at least one program code is stored, which is loaded and executed by a processor to perform operations performed to implement the content item presentation method according to any one of the above possible implementations.
In one aspect, there is provided a computer program or computer program product comprising: computer program code which, when executed by a terminal, causes the terminal to carry out operations performed by a content item presentation method as in any one of the possible implementations described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
according to the content item display method, the content item display device, the terminal and the storage medium, when the content item is displayed in the target interface, the interface element in the target interface is controlled to disappear, and then the video in the content item is played, so that the attention of a user to the content item is greatly improved, and the display effect of the content item is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic illustration of an implementation environment provided by an embodiment of the present application;
FIG. 2 is a flow chart of a method for presenting content items provided by an embodiment of the present application;
FIG. 3 is a flow chart of a method for presenting content items provided by an embodiment of the present application;
FIG. 4 is a schematic diagram of a target interface provided by an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating the disappearance of an interface element provided by an embodiment of the present application;
FIG. 6 is a schematic diagram illustrating a video for playing an object to be recommended in a target interface according to an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating a video for playing an object to be recommended in a target interface according to an embodiment of the present application;
FIG. 8 is a schematic diagram illustrating moving a video frame to a target position according to an embodiment of the present disclosure;
FIG. 9 is a diagram illustrating description information provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of a content item presentation apparatus according to an embodiment of the present application;
FIG. 11 is a schematic diagram of another content item presentation apparatus according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of a terminal provided in an embodiment of the present application;
fig. 13 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It will be understood that the terms "first," "second," and the like as used herein may be used herein to describe various concepts, which are not limited by these terms unless otherwise specified. These terms are only used to distinguish one concept from another. For example, the first display mode may be referred to as a second display mode, and similarly, the second display mode may be referred to as the first display mode without departing from the scope of the present application.
As used herein, the terms "at least one," "a plurality," "each," "any," and at least one includes one, two, or more, and a plurality includes two or more, and each refers to each of the corresponding plurality, and any refers to any one of the plurality, for example, the plurality of interface elements includes 3 interface elements, and each refers to each of the 3 interface elements, and any refers to any one of the 3 interface elements, which may be the first, the second, or the third.
The content item display method provided by the embodiment of the application is executed by the terminal. Alternatively, the terminal may be a computer, a mobile phone, a tablet computer or other types of terminals.
Fig. 1 is a schematic diagram of an implementation environment provided by an embodiment of the present application, and as shown in fig. 1, the implementation environment includes a terminal 101 and a server 102. The terminal 101 and the server 102 are connected by a wireless or wired network.
The terminal 101 has installed thereon a target application served by the server 102, through which the terminal 101 can implement functions such as data transmission, message interaction, and the like. Optionally, the terminal 101 is a computer, a mobile phone, a tablet computer, or other terminal. Optionally, the target application is a target application in an operating system of the terminal 101, or a target application provided by a third party. For example, the target application is a content sharing application, and the content sharing application has a content sharing function, but the content sharing application can also have other functions, such as a comment function, a shopping function, a navigation function, a game function, and the like. Optionally, the server 102 is a background server of the target application or a cloud server providing services such as cloud computing and cloud storage.
Optionally, the terminal 101 downloads a content item to be recommended from the server 102 to the local, acquires the content item from the local if a content item display condition is met, controls an interface element in the current interface to disappear, and plays a video in the content item in the current interface after the interface element disappears.
The content item presentation method provided by the embodiment of the application can be applied to any scene recommended by the content item.
For example, when a user opens an audio playing interface of a song, if the content item display method provided by the embodiment of the present application is adopted, the content related to the song in the audio playing interface disappears, and at this time, a video of an object to be recommended in a content item is played, so that the user pays attention to the video, and thus, the display effect of the content item is improved.
Fig. 2 is a flowchart of a content item presentation method according to an embodiment of the present application. The embodiment of the present application takes an execution subject as an example for exemplary explanation, and the embodiment includes:
201. the terminal displays a target interface, which includes interface elements.
Wherein the target interface may be any interface. Optionally, the terminal is installed with a target application, and the target interface is any interface provided by the target application. For example, the target interface is a home page of the target application; as another example, the target interface provides a function page for the target application, and so on. Optionally, the target interface is an interface provided by a website browsed by the user. Optionally, the target interface is a designated interface. For example, an audio playing application is installed on the terminal, and the target interface is an audio playing interface provided by the audio playing application. The target interface is not limited in the embodiments of the present application.
Optionally, the terminal displays the target interface, including: and the terminal responds to the triggering operation of the interface entrance of the target interface and displays the target interface.
An interface element is any element displayed in the target interface. For example, the interface element is a view element in the target interface; as another example, the interface element is a tool element in the target interface. Wherein, the view element is an element for showing the view content. Optionally, the view element is a visual view control, and the visual view control displays the related view therein. Taking a song playing interface with a target interface as a song as an example, a cover of the song is displayed in the visual control; alternatively, lyrics of a song or the like are displayed in the visualization control.
202. And the terminal acquires the content item to be recommended under the condition that the content item display condition is met, wherein the content item at least comprises the video of the object to be recommended.
In the embodiment of the application, in the process of browsing the target interface by the user, the terminal plays the video of the object to be recommended in the content item in the target interface. In order to clarify the presentation timing of the content item, a content item presentation condition may be set, and in the case where the content item presentation condition is satisfied, the content item to be recommended is acquired and presented. The content item may be data such as an advertisement, and the content item is not limited in this embodiment of the application.
203. And the terminal controls the interface elements in the target interface to disappear.
In order to enable a user to pay attention to a content item when the terminal displays the content item, the embodiment of the application controls interface elements in a target interface to disappear.
204. After the interface element disappears, the video in the content item is played in the target interface.
Because the interface elements in the target interface are disappeared, when the video of the object to be recommended is played in the target interface, the user can pay attention to the video of the object to be recommended.
According to the content item display method provided by the embodiment of the application, when the content item is displayed in the target interface, the interface element in the target interface is controlled to disappear, and then the video in the content item is played, so that the attention of a user to the content item is greatly improved, and the display effect of the content item is improved.
Fig. 3 is a flowchart of a content item presentation method according to an embodiment of the present application. The embodiment of the present application takes an execution subject as an example for exemplary explanation, and the embodiment includes:
301. the terminal displays a target interface, which includes interface elements.
The step 301 is similar to the step 201, and is not described in detail herein.
302. And the terminal acquires the content item to be recommended under the condition of meeting the content item display condition, wherein the content item comprises the video and the description information of the object to be recommended.
The content item presentation condition is a condition that is satisfied by presenting a content item to be recommended, and it can be considered that the terminal acquires the content item to be recommended only when the content item presentation condition is satisfied, and then presents the content item.
In one possible implementation, the terminal presents the content item according to a presentation style of the content item when presenting the content item. For example, the presentation style of the content item is static presentation in the lower right corner, and when the terminal presents the content item, the terminal statically displays an image of an object to be recommended in the content item in the lower right corner. If the presentation style of the content item is full-screen playing, the terminal plays the video of the object to be recommended in the content item in full-screen when the content item is presented.
In this embodiment of the application, the data type of the content item is used to indicate the presentation style of the content item, before presenting the content item, the terminal may determine whether the presentation style of the content item is the target presentation style according to the data type of the content item, and acquire the content item when the data type of the content item indicates that the presentation style of the content item is the target presentation style. That is, the content item presentation condition includes that the data type of the content item indicates that the presentation style of the content item is the target presentation style. The target presentation style is a presentation style for presenting the content item to be recommended in the embodiment of the application. For example, the target presentation style is a video playing style, a linkage presentation style, and the like.
In another possible implementation, the content item to be recommended is obtained and presented by defining a presentation time of the content item, when the presentation time is reached. Optionally, the content item is configured with time profile information comprising a length of time the content item was waiting before being presented. And when the current moment is the display moment indicated by the time configuration information of the content item, acquiring the content item to be recommended and displaying the content item. That is, the content display condition includes a presentation time indicated by the time configuration information of the content item at the current time.
It should be noted that, in the embodiment of the present application, the content item presentation condition is exemplarily described only by taking the data type or the time configuration information as an example, and the content item presentation condition is not limited. In one possible implementation, the content item presentation condition includes: the data type of the content item indicates that the presentation style of the content item is the target presentation style and that the current time is the presentation time indicated by the temporal configuration information of the content item.
In another possible implementation manner, the terminal plays a video of the object to be recommended in the target interface once every first time length. Thus, the content item presentation condition includes a duration of time from entering the target interface or a duration of time up to a first duration of time from a last time the video in the content item was played in the target interface.
The data format of the content item can be various, and the presentation style of the content item in different data formats is different. For example, the data form of the content item may be a picture form or a video form. Wherein the presentation style of the content item in the form of a picture and the content item in the form of a video may be different. In the embodiment of the present application, a video of an object to be recommended may be played in a target interface, and if a content item includes only an image, the playing of the video cannot be realized, so that in the embodiment of the present application, whether to acquire the content item to be recommended may be determined according to a data format of the content item, and the content item is displayed according to the display method shown in the embodiment of the present application. Optionally, the content item presentation condition includes that the data form of the content item is a target data form.
The content item in step 302 may be obtained by the terminal from the server, or may be obtained by the terminal. The description will be given taking as an example that the content item is acquired by the terminal from the server. In step 302, the obtaining of the content item to be recommended may be triggering the terminal to load the content item from the server when the content item display condition is satisfied; or when the user enters the target interface, the terminal is triggered to load the content item from the server, the loaded content item is stored locally, and the content item is acquired locally under the condition that the content item display condition is met. For example, the method further comprises: and responding to the triggering operation of the target interface, asynchronously requesting to acquire the content item, and loading the content item to the local. The obtaining of the content item to be recommended in step 302 includes: the content item is retrieved from the local.
For example, after a user enters an audio playing interface of a certain song, a terminal requests a server for a content item, and the terminal receives a download link, a data type and time configuration information of the content item sent by the server; the terminal carries out asynchronous loading according to a download link issued by the server, and the terminal stores the loaded link in a user mobile phone disk in a file form; if the data type sent by the server indicates that the display style of the content item is the target display style, setting a delay response event according to the time configuration information sent by the server, and displaying the content item after the waiting time is up.
The time configuration information can be issued by the server when the terminal requests the server to issue the content item; the time configuration information may also be the configuration information sent by the server when the target application in the terminal is started, that is, the time configuration information may be the configuration information sent by the server when the terminal is started. The embodiment of the present application does not limit the manner of acquiring the time configuration information.
303. And the terminal controls the interface elements in the target interface to disappear.
In the embodiment of the application, in order that a user can better pay attention to a content item when the content item is displayed and the display effect of the content item is improved, interface elements in a target interface can be controlled to disappear first, and then the content item is displayed.
When the interface elements in the control target interface disappear, the terminal can control the interface elements in the control target interface to disappear directly or gradually by adopting any special effect control interface element. The embodiment of the application exemplifies the process of interface element disappearance in the terminal control target interface in the following three disappearance manners.
In one possible implementation, controlling the interface element in the target interface to disappear comprises at least one of:
(1) and controlling the interface element to move out of the target interface.
When the interface element is controlled to move out of the target interface, the interface element can be controlled to move in a certain direction until the interface element moves out of the target interface. The direction may be any direction, for example, a direction directly below, a direction leftward, a direction rightward, a direction leftward, a direction rightward, and the like.
In one possible implementation, the terminal needs to control the movement of multiple interface elements out of the target interface. The terminal can control the plurality of interface elements to move in the same direction, and can also control the plurality of interface elements to move in different directions. For example, the terminal controls the plurality of interface elements to move towards the right lower side until the plurality of interface elements all move out of the target interface. For another example, the terminal controls an interface element close to the lower side of the interface to move downwards, controls an interface element close to the upper side of the interface to move upwards, controls an interface element close to the left side of the interface to move leftwards, controls an interface element close to the right side of the interface to move rightwards, and the like.
In one possible implementation manner, the terminal acquires the special effect parameters and controls the interface elements according to the acquired special effect parameters. Wherein, the terminal controls the interface element to move out of the target interface, including: obtaining special effect parameters of the interface element, wherein the special effect parameters comprise a moving track, the moving track is a track for moving out of a target interface from the current position of the interface element, and the interface element is controlled to move according to the moving track; or obtaining special effect parameters of the interface element, wherein the special effect parameters comprise a moving direction, and controlling the interface element to move along the moving direction until the interface is moved out of the target interface.
Optionally, the special effect parameter further includes a moving speed, and the terminal controls the interface element to move out of the target interface according to the moving speed. The embodiment of the present application does not limit the special effect parameters.
It should be noted that, in the embodiment of the present application, when the control interface element moves out of the target interface, other special effects may also be superimposed, for example, the terminal control interface element moves out of the target interface, and in the process of moving out of the target interface, the interface element is controlled to rotate. The embodiment of the present application only takes a rotation effect as an example, and exemplifies the superimposed effect, and does not limit the superimposed effect.
(2) The transparency of the interface element is controlled to gradually decrease until the transparency is 0.
Wherein the transparency of the interface element is 0, which means that the interface element is completely transparent and invisible; a transparency of the interface element of 1 indicates that the interface element is completely opaque and visible. And controlling the transparency of the interface element to gradually decrease until the transparency is 0, namely controlling the interface element to gradually be transparent until the interface element is completely transparent.
In a possible implementation manner, the terminal may obtain the special effect parameters, and control the interface elements according to the obtained special effect parameters. Wherein, controlling the transparency of the interface element to gradually decrease until the transparency is 0 comprises: the method comprises the steps of obtaining special effect parameters of the interface elements, wherein the special effect parameters comprise transparencies corresponding to a plurality of moments, the transparency corresponding to the last moment is 0, and controlling the transparencies of the interface elements to change according to the transparencies corresponding to the moments.
(3) And controlling the interface element to gradually shrink until the interface element disappears.
In one possible implementation manner, the terminal acquires the special effect parameters and controls the interface elements according to the acquired special effect parameters. Wherein, controlling the interface element to gradually shrink until disappearing comprises: the method comprises the steps of obtaining special effect parameters of the interface element, wherein the special effect parameters comprise sizes corresponding to a plurality of moments, the size corresponding to the last moment is 0, and controlling the size of the interface element to change according to the sizes corresponding to the moments.
It should be noted that, in the embodiment of the present application, the method for controlling the interface elements to gradually disappear is only exemplified, and the manner of disappearing the interface elements is not limited.
In one possible implementation, the interface element has multiple display modes, the display modes of the interface element are different, and the way in which the interface element disappears is also different. Wherein, controlling the interface elements in the target interface to disappear comprises: acquiring a disappearing special effect parameter corresponding to the display mode based on the display mode of the interface element; and controlling the interface element to disappear based on the acquired disappearance special effect parameter. The disappearing special effect parameter is a special effect parameter used for controlling the disappearance of the interface element. Different disappearing special effect parameters can be randomly combined and paired with the display mode, and the embodiment of the application does not limit the method.
In the embodiment of the present application, a manner of controlling the interface element to disappear according to the display mode is exemplarily given. Optionally, controlling the interface element to disappear based on the acquired disappearing special effect parameter, including any one of the following steps: the display mode of the interface element is a first display mode, the disappearing special effect parameter corresponding to the first display mode is a first disappearing special effect parameter, the interface element is controlled to move out of the target interface based on the first disappearing special effect parameter, and the transparency of the interface element is controlled to be reduced to 0 in the process of moving out of the target interface; the display mode of the interface element is a second display mode, the disappearing special effect parameter corresponding to the second display mode is a second disappearing special effect parameter, the interface element is controlled to be gradually reduced until disappearing based on the second disappearing special effect parameter, and the transparency of the interface element is controlled to be reduced to 0 in the reducing process.
For example, taking the target interface as an audio playing interface as an example, the display modes of the interface elements include a cover mode, a square cover mode, a circular rotation mode and a singer's handwriting mode. In the cover mode, the target interface comprises a full-screen song cover view, the terminal can control the song cover view to linearly move downwards, and the transparency of the cover view is reduced to 0 in the moving downwards process; in the square cover mode, the target interface comprises a square cover view displayed on the upper half part of the target interface, and the terminal can control the square cover view to be gradually reduced and control the transparency of the square cover view to be reduced to 0 in the reducing process; in the circular rotation mode, the target interface comprises a circular cover view displayed on the upper half part of the target interface, the circular cover view continuously rotates, the terminal can control the circular cover view to gradually shrink, and control the transparency of the circular cover view to be reduced to 0 in the shrinking process, and the circular cover view keeps rotating in the process; in the singer portrait mode, the target interface comprises a singer portrait view displayed in a full screen mode, the terminal can control the singer portrait view to move linearly downwards, and in the moving downwards process, the transparency of the singer portrait view is controlled to be reduced to 0.
It should be noted that in the embodiment of the present application, the control interface element disappears in order to divert the attention of the user to the content item to be presented, and therefore, some interface elements that do not divert the attention of the user may be retained in the target interface, and only the interface elements that can attract the attention of the user disappear. In one possible implementation, the interface elements include view elements and tool elements; controlling the interface elements in the target interface to disappear, comprising: and controlling the view elements in the target interface to disappear, and keeping the tool elements in the target interface unchanged.
Wherein the view element is an element for presenting view contents, and the tool element is an element for providing a modification function. Taking the target interface as an audio playing interface as an example, the view elements may include lyric elements, album cover elements, and the like; and the tool elements may include a "play/pause" option, a "next" option, a "previous" option, etc.
For example, as shown in FIG. 4, the target interface includes various view elements such as "song title", "lyrics", "song cover", etc., and also includes various tool elements such as "play/pause" option, "next" option, "previous" option, etc. The terminal may control various view elements such as "song title", "lyrics", "song cover", etc. to move down, and the transparency of the view elements gradually decreases during the moving down, as shown in fig. 5. And the terminal only controls the view element to move downwards, and keeps a plurality of tool elements such as a 'play/pause' option, a 'next' option, a 'previous' option and the like unchanged.
The tool elements in the target interface can update the content displayed by the view elements in the target interface, and the view elements disappear, so that the updated view elements cannot be seen even if the tool elements are operated. Thus, tool elements may be prohibited from interacting. Optionally, the method further comprises: and updating the state of the tool element to be in a non-triggerable state, wherein the tool element in the non-triggerable state does not respond to a trigger operation.
Optionally, the method further comprises: after the video playing is completed, the state of the tool element is updated to be a triggerable state, and the tool element in the triggerable state responds to a trigger operation. After the video playing is finished, the interface element is displayed again in the target interface, so that the interaction of the tool element can be recovered in order to ensure the normal use of the target interface.
304. And after the interface element disappears, the terminal plays the video of the object to be recommended in the content item in the target interface.
When the terminal plays the video of the object to be recommended in the target interface, the terminal can play the video in a full screen mode or in a certain area, and the video is not limited in the embodiment of the application.
In one possible implementation manner, the terminal embeds a video playing control in the target interface, and plays the video through the video playing control. That is to say, after the interface element disappears, the terminal plays the video of the object to be recommended in the target interface, including: and after the interface elements disappear, the terminal inserts a video playing control into the target interface, and plays the video of the object to be recommended in the content item based on the video playing control.
It should be noted that the terminal may initialize the video player and the video playing control in advance, for example, after the content item is loaded locally, the video player and the video playing control may be initialized.
In one possible implementation, some preparation work is required before playing the video, and the video playing can be performed after the preparation work is completed. Optionally, playing the video of the object to be recommended in the target interface includes: under the condition that a video playing condition is met, playing the video in the target interface; the video playing condition includes at least one of: (1) the content item has been acquired; (2) the target interface is a top-level view, which is a view presented to the user by the device; (3) the tool element in the target interface is in a non-triggerable state; (4) the target interface is inhibited from responding to a target gesture, the target gesture being a gesture to exit the target interface.
For example, when the current time is the display time indicated by the time configuration information of the content item, the content item is read locally from the terminal, whether the target interface is the top view or not is determined, if the target interface is the top view, the state of the tool element is updated to be the triggerable state, and the response to the target gesture is forbidden; and then, controlling the interface elements in the target interface to disappear, and playing the video in the target interface.
For example, the target interface is an audio playing interface, the audio playing interface is a top-level view, which indicates that the user is watching the audio playing interface, and the client to which the audio playing interface belongs runs not in the background but in the foreground. However, even if the client operates in the foreground, if the top view is not an audio playing interface, the terminal will not play the video. For example, the user slides from the audio playing interface to the lyric display interface, the top view is the lyric display interface, and since the user usually slides to the lyric display interface to watch the lyrics, if the lyrics disappear under control, the user may feel the objections of displaying the content item, and therefore, when the top view is the lyric display interface, the content item is not displayed any more.
In a possible implementation manner, the target interface is an audio playing interface of any song, and when the terminal displays the audio playing interface, the terminal may play the audio of the song at the same time. In this application embodiment, when playing a video, the playing of the song may be paused, and the playing of the song may also be continued, which is not limited in this application embodiment. Optionally, playing the video of the object to be recommended in the target interface includes any one of the following steps: pausing the playing of the song and playing the video in the target interface; the video is played in the target interface while the song is played.
It should be noted that if the playing of the song is paused while the video is played, the playing of the song also needs to be resumed after the video playing is completed.
It should be noted that the video may or may not include audio data. Optionally, the video includes audio data, and the playing of the song may be paused while the video is played; optionally, the video does not include audio data, and the playing of the song can be continuously maintained while the video is played, because the video has no sound, and therefore the playing of the video does not affect the playing of the song.
As shown in fig. 6 and 7, the object to be recommended is an earphone, and a video of the earphone is played in the target interface.
305. And after the terminal plays the video, displaying the last video frame of the video.
In the embodiment of the application, after the terminal plays the video, the terminal stays at the last video frame of the video.
306. The terminal controls the video frame to move from the current position to the target position.
The target location may be any location in the target interface, and the location may be a default location of the system, a location set by a user, a location set by a technician, or the like, which is not limited in this embodiment of the application.
In one possible implementation manner, the terminal controls the video frame to move from the current position to the target position through the special effect parameter. Optionally, the terminal controls the video frame to move from the current position to the target position, including: and acquiring special effect parameters including a target position, and controlling the video frame to move from the current position to the target position according to the special effect parameters.
It should be noted that, in the embodiment of the present application, a manner in which the terminal controls the video frame to move from the current position to the target position is not limited.
In one possible implementation manner, the terminal plays the video by embedding a video playing control in the target interface, and therefore, the terminal moves the video playing control, so that the content displayed by the video playing control moves in the target interface. Wherein, the terminal controls the video frame to move from the current position to the target position, including: and the terminal controls the video playing control to move from the current position to the target position.
307. The terminal controls the video frame to disappear, and in the area where the target position belongs, the description information in the content item is displayed.
In the embodiment of the present application, a display manner of a content item is exemplarily illustrated only by taking an example that a video frame disappears and description information of an object is displayed in an area to which a target position belongs. In another embodiment, the terminal may also control the video frame to move from the current position to a target position, where the video frame is displayed. In one possible implementation, the video frame is displayed full screen, and may be reduced when moved to the target location. Wherein controlling the video frame to move from the current position to the target position comprises: and controlling the video frame to move from the current position to the target position, and controlling the video frame to gradually shrink in the moving process. As shown in fig. 8, the video frame is moved to above the target interface and is zoomed out.
In the embodiment of the present application, the content item further includes description information of the object, and the target position may be any position in a display area of the description information of the object. Alternatively, the target position may be a central position describing a displayed area of the object.
It should be noted that, in the embodiment of the present application, the terminal may control the disappearance of the video frame in a process of moving the video frame from the current position to the target position, or after moving the video frame to the target position, which is not limited in the embodiment of the present application.
In one possible implementation manner, the terminal controls the video frame to gradually disappear in the process of moving the video frame from the current position to the target position. Wherein controlling the video frame to move from the current position to the target position comprises: and controlling the video frame to move from the current position to the target position, and controlling the video frame to gradually shrink until the video frame disappears in the moving process.
When the video frame is controlled to gradually decrease, the rounding center of the video frame can be controlled to decrease, and the decreasing mode is not limited in the embodiment of the application.
For example, the terminal controls the video frame to move from the current position to the central position of the display area of the description information, controls the video frame to gradually shrink in the moving process, just shrinks to disappear when the video frame moves to the central position, and displays the description information of the object to be recommended in the display area at the moment so that the user can notice the description information.
Optionally, the description information of the object to be recommended may be at least one of an object name, a price, a picture, a video, and the like of the object to be recommended, and the description information of the object to be recommended is not limited in this embodiment of the application. Optionally, after the user watches the video of the object to be recommended, a desire to purchase may be generated, so the description information of the object to be recommended may also be a shopping link of the object to be recommended, and by performing a trigger operation on the description information, the terminal may be enabled to display a purchase interface of the object to be recommended.
It should be noted that, in the embodiment of the present application, the content item includes a video of an object to be recommended and description information, which are taken as examples, and a presentation method of the content item is exemplarily described. In yet another embodiment, the content item includes a video of an object and a logo image, and after controlling the video frame to move from the current position to the target position, the method further includes: controlling the video frame to disappear; the marker image is displayed at the target location. Wherein the logo image is an image for showing the outer shape of the object. For example, the object is a headphone, and the logo image is a photograph of the headphone.
In one possible implementation, displaying the logo image at the target location includes any one of the following steps: controlling the mark image to disappear after popping up on the target position; and controlling the mark image to be recovered after being popped up at the target position.
The terminal can enable the mark image to present a popup effect by means of enlarging and reducing the mark image. For example, the terminal may display the logo image according to the initial size, switch to 1.15 times for display, and gradually reduce the size from 1.15 times to the initial size; or, the terminal directly displays the logo image in a size of 1.15 times, and then gradually reduces the logo image from 1.15 times to the initial size.
It should be noted that, after the terminal restores the logo image to the original size, if the logo image continues to be displayed, the effect of restoring the logo image after popping up is achieved; if the marker image disappears after being restored to the original size, the effect that the marker image disappears after being popped up at the target position is achieved.
In another embodiment, the content item includes a video of the object to be recommended, description information, and a logo image, and since the description information and the logo image are both displayed at the target position, the logo image may disappear after the target position pops up. In a possible implementation manner, the terminal controls the video frame to disappear at the target position, and controls the mark image to disappear after popping up at the target position on the target position of the first layer; and displaying the description information in a display area of the second layer. Wherein the first layer is located on the second layer. Therefore, the logo image may cover the description information, and the description information is displayed after the logo image disappears.
The description information can be regarded as an advertisement banner of an object to be recommended; and the logo image can be regarded as a middle map of the object to be recommended. The middle map is used for linkage video and advertisement banners.
308. And restoring the terminal control interface element into the target interface.
After the content item is presented, the interface element needs to be restored into the target interface to ensure normal use of the target interface.
It should be noted that, in a possible implementation manner, a process of disappearing the terminal control interface element and a process of restoring the control interface element to the target interface are reverse processes. For example, the terminal may cause the interface element to disappear by reducing the transparency of the interface element from 1 to 0; the terminal control interface element may be controlled to increase its transparency from 0 to 1 when it is restored into the target interface.
In one possible implementation, when restoring an interface element into a target interface, the display mode of the interface element is considered, with different display modes having different manners of restoration. Controlling the interface element to be restored into the target interface, including: acquiring a special effect recovery parameter corresponding to the display mode based on the display mode of the interface element; and controlling the interface element to move back to the target interface based on the acquired special effect recovery parameters.
Optionally, based on the acquired special effect recovery parameter, controlling the interface element to move back to the target interface, including any one of the following steps: the display mode of the interface element is a first display mode, the recovery special effect parameter corresponding to the first display mode is a first recovery special effect parameter, based on the first recovery special effect parameter, the interface element is controlled to move back to the target interface, and in the process of moving back to the target interface, the transparency of the interface element is controlled to be increased to 1; the display mode of the interface element is a second display mode, the recovery special effect parameter corresponding to the second display mode is a second recovery special effect parameter, the interface element is controlled to be gradually enlarged to a target size based on the second recovery special effect parameter, in the enlargement process, the transparency of the interface element is controlled to be increased to 1, and the target size is the initial size of the interface element in the target interface.
The manner of restoring the interface elements based on the display mode is the same as the manner of controlling the disappearance of the interface elements based on the display mode, and is not described in detail herein.
In one possible implementation, the controlling the interface element to revert to the target interface includes at least one of: (1) controlling the interface element to move from the current position to the initial position of the interface element in the target interface; (2) controlling the interface element to be enlarged from the current size to the initial size of the interface element in the target interface; (3) and controlling the interface element to increase from the current transparency to 1.
The manner of restoring the interface element into the target interface is the same as the manner of controlling the interface element to disappear, and is not described in detail herein.
It should be noted that the step 308 and the step 306 may be executed simultaneously, or the step 306 may be executed first and then the step 308 is executed. The embodiment of the present application does not limit this.
For example, as shown in FIG. 8 and as shown in FIG. 9, the interface elements move into the target interface from below the target interface, and the transparency gradually increases as the target interface is entered.
After the presentation of the content item is completed, the video text may be released from memory and read from the disk again for the next presentation.
According to the content item display method provided by the embodiment of the application, when the content item is displayed in the target interface, the interface element in the target interface is controlled to disappear, and then the video in the content item is played, so that the attention of a user to the content item is greatly improved, and the display effect of the content item is improved.
And after the video playing of the object is finished, the video can be moved to the target position, and after the video disappears at the target position, the description information of the object is displayed at the target position, so that the linkage of the video and the description information is realized, the user pays attention to the description information, and the display effect of the content item is further improved.
In addition, the animation can be displayed at the position where the video disappears after the video disappears, and the description information of the object is displayed after the animation display is completed, so that the visual linkage of the dynamic display information and the static display information is realized, the perception of the user is doubly improved, and the display effect of the content item is improved.
Fig. 10 is a schematic structural diagram of a content item presentation apparatus provided in an embodiment of the present application, and referring to fig. 10, the apparatus includes:
a display module 1001 configured to display a target interface, where the target interface includes interface elements;
an obtaining module 1002, configured to obtain a content item to be recommended when a content item display condition is met, where the content item includes a video of an object to be recommended;
the display module 1001 is further configured to control interface elements in the target interface to disappear;
the display module 1001 is further configured to play the video of the object to be recommended in the target interface after the interface element disappears.
As shown in fig. 11, in one possible implementation manner, the display module 1001 includes:
an obtaining unit 1011, configured to obtain a disappearing special effect parameter corresponding to a display mode based on the display mode of the interface element;
and the display unit 1021 is used for controlling the interface element to disappear based on the acquired disappearance special effect parameter.
In one possible implementation, the display unit 1021 is configured to perform any one of the following steps:
the display mode of the interface element is a first display mode, the disappearing special effect parameter corresponding to the first display mode is a first disappearing special effect parameter, the interface element is controlled to move out of the target interface based on the first disappearing special effect parameter, and the transparency of the interface element is controlled to be gradually reduced to 0 in the process of moving out of the target interface;
the display mode of the interface element is a second display mode, the disappearing special effect parameter corresponding to the second display mode is a second disappearing special effect parameter, the interface element is controlled to be gradually reduced until disappearing based on the second disappearing special effect parameter, and the transparency of the interface element is controlled to be reduced to 0 in the reducing process.
In one possible implementation, the display module 1001 is configured to perform at least one of the following:
controlling the interface element to move out of the target interface;
controlling the transparency of the interface element to gradually decrease until the transparency is 0;
and controlling the interface element to gradually shrink until the interface element disappears.
In a possible implementation manner, the display module 1001 is further configured to control the interface element to be restored into the target interface.
In one possible implementation, the display module 1001 includes:
an obtaining unit 1011, configured to obtain, based on a display mode of the interface element, a special effect recovery parameter corresponding to the display mode;
and a display unit 1021, configured to control the interface element to be restored into the target interface based on the acquired restoration special effect parameter.
In one possible implementation manner, the display unit 1021 is configured to perform any one of the following steps:
the display mode of the interface element is a first display mode, the recovery special effect parameter corresponding to the first display mode is a first recovery special effect parameter, the interface element is controlled to move back to the target interface based on the first recovery special effect parameter, and the transparency of the interface element is controlled to be increased to 1 in the process of moving back to the target interface;
the display mode of the interface element is a second display mode, the recovery special effect parameter corresponding to the second display mode is a second recovery special effect parameter, the interface element is controlled to be gradually enlarged to a target size based on the second recovery special effect parameter, in the enlargement process, the transparency of the interface element is controlled to be increased to 1, and the target size is the initial size of the interface element in the target interface.
In one possible implementation, the display module 1001 is further configured to perform at least one of the following:
controlling the interface element to move from a current position to an initial position of the interface element in the target interface;
controlling the interface element to be enlarged from the current size to the initial size of the interface element in the target interface;
and controlling the interface element to increase from the current transparency to 1.
In a possible implementation manner, the display module 1001 is further configured to display a last video frame of the video after the video is played; and controlling the video frame to move from the current position to the target position.
In a possible implementation manner, the display module 1001 is configured to control the video frame to move from the current position to the target position, and control the video frame to gradually shrink during the moving process.
In a possible implementation manner, the display module 1001 is configured to control the video frame to move from the current position to the target position, and in the moving process, control the video frame to gradually shrink until disappearing.
In one possible implementation, the content item further includes descriptive information of the object; the display module 1001 is further configured to control the video frame to disappear; and displaying the description information in the content item in the area to which the target position belongs.
In one possible implementation, the content item further includes a logo image of the object; the display module 1001 is further configured to control the video frame to disappear; displaying a logo image in the content item at the target location.
In one possible implementation manner, the display module 1001 is configured to perform any one of the following steps:
controlling the mark image to disappear after popping up on the target position;
and controlling the mark image to be restored after being popped up at the target position.
In a possible implementation manner, the obtaining module 1002 is configured to asynchronously request to obtain the content item in response to a trigger operation on the target interface, and load the content item locally; the content item is retrieved from the local.
In one possible implementation, the content item presentation condition includes at least one of:
the data type of the content item indicates that the presentation style of the content item is a target presentation style;
the current time is the presentation time indicated by the time configuration information of the content item, and the time configuration information includes the duration of waiting of the content item before presentation.
In one possible implementation, the interface element includes a view element and a tool element; the display module 1001 is configured to control a view element in the target interface to disappear, and keep a tool element in the target interface unchanged.
In one possible implementation, the apparatus further includes:
an updating module 1003, configured to update the state of the tool element to an untriggerable state, where the tool element in the untriggerable state does not respond to a trigger operation.
In a possible implementation manner, the updating module 1003 is further configured to update the state of the tool element to a triggerable state after the video playing is completed, where the tool element in the triggerable state responds to a triggering operation.
In a possible implementation manner, the display module 1001 is configured to play the video in the target interface if a video playing condition is met;
the content item presentation condition includes at least one of:
the content item has been acquired;
the target interface is a top-level view, and the top-level view is a view presented to a user by the equipment;
tool elements in the target interface are in a non-triggerable state;
the target interface is inhibited from responding to a target gesture, the target gesture being a gesture to exit the target interface.
In one possible implementation, the target interface is an audio playing interface of any song; the display module 1001 is further configured to perform any one of the following steps:
pausing the playing of the song and playing the video in the target interface;
and playing the video in the target interface while playing the song.
Fig. 12 shows a block diagram of a terminal 1200 according to an exemplary embodiment of the present application. The terminal 1200 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1200 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and so forth.
The terminal 1200 includes: a processor 1201 and a memory 1202.
The processor 1201 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1201 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1201 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1201 may be integrated with a GPU (Graphics Processing Unit) that is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, the processor 1201 may further include an AI (Artificial Intelligence) processor for processing a computing operation related to machine learning.
Memory 1202 may include one or more computer-readable storage media, which may be non-transitory. Memory 1202 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1202 is used to store at least one program code for execution by the processor 1201 to implement the content item presentation methods provided by the method embodiments herein.
In some embodiments, the terminal 1200 may further optionally include: a peripheral interface 1203 and at least one peripheral. The processor 1201, memory 1202, and peripheral interface 1203 may be connected by a bus or signal line. Various peripheral devices may be connected to peripheral interface 1203 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1204, display 1205, camera assembly 1206, audio circuitry 1207, positioning assembly 1208, and power supply 1209.
The peripheral interface 1203 may be used to connect at least one peripheral associated with I/O (Input/Output) to the processor 1201 and the memory 1202. In some embodiments, the processor 1201, memory 1202, and peripheral interface 1203 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1201, the memory 1202 and the peripheral device interface 1203 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1204 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuit 1204 communicates with a communication network and other communication devices by electromagnetic signals. The radio frequency circuit 1204 converts an electric signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electric signal. Optionally, the radio frequency circuit 1204 comprises: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1204 may communicate with other terminals through at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 1204 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1205 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1205 is a touch display screen, the display screen 1205 also has the ability to acquire touch signals on or over the surface of the display screen 1205. The touch signal may be input to the processor 1201 as a control signal for processing. At this point, the display 1205 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1205 may be one, providing the front panel of the terminal 1200; in other embodiments, the display 1205 can be at least two, respectively disposed on different surfaces of the terminal 1200 or in a folded design; in still other embodiments, the display 1205 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 1200. Even further, the display screen 1205 may be arranged in a non-rectangular irregular figure, i.e., a shaped screen. The Display panel 1205 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or other materials.
Camera assembly 1206 is used to capture images or video. Optionally, camera assembly 1206 includes a front camera and a rear camera. The front camera is arranged on the front panel of the terminal, and the rear camera is arranged on the back of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1206 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1207 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals into the processor 1201 for processing or inputting the electric signals into the radio frequency circuit 1204 to achieve voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided at different locations of terminal 1200. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1201 or the radio frequency circuit 1204 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1207 may also include a headphone jack.
The positioning component 1208 is configured to locate a current geographic Location of the terminal 1200 to implement navigation or LBS (Location Based Service). The Positioning component 1208 can be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, the russian graves System, or the european union galileo System.
The power supply 1209 is used to provide power to various components within the terminal 1200. The power source 1209 may be alternating current, direct current, disposable or rechargeable. When the power source 1209 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1200 also includes one or more sensors 1210. The one or more sensors 1120 include, but are not limited to: acceleration sensor 1211, gyro sensor 1212, pressure sensor 1213, fingerprint sensor 1214, optical sensor 1215, and proximity sensor 1216.
The acceleration sensor 1211 can detect magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 1200. For example, the acceleration sensor 1211 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1201 may control the display screen 1205 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1211. The acceleration sensor 1211 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1212 may detect a body direction and a rotation angle of the terminal 1200, and the gyro sensor 1212 may collect a 3D motion of the user on the terminal 1200 in cooperation with the acceleration sensor 1211. The processor 1201 can implement the following functions according to the data collected by the gyro sensor 1212: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1213 may be disposed on the side frames of terminal 1200 and/or underlying display 1205. When the pressure sensor 1213 is disposed on the side frame of the terminal 1200, the user's holding signal of the terminal 1200 can be detected, and the processor 1201 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1213. When the pressure sensor 1213 is disposed at a lower layer of the display screen 1205, the processor 1201 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 1205. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1214 is used for collecting a fingerprint of the user, and the processor 1201 identifies the user according to the fingerprint collected by the fingerprint sensor 1214, or the fingerprint sensor 1214 identifies the user according to the collected fingerprint. When the user identity is identified as a trusted identity, the processor 1201 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 1214 may be provided on the front, back, or side of the terminal 1200. When a physical button or vendor Logo is provided on the terminal 1200, the fingerprint sensor 1214 may be integrated with the physical button or vendor Logo.
The optical sensor 1215 is used to collect the ambient light intensity. In one embodiment, the processor 1201 may control the display brightness of the display 1205 according to the ambient light intensity collected by the optical sensor 1215. Specifically, when the ambient light intensity is high, the display luminance of the display panel 1205 is increased; when the ambient light intensity is low, the display brightness of the display panel 1205 is turned down. In another embodiment, processor 1201 may also dynamically adjust the camera head 1206 shooting parameters based on the ambient light intensity collected by optical sensor 1215.
A proximity sensor 1216, also called a distance sensor, is disposed at a front panel of the terminal 1200. The proximity sensor 1216 is used to collect a distance between the user and the front surface of the terminal 1200. In one embodiment, when the proximity sensor 1216 detects that the distance between the user and the front surface of the terminal 1200 gradually decreases, the processor 1201 controls the display 1205 to switch from the bright screen state to the dark screen state; when the proximity sensor 1216 detects that the distance between the user and the front surface of the terminal 1200 gradually becomes larger, the processor 1201 controls the display 1205 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 12 is not intended to be limiting of terminal 1200 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Fig. 13 is a schematic structural diagram of a server 1300 according to an embodiment of the present application, where the server 1300 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 1301 and one or more memories 1302, where the memory 1302 stores at least one program code, and the at least one program code is loaded and executed by the processor 1301 to implement the methods provided by the method embodiments. Of course, the server may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input/output, and the server may also include other components for implementing the functions of the device, which are not described herein again.
The server 1300 is configured to perform the steps performed by the server in the above method embodiments.
In an exemplary embodiment, a computer-readable storage medium, such as a memory, is also provided, which includes program code executable by a processor in a computer device to perform the content item presentation method of the above embodiments. For example, the computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program or a computer program product is also provided, which comprises computer program code, which, when executed by a terminal, causes the terminal to implement the content item presentation method in the above-described embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, and the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (24)

1. A method of content item presentation, the method comprising:
displaying a target interface, the target interface comprising interface elements;
under the condition that a content item display condition is met, acquiring a content item to be recommended, wherein the content item at least comprises a video of an object to be recommended;
controlling interface elements in the target interface to disappear;
playing a video in the content item in the target interface after the interface element disappears.
2. The method of claim 1, wherein the controlling of interface elements in the target interface to disappear comprises:
acquiring a disappearing special effect parameter corresponding to the display mode based on the display mode of the interface element;
and controlling the interface element to disappear based on the acquired disappearance special effect parameter.
3. The method according to claim 2, wherein the controlling the interface element to disappear based on the obtained disappearance special effect parameter comprises any one of the following steps:
the display mode of the interface element is a first display mode, the disappearing special effect parameter corresponding to the first display mode is a first disappearing special effect parameter, the interface element is controlled to move out of the target interface based on the first disappearing special effect parameter, and the transparency of the interface element is controlled to be gradually reduced to 0 in the process of moving out of the target interface;
the display mode of the interface element is a second display mode, the disappearing special effect parameter corresponding to the second display mode is a second disappearing special effect parameter, the interface element is controlled to be gradually reduced until disappearing based on the second disappearing special effect parameter, and the transparency of the interface element is controlled to be reduced to 0 in the reducing process.
4. The method of claim 1, wherein the controlling of interface elements in the target interface to disappear comprises at least one of:
controlling the interface element to move out of the target interface;
controlling the transparency of the interface element to gradually decrease until the transparency is 0;
and controlling the interface element to gradually shrink until the interface element disappears.
5. The method of claim 1, wherein after the interface element disappears, and after the video in the content item is played in the target interface, the method further comprises:
and controlling the interface element to be restored into the target interface.
6. The method of claim 5, wherein the controlling the interface element to revert into the target interface comprises:
acquiring a special effect recovery parameter corresponding to a display mode based on the display mode of the interface element;
and controlling the interface element to be restored into the target interface based on the acquired special effect restoration parameters.
7. The method according to claim 6, wherein the controlling the interface element to be restored into the target interface based on the acquired restoration special effect parameter comprises any one of the following steps:
the display mode of the interface element is a first display mode, the recovery special effect parameter corresponding to the first display mode is a first recovery special effect parameter, the interface element is controlled to move back to the target interface based on the first recovery special effect parameter, and the transparency of the interface element is controlled to be increased to 1 in the process of moving back to the target interface;
the display mode of the interface element is a second display mode, the recovery special effect parameter corresponding to the second display mode is a second recovery special effect parameter, the interface element is controlled to be gradually enlarged to a target size based on the second recovery special effect parameter, in the enlargement process, the transparency of the interface element is controlled to be increased to 1, and the target size is the initial size of the interface element in the target interface.
8. The method of claim 5, wherein the controlling the interface element to revert into the target interface comprises at least one of:
controlling the interface element to move from a current position to an initial position of the interface element in the target interface;
controlling the interface element to be enlarged from the current size to the initial size of the interface element in the target interface;
and controlling the interface element to increase from the current transparency to 1.
9. The method of claim 1, further comprising:
after the video is played, displaying the last video frame of the video;
and controlling the video frame to move from the current position to the target position.
10. The method of claim 9, wherein the controlling the video frame to move from the current position to the target position comprises:
and controlling the video frame to move from the current position to the target position, and controlling the video frame to gradually shrink in the moving process.
11. The method of claim 10, wherein the controlling the video frame to move from the current position to the target position and to gradually shrink during the moving comprises:
and controlling the video frame to move from the current position to the target position, and controlling the video frame to gradually shrink until the video frame disappears in the moving process.
12. The method of claim 9, wherein the content item further comprises descriptive information of the object; after the controlling the video frame to move from the current position to the target position, the method further comprises:
controlling the video frame to disappear;
and displaying the description information in the content item in the area to which the target position belongs.
13. The method of claim 9, wherein the content item further comprises a logo image of the object, and wherein after controlling the video frame to move from a current position to a target position, the method further comprises:
controlling the video frame to disappear;
displaying a logo image in the content item at the target location.
14. The method of claim 13, wherein displaying a logo image in the content item at the target location comprises any one of:
controlling the mark image to disappear after popping up on the target position;
and controlling the mark image to be restored after being popped up at the target position.
15. The method of claim 1, further comprising:
responding to the trigger operation of the target interface, asynchronously requesting to acquire the content item, and loading the content item to the local;
the obtaining of the content item to be recommended includes:
the content item is retrieved from the local.
16. The method of claim 1, wherein the content item presentation condition comprises at least one of:
the data type of the content item indicates that the presentation style of the content item is a target presentation style;
the current time is the presentation time indicated by the time configuration information of the content item, and the time configuration information includes the duration of waiting of the content item before presentation.
17. The method of claim 1, wherein the interface elements include view elements and tool elements; the controlling of the interface elements in the target interface to disappear comprises:
and controlling the view elements in the target interface to disappear, and keeping the tool elements in the target interface unchanged.
18. The method of claim 17, further comprising:
and updating the state of the tool element to be in a non-triggerable state, wherein the tool element in the non-triggerable state does not respond to a trigger operation.
19. The method of claim 18, further comprising:
after the video playing is completed, updating the state of the tool element to a triggerable state, wherein the tool element in the triggerable state responds to a trigger operation.
20. The method of claim 1, wherein the playing the video of the content item in the target interface comprises:
under the condition that a video playing condition is met, playing the video in the target interface;
the content item presentation condition includes at least one of:
the content item has been acquired;
the target interface is a top-level view, and the top-level view is a view presented to a user by the equipment;
tool elements in the target interface are in a non-triggerable state;
the target interface is inhibited from responding to a target gesture, the target gesture being a gesture to exit the target interface.
21. The method of claim 1, wherein the target interface is an audio playback interface for any song; the playing the video in the content item in the target interface comprises any one of the following steps:
pausing the playing of the song and playing the video in the target interface;
and playing the video in the target interface while playing the song.
22. A content item presentation apparatus, characterized in that the apparatus comprises:
a display module to display a target interface, the target interface including interface elements;
the device comprises an acquisition module, a recommendation module and a recommendation module, wherein the acquisition module is used for acquiring a content item to be recommended under the condition that a content item display condition is met, and the content item at least comprises a video of an object to be recommended;
the display module is further used for controlling interface elements in the target interface to disappear;
the display module is further configured to play the video in the content item in the target interface after the interface element disappears.
23. A terminal, characterized in that the terminal comprises one or more processors and one or more memories having stored therein at least one program code, which is loaded and executed by the one or more processors to implement the operations executed by the content item presentation method according to any one of claims 1 to 21.
24. A computer-readable storage medium having stored therein at least one program code, which is loaded and executed by a processor to perform operations executed by a content item presentation method according to any one of claims 1 to 21.
CN202110959710.7A 2021-08-20 2021-08-20 Content item display method, device, terminal and storage medium Pending CN113626127A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110959710.7A CN113626127A (en) 2021-08-20 2021-08-20 Content item display method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110959710.7A CN113626127A (en) 2021-08-20 2021-08-20 Content item display method, device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN113626127A true CN113626127A (en) 2021-11-09

Family

ID=78386903

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110959710.7A Pending CN113626127A (en) 2021-08-20 2021-08-20 Content item display method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN113626127A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100269030A1 (en) * 2009-04-17 2010-10-21 Jokaroo Entertainment Inc. Method and device for temporally displaying advertising content on a webpage
US20170060405A1 (en) * 2015-08-28 2017-03-02 Facebook, Inc. Systems and methods for content presentation
CN108008886A (en) * 2017-11-28 2018-05-08 上海量明科技发展有限公司 The method, apparatus and system of advertisement are exported on a display screen
CN108965980A (en) * 2018-07-20 2018-12-07 腾讯科技(深圳)有限公司 Recommendation display methods, device, terminal and storage medium
CN110221748A (en) * 2019-05-23 2019-09-10 厦门美柚信息科技有限公司 Show the method, device and mobile terminal of page parallax
CN110889060A (en) * 2019-11-05 2020-03-17 北京三快在线科技有限公司 Webpage display method and device, computer equipment and storage medium
CN111221450A (en) * 2020-01-02 2020-06-02 杭州网易云音乐科技有限公司 Information display method and device, electronic equipment and storage medium
WO2020173199A1 (en) * 2019-02-27 2020-09-03 北京市商汤科技开发有限公司 Display method and device, electronic device and storage medium
CN112073405A (en) * 2020-09-03 2020-12-11 中国平安财产保险股份有限公司 Webpage data loading method and device, computer equipment and storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100269030A1 (en) * 2009-04-17 2010-10-21 Jokaroo Entertainment Inc. Method and device for temporally displaying advertising content on a webpage
US20170060405A1 (en) * 2015-08-28 2017-03-02 Facebook, Inc. Systems and methods for content presentation
CN108008886A (en) * 2017-11-28 2018-05-08 上海量明科技发展有限公司 The method, apparatus and system of advertisement are exported on a display screen
CN108965980A (en) * 2018-07-20 2018-12-07 腾讯科技(深圳)有限公司 Recommendation display methods, device, terminal and storage medium
WO2020173199A1 (en) * 2019-02-27 2020-09-03 北京市商汤科技开发有限公司 Display method and device, electronic device and storage medium
CN110221748A (en) * 2019-05-23 2019-09-10 厦门美柚信息科技有限公司 Show the method, device and mobile terminal of page parallax
CN110889060A (en) * 2019-11-05 2020-03-17 北京三快在线科技有限公司 Webpage display method and device, computer equipment and storage medium
CN111221450A (en) * 2020-01-02 2020-06-02 杭州网易云音乐科技有限公司 Information display method and device, electronic equipment and storage medium
CN112073405A (en) * 2020-09-03 2020-12-11 中国平安财产保险股份有限公司 Webpage data loading method and device, computer equipment and storage medium

Similar Documents

Publication Publication Date Title
CN109977333B (en) Webpage display method and device, computer equipment and storage medium
WO2022088884A1 (en) Page display method and terminal
CN108449641B (en) Method, device, computer equipment and storage medium for playing media stream
CN109660855B (en) Sticker display method, device, terminal and storage medium
CN108737897B (en) Video playing method, device, equipment and storage medium
CN112181572A (en) Interactive special effect display method and device, terminal and storage medium
CN109327608B (en) Song sharing method, terminal, server and system
CN109144346B (en) Song sharing method and device and storage medium
CN109275013B (en) Method, device and equipment for displaying virtual article and storage medium
CN111368114B (en) Information display method, device, equipment and storage medium
WO2023000677A1 (en) Content item display method and apparatus
CN114116053B (en) Resource display method, device, computer equipment and medium
CN113395566B (en) Video playing method and device, electronic equipment and computer readable storage medium
CN112667835A (en) Work processing method and device, electronic equipment and storage medium
CN112383808A (en) Video playing method, device, terminal and storage medium
CN110662105A (en) Animation file generation method and device and storage medium
CN111083526B (en) Video transition method and device, computer equipment and storage medium
CN112131422A (en) Expression picture generation method, device, equipment and medium
CN112257006A (en) Page information configuration method, device, equipment and computer readable storage medium
CN109614563B (en) Method, device and equipment for displaying webpage and storage medium
CN113204672B (en) Resource display method, device, computer equipment and medium
CN113377270B (en) Information display method, device, equipment and storage medium
CN113032590B (en) Special effect display method, device, computer equipment and computer readable storage medium
CN112770177B (en) Multimedia file generation method, multimedia file release method and device
CN110992268B (en) Background setting method, device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination