CN115905374A - Application function display method and device, terminal and storage medium - Google Patents

Application function display method and device, terminal and storage medium Download PDF

Info

Publication number
CN115905374A
CN115905374A CN202110821148.1A CN202110821148A CN115905374A CN 115905374 A CN115905374 A CN 115905374A CN 202110821148 A CN202110821148 A CN 202110821148A CN 115905374 A CN115905374 A CN 115905374A
Authority
CN
China
Prior art keywords
function
interface
background
application
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110821148.1A
Other languages
Chinese (zh)
Inventor
吴启亮
郭瑄
吴雷
陈玮彤
徐思敏
曾兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110821148.1A priority Critical patent/CN115905374A/en
Publication of CN115905374A publication Critical patent/CN115905374A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a display method, a display device, a display terminal and a storage medium of an application function, and belongs to the technical field of application programs. The method comprises the following steps: responding to application starting operation, displaying a first function introduction interface, wherein the first function introduction interface comprises a function interaction object corresponding to a first application function, an interface background of the first function introduction interface is matched with object content of the function interaction object, and the object content is determined and obtained based on a user portrait; responding to the triggering operation of the function interaction object, displaying a function display interface of the first application function, wherein the display content in the function display interface is matched with the object content; responding to the triggering operation of the background interaction object in the interface background, and displaying an information display interface, wherein the information display interface comprises the object information of the background interaction object, and the background interaction object is matched with the object content. According to the scheme of the embodiment of the application, the understanding degree of the user on the application function can be improved, and the utilization rate of the application function is improved.

Description

Application function display method and device, terminal and storage medium
Technical Field
The embodiment of the application relates to the technical field of application programs, in particular to a method, a device, a terminal and a storage medium for displaying an application function.
Background
The Application programs (apps) with rich varieties provide great convenience for the life of people, and the product functions are more and more rich and diversified. When an application is first used, the application typically introduces its functionality to the user through a functionality introduction interface.
In the related art, after downloading an App for the first time, a user needs to enter a function introduction interface before entering a function use interface when entering the App. The function introduction interface usually displays the core function of the App in the forms of characters, pictures and the like, so that a user can know the application function, and the user can conveniently enter the function use interface and then further use the App.
However, the application function introduction manner adopted in the above scheme is single, and the content richness is poor, so that the user has insufficient understanding degree of the application function, and the usage rate of the application function is low.
Disclosure of Invention
The embodiment of the application provides a display method, a display device, a display terminal and a storage medium of an application function, which can introduce the application function according to user preference so as to improve the understanding degree of the application function by a user and contribute to improving the utilization rate of the application function. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a method for displaying an application function, where the method includes:
responding to application starting operation, displaying a first function introduction interface, wherein the first function introduction interface comprises a function interaction object corresponding to a first application function, an interface background of the first function introduction interface is matched with object content of the function interaction object, and the object content is determined and obtained based on user portrait;
responding to the trigger operation of the function interaction object, displaying a function display interface of the first application function, wherein the display content in the function display interface is matched with the object content;
responding to the triggering operation of the background interaction object in the interface background, and displaying an information display interface, wherein the information display interface comprises the object information of the background interaction object, and the background interaction object is matched with the object content.
On the other hand, an embodiment of the present application provides an application function display apparatus, where the apparatus includes:
the first display module is used for responding to application starting operation and displaying a first function introduction interface, wherein the first function introduction interface comprises a function interaction object corresponding to a first application function, the interface background of the first function introduction interface is matched with the object content of the function interaction object, and the object content is determined and obtained based on the user portrait;
the second display module is used for responding to the triggering operation of the function interaction object and displaying a function display interface of the first application function, and display content in the function display interface is matched with the object content;
and the third display module is used for responding to the triggering operation of the background interaction object in the interface background and displaying an information display interface, wherein the information display interface comprises the object information of the background interaction object, and the background interaction object is matched with the object content.
In another aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or a set of instructions, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the method for presenting application functions according to the foregoing aspect.
In another aspect, an embodiment of the present application provides a computer-readable storage medium, in which at least one instruction, at least one program, a code set, or a set of instructions is stored, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by a processor to implement the method for exposing application functions according to the above aspect.
In another aspect, embodiments of the present application provide a computer program product or a computer program, which includes computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to enable the computer device to execute the application function presentation method provided by the aspect.
The technical scheme provided by the application can comprise the following beneficial effects:
in the embodiment of the application, the object content to be displayed in the function introduction interface is determined based on the user portrait, so that the function interaction object and the interface background matched with the object content are displayed on the application function interface based on the object content, the function introduction interface for displaying personalized content for different users is realized, and the content richness and the content matching degree of the function introduction interface are improved; in addition, compared with static function display through characters and pictures, in the embodiment of the application, the function interaction object in the function introduction interface and the background interaction object in the interface background both support interaction, so that a user can know the application function through triggering operation in the function introduction interface, the understanding degree of the user on the application function is further deepened, and the utilization rate of the application function is further improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and, together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 2 is a flowchart of a method for exposing application functionality provided by an exemplary embodiment of the present application;
FIG. 3 is an interface diagram of an application functionality exposure process shown in an exemplary embodiment of the present application;
FIG. 4 is a flowchart of a method for exposing application functionality provided by another exemplary embodiment of the present application;
FIG. 5 is a schematic diagram of an interface of a search function introduction interface in different scenarios;
FIG. 6 is a schematic diagram of an interface of a function introduction interface corresponding to different application functions;
FIG. 7 is an interface diagram illustrating a jump process in accordance with an illustrative embodiment;
FIG. 8 is an interface diagram illustrating a model interaction process, according to an exemplary embodiment;
FIG. 9 is an interface diagram illustrating a functionality introduction interface switching process in accordance with an illustrative embodiment;
FIG. 10 is a flow chart of a function interaction object and interface background display process;
FIG. 11 is a schematic diagram of a background switching process in a functionality introduction interface;
fig. 12 is a block diagram showing a structure of an application function presentation apparatus according to an exemplary embodiment of the present application;
fig. 13 shows a block diagram of a terminal according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
Referring to fig. 1, a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application is shown, where the implementation environment includes a terminal 110 and a server 120. The data communication between the terminal 110 and the server 120 is performed through a communication network, optionally, the communication network may be a wired network or a wireless network, and the communication network may be at least one of a local area network, a metropolitan area network, and a wide area network.
The terminal 110 is an electronic device running an application program. The electronic device may be a mobile terminal such as a smart phone, a tablet computer, a laptop portable notebook computer, or a terminal such as a desktop computer or a projection computer, which is not limited in this embodiment of the present application.
The application programs executed in the terminal 110 may include a browser application, an information reading application, an instant messaging application, a social contact application, a video sharing application, a game application, a shopping application, a payment application, and the like, which is not limited in this embodiment.
In the embodiment of the present application, when the application function introduction condition is satisfied, in the process of running the application program, the terminal 110 firstly introduces the function of the application program through a plurality of function introduction interfaces, and displays the use interface of the application program after completing the function introduction. In some embodiments, the application function introduction condition may be a first use after the application is installed, or a first use after the application is updated.
The server 120 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like. In this embodiment, the server 120 is a background server of an application program in the terminal 110.
In this embodiment, the server 120 is configured to provide the content exposed in the function introduction interface to the application program when the application program displays the function introduction interface. In some embodiments, the server 120 provides the application with content matching the application user based on the user profile of the application user, i.e., the content provided by the server 120 may be different for different application users, so that the application performs targeted function introduction according to the user preference.
In one possible application scenario, when the application program is a browser application and the function introduction interface is used to introduce a search function of the browser application. As shown in fig. 1, the terminal 110 provides the user image to the server 120, the server 120 queries the target search term matching the user image from the search term database 130 based on the user image, further searches the video database 140 for the target background video matching the target search term, and further feeds the queried target search term and target background video to the terminal 110. The terminal 110 uses the target background video as an interface background of the function introduction interface, and performs function introduction in the function introduction interface according to the target search term.
Further, the interface elements in the function introduction interface support interaction, and when the trigger operation on the interface elements is received, the terminal 110 further obtains the relevant information of the interface elements from the server 120 and displays the information, so that the user can further understand the application function, and the probability of using the application function by subsequent users is improved.
It should be noted that the above application scenarios are only exemplary, and in other possible application scenarios, other application function descriptions may be implemented by performing data interaction between the terminal 110 and the server 120, and the embodiment of the present application is not limited to the specific application function specifically described.
For convenience of description, the following embodiments illustrate the application of the method for displaying the application function to the implementation environment shown in fig. 1.
Referring to fig. 2, a flowchart of a method for presenting application functions according to an exemplary embodiment of the present application is shown, where the method includes:
step 201, responding to application starting operation, displaying a first function introduction interface, wherein the first function introduction interface comprises a function interaction object corresponding to a first application function, an interface background of the first function introduction interface is matched with object content of the function interaction object, and the object content is determined based on a user portrait.
In a possible implementation manner, when the application program is started for the first time after installation or when the application program is started for the first time after update, the terminal displays the first function introduction interface. The first function introduction interface is an interface for introducing a first application function, and the first application function may be any one of a plurality of application functions to be introduced in an application program.
In the embodiment of the application, the function introduction interface is composed of a function interaction object corresponding to the application function and an interface background. The function interaction object is an experience entrance of the application function, and by triggering the function interaction object, a user can experience the application function introduced by the current function introduction interface under the condition that the user does not enter the function use interface. Optionally, the object interaction object may be a function control corresponding to the application function.
In addition, in the embodiment of the application, the object content of the function interaction object is not uniform content (that is, the object content of the function interaction object corresponding to the same application function may be different for different users), but is determined based on the user portrait, that is, the object content conforms to the user characteristic (such as an identity characteristic or a preference characteristic, etc.). Correspondingly, compared with the mode that unified content is displayed on the function introduction interface, the object content displayed based on the user portrait is more in line with the user preference, the attention of the user to the function introduction interface is favorably improved, and the user can better know about the application function through the function introduction interface.
Optionally, the number of the function interaction objects in the function introduction interface is at least one, and the object contents of different function interaction objects are different.
Optionally, the content type of the object content is related to an application function. For example, the object content is content related to a search function, content related to a video function, or content related to a reading function, and the specific content type of the object content is not limited in this embodiment.
In addition, in order to further improve the attention degree of the function introduction interface, the interface background of the function introduction interface is matched with the object content of the function interaction object. Optionally, the interface background may be a static background or a dynamic background, where the static background may be a static picture matched with the object content, and the dynamic background may be a video or a moving picture matched with the object content, and the specific type of the interface background is not limited in this embodiment of the application.
Illustratively, as shown in fig. 3, when receiving a start operation of the browser application, the terminal displays a search function introduction interface 31 (for introducing a search function of the browser application), where the search function introduction interface 31 includes a search function interaction object 311 and an interface background related to search content "chicken".
Step 202, responding to the trigger operation of the function interaction object, displaying a function display interface of the first application function, wherein the display content in the function display interface is matched with the object content.
The function interaction object in the embodiment of the application supports interaction, and a user can experience an application function by triggering the function interaction object. Correspondingly, when the triggering operation of the function interaction object is received, the terminal displays a function display interface of the application function, and the display content in the function display interface is determined and obtained based on the application function and the object content in the function interaction object.
Regarding the presentation form of the function presentation interface, in one possible implementation, the function presentation interface is displayed in a floating window on the upper layer of the function introduction interface, and the function introduction interface can be displayed again by closing the floating window. Of course, the function display interface may also adopt other display forms, and the embodiment of the present application is not limited thereto.
In addition, the function display interface is used for displaying the result of the trigger function interaction object, and is not a function use interface for the application function, that is, the function display interface does not have the complete function of the function use interface.
Illustratively, as shown in fig. 3, when a click operation on the search function interaction object 311 is received, the terminal displays a function presentation interface 32 in a floating window on the upper layer of the search function presentation interface 31, and the function presentation interface 32 presents a word "what is chicken in the search function interaction object 311? "is used.
And 203, responding to the trigger operation of the background interaction object in the interface background, and displaying an information display interface, wherein the information display interface comprises object information of the background interaction object, and the background interaction object is matched with the object content.
In the embodiment of the application, besides the function interaction object is used for interaction, the background interaction object in the interface background is also used for interaction, so that the understanding degree and impression of the user on the application function are further enhanced. When receiving a trigger operation on a background interactive object in an interface background, the terminal displays object information of the triggered background interactive object in an information display interface. The background interactive object in the interface background is matched with the object content, namely the background interactive object also conforms to the user characteristics, so that the probability of triggering the background interactive object by the user is improved.
Optionally, the number of the background interactive objects in the interface background is at least one, and the object information displayed in the information display interface corresponding to different background interactive objects may be the same or different.
Regarding the presentation form of the information presentation interface, in a possible implementation manner, the information presentation interface is displayed on the upper layer of the function introduction interface (a covering layer is arranged between the information presentation interface and the function introduction interface), and the function introduction interface can be restored by closing the information presentation interface. Of course, the information display interface may also adopt other display forms, and the embodiment of the present application does not limit this.
Optionally, the object information displayed in the information display interface may be text information, picture information, audio/video information, or a three-dimensional model, and the like, and the specific type of the object information is not limited in the embodiment of the present application.
Illustratively, as shown in fig. 3, the interface background includes a background interactive object 312 (chicken) related to the search term, when a click operation on the background interactive object 312 is received, the terminal displays an information display interface 33 on the upper layer of the search function introduction interface 31, and the information display interface 33 displays related information of "chicken".
In summary, in the embodiment of the present application, the object content to be displayed in the function introduction interface is determined based on the user portrait, so that the function introduction interface for displaying personalized content for different users is implemented based on the function interaction object displayed in the application function interface by the object content and the interface background matched with the object content, and the content richness and the matching degree with the content of the function introduction interface are improved; in addition, compared with static function display through characters and pictures, in the embodiment of the application, the function interaction object in the function introduction interface and the background interaction object in the interface background both support interaction, so that a user can know the application function through triggering operation in the function introduction interface, the understanding degree of the user on the application function is further deepened, and the utilization rate of the application function is further improved.
In a possible implementation manner, in order to make a user more intuitively know a background interactive object that can be interacted in an interface background, the terminal device marks the background interactive object, which is described below by an exemplary embodiment.
Referring to fig. 4, a flowchart of a method for presenting application functions provided by another exemplary embodiment of the present application is shown, where the method includes:
step 401, in response to an application start operation, acquiring object content and an interface background based on a user portrait.
With respect to the manner in which a user representation is obtained, in one possible embodiment, when an application is first run after installation, if the application has permission (subject to user authorization) to obtain application data of other applications (e.g., other applications developed by the same application developer), the application obtains the user representation from the other applications.
In other possible embodiments, when the application program is first run after being updated, the application program acquires a user representation constructed by the application program, and the user representation is constructed by the application program based on the user historical behaviors. Of course, the user representation may also be obtained by other manners (for example, the user representation is stored in a server, and the server directly obtains the user representation according to the terminal identifier), which is not limited in this embodiment of the application.
In some embodiments, the server is provided with a content database and a background database, the content database includes candidate object content for the functional interaction object, the background database includes candidate interface background, and the terminal obtains the object content and the interface background according with the user characteristics from the server according to the user figure.
Optionally, each candidate object content in the content database includes a corresponding content tag, and when the object content is acquired based on the user image, the server matches the user tag in the user image with the content tag, and selects the object content based on the tag matching result.
In an illustrative example, for the search function of the browser application, the search term database of the server includes candidate search terms and content tags, and the correspondence between the candidate search terms and the content tags is shown in table one.
Watch 1
Figure BDA0003172028710000081
Figure BDA0003172028710000091
When the user profile obtained contains "rural" and "farm", the server will search for the word "what was the chicken evolved? "is determined as the search term to be fed back.
Optionally, each candidate interface background in the background database includes a corresponding background tag, and the server matches the content tag of the object content with the background tag, so as to determine the interface background corresponding to the object content based on the tag matching result. The determination of the interface background corresponding to the object content may be performed by the server in real time, or may be performed by the server in advance, and the corresponding relationship between the object content and the interface background is stored.
In an illustrative example, when the candidate object content is a search term and the candidate interface background is a video, the correspondence between the candidate search term, the candidate video and the video tag (i.e., the background tag) is shown in table two.
Watch two
Candidate search term Candidate video Video label
What was the chicken evolved? Video A Chicken, countryside, farm, chicken farm
What need to be noticed in pig farming? Video B Pig, countryside and pig farm
In one possible implementation, in response to an application start operation, the terminal sends a content context acquisition instruction to the server based on the user representation (the instruction may include a specific user representation or only include a user representation identifier, and the server queries locally based on the user representation identifier), and instructs the server to feed back object content and interface context according with the user characteristics based on the user representation.
Illustratively, as shown in fig. 5, when the user image indicates that the user belongs to the sinking scene, the terminal displays a first search function introduction interface 51, where the first search function introduction interface 51 includes a first function interaction object 511 related to the sinking scene and a first interface background 512; when the user image indicates that the user belongs to the city scene, the terminal displays a second search function introduction interface 52, and the second search function introduction interface 52 includes a second function interaction object 521 related to the city scene and a second interface background 522.
And 402, displaying an interface background in the first function introduction interface, and displaying a function interaction object on the upper layer of the interface background based on the object content.
Further, the terminal generates a function interaction object based on the object content and the application function to be introduced, so that the function interaction object is displayed on the upper layer of the interface background, and a function introduction interface is obtained. It should be noted that, besides displaying the function interaction object, the upper layer of the interface background may also display other interface elements, such as a function introduction text, a skip control, and the like, which is not limited in this embodiment.
The form of the function interaction object is different for different application functions, and in a possible implementation, when the first application function is a search function, the function interaction object is a search control containing a search word (i.e., object content); when the first application function is a reading function, the function interaction object is a browsing control containing reading content; when the first application function is a video function, the function interaction object is a play control containing video content.
Illustratively, as shown in fig. 6, for the search function, a search control 611 containing a search term is displayed in the search function introduction interface 61, and the first interface background 612 is a short video related to the search term; for the reading function, a reading control 621 containing reading content is displayed in the reading function introduction interface 62, and the second interface background 622 is a short video related to the reading content.
It should be noted that, this embodiment is only schematically described by taking the above application function and the function interaction object corresponding to the application function as an example, in other possible embodiments, a developer may also set the function interaction object according to other possible application functions, and this embodiment does not limit this.
In step 403, the background interactive objects in the interface background are marked.
Regarding the specific way of marking the background interactive object, in a possible implementation manner, the server identifies the background interactive object in the interface background in advance according to the object content matched with the interface background, and determines the marking information corresponding to the background interactive object; and when the interface background is provided for the terminal subsequently, the marking information corresponding to the background interaction object in the interface background is fed back to the terminal. Correspondingly, after the terminal acquires the marking information (used for indicating the display position of the background interactive object in the interface background), the terminal marks the background interactive object in the interface background based on the marking information.
The labeling information may be position information of an object contour of the background interactive object, and the terminal labels the object contour of the background interactive object according to the labeling information.
In another possible implementation, the step of determining the display position of the background interactive object may also be performed by the terminal, and optionally, this step includes the following sub-steps:
firstly, extracting content keywords of the object content.
In some embodiments, the terminal extracts keywords from the obtained object content to obtain content keywords, so as to label objects related to the content keywords in the interface background in the following. The content keywords extracted from the object content by the terminal are entity nouns, and the number of the content keywords is at least one.
Optionally, if the object content is text content, the terminal performs keyword extraction on the text content to obtain content keywords; if the object content is multimedia content (such as pictures or audios and videos), the terminal extracts keywords from the description information of the multimedia content to obtain content keywords.
Secondly, object recognition is carried out on the interface background based on the content keywords, and the recognized background interaction objects are marked.
Furthermore, the terminal identifies the interface background based on the extracted content keywords, and determines the object matched with the content keywords as a background interactive object, so as to label the background interactive object. The terminal can identify the object of the interface background through the neural network model and match the identified object with the content keyword, so as to determine the background interactive object and the display position of the background interactive object in the interface background.
Optionally, the manner of labeling the background interactive object may include labeling an object outline, highlighting an object, and the like, which is not limited in this embodiment.
It should be noted that, when a plurality of content keywords are extracted, the terminal performs object identification based on each content keyword, and the background interactive objects matched with different content keywords adopt the same or different marking manners, and the number and specific marking manners of the background interactive objects are not limited in this embodiment.
Schematically, as shown in fig. 5, in a sinking scenario, the terminal evolves from the object content "what is chicken? "extracts a content keyword" chicken ", and performs object recognition on the first interface background 512 based on the content keyword, thereby determining the" chicken "in the first interface background 512 as a first background interactive object 513, and labeling the outline thereof; in a city scene, from the object content, "who invented zebra crossing? "the content keyword" zebra crossing "is extracted, and object recognition is performed on the second interface background 522 based on the content keyword, so that the" zebra crossing "in the second interface background 522 is determined as the second background interactive object 523, and the outline thereof is labeled.
And step 404, responding to the triggering operation of the function interaction object, and displaying a function display interface of the first application function, wherein the display content in the function display interface is matched with the object content.
The step 202 may be referred to in the implementation manner of this step, and this embodiment is not described herein again.
Illustratively, as shown in fig. 7, when a click operation is received on the search control 711 in the search function introduction interface 71, the terminal displays a search function presentation interface 72, and the search function presentation interface 72 includes presentation contents matched with the search terms.
And 405, responding to the triggering operation of the jump control in the function display interface, and jumping to display a function use interface of the first application function.
In order to facilitate the user to continue to view the content of the object of interest or continue to actually use the application function, in one possible implementation manner, a jump control is arranged in the function display interface displayed after the function interaction object is triggered, and the jump control is used for triggering a jump to the function use interface of the application function.
For example, when the function display interface is used for displaying a search function, the terminal skips to display the search interface after the user triggers the skip control; when the function display interface is used for displaying the reading function, the terminal skips to display the reading interface after the user triggers the skip control; and when the function display interface is used for displaying the video function, the terminal skips to display the video interface after the user triggers the skip control.
In some embodiments, the functional use interface displayed after the jump control is triggered also contains the content related to the object content. For example, when the object content is a search term, the functional use interface displayed after triggering the jump control is a search result interface searched according to the search term.
Optionally, the jump control is located in a fixed area of a page displayed on the function application interface, for example, located at the top of the page or located at the bottom of the page.
Illustratively, as shown in fig. 7, when a click operation on the search control 711 is received, the terminal displays the search presentation interface 72, and the user can slide up and down in the search presentation interface 72 to view the complete content. When the interaction reaches the bottom of the page, a jump button 721 is displayed in the search display interface 72, and after the user clicks the jump button 721, the terminal further jumps to display the search interface 73, so that the user can use the search function.
And 406, responding to the trigger operation of the background interactive object in the interface background, and displaying an information display interface, wherein the information display interface comprises object information of the background interactive object, and the background interactive object is matched with the object content.
The step 203 may be referred to in the implementation manner of this step, and this embodiment is not described herein again.
Illustratively, as shown in fig. 8, when a click operation on a background interactive object 712 in the search function introduction interface 71 is received, the terminal displays an information presentation interface 74 corresponding to "chicken".
Step 407, responding to the interactive operation of the three-dimensional model in the information display interface, and adjusting the three-dimensional model based on the interactive operation, wherein the adjustment mode of the three-dimensional model comprises at least one of zooming out, zooming in and rotating.
In order to enable a user to know the interface interaction object more intuitively, a three-dimensional model of a background interaction object is displayed in the information display interface. Optionally, a three-dimensional model library is arranged in the server, and when the triggering operation on the background interaction object is received, the terminal obtains the three-dimensional model corresponding to the background interaction object from the server and displays the three-dimensional model.
Furthermore, the three-dimensional model in the information display interface supports interaction, and a user can adjust the size and the angle of the three-dimensional model through an interaction gesture, so that a background interaction object can be observed more comprehensively.
Illustratively, as shown in fig. 8, the information presentation interface 74 includes text introduction information 741 of the background interactive object 712 and a three-dimensional model 742 of the background interactive object 712. The user may scale or rotate the three-dimensional model 742 through an interactive gesture.
In this embodiment, the terminal explicitly labels the background interactive object in the interface background based on the labeling information or the object identification result, so as to prompt the user to interact with the background interactive object, which is beneficial to improving the trigger rate of the background interactive object and deepening the understanding degree of the user on the application function.
In addition, the jump control is arranged in the function display interface, so that the user can further use the application function by triggering the jump control, and the use efficiency of the application function is improved; by displaying the three-dimensional model of the background interactive object in the information display interface, a user can more fully know the background interactive object in the interactive process of the three-dimensional model, the interest degree of the user in the application function is improved, and the utilization rate of the application function is further improved.
In general, an application usually has a plurality of application functions to be introduced, so that after the terminal displays the first function introduction interface, the function introduction interface needs to be switched when the interface switching condition is met. The interface switching condition can be automatically triggered by an application or manually triggered by a user.
Optionally, the interface switching condition automatically triggered by the application may be that the display duration of the first function introduction interface reaches a duration threshold (for example, 10 s).
Optionally, when the interface switching condition is manually triggered by the user, in response to the interface switching operation, the terminal displays a second function introduction interface, where a second application function corresponding to the second function introduction interface is different from the first application function. It should be noted that the second function introduction interface also includes an interface background and a function interaction object related to the second application function, and the specific display process of the second function introduction interface may refer to the foregoing embodiment, which is not described herein again.
The interface switching operation may be a sliding operation (for example, sliding left and right), a long-time pressing operation (for example, pressing a designated area of the interface for a long time), a clicking operation (for example, clicking a switching control in the interface), and the like, which is not limited in this embodiment.
Illustratively, as shown in fig. 9, in the case of displaying the search function introduction interface 91, when a slide operation is received on the interface, the terminal switches to display the reading function introduction interface 92. Of course, the user may also trigger the terminal to switch back to the search function introduction interface 91 through a reverse sliding operation.
In some embodiments, the server includes at least two object contents fed back based on the user portrait, and different object contents correspond to different interface backgrounds, and when the terminal displays a plurality of function interaction objects based on the object contents, one of the function interaction objects is set to be in a triggerable state, and the interface background matched with the triggerable state is displayed. For example, the server feeds back the first search term, the second search term, the third search term, and short videos (i.e., 3 short videos in total) matched with each search term, and when the terminal sets the search control corresponding to the first search term to be in a triggerable state, the short video corresponding to the first search term needs to be used as an interface background. Optionally, as shown in fig. 10, the step 402 may include the following sub-steps:
in step 402A, an interface background corresponding to the content of the first object is displayed in the first function introduction interface.
Optionally, the first object content is an object content with the highest matching degree with the user portrait in the at least two object contents, that is, the terminal preferentially displays an interface background with the highest matching degree with the user feature. Of course, in other possible implementations, the first object content may be any one of at least two object contents, which is not limited in this embodiment.
Illustratively, as shown in fig. 11, the terminal displays a first search word "what is chicken evolved? "corresponding first interface background 1113.
And 402B, displaying at least two functional interaction objects on the upper layer of the interface background based on at least two object contents.
Further, the terminal displays a function interaction object of each item of object content on the interface background based on the first application function and each item of object content. Illustratively, as shown in fig. 11, the terminal displays a first search control 1111 corresponding to the first search word and a second search control 1112 corresponding to the second search word on a top layer of a first interface background 1113.
And step 402C, setting the function interaction object corresponding to the first object content to be in a triggerable state.
In a possible implementation manner, only one function interaction object in the plurality of function interaction objects displayed on the function introduction interface is in a triggerable state at the same time, and the function interaction object in the triggerable state is matched with the current interface background. Wherein, the triggerable state indicates that the user can perform interactive operation on the function interaction object. Alternatively, the triggerable state may be represented by highlighting, zooming in, adding controls, and the like.
Illustratively, as shown in FIG. 11, the terminal sets the first search control 1111 to the triggerable state via search button 1115 to ensure that the first search term matches the first interface background 1113.
In the subsequent process, the user can trigger the function interaction object in the triggerable state to trigger and display the corresponding function display interface. It should be noted that, when the trigger operation for the function interaction object in the non-triggerable state is received, the terminal does not display the function display interface.
Optionally, when there are at least two items of object content, the interface background of the function introduction interface supports switching, and in response to meeting a switching condition, the terminal sets the function interaction object corresponding to the second object content to be in a triggerable state, and displays the interface background corresponding to the second object content in the first function introduction interface.
In some embodiments, the switching condition may include at least one of a display duration condition and a manual trigger condition. The display duration condition refers to that the display duration of the current interface background reaches a duration threshold (for example, 3 s); the manual trigger condition refers to that a gesture operation (such as a sliding-up operation) for triggering interface background switching is received.
In a possible implementation manner, when the background of the interface is a video, when the video playing corresponding to the first object content is finished, the terminal sets the functional interaction object corresponding to the second object content to be in a triggerable state, and plays the video corresponding to the second object content.
In another possible implementation manner, when the interface background is a picture, the terminal presets a picture display duration corresponding to the interface background, and when the picture display duration is reached, the terminal switches the function interaction object in the triggerable state and switches the interface background.
Illustratively, as shown in FIG. 11, when the first search control 1111 is in the triggerable state, the search function introduction interface 1110 displays a first interface background 1113 (first short video); when the first short video playback ends, the terminal switches the second search control 1112 to a triggerable state and displays a second interface background 1114 (second short video) on the search function introduction interface 1110.
Referring to fig. 12, a block diagram of a device for presenting application functions according to an exemplary embodiment of the present application is shown. The apparatus may include:
the first display module 1201 is configured to display a first function introduction interface in response to an application start operation, where the first function introduction interface includes a function interaction object corresponding to a first application function, an interface background of the first function introduction interface is matched with object content of the function interaction object, and the object content is determined based on a user portrait;
a second display module 1202, configured to display a function display interface of the first application function in response to a trigger operation on the function interaction object, where display content in the function display interface is matched with the object content;
a third display module 1203, configured to display an information display interface in response to a trigger operation on a background interaction object in the interface background, where the information display interface includes object information of the background interaction object, and the background interaction object is matched with the object content.
Optionally, the first display module 1201 includes:
an obtaining unit, configured to obtain the object content and the interface background based on the user representation in response to the application start operation;
the display unit is used for displaying the interface background in the first function introduction interface and displaying the function interaction object on the upper layer of the interface background based on the object content;
and the marking unit is used for marking the background interaction objects in the interface background.
Optionally, the marking unit is configured to:
acquiring annotation information, wherein the annotation information is used for indicating the display position of the background interactive object in the interface background;
and marking the background interaction object in the interface background based on the marking information.
Optionally, the marking unit is configured to:
extracting content keywords of the object content;
and performing object recognition on the interface background based on the content keywords, and marking the recognized background interaction object.
Optionally, the object contents are at least two items, and different object contents correspond to different interface backgrounds;
the display unit is used for:
displaying an interface background corresponding to first object content in the first function introduction interface;
displaying at least two function interaction objects on the upper layer of the interface background based on at least two items of the object contents;
and setting the function interaction object corresponding to the first object content into a triggerable state.
Optionally, the apparatus further comprises:
and the switching module is used for setting the function interaction object corresponding to the second object content into a triggerable state in response to the switching condition being met, and displaying the interface background corresponding to the second object content in the first function introduction interface.
Optionally, the interface background is a video;
the switching module is configured to:
and in response to the end of the video playing corresponding to the first object content, setting the functional interaction object corresponding to the second object content to be in a triggerable state, and playing the video corresponding to the second object content.
Optionally, the function display interface includes a jump control;
the device further comprises:
and the fourth display module is used for responding to the triggering operation of the jump control in the function display interface and jumping to display the function use interface of the first application function.
Optionally, the information display interface includes a three-dimensional model of the background interactive object;
the device further comprises:
and the model interaction module is used for responding to the interaction operation of the three-dimensional model in the information display interface and adjusting the three-dimensional model based on the interaction operation, and the adjustment mode of the three-dimensional model comprises at least one of zooming out, zooming in and rotating.
Optionally, the apparatus further comprises:
and the fifth display module is used for responding to interface switching operation and displaying a second function introduction interface, wherein the second application function corresponding to the second function introduction interface is different from the first application function.
Optionally, when the first application function is a search function, the function interaction object is a search control containing a search term;
when the first application function is a reading function, the function interaction object is a browsing control containing reading content;
and when the first application function is a video function, the function interaction object is a playing control containing video content.
In summary, in the embodiment of the application, the object content to be displayed in the function introduction interface is determined based on the user portrait, so that the function interaction object and the interface background matched with the object content are displayed in the application function interface based on the object content, the function introduction interface for displaying personalized content for different users is realized, and the content richness and the content matching degree of the function introduction interface are improved; in addition, compared with static function display through characters and pictures, in the embodiment of the application, the function interaction object in the function introduction interface and the background interaction object in the interface background both support interaction, so that a user can know the application function through triggering operation in the function introduction interface, the understanding degree of the user on the application function is further deepened, and the utilization rate of the application function is further improved.
It should be noted that: in practical applications, the above function distribution may be completed by different function modules according to needs, that is, the internal structure of the apparatus is divided into different function modules, so as to complete all or part of the above described functions. In addition, the apparatus and method embodiments provided in the above embodiments belong to the same concept, and specific implementation processes thereof are described in detail in the method embodiments, which are not described herein again.
Referring to fig. 13, a block diagram of a terminal 1300 according to an exemplary embodiment of the present application is shown. The terminal 1300 may be a portable mobile terminal such as: smart phones, tablet computers, MP3 players (Moving Picture Experts Group Audio Layer III, moving Picture Experts compression standard Audio Layer 3), MP4 players (Moving Picture Experts Group Audio Layer IV, moving Picture Experts compression standard Audio Layer 4). Terminal 1300 may also be referred to by other names such as user equipment, portable terminal, etc.
In general, terminal 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1302 may include one or more computer-readable storage media, which may be tangible and non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer-readable storage medium in memory 1302 is used to store at least one instruction for execution by processor 1301 to implement a method as provided by embodiments of the present application.
In some embodiments, terminal 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, touch display 1305, camera 1306, audio circuitry 1307, positioning component 1308, and power supply 1309.
Peripheral interface 1303 may be used to connect at least one peripheral associated with I/O (Input/Output) to processor 1301 and memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1301, the memory 1302, and the peripheral device interface 1303 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1304 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals. Optionally, the radio frequency circuit 1304 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1304 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The touch display 1305 is used to display a UI (user interface). The UI may include graphics, text, icons, video, and any combination thereof. The touch display 1305 also has the capability to collect touch signals on or over the surface of the touch display 1305. The touch signal may be input to the processor 1301 as a control signal for processing. The touch display 1305 is used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, touch display 1305 may be one, providing the front panel of terminal 1300; in other embodiments, touch display 1305 may be at least two, either on different surfaces of terminal 1300 or in a folded design; in still other embodiments, touch display 1305 may be a flexible display disposed on a curved surface or on a folded surface of terminal 1300. Even more, the touch screen 1305 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The touch Display 1305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 1306 is used to capture images or video. Optionally, camera head assembly 1306 includes a front camera and a rear camera. Generally, a front camera is used to implement a video call or self-timer shooting, and a rear camera is used to implement a picture or video shooting. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera and a wide-angle camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting function and a VR (Virtual Reality) shooting function. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1307 is used to provide an audio interface between the user and the terminal 1300. The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for realizing voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1300. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1301 or the radio frequency circuitry 1304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1307 may also include a headphone jack.
The positioning component 1308 is used for locating the current geographic position of the terminal 1300 for navigation or LBS (Location Based Service). The Positioning component 1308 can be a Positioning component based on the Global Positioning System (GPS) in the united states, the beidou System in china, or the galileo System in russia.
Power supply 1309 is used to provide power to various components in terminal 1300. The power supply 1309 may be alternating current, direct current, disposable or rechargeable batteries. When the power source 1309 comprises a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1300 also includes one or more sensors 1310. The one or more sensors 1310 include, but are not limited to: acceleration sensor 1311, gyro sensor 1312, pressure sensor 1313, fingerprint sensor 1314, optical sensor 1315, and proximity sensor 1316.
The acceleration sensor 1311 can detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1300. For example, the acceleration sensor 1311 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1301 may control the touch display screen 1305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1311. The acceleration sensor 1311 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1312 may detect the body direction and the rotation angle of the terminal 1300, and the gyro sensor 1312 may cooperate with the acceleration sensor 1311 to acquire a 3D motion of the user with respect to the terminal 1300. Processor 1301, based on the data collected by gyroscope sensor 1312, may perform the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1313 may be disposed on a side bezel of terminal 1300 and/or underlying touch display 1305. When the pressure sensor 1313 is provided on the side frame of the terminal 1300, a user's grip signal on the terminal 1300 can be detected, and left-right hand recognition or shortcut operation can be performed based on the grip signal. When the pressure sensor 1313 is disposed on the lower layer of the touch display panel 1305, it is possible to control the operability control on the UI interface according to the pressure operation of the user on the touch display panel 1305. The operability control comprises at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1314 is used for collecting the fingerprint of the user so as to identify the identity of the user according to the collected fingerprint. When the identity of the user is identified as a trusted identity, the processor 1301 authorizes the user to perform relevant sensitive operations, including unlocking a screen, viewing encrypted information, downloading software, paying, changing settings, and the like. The fingerprint sensor 1314 may be disposed on the front, back, or side of the terminal 1300. When a physical button or vendor Logo is provided on the terminal 1300, the fingerprint sensor 1314 may be integrated with the physical button or vendor Logo.
The optical sensor 1315 is used to collect the ambient light intensity. In one embodiment, the processor 1301 can control the display brightness of the touch display screen 1305 according to the intensity of the ambient light collected by the optical sensor 1315. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1305 is increased; when the ambient light intensity is low, the display brightness of the touch display 1305 is turned down. In another embodiment, the processor 1301 can also dynamically adjust the shooting parameters of the camera head assembly 1306 according to the ambient light intensity collected by the optical sensor 1315.
Proximity sensor 1316, also known as a distance sensor, is typically disposed on a front face of terminal 1300. Proximity sensor 1316 is used to gather the distance between the user and the front face of terminal 1300. In one embodiment, the processor 1301 controls the touch display 1305 to switch from the bright screen state to the dark screen state when the proximity sensor 1316 detects that the distance between the user and the front face of the terminal 1300 gradually decreases; the touch display 1305 is controlled by the processor 1301 to switch from the rest state to the bright state when the proximity sensor 1316 detects that the distance between the user and the front face of the terminal 1300 gradually becomes larger.
Those skilled in the art will appreciate that the configuration shown in fig. 13 is not intended to be limiting with respect to terminal 1300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
In an embodiment of the present application, a computer-readable storage medium is further provided, where at least one instruction is stored in the storage medium, and the at least one instruction is loaded and executed by a processor to implement the application function presentation method according to the above aspect.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the presentation method of the application functions provided in the various alternative implementations of the above aspects.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements that have been described above and shown in the drawings, and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (14)

1. A method for displaying application functions, the method comprising:
responding to application starting operation, displaying a first function introduction interface, wherein the first function introduction interface comprises a function interaction object corresponding to a first application function, the interface background of the first function introduction interface is matched with the object content of the function interaction object, and the object content is determined and obtained based on the user portrait;
responding to the triggering operation of the function interaction object, and displaying a function display interface of the first application function, wherein the display content in the function display interface is matched with the object content;
responding to the triggering operation of the background interaction object in the interface background, and displaying an information display interface, wherein the information display interface comprises the object information of the background interaction object, and the background interaction object is matched with the object content.
2. The method of claim 1, wherein displaying a first functionality introduction interface in response to the application launch operation comprises:
in response to the application starting operation, acquiring the object content and the interface background based on the user portrait;
displaying the interface background in the first function introduction interface, and displaying the function interaction object on the upper layer of the interface background based on the object content;
marking the background interaction object in the interface background.
3. The method of claim 2, wherein said marking the background interaction objects in the interface background comprises:
acquiring annotation information, wherein the annotation information is used for indicating the display position of the background interactive object in the interface background;
and marking the background interaction object in the interface background based on the marking information.
4. The method of claim 2, wherein the tagging the background interaction object in the interface background comprises:
extracting content keywords of the object content;
and carrying out object recognition on the interface background based on the content keywords, and marking the recognized background interaction objects.
5. The method according to claim 2, wherein the object contents are at least two items, and different object contents correspond to different interface backgrounds;
the displaying the interface background in the first function introduction interface and the function interaction object on the upper layer of the interface background based on the object content comprises:
displaying an interface background corresponding to first object content in the first function introduction interface;
displaying at least two functional interaction objects on the interface background upper layer based on at least two items of the object contents;
and setting the function interaction object corresponding to the first object content to be in a triggerable state.
6. The method of claim 5, further comprising:
and in response to the switching condition being met, setting the function interaction object corresponding to the second object content into a triggerable state, and displaying an interface background corresponding to the second object content in the first function introduction interface.
7. The method of claim 6, wherein the interface background is a video;
the step of setting the function interaction object corresponding to the second object content to be in a triggerable state in response to the switching condition being met, and displaying an interface background corresponding to the second object content in the first function introduction interface includes:
and in response to the end of the video playing corresponding to the first object content, setting the functional interaction object corresponding to the second object content to be in a triggerable state, and playing the video corresponding to the second object content.
8. The method according to any one of claims 1 to 7, wherein the function presentation interface comprises a jump control;
after the function display interface of the first application function is displayed in response to the triggering operation of the function interaction object, the method further comprises:
and responding to the triggering operation of the jump control in the function display interface, and jumping to display the function use interface of the first application function.
9. The method according to any one of claims 1 to 7, wherein the information presentation interface comprises a three-dimensional model of the background interactive object;
after the information display interface is displayed in response to the triggering operation of the background interaction object in the interface background, the method further comprises the following steps:
and responding to the interactive operation of the three-dimensional model in the information display interface, and adjusting the three-dimensional model based on the interactive operation, wherein the adjustment mode of the three-dimensional model comprises at least one of zooming out, zooming in and rotating.
10. The method of any one of claims 1 to 7, wherein after displaying the first function introduction interface, the method further comprises:
and responding to interface switching operation, and displaying a second function introduction interface, wherein a second application function corresponding to the second function introduction interface is different from the first application function.
11. The method according to any one of claims 1 to 7,
when the first application function is a search function, the function interaction object is a search control containing search words;
when the first application function is a reading function, the function interaction object is a browsing control containing reading content;
and when the first application function is a video function, the function interaction object is a playing control containing video content.
12. An apparatus for displaying application functions, the apparatus comprising:
the first display module is used for responding to application starting operation and displaying a first function introduction interface, wherein the first function introduction interface comprises a function interaction object corresponding to a first application function, the interface background of the first function introduction interface is matched with the object content of the function interaction object, and the object content is determined and obtained based on the user portrait;
the second display module is used for responding to the triggering operation of the function interaction object and displaying a function display interface of the first application function, and display content in the function display interface is matched with the object content;
and the third display module is used for responding to the triggering operation of the background interaction object in the interface background and displaying an information display interface, wherein the information display interface comprises the object information of the background interaction object, and the background interaction object is matched with the object content.
13. A terminal, characterized in that it comprises a processor and a memory in which at least one instruction, at least one program, set of codes or set of instructions is stored, which is loaded and executed by the processor to implement a method of presentation of application functions as claimed in any one of claims 1 to 11.
14. A computer-readable storage medium, having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement a presentation method of an application function according to any one of claims 1 to 11.
CN202110821148.1A 2021-07-20 2021-07-20 Application function display method and device, terminal and storage medium Pending CN115905374A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110821148.1A CN115905374A (en) 2021-07-20 2021-07-20 Application function display method and device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110821148.1A CN115905374A (en) 2021-07-20 2021-07-20 Application function display method and device, terminal and storage medium

Publications (1)

Publication Number Publication Date
CN115905374A true CN115905374A (en) 2023-04-04

Family

ID=86478172

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110821148.1A Pending CN115905374A (en) 2021-07-20 2021-07-20 Application function display method and device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN115905374A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117270717A (en) * 2023-09-06 2023-12-22 北京酷讯科技有限公司 Man-machine interaction method, device, equipment and storage medium based on user interface

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117270717A (en) * 2023-09-06 2023-12-22 北京酷讯科技有限公司 Man-machine interaction method, device, equipment and storage medium based on user interface

Similar Documents

Publication Publication Date Title
CN111079012A (en) Live broadcast room recommendation method and device, storage medium and terminal
CN110163066B (en) Multimedia data recommendation method, device and storage medium
CN111291200B (en) Multimedia resource display method and device, computer equipment and storage medium
CN113411680B (en) Multimedia resource playing method, device, terminal and storage medium
CN112052897B (en) Multimedia data shooting method, device, terminal, server and storage medium
CN110933468A (en) Playing method, playing device, electronic equipment and medium
CN111858971A (en) Multimedia resource recommendation method, device, terminal and server
CN113613028B (en) Live broadcast data processing method, device, terminal, server and storage medium
CN111026992A (en) Multimedia resource preview method, device, terminal, server and storage medium
CN110750734A (en) Weather display method and device, computer equipment and computer-readable storage medium
CN111711838B (en) Video switching method, device, terminal, server and storage medium
CN112052354A (en) Video recommendation method, video display method and device and computer equipment
CN111031391A (en) Video dubbing method, device, server, terminal and storage medium
CN113032587A (en) Multimedia information recommendation method, system, device, terminal and server
CN112131473B (en) Information recommendation method, device, equipment and storage medium
CN113609358A (en) Content sharing method and device, electronic equipment and storage medium
CN113469779A (en) Information display method and device
CN114547429A (en) Data recommendation method and device, server and storage medium
CN111782950A (en) Sample data set acquisition method, device, equipment and storage medium
CN115905374A (en) Application function display method and device, terminal and storage medium
CN113987326B (en) Resource recommendation method and device, computer equipment and medium
CN114125531B (en) Video preview method, device, terminal and storage medium
CN111641853B (en) Multimedia resource loading method and device, computer equipment and storage medium
CN113377271A (en) Text acquisition method and device, computer equipment and medium
CN113377976A (en) Resource searching method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40084269

Country of ref document: HK