CN114153361A - Interface display method, device, terminal and storage medium - Google Patents

Interface display method, device, terminal and storage medium Download PDF

Info

Publication number
CN114153361A
CN114153361A CN202010927582.3A CN202010927582A CN114153361A CN 114153361 A CN114153361 A CN 114153361A CN 202010927582 A CN202010927582 A CN 202010927582A CN 114153361 A CN114153361 A CN 114153361A
Authority
CN
China
Prior art keywords
interface
displaying
eyeball
target
target application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010927582.3A
Other languages
Chinese (zh)
Other versions
CN114153361B (en
Inventor
田元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010927582.3A priority Critical patent/CN114153361B/en
Publication of CN114153361A publication Critical patent/CN114153361A/en
Application granted granted Critical
Publication of CN114153361B publication Critical patent/CN114153361B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an interface display method, an interface display device, a terminal and a storage medium, and belongs to the technical field of terminals. The method comprises the following steps: displaying a first interface, wherein the first interface is any interface of a target application; detecting the use condition of the target application to obtain the use information of the target application; and responding to the use information meeting a target condition, and displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding the user to take a rest. By the aid of the technical scheme, the user can be guided to have a rest, fatigue of the user is effectively relieved, and human-computer interaction efficiency is improved.

Description

Interface display method, device, terminal and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an interface display method and apparatus, a terminal, and a storage medium.
Background
With the development of terminal technology, a terminal is installed with various types of applications, and a user uses the applications for a longer and longer time. However, using the application program for a long time may cause eye fatigue, which is not good for the health of the user.
At present, in order to avoid that a user uses an application program for a long time, an interface for setting and reminding the user to have a rest is provided, and the user can set how long the user uses the application program cumulatively through the interface to remind the user, and if the user sets 2 hours, the user is reminded to have a rest after the user uses the application program cumulatively for two hours.
The technical scheme has the problems that the user is only reminded to have a rest, the fatigue state of the user cannot be effectively relieved, the function is not effective, and the human-computer interaction efficiency is low.
Disclosure of Invention
The embodiment of the application provides an interface display method, an interface display device, a terminal and a storage medium, which can guide a user to have a rest, effectively relieve the fatigue condition of the user and improve the human-computer interaction efficiency. The technical scheme is as follows:
in one aspect, an interface display method is provided, and the method includes:
displaying a first interface, wherein the first interface is any interface of a target application;
detecting the use condition of the target application to obtain the use information of the target application;
and responding to the use information meeting a target condition, and displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding the user to take a rest.
In another aspect, an interface display apparatus is provided, the apparatus including:
the first display module is used for displaying a first interface, and the first interface is any interface of the target application;
the detection module is used for detecting the use condition of the target application to obtain the use information of the target application;
and the second display module is used for responding to the situation that the use information meets the target condition and displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding the user to take a rest.
In an optional implementation manner, the detection module is configured to determine a single-use duration of the target application according to a use operation of the target application; and sending the single-use duration to a server, determining the accumulated use duration of the target application in a target time range by the server based on the single-use duration, and taking the accumulated use duration as the use information of the target application.
In an optional implementation manner, the detection module is configured to perform facial expression recognition according to an acquired facial image, and use a recognition result of the facial expression recognition as the use information of the target application, where the recognition result is used to indicate whether a user is in a fatigue state.
In an optional implementation manner, the second display module is configured to display an indication graphic on the second interface in response to the usage information meeting a target condition, where the indication graphic is used for indicating an eye movement; carrying out eyeball tracking according to the collected face image to obtain the real-time position of the eyeball; and displaying a progress graph on the periphery of the indication graph according to the real-time position of the eyeball, wherein the progress graph is used for indicating the progress of executing the eyeball action.
In an optional implementation manner, the second display module is further configured to display a next indication graph in response to the progress graph indicating that the eyeball motion is performed completely, where the next indication graph is different from the eyeball motion indicated by the indication graph.
In an optional implementation manner, the second interface further displays the remaining number of times of executing the eyeball action;
the second display module is further configured to update the remaining execution times in response to the progress graph indicating that the eyeball action is executed; and zeroing the progress indicated by the progress graph for executing the eyeball action.
In an optional implementation, the apparatus further includes:
the second display module is further used for displaying an action setting control on the second interface;
and the third display module is used for responding to the triggering operation of the action setting control and displaying an action setting interface, and the action setting interface is used for setting the eyeball action indicated by the indication graph.
In an optional implementation manner, the second display module is configured to display a first eyeball action on the second interface in response to the usage information meeting a target condition; determining an eyeball movement track according to the collected face image; and responding to the eyeball movement track to be the same as the first eyeball action, and sequentially displaying at least one interface which is used for displaying the eyeball action different from the first eyeball action.
In an optional implementation manner, the second interface further displays the remaining number of times of execution of the first eyeball action;
the second display module is further configured to update the remaining execution times; and responding to the fact that the residual execution times are zero, and sequentially displaying at least one interface of the target application.
In an optional implementation, the apparatus further includes:
the fourth display module is used for displaying a target condition setting interface;
and the request sending module is used for responding to the setting operation detected on the target condition setting interface and sending a target condition setting request to a server, wherein the target condition setting request carries the target condition.
In an optional implementation manner, the second display module is further configured to display a delayed rest control on the second interface; and responding to the triggering operation of the delayed rest control, and displaying the second interface after delaying the target duration.
In an optional implementation, the apparatus further includes:
the fifth display module is used for responding to the quitting operation and displaying a progress display interface, the progress display interface is used for displaying a plurality of execution progress controls, and one execution progress control is used for displaying the execution progress of one eyeball action;
and the sixth display module is used for responding to the trigger operation of the execution progress control corresponding to any eyeball action and displaying an interface comprising the eyeball action.
In another aspect, a terminal is provided, where the terminal includes a processor and a memory, where the memory is used to store at least one program code, and the at least one program code is loaded and executed by the processor to implement the operations performed in the interface display method in the embodiments of the present application.
In another aspect, a computer-readable storage medium is provided, in which at least one program code is stored, and the at least one program code is loaded and executed by a processor to implement the operations performed in the interface display method in the embodiment of the present application.
In another aspect, a computer program product or a computer program is provided, the computer program product or the computer program comprising computer program code, the computer program code being stored in a computer readable storage medium. The processor of the terminal reads the computer program code from the computer-readable storage medium, and the processor executes the computer program code, so that the computer device performs the interface display method provided in the above-described aspects or various alternative implementations of the aspects.
The technical scheme provided by the embodiment of the application has the following beneficial effects:
the embodiment of the application provides an interface display method, which is characterized in that the use information of a target application is acquired by detecting the use condition of the target application, so that when the use information meets the target condition, any interface displaying the target application is changed into a second interface displaying indication information guiding a user to have a rest, the user can be guided to have a rest, the fatigue condition of the user is effectively relieved, and the man-machine interaction efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of an implementation environment of an interface display method provided according to an embodiment of the present application;
FIG. 2 is a flow chart of an interface display method according to an embodiment of the present application;
FIG. 3 is a flow chart of another interface display method provided in accordance with an embodiment of the present application;
FIG. 4 is a schematic illustration of a second interface provided in accordance with an embodiment of the present application;
FIG. 5 is a schematic illustration of a second interface provided in accordance with an embodiment of the present application;
FIG. 6 is a schematic illustration of another second interface provided in accordance with an embodiment of the present application;
FIG. 7 is a schematic illustration of another second interface provided in accordance with an embodiment of the present application;
FIG. 8 is a flow chart of another interface display method provided in accordance with an embodiment of the present application;
FIG. 9 is a schematic diagram of a second interface and action setting interface provided in accordance with an embodiment of the present application;
FIG. 10 is a schematic illustration of another second interface provided in accordance with an embodiment of the present application;
FIG. 11 is a schematic illustration of a third interface provided in accordance with an embodiment of the present application;
FIG. 12 is a schematic view of another second interface provided in accordance with an embodiment of the present application;
FIG. 13 is a schematic diagram of a progress-displaying interface provided in accordance with an embodiment of the present application;
FIG. 14 is a block diagram of an interface display apparatus provided in accordance with an embodiment of the present application;
fig. 15 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
An implementation environment of the interface display method provided by the embodiment of the present application is described below. Fig. 1 is a schematic diagram of an implementation environment of an interface display method according to an embodiment of the present application. Referring to fig. 1, the implementation environment includes a terminal 101 and a server 102.
The terminal 101 and the server 102 can be directly or indirectly connected through wired or wireless communication, and the application is not limited herein.
Optionally, the terminal 101 is a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, or the like, but is not limited thereto. The terminal 101 is installed and operated with an application program. The application program may be a social application program, an information application program, an education application program, and the like, which is not limited in this embodiment of the application. Illustratively, the terminal 101 is a terminal used by a user, and a user account is registered in an application running in the terminal.
Optionally, the server 102 is an independent physical server, may also be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a CDN (Content Delivery Network), a big data and artificial intelligence platform, and the like. The server 102 is used to provide background services for the application. Alternatively, the server 102 may undertake primary computational tasks and the terminal 101 may undertake secondary computational tasks; or, the server 102 undertakes the secondary computing work, and the terminal 101 undertakes the primary computing work; alternatively, the server 102 and the terminal 101 perform cooperative computing by using a distributed computing architecture.
Optionally, the server 102 comprises: the system comprises an access server, an application background server and a database. The access server is used for providing the terminal 101 with access service. The application background server is used for providing background services for the application program. The application background server can be one or more. When the number of the application background servers is multiple, there are at least two application background servers for providing different services, and/or there are at least two application background servers for providing the same service, for example, providing the same service in a load balancing manner, which is not limited in the embodiment of the present application. The database server is used for storing the single-use time of the application program uploaded by the terminal.
Optionally, the terminal 101 generally refers to one of multiple terminals, and the embodiment of the present application is illustrated by the terminal 101. Those skilled in the art will appreciate that the number of terminals described above may be greater or fewer. For example, the number of the terminals may be only one, or several tens or hundreds of the terminals, or more. The number of terminals and the type of the device are not limited in the embodiments of the present application.
Fig. 2 is a flowchart of an interface display method according to an embodiment of the present application, and as shown in fig. 2, the interface display method is described in the embodiment of the present application by taking an application to a terminal as an example. The interface display method comprises the following steps:
201. the terminal displays a first interface, and the first interface is any interface of the target application.
In this embodiment of the application, the target application is any application running in the terminal, and the first interface is any interface of the target application.
202. And the terminal detects the use condition of the target application to obtain the use information of the target application.
In the embodiment of the application, the terminal can detect the use condition of the target application by the user, such as the accumulated use duration of the target application by the user, the single use duration for the target application, the fatigue degree when the target application is used by the user, and the like.
203. And the terminal responds to the use information meeting the target condition and displays a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding the user to take a rest.
In the embodiment of the application, the target condition is different according to different use information, and when the use information meets the target condition, the terminal displays the indication information for guiding the user to have a rest on the second interface, so that the user can be effectively guided to have a rest in a relaxing way, and the fatigue state of the user is relieved. The indication information can be in the form of a dynamic image, a text plus graphics form, and a static graphics form. Optionally, the indication information can also be audio indication information. Optionally, when the terminal displays the indication information, the terminal may also play a corresponding audio, which is not limited in this embodiment of the application. Optionally, the instruction information can be used to instruct the user to take a rest in various ways, such as instructing the user to take a rest on eyes, instructing the user to take a rest on cervical vertebrae, instructing the user to take a rest on brain, and the like, and the embodiment of the present application does not limit this.
The embodiment of the application provides an interface display method, which is characterized in that the use information of a target application is acquired by detecting the use condition of the target application, so that when the use information meets the target condition, any interface displaying the target application is changed into a second interface displaying indication information guiding a user to have a rest, the user can be guided to have a rest, the fatigue condition of the user is effectively relieved, and the man-machine interaction efficiency is improved.
Fig. 3 is a flowchart of another interface display method provided according to an embodiment of the present application, and as shown in fig. 3, in the embodiment of the present application, an example in which indication information is applied to a terminal and used for guiding a user to rest eyes is described. The method comprises the following steps:
301. the terminal displays a first interface, and the first interface is any interface of the target application.
In this embodiment of the application, the target application is any application running in the terminal, and the first interface is any interface of the target application. The terminal can take the application currently being used by the user as the target application.
302. And the terminal detects the use condition of the target application to obtain the use information of the target application.
In the embodiment of the application, the terminal can detect the use condition of the target application by the user, such as the accumulated use duration of the target application by the user, the single use duration for the target application, the fatigue degree when the target application is used by the user, and the like.
Optionally, the usage information is used to indicate an accumulated usage duration of the target application, and the usage duration of the target application can be accumulated by the terminal and also can be accumulated by the server.
In an optional implementation manner, the terminal determines the single-use duration of the target application according to the use operation of the target application, and the terminal performs accumulation based on the single-use duration to determine the accumulated use duration of the target application within the target time range. The use operation of the target application comprises starting the target application, switching the target application running in the background to be in the foreground running, switching the target application running in the foreground to be in the background running, closing the target application and the like. The target time range is 4 hours, 12 hours, 24 hours, or the like, which is not limited in the examples of the present application.
For example, the target application runs in the background, the terminal starts timing when the target application running in the background is switched to run in the foreground, the terminal finishes timing when the target application running in the foreground is switched to run in the background or the target application is closed, and the timing duration is taken as the single-use duration of the target application by the terminal.
In an optional implementation manner, the terminal determines the single-use duration of the target application according to the use operation of the target application, then the terminal sends the single-use duration to the server, and the server determines the accumulated use duration of the target application in the target time range based on the single-use duration. Correspondingly, when the server determines that the accumulated use time of the target application in the target time range reaches the time threshold, the server can send a rest instruction to the terminal, and the terminal determines that the accumulated use time of the target application reaches the time threshold based on the rest instruction. Optionally, when the accumulated usage duration of the target application in the target time range is close to the time threshold, the server may send an early warning instruction to the terminal, and the terminal sends a timing request to the server based on the early warning instruction when the user starts to use the target application, and the terminal and the server perform timing together until the accumulated usage duration of the target application reaches the time threshold. By the method, the problem that the server cannot effectively determine whether the accumulated use duration of the target application reaches the time threshold value or not due to the fact that the single use duration of the target application is too long can be avoided.
In an optional implementation manner, the terminal determines the current starting time of the target application according to the starting operation of the target application, then the terminal sends the current starting time to the server, and the server determines the accumulated use duration of the target application within the target time range based on the current starting time. The starting time is the starting time of the target application.
In an optional implementation manner, the terminal can perform facial expression recognition according to the collected facial image, and the recognition result of the facial expression recognition is used as the use information of the target application, and the recognition result is used for indicating whether the user is in a fatigue state.
Optionally, in response to the recognition result indicating that the user is in a tired state, the terminal displays a second interface. In the embodiment of the present application, a case where the terminal displays the second interface when the cumulative usage time of the target application reaches the time threshold is taken as an example for description.
303. And in response to the cumulative use duration of the target application reaching the time threshold, the terminal displays an indication graph on the second interface, wherein the indication graph is used for indicating the eyeball action.
In this embodiment, when the target condition is a time threshold, in a target time range, if the cumulative duration of use of the target application reaches the time threshold, the usage information satisfies the target condition, the terminal can display a second interface, and display an indication graph on the second interface, so as to guide the user to perform a corresponding eye movement according to the indication graph, such as blinking, rotation of an eye in a 360-degree spherical shape, rotation of an eye in a 360-degree quadrilateral shape, or 8-shaped eye movement. The terminal displays the second interface in various modes.
In an optional implementation manner, in response to that the accumulated usage time of the target application reaches the time threshold, the terminal can switch the currently displayed first interface to the second interface, and then the terminal displays the indication graph on the second interface.
In an optional implementation manner, in response to that the cumulative usage time of the target application reaches the time threshold, the terminal can display the second interface in an overlaying manner on the currently displayed first interface, the transparency of the second interface is gradually reduced, and then the terminal displays the indication graph on the second interface. Wherein the second interface is capable of being displayed in a floating layer from completely transparent to opaque. Optionally, when the transparency of the second interface is gradually decreased, the second interface can display a countdown, and when the countdown is finished, the second interface changes to the opaque state.
For example, referring to fig. 4, fig. 4 is a schematic view of a second interface provided according to an embodiment of the present application. As shown in fig. 4, the second interface is displayed in a semi-transparent manner on the first interface of the social application, the second interface displaying the eye in a semi-transparent state and the countdown seconds. Wherein the dotted line represents a translucent state.
It should be noted that the second interface can be an interface in the target application and also an interface provided by a system running in the terminal, which is not limited in the embodiment of the present application.
304. And the terminal tracks the eyeballs according to the collected face images to obtain the real-time positions of the eyeballs.
In the embodiment of the application, when the user performs the eyeball motion according to the guidance, the terminal can acquire the face image based on the image acquisition device, such as a camera, and perform eyeball tracking on the eyeball of the user based on the acquired face image, so as to obtain the real-time position of the eyeball of the user.
In an optional implementation manner, the terminal can acquire depth information from the acquired face image, and perform eyeball tracking based on the depth information.
For example, the following description will be given taking an example in which a terminal performs TrueDepth imaging (depth imaging) by a camera and captures a depth image having depth information to realize eye tracking. Firstly, a plurality of face images are shot through a TrueDepth camera, then eyeball tracking is carried out on the basis of the face images, and the real-time position of an eyeball in each face image is determined, wherein the real-time position can represent the position of the eyeball in a watching indication graph.
The following briefly describes how to capture a face image capable of eye tracking. First a camera device is selected as capture device, such as a dual camera by builtInDualCamera () function or a truetdepth camera by builtintruedtimesarama () function. Then, before shooting, the relevant configuration is performed by the avcaptureshoutput () function. When depth capture is ready, it is also possible to request capture of a depth image separately from a photograph along with a color image through the avcapturepthoussettings () function. An object is created, the color image format is selected by the capturepthoto (with: delete:) function, then depth capture and depth output are enabled, and the avcapturepthotosettings () function described above is invoked. After capture, the photo output will call the method of the delegate object, which will receive each generated image and related data in the form of an AVCapturePhoto object.
305. And the terminal displays a progress graph on the periphery of the indication graph according to the real-time position of the eyeball, wherein the progress graph is used for indicating the progress of executing the eyeball action.
In the embodiment of the application, after the terminal determines the real-time position of the eyeball, the terminal can display the progress image on the periphery of the displayed indication graph according to the real-time position of the eyeball to indicate the current execution progress, and a user can determine whether the executed eyeball action is correct or not through the progress image and determine the eyeball action to be executed next.
In an alternative implementation manner, the terminal can guide the user to execute the eye movement for multiple times, and correspondingly, the second interface further displays the remaining execution times of the eye movement. In response to the progress indication graph indicating that the eyeball motion is performed completely, the terminal can update the remaining number of times of execution, and then the terminal zeroes the progress indicated by the progress graph for performing the eyeball motion.
For example, referring to fig. 5, fig. 5 is a schematic view of another second interface provided according to an embodiment of the present application. As shown in fig. 5, the second interface displays an indication graph with a dotted circle, an arrow displayed in the indication graph is used for indicating a starting point and a moving direction of an eyeball movement, and a numeral 1 displayed in the indication graph is used for indicating that the remaining execution times of the eyeball movement is 1 time, that is, after the execution of the current time, the execution needs to be performed again for 1 time. The vertical line displayed on the periphery of the indication graph is a progress graph, and the execution progress of the eyeball action at the moment accounts for about one fourth of the total progress. When the execution progress of the eyeball action reaches one hundred percent, the terminal updates the residual execution times of the eyeball action to 0 and simultaneously sets the execution progress of the eyeball action to zero.
In an optional implementation manner, after the terminal displays the indication graph on the second interface, the terminal may further display an action setting control on the second interface, and in response to a trigger operation of the action setting control, the terminal displays an action setting interface, where the action setting interface is used to set an eyeball action indicated by the indication graph.
In an optional implementation manner, after the terminal displays the indication graph on the second interface of the target application, the terminal may further display a delay rest control on the second interface, and in response to a trigger operation of the delay rest control, the terminal displays the second interface after delaying the target duration. The target time is 3 minutes, 5 minutes, 10 minutes, or the like, which is not limited in the examples of the present application. By providing the delayed rest control, when the user uses the target application to perform more important events, such as meeting, communication work, video chat and the like, the events cannot be interrupted.
306. And responding to the progress graph indicating that the eyeball action is executed completely, and displaying a next indication graph by the terminal, wherein the next indication graph is different from the eyeball action indicated by the indication graph.
In the embodiment of the application, the terminal can guide the user to execute different eyeball actions, and after the eyeball action indicated by the currently displayed progress image is executed, the terminal can display the next indication graph to guide the user to execute other eyeball actions.
For example, referring to fig. 6, fig. 6 is a schematic view of another second interface provided according to an embodiment of the present application. As shown in fig. 6, the second interface displays an indication graphic having a dotted rounded rectangle, which is the next indication graphic of the indication image shown in fig. 5. The arrow displayed in the indication graph is used for indicating the starting point and the moving direction of the eyeball action, the number 1 displayed in the indication graph is used for indicating that the residual execution times of the eyeball action is 1 time, the vertical line displayed on the periphery of the indication graph is a progress graph, and the execution progress of the eyeball action at the moment accounts for about one fourth of the total progress.
It should be noted that, after the user performs all the eye movements, the terminal can also display an end prompt on the second interface, which is used for prompting the user that all the eye movements have been completed, and then the terminal displays the first interface.
For example, referring to fig. 7, fig. 7 is a schematic view of another second interface provided according to an embodiment of the present application. As shown in fig. 7, the second interface displays a pair of symbols surrounded by eyes and a circle, which indicates that the user has completed all the eye movements.
It should be noted that, after the user performs all the eye movements, the terminal can display the first interface again.
It should be noted that, the foregoing steps 301 to 306 are optional implementations of the interface display method provided in the embodiment of the present application. Accordingly, there are other alternative implementations.
In an optional implementation manner, before the terminal displays the indication graph on the second interface of the target application, a target condition setting interface may also be displayed, and in response to the setting operation detected on the target condition setting interface, the terminal may send a target condition setting request to the server, where the target condition setting request carries the target condition. Optionally, the terminal may further store the target condition set by the user in a local storage space of the terminal, which is not limited in this embodiment of the present application.
The embodiment of the application provides an interface display method, which is characterized in that the use information of a target application is acquired by detecting the use condition of the target application, so that when the use information meets the target condition, any interface displaying the target application is changed into a second interface displaying indication information guiding a user to have a rest, the user can be guided to have a rest, the fatigue condition of the user is effectively relieved, and the man-machine interaction efficiency is improved.
Fig. 8 is a flowchart of another interface display method provided in the embodiment of the present application, and as shown in fig. 8, the application to a terminal is taken as an example in the embodiment of the present application for description. The method comprises the following steps:
801. the terminal displays a first interface, and the first interface is any interface of the target application.
This step is referred to as step 301, and is not described herein again.
802. And the terminal detects the use condition of the target application to obtain the use information of the target application.
The step is referred to as step 302, and is not described herein again.
803. And responding to the accumulated use time of the target application reaching the time threshold value, and displaying the first eyeball action on the second interface by the terminal.
In this embodiment of the application, the target condition is a time threshold, and in a target time range, if the accumulated usage duration of the target application reaches the time threshold, the usage information satisfies the target condition, the terminal can display a second interface, and display the first eyeball action on the second interface. Optionally, the first eye movement is a blink, an eye horizontal movement, an eye vertical movement, an eye oblique movement, or the like, which is not limited in the embodiment of the present application.
In an optional implementation manner, in response to that the cumulative usage time of the target application reaches the time threshold, the terminal can switch the currently displayed first interface to the second interface, and then the terminal displays the first eyeball action on the second interface.
In an optional implementation manner, in response to that the cumulative usage time of the target application reaches the time threshold, the terminal can display the second interface in an overlaying manner on the currently displayed first interface, the transparency of the second interface is gradually reduced, and then the terminal displays the indication graph on the second interface. Wherein the second interface is capable of being displayed in a floating layer from completely transparent to opaque. Optionally, when the transparency of the second interface is gradually decreased, the second interface can display a countdown, and when the countdown is finished, the second interface changes to the opaque state.
It should be noted that the usage time period of the target application can be accumulated by the terminal, and can also be accumulated by the server. The second interface can also be an interface provided by a system running in the terminal, which is not limited in the embodiment of the present application. See step 303 for details, which are not described herein.
It should be noted that, before the terminal displays the first eye movement on the second interface of the target application, the terminal may further display an movement setting control on the second interface, and in response to a trigger operation of the movement setting control, display a movement setting interface, where the movement setting interface is used to set the second interface and the eye movement displayed by the at least one interface. The action setting interface comprises two parts, wherein the first part is an eyeball action executed at this time, and the second part is an optional eyeball action. The user can adjust the execution sequence of each eyeball action and execute the eyeball actions at this time by dragging the control corresponding to the eyeball action. Of course, the action setting interface can have other display forms, and the embodiment of the application is not limited to this.
For example, referring to fig. 9, fig. 9 is a schematic diagram of a second interface and an action setting interface provided according to an embodiment of the present application. As shown in fig. 9, the second interface displays an action setting control, and when the user triggers the action setting control, the terminal displays an action setting interface, where the action setting interface includes two parts, a first part is an eyeball action executed this time, and a second part is an optional eyeball action. The user can adjust the execution sequence of each eyeball action and execute the eyeball actions at this time by dragging the control corresponding to the eyeball action.
804. And the terminal determines the moving track of the eyeballs according to the collected face images.
In the embodiment of the application, after displaying the first eyeball motion, the terminal can acquire a plurality of face images based on an image acquisition device, such as a camera, and determine an eyeball movement track of the user based on the acquired plurality of face images. Wherein, the eyeball moving track is a connecting line of the eyeball positions in the plurality of face images.
It should be noted that the terminal may determine the eye movement trajectory by the eye tracking method shown in step 304, may determine the eye movement trajectory by performing eye recognition on each of the plurality of face images, and may determine the eye movement trajectory of the user by other methods, which is not limited in this embodiment of the application.
805. And responding to the eyeball movement track to be the same as the first eyeball movement, and sequentially displaying at least one interface which is used for displaying eyeball movement different from the first eyeball movement.
In this embodiment of the application, if the eye movement trajectory determined by the terminal is the same as the first eye movement, if the first eye movement is taken as horizontal eye movement from left to right, and the eye movement trajectory determined by the terminal is a horizontal line from left to right, the terminal determines that the eye movement trajectory is the same as the first eye movement, and then the terminal can display a third interface, which is a next interface of the second interface and displays an eye movement different from the first eye movement. And the terminal acquires the face image again, determines the eyeball movement track again, and displays the next interface of the third interface until all interfaces containing the eyeball movement are displayed if the determined eyeball track is the same as the second eyeball movement displayed on the third interface.
In an optional implementation manner, the terminal may guide the user to execute the first eye movement multiple times, and accordingly, the remaining execution times of the first eye movement are displayed on the second interface. The terminal can update the remaining times before sequentially displaying the at least one interface, and the step of displaying the at least one interface is executed in response to the remaining execution times being zero; and in response to the fact that the residual execution times are not zero, determining an eyeball movement track according to the collected face image, and in response to the fact that the eyeball movement track is the same as the first eyeball action, updating the residual execution times.
For example, referring to fig. 10, fig. 10 is a schematic view of another second interface provided according to an embodiment of the present application. As shown in fig. 10, the first eyeball movement is taken as the eyeball moving from left to right, and the current remaining execution time is 4, that is, after the execution of this time, the execution needs to be performed 3 times again.
In an alternative implementation, the terminal can use different background colors when displaying two adjacent interfaces.
For example, referring to fig. 11, fig. 11 is a schematic view of a third interface provided according to an embodiment of the present application. As shown in fig. 11, the second eye movement is that the eye moves from top right to bottom left, the current remaining number of times is 5, and the background color of the third interface is black, while the background color of the second interface shown in fig. 10 is white.
In an alternative implementation, when the first eye movement is blinking, the second interface also displays closed-eye time and text for relaxing the user.
For example, referring to fig. 12, fig. 12 is a schematic view of another second interface provided according to an embodiment of the present application. As shown in fig. 12, the first eye movement is blinking, and 6 seconds are displayed on the second interface, indicating the remaining eye-closing time, and a text for relaxing the user, i.e., "keep calm when closing the eye", is also displayed, as shown in (a). Two seconds later, as shown in (b), 4 seconds are displayed on the second interface, and another text: "although you are curious:) relax and close the eyes".
In an optional implementation manner, a user can quit the current interface when executing any eyeball action, in response to the quitting operation, the terminal displays a progress display interface, the progress display interface is used for displaying the execution progress control of each eyeball action, and in response to the triggering operation of the execution progress control corresponding to any eyeball action, the interface including the eyeball action is displayed. Optionally, the display sequence of the progress control for executing each eyeball action in the progress display interface is the same as the execution sequence of each eyeball action set in the action setting interface.
For example, referring to fig. 13, fig. 13 is a schematic diagram of a progress display interface provided according to an embodiment of the present application. As shown in fig. 13, the progress display interface displays a plurality of execution progress controls corresponding to the eyeball actions, and black parts of the execution progress controls are used for indicating the execution progress.
It should be noted that, after the user performs all the eye movements, the terminal can also display an end prompt on the second interface, so as to prompt the user that all the eye movements have been completed. Referring to step 306, further description is omitted here.
It should be noted that, before displaying the first eyeball action on the second interface of the target application, the terminal may further display a time threshold setting interface, and send a time threshold setting request to the server in response to the setting operation detected on the time threshold setting interface, where the time threshold setting request carries the time threshold. Optionally, the terminal may further store the time threshold set by the user in a local storage space of the terminal, which is not limited in this embodiment of the present application. And the time threshold setting interface is a target condition setting interface.
In an optional implementation manner, after the terminal displays the first eyeball graphic on the second interface of the target application, the terminal can also display a delay rest control on the second interface, and in response to a trigger operation of the delay rest control, the terminal displays the second interface after delaying the target duration. The target time is 3 minutes, 5 minutes, 10 minutes, or the like, which is not limited in the examples of the present application. By providing the delayed rest control, when the user uses the target application to perform more important events, such as meeting, communication work, video chat and the like, the events cannot be interrupted.
The embodiment of the application provides an interface display method, which is characterized in that the use information of a target application is acquired by detecting the use condition of the target application, so that when the use information meets the target condition, any interface displaying the target application is changed into a second interface displaying indication information guiding a user to have a rest, the user can be guided to have a rest, the fatigue condition of the user is effectively relieved, and the man-machine interaction efficiency is improved.
Fig. 14 is a block diagram of an interface display device according to an embodiment of the present application. The apparatus is used for executing the steps executed by the interface display method, and referring to fig. 14, the apparatus includes: a first display module 1401, a detection module 1402 and a second display module 1403.
A first display module 1401, configured to display a first interface, where the first interface is any interface of a target application;
a detection module 1402, configured to detect a usage situation of the target application to obtain usage information of the target application;
a second display module 1403, configured to display a second interface in response to that the usage information meets a target condition, where the second interface is used to display indication information, and the indication information is used to guide a user to take a rest.
In an optional implementation manner, the detecting module 1402 is configured to determine a single-use duration of the target application according to a use operation of the target application; and sending the single-use duration to a server, determining the accumulated use duration of the target application in a target time range by the server based on the single-use duration, and taking the accumulated use duration as the use information of the target application.
In an optional implementation manner, the detection module 1402 is configured to perform facial expression recognition according to the collected facial image, and use a recognition result of the facial expression recognition as the usage information of the target application, where the recognition result is used to indicate whether the user is in a fatigue state.
In an alternative implementation manner, the second display module 1403 is configured to display an indication graphic on the second interface in response to the usage information meeting a target condition, where the indication graphic is used for indicating an eye movement; carrying out eyeball tracking according to the collected face image to obtain the real-time position of the eyeball; and displaying a progress graph on the periphery of the indication graph according to the real-time position of the eyeball, wherein the progress graph is used for indicating the progress of executing the eyeball action.
In an alternative implementation manner, the second display module 1403 is further configured to display a next indication graph in response to the progress graph indicating that the eyeball motion is performed completely, where the next indication graph is different from the eyeball motion indicated by the indication graph.
In an optional implementation manner, the second interface further displays the remaining number of times of executing the eyeball action;
the second display module 1403, further configured to update the remaining execution times in response to the progress graph indicating that the eyeball action has been executed; and zeroing the progress indicated by the progress graph for executing the eyeball action.
In an optional implementation, the apparatus further includes:
the second display module 1403 is further configured to display an action setting control on the second interface;
and the third display module is used for responding to the triggering operation of the action setting control and displaying an action setting interface, and the action setting interface is used for setting the eyeball action indicated by the indication graph.
In an optional implementation manner, the second display module 1403 is configured to display a first eyeball motion on the second interface in response to the usage information meeting a target condition; determining an eyeball movement track according to the collected face image; and responding to the eyeball movement track to be the same as the first eyeball movement, and sequentially displaying at least one interface which is used for displaying eyeball movement different from the first eyeball movement.
In an optional implementation manner, the second interface further displays the remaining number of times of execution of the first eyeball action;
the second display module 1403 is further configured to update the remaining execution times; and responding to the residual execution times being zero, and sequentially displaying at least one interface of the target application.
In an optional implementation, the apparatus further includes:
the fourth display module is used for displaying a target condition setting interface;
and the request sending module is used for responding to the setting operation detected on the target condition setting interface and sending a target condition setting request to the server, wherein the target condition setting request carries the target condition.
In an alternative implementation, the second display module 1403 is further configured to display a delayed rest control on the second interface; and responding to the triggering operation of the delay rest control, and displaying the second interface after delaying the target time length.
In an optional implementation, the apparatus further includes:
the fifth display module is used for responding to the quitting operation and displaying a progress display interface, the progress display interface is used for displaying a plurality of execution progress controls, and one execution progress control is used for displaying the execution progress of one eyeball action;
and the sixth display module is used for responding to the trigger operation of the execution progress control corresponding to any eyeball action and displaying an interface comprising the eyeball action.
The embodiment of the application provides an interface display method, when the time threshold is reached by accumulating the use of a target application, an indication graph for indicating eyeball action is displayed on a first interface to guide a user to relax eyes, and then eyeball tracking is performed by combining collected face images to determine the real-time position of the eyeball.
It should be noted that: in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the functions described above. In addition, the interface display device and the interface display method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Fig. 15 is a block diagram of a terminal 1500 according to an embodiment of the present disclosure. The terminal 1500 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1500 may also be referred to as user equipment, a portable terminal, a laptop terminal, a desktop terminal, or other names.
In general, terminal 1500 includes: a processor 1501 and memory 1502.
Processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). Processor 1501 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content that the display screen needs to display. In some embodiments, processor 1501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
The memory 1502 may include one or more computer-readable storage media, which may be non-transitory. The memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in the memory 1502 is used to store at least one program code for execution by the processor 1501 to implement the interface display method provided by the method embodiments herein.
In some embodiments, the terminal 1500 may further include: a peripheral interface 1503 and at least one peripheral. The processor 1501, memory 1502, and peripheral interface 1503 may be connected by buses or signal lines. Various peripheral devices may be connected to peripheral interface 1503 via buses, signal lines, or circuit boards. Specifically, the peripheral device includes: at least one of a radio frequency circuit 1504, a display 1505, a camera assembly 1506, an audio circuit 1507, a positioning assembly 1508, and a power supply 1509.
The peripheral interface 1503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 1501 and the memory 1502. In some embodiments, the processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuitry 1504 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 1504 can communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 1505 is a touch display screen, the display screen 1505 also has the ability to capture touch signals on or over the surface of the display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. In this case, the display screen 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, display 1505 may be one, provided on the front panel of terminal 1500; in other embodiments, display 1505 may be at least two, each disposed on a different surface of terminal 1500 or in a folded design; in other embodiments, display 1505 may be a flexible display disposed on a curved surface or a folded surface of terminal 1500. Even further, the display 1505 may be configured in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 1505 can be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), and other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 1506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuitry 1507 may include a microphone and speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1501 for processing or inputting the electric signals to the radio frequency circuit 1504 to realize voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of the terminal 1500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 1507 may also include a headphone jack.
The positioning component 1508 is used to locate the current geographic position of the terminal 1500 for navigation or LBS (Location Based Service). The Positioning component 1508 may be a Positioning component based on the united states GPS (Global Positioning System), the chinese beidou System, or the russian galileo System.
Power supply 1509 is used to power the various components in terminal 1500. The power supply 1509 may be alternating current, direct current, disposable or rechargeable. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyro sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 1501 may control the display screen 1505 to display the user interface in a landscape view or a portrait view based on the gravitational acceleration signal collected by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for acquisition of motion data of a game or a user.
The gyroscope sensor 1512 can detect the body direction and the rotation angle of the terminal 1500, and the gyroscope sensor 1512 and the acceleration sensor 1511 cooperate to collect the 3D motion of the user on the terminal 1500. The processor 1501 may implement the following functions according to the data collected by the gyro sensor 1512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensor 1513 may be disposed on a side frame of terminal 1500 and/or underneath display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, the holding signal of the user to the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at a lower layer of the display screen 1505, the processor 1501 controls the operability control on the UI interface in accordance with the pressure operation of the user on the display screen 1505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 1514 is configured to capture a fingerprint of the user, and the processor 1501 identifies the user based on the fingerprint captured by the fingerprint sensor 1514, or the fingerprint sensor 1514 identifies the user based on the captured fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 1514 may be disposed on the front, back, or side of the terminal 1500. When a physical key or vendor Logo is provided on the terminal 1500, the fingerprint sensor 1514 may be integrated with the physical key or vendor Logo.
The optical sensor 1515 is used to collect ambient light intensity. In one embodiment, processor 1501 may control the brightness of display screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1505 is increased; when the ambient light intensity is low, the display brightness of the display screen 1505 is adjusted down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also known as a distance sensor, is typically provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect the distance between the user and the front surface of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually decreases, the processor 1501 controls the display 1505 to switch from the bright screen state to the dark screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually becomes larger, the processor 1501 controls the display 1505 to switch from the breath screen state to the bright screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 15 does not constitute a limitation of terminal 1500, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components may be employed.
The embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium is applied to a terminal, and at least one program code is stored in the computer-readable storage medium, and the at least one program code is loaded and executed by a processor to implement the operations executed by the terminal in the interface display method according to the above embodiment.
Embodiments of the present application also provide a computer program product or a computer program comprising computer program code stored in a computer readable storage medium. The processor of the terminal reads the computer program code from the computer-readable storage medium, and the processor executes the computer program code, so that the terminal performs the interface display method provided in the above-described various alternative implementations.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (15)

1. An interface display method, characterized in that the method comprises:
displaying a first interface, wherein the first interface is any interface of a target application;
detecting the use condition of the target application to obtain the use information of the target application;
and responding to the use information meeting a target condition, and displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding the user to take a rest.
2. The method according to claim 1, wherein the detecting the usage of the target application to obtain the usage information of the target application comprises:
determining the single-use duration of the target application according to the use operation of the target application;
and sending the single-use duration to a server, determining the accumulated use duration of the target application in a target time range by the server based on the single-use duration, and taking the accumulated use duration as the use information of the target application.
3. The method according to claim 1, wherein the detecting the usage of the target application to obtain the usage information of the target application comprises:
and carrying out facial expression recognition according to the collected facial image, and taking a recognition result of the facial expression recognition as the use information of the target application, wherein the recognition result is used for indicating whether the user is in a fatigue state.
4. The method of claim 1, wherein displaying a second interface in response to the usage information satisfying a target condition comprises:
displaying an indication graph on the second interface in response to the usage information meeting a target condition, wherein the indication graph is used for indicating eyeball action;
carrying out eyeball tracking according to the collected face image to obtain the real-time position of the eyeball;
and displaying a progress graph on the periphery of the indication graph according to the real-time position of the eyeball, wherein the progress graph is used for indicating the progress of executing the eyeball action.
5. The method according to claim 4, wherein after displaying a progress graphic at a periphery of the indication graphic according to the real-time position of the eyeball, the method further comprises:
and responding to the progress graph indicating that the eyeball action is executed completely, and displaying a next indication graph, wherein the next indication graph is different from the eyeball action indicated by the indication graph.
6. The method according to claim 4, wherein the second interface further displays the remaining number of executions of the eye movement;
after the progress graph is displayed on the periphery of the indication graph according to the real-time position of the eyeball, the method further comprises the following steps:
updating the remaining execution times in response to the progress graph indicating that the eyeball action is executed;
and zeroing the progress indicated by the progress graph for executing the eyeball action.
7. The method of claim 4, wherein after displaying the second interface in response to the usage information satisfying a target condition, the method further comprises:
displaying an action setting control on the second interface;
and responding to the triggering operation of the action setting control, and displaying an action setting interface, wherein the action setting interface is used for setting the eyeball action indicated by the indication graph.
8. The method of claim 1, wherein displaying a second interface in response to the usage information satisfying a target condition comprises:
in response to the usage information satisfying a target condition, displaying a first eye movement on the second interface;
determining an eyeball movement track according to the collected face image;
and responding to the eyeball movement track to be the same as the first eyeball action, and sequentially displaying at least one interface which is used for displaying the eyeball action different from the first eyeball action.
9. The method of claim 8, wherein the second interface further displays a remaining number of executions of the first eye movement;
before the sequentially displaying at least one interface of the target application, the method further includes:
updating the residual execution times;
and responding to the condition that the residual execution times are zero, and executing the step of sequentially displaying at least one interface of the target application.
10. The method of claim 1, wherein prior to displaying the second interface in response to the usage information satisfying a target condition, the method further comprises:
displaying a target condition setting interface;
and responding to the setting operation detected on the target condition setting interface, and sending a target condition setting request to a server, wherein the target condition setting request carries the target condition.
11. The method of claim 1, wherein after displaying the second interface in response to the usage information satisfying a target condition, the method further comprises:
displaying a deferred rest control on the second interface;
and responding to the triggering operation of the delayed rest control, and displaying the second interface after delaying the target duration.
12. The method of claim 1, further comprising:
responding to the quitting operation, displaying a progress display interface, wherein the progress display interface is used for displaying a plurality of execution progress controls, and one execution progress control is used for displaying the execution progress of one eyeball action;
and responding to the trigger operation of the execution progress control corresponding to any eyeball action, and displaying an interface comprising the eyeball action.
13. An interface display apparatus, the apparatus comprising:
the first display module is used for displaying a first interface, and the first interface is any interface of the target application;
the detection module is used for detecting the use condition of the target application to obtain the use information of the target application;
and the second display module is used for responding to the situation that the use information meets the target condition and displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding the user to take a rest.
14. A computer device, characterized in that the computer device comprises a processor and a memory for storing at least one piece of program code, which is loaded by the processor and executes the interface display method of any one of claims 1 to 12.
15. A storage medium for storing at least one program code for performing the interface display method of any one of claims 1 to 12.
CN202010927582.3A 2020-09-07 2020-09-07 Interface display method, device, terminal and storage medium Active CN114153361B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010927582.3A CN114153361B (en) 2020-09-07 2020-09-07 Interface display method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010927582.3A CN114153361B (en) 2020-09-07 2020-09-07 Interface display method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN114153361A true CN114153361A (en) 2022-03-08
CN114153361B CN114153361B (en) 2023-08-22

Family

ID=80460791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010927582.3A Active CN114153361B (en) 2020-09-07 2020-09-07 Interface display method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN114153361B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115291783A (en) * 2022-06-30 2022-11-04 中国第一汽车股份有限公司 Interface operation method and device, electronic equipment and storage medium

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106502859A (en) * 2016-10-11 2017-03-15 北京小米移动软件有限公司 The method and device of control terminal equipment
CN106533735A (en) * 2016-10-11 2017-03-22 北京奇虎科技有限公司 Mobile terminal use behavior monitoring method and device, server and system
CN108888487A (en) * 2018-05-22 2018-11-27 深圳奥比中光科技有限公司 A kind of eyeball training system and method
WO2019016406A1 (en) * 2017-07-21 2019-01-24 Stecnius Ug (Haftungsbeschraenkt) Method for guiding sequences of movements and training device for guiding sequences of movements
CN109656504A (en) * 2018-12-11 2019-04-19 北京锐安科技有限公司 Screen eye care method, device, terminal and storage medium
CN109771952A (en) * 2018-12-28 2019-05-21 努比亚技术有限公司 Based reminding method, terminal and computer readable storage medium based on game fatigue strength
CN109828731A (en) * 2018-12-18 2019-05-31 维沃移动通信有限公司 A kind of searching method and terminal device
CN109885362A (en) * 2018-11-30 2019-06-14 努比亚技术有限公司 Terminal and its eyeshield control method and computer readable storage medium
CN110007758A (en) * 2019-03-26 2019-07-12 维沃移动通信有限公司 A kind of control method and terminal of terminal
WO2019144814A1 (en) * 2018-01-26 2019-08-01 维沃移动通信有限公司 Display screen control method and mobile terminal
US20200026523A1 (en) * 2018-06-26 2020-01-23 Bryan Allen Young System and method for limiting maximum run time for an application
CN111281762A (en) * 2018-12-07 2020-06-16 广州幻境科技有限公司 Vision rehabilitation training method and system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106502859A (en) * 2016-10-11 2017-03-15 北京小米移动软件有限公司 The method and device of control terminal equipment
CN106533735A (en) * 2016-10-11 2017-03-22 北京奇虎科技有限公司 Mobile terminal use behavior monitoring method and device, server and system
WO2019016406A1 (en) * 2017-07-21 2019-01-24 Stecnius Ug (Haftungsbeschraenkt) Method for guiding sequences of movements and training device for guiding sequences of movements
WO2019144814A1 (en) * 2018-01-26 2019-08-01 维沃移动通信有限公司 Display screen control method and mobile terminal
CN108888487A (en) * 2018-05-22 2018-11-27 深圳奥比中光科技有限公司 A kind of eyeball training system and method
US20200026523A1 (en) * 2018-06-26 2020-01-23 Bryan Allen Young System and method for limiting maximum run time for an application
CN109885362A (en) * 2018-11-30 2019-06-14 努比亚技术有限公司 Terminal and its eyeshield control method and computer readable storage medium
CN111281762A (en) * 2018-12-07 2020-06-16 广州幻境科技有限公司 Vision rehabilitation training method and system
CN109656504A (en) * 2018-12-11 2019-04-19 北京锐安科技有限公司 Screen eye care method, device, terminal and storage medium
CN109828731A (en) * 2018-12-18 2019-05-31 维沃移动通信有限公司 A kind of searching method and terminal device
CN109771952A (en) * 2018-12-28 2019-05-21 努比亚技术有限公司 Based reminding method, terminal and computer readable storage medium based on game fatigue strength
CN110007758A (en) * 2019-03-26 2019-07-12 维沃移动通信有限公司 A kind of control method and terminal of terminal

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115291783A (en) * 2022-06-30 2022-11-04 中国第一汽车股份有限公司 Interface operation method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114153361B (en) 2023-08-22

Similar Documents

Publication Publication Date Title
CN112911182B (en) Game interaction method, device, terminal and storage medium
CN108803896B (en) Method, device, terminal and storage medium for controlling screen
CN110992493A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112044065B (en) Virtual resource display method, device, equipment and storage medium
CN110933452B (en) Method and device for displaying lovely face gift and storage medium
CN108172176B (en) Page refreshing method and device for ink screen
CN108848405B (en) Image processing method and device
CN110956580A (en) Image face changing method and device, computer equipment and storage medium
CN110275655B (en) Lyric display method, device, equipment and storage medium
CN109831817B (en) Terminal control method, device, terminal and storage medium
CN110769120A (en) Method, device, equipment and storage medium for message reminding
CN111158575B (en) Method, device and equipment for terminal to execute processing and storage medium
CN110152309B (en) Voice communication method, device, electronic equipment and storage medium
CN111857938A (en) Management method and device of popup view, terminal and storage medium
CN114153361B (en) Interface display method, device, terminal and storage medium
CN109819308B (en) Virtual resource acquisition method, device, terminal, server and storage medium
CN111986700A (en) Method, device, equipment and storage medium for triggering non-contact operation
CN112860046A (en) Method, apparatus, electronic device and medium for selecting operation mode
CN115904079A (en) Display equipment adjusting method, device, terminal and storage medium
CN112967261B (en) Image fusion method, device, equipment and storage medium
CN114595019A (en) Theme setting method, device and equipment of application program and storage medium
CN108881715B (en) Starting method and device of shooting mode, terminal and storage medium
CN113824902A (en) Method, device, system, equipment and medium for determining time delay of infrared camera system
CN109561215B (en) Method, device, terminal and storage medium for controlling beautifying function
CN112015612A (en) Method and device for acquiring stuck information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant