CN114153361B - Interface display method, device, terminal and storage medium - Google Patents

Interface display method, device, terminal and storage medium Download PDF

Info

Publication number
CN114153361B
CN114153361B CN202010927582.3A CN202010927582A CN114153361B CN 114153361 B CN114153361 B CN 114153361B CN 202010927582 A CN202010927582 A CN 202010927582A CN 114153361 B CN114153361 B CN 114153361B
Authority
CN
China
Prior art keywords
interface
target application
displaying
target
eyeball
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010927582.3A
Other languages
Chinese (zh)
Other versions
CN114153361A (en
Inventor
田元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202010927582.3A priority Critical patent/CN114153361B/en
Publication of CN114153361A publication Critical patent/CN114153361A/en
Application granted granted Critical
Publication of CN114153361B publication Critical patent/CN114153361B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an interface display method, an interface display device, a terminal and a storage medium, and belongs to the technical field of terminals. The method comprises the following steps: displaying a first interface, wherein the first interface is any interface of a target application; detecting the use condition of the target application to obtain the use information of the target application; and responding to the use information meeting the target condition, displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding a user to rest. According to the technical scheme, the user can be guided to rest, the fatigue condition of the user is effectively relieved, and the man-machine interaction efficiency is improved.

Description

Interface display method, device, terminal and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an interface display method, an apparatus, a terminal, and a storage medium.
Background
With the development of terminal technology, terminals are installed with various types of applications, and users use the applications for a longer and longer time. However, too long a use of the application may lead to eye fatigue, which is detrimental to the health of the user.
Currently, in order to avoid excessively long use of an application program by a user, an interface for setting a rest reminding user is provided, through which the user can set how long to use for accumulation and remind the user to rest, for example, after the user sets 2 hours, the user is reminded to rest after the user uses for accumulation for two hours.
The problem that the technical scheme has is that the user is only reminded to rest, the fatigue state of the user can not be effectively relieved, the function is not effective, and the human-computer interaction efficiency is low.
Disclosure of Invention
The embodiment of the application provides an interface display method, an interface display device, a terminal and a storage medium, which can guide a user to rest, effectively relieve fatigue of the user and improve human-computer interaction efficiency. The technical scheme is as follows:
in one aspect, there is provided an interface display method, the method including:
displaying a first interface, wherein the first interface is any interface of a target application;
detecting the use condition of the target application to obtain the use information of the target application;
and responding to the use information meeting the target condition, displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding a user to rest.
In another aspect, there is provided an interface display device, the device comprising:
the first display module is used for displaying a first interface, wherein the first interface is any interface of a target application;
the detection module is used for detecting the use condition of the target application to obtain the use information of the target application;
the second display module is used for responding to the condition that the use information meets the target condition, displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding a user to rest.
In an optional implementation manner, the detection module is configured to determine a single-use duration of the target application according to a use operation of the target application; and sending the single-use duration to a server, and determining the accumulated use duration of the target application in a target time range by the server based on the single-use duration, wherein the accumulated use duration is used as the use information of the target application.
In an optional implementation manner, the detection module is configured to perform facial expression recognition according to the collected facial image, and use a recognition result of the facial expression recognition as the usage information of the target application, where the recognition result is used to indicate whether the user is in a fatigue state.
In an optional implementation manner, the second display module is configured to display an indication graph on the second interface in response to the usage information meeting a target condition, where the indication graph is used to indicate an eyeball action; according to the acquired face image, eye tracking is carried out to obtain the real-time position of the eyeball; and displaying a progress pattern on the periphery of the indication pattern according to the real-time position of the eyeball, wherein the progress pattern is used for indicating the progress of executing the eyeball action.
In an optional implementation manner, the second display module is further configured to display a next indication graph in response to the progress graph indicating that the eyeball motion is performed, where the next indication graph is different from the eyeball motion indicated by the indication graph.
In an alternative implementation manner, the second interface further displays the remaining execution times of the eyeball motion;
the second display module is further used for updating the residual execution times in response to the progress graph indicating that the eyeball action is executed; and setting zero the progress of executing the eyeball action indicated by the progress graph.
In an alternative implementation, the apparatus further includes:
The second display module is further used for displaying an action setting control on the second interface;
and the third display module is used for responding to the triggering operation of the action setting control and displaying an action setting interface, and the action setting interface is used for setting the eyeball action indicated by the indication graph.
In an alternative implementation, the second display module is configured to display a first eye action on the second interface in response to the usage information meeting a target condition; according to the acquired face image, determining an eyeball movement track; and in response to the eyeball movement track being the same as the first eyeball movement, sequentially displaying at least one interface, wherein the at least one interface is used for displaying eyeball movement different from the first eyeball movement.
In an alternative implementation manner, the second interface also displays the residual execution times of the first eye ball action;
the second display module is further configured to update the remaining execution times; and in response to the residual execution times being zero, sequentially displaying at least one interface of the target application.
In an alternative implementation, the apparatus further includes:
The fourth display module is used for displaying a target condition setting interface;
and the request sending module is used for responding to the setting operation detected at the target condition setting interface and sending a target condition setting request to the server, wherein the target condition setting request carries the target condition.
In an optional implementation manner, the second display module is further configured to display a delay rest control on the second interface; and responding to the triggering operation of the delay rest control, and displaying the second interface after delaying the target duration.
In an alternative implementation, the apparatus further includes:
the fifth display module is used for responding to the exit operation and displaying a progress display interface, wherein the progress display interface is used for displaying a plurality of execution progress controls, and one execution progress control is used for displaying the execution progress of one eyeball action;
and the sixth display module is used for responding to the triggering operation of the execution progress control corresponding to the Ren Yiyan ball action and displaying an interface comprising the eyeball action.
In another aspect, a terminal is provided, the terminal including a processor and a memory for storing at least one piece of program code, the at least one piece of program code being loaded and executed by the processor to implement operations performed in the interface display method in an embodiment of the present application.
In another aspect, a computer readable storage medium having stored therein at least one piece of program code loaded and executed by a processor to implement operations performed in an interface display method according to an embodiment of the present application is provided.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer program code, the computer program code being stored in a computer readable storage medium. The processor of the terminal reads the computer program code from the computer readable storage medium, and the processor executes the computer program code to cause the computer device to perform the interface display method provided in the above aspects or various alternative implementations of the aspects.
The technical scheme provided by the embodiment of the application has the beneficial effects that:
the embodiment of the application provides an interface display method, which is used for acquiring the use information of a target application by detecting the use condition of the target application, so that when the use information meets a target condition, any interface of the target application is displayed and changed into a second interface displaying indication information for guiding a user to rest, thereby guiding the user to rest, effectively relieving the fatigue condition of the user and improving the man-machine interaction efficiency.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic view of an implementation environment of an interface display method according to an embodiment of the present application;
FIG. 2 is a flow chart of an interface display method according to an embodiment of the present application;
FIG. 3 is a flow chart of another interface display method provided according to an embodiment of the present application;
FIG. 4 is a schematic illustration of a second interface provided in accordance with an embodiment of the present application;
FIG. 5 is a schematic illustration of a second interface provided in accordance with an embodiment of the present application;
FIG. 6 is a schematic illustration of another second interface provided in accordance with an embodiment of the present application;
FIG. 7 is a schematic illustration of another second interface provided in accordance with an embodiment of the present application;
FIG. 8 is a flow chart of another interface display method provided according to an embodiment of the present application;
FIG. 9 is a schematic diagram of a second interface and action setup interface provided in accordance with an embodiment of the present application;
FIG. 10 is a schematic illustration of another second interface provided in accordance with an embodiment of the present application;
FIG. 11 is a schematic illustration of a third interface provided in accordance with an embodiment of the present application;
FIG. 12 is a schematic illustration of another second interface provided in accordance with an embodiment of the present application;
FIG. 13 is a schematic diagram of a progress presentation interface provided in accordance with an embodiment of the present application;
FIG. 14 is a block diagram of an interface display device provided according to an embodiment of the present application;
fig. 15 is a block diagram of a terminal according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of apparatus and methods consistent with aspects of the application as detailed in the accompanying claims.
The following describes an implementation environment of the interface display method provided by the embodiment of the present application. Fig. 1 is a schematic view of an implementation environment of an interface display method according to an embodiment of the present application. Referring to fig. 1, the implementation environment includes a terminal 101 and a server 102.
The terminal 101 and the server 102 can be directly or indirectly connected through wired or wireless communication, and the present application is not limited herein.
Optionally, the terminal 101 is a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc., but is not limited thereto. The terminal 101 has an application installed and running. The application may be a social application, an informative application, an educational application, etc., to which embodiments of the application are not limited. Illustratively, the terminal 101 is a terminal used by a user, and a user account is logged into an application running in the terminal.
Alternatively, the server 102 is a stand-alone physical server, or can be a server cluster or a distributed system formed by a plurality of physical servers, or can be a cloud server that provides cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), and basic cloud computing services such as big data and artificial intelligence platforms. The server 102 is used to provide background services for applications. Alternatively, the server 102 may undertake primary computing work and the terminal 101 may undertake secondary computing work; alternatively, the server 102 takes on secondary computing work and the terminal 101 takes on primary computing work; alternatively, a distributed computing architecture is used for collaborative computing between the server 102 and the terminal 101.
Optionally, the server 102 includes: the system comprises an access server, an application background server and a database. The access server is used for providing access services to the terminal 101. The application background server is used for providing background services for the application programs. The application background server may be one or more. When the application background servers are multiple, there are at least two application background servers for providing different services, and/or there are at least two application background servers for providing the same service, such as providing the same service in a load balancing manner, which embodiments of the present application are not limited. The database server is used for storing single-use time of the application program uploaded by the terminal.
Alternatively, the terminal 101 may refer broadly to one of a plurality of terminals, and embodiments of the present application are illustrated by way of example only with respect to terminal 101. Those skilled in the art will recognize that the number of terminals may be greater or lesser. Such as the above-mentioned terminals may be only one, or the above-mentioned terminals may be several tens or hundreds, or more. The embodiment of the application does not limit the number of terminals and the equipment type.
Fig. 2 is a flowchart of an interface display method according to an embodiment of the present application, and as shown in fig. 2, the embodiment of the present application is described by taking application to a terminal as an example. The interface display method comprises the following steps:
201. The terminal displays a first interface, wherein the first interface is any interface of the target application.
In the embodiment of the present application, the target application is any application running in the terminal, and the first interface is any interface of the target application.
202. And the terminal detects the use condition of the target application to obtain the use information of the target application.
In the embodiment of the application, the terminal can detect the use condition of the user on the target application, such as the accumulated use duration of the user on the target application, the single use duration of the target application, the fatigue degree of the user when using the target application and the like.
203. And the terminal responds to the use information meeting the target condition, and displays a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding the user to rest.
In the embodiment of the application, the target conditions are different according to the different use information, and when the use information meets the target conditions, the terminal pricks the second interface to display the indication information for guiding the user to rest, so that the user can be effectively guided to relax and rest, and the fatigue state of the user is relieved. The indication information can be in the form of dynamic images, text plus graphics, and static graphics. Optionally, the indication information can also be audio indication information. Optionally, when the terminal displays the indication information, the terminal may also play corresponding audio, which is not limited in the embodiment of the present application. Optionally, the indication information can be used to instruct the user to take a rest in a plurality of ways, such as instruct the user to take a rest on the eyes, instruct the user to take a rest on the cervical vertebra, instruct the user to take a rest on the brain, and the like, which is not limited by the embodiment of the present application.
The embodiment of the application provides an interface display method, which is used for acquiring the use information of a target application by detecting the use condition of the target application, so that when the use information meets a target condition, any interface of the target application is displayed and changed into a second interface displaying indication information for guiding a user to rest, thereby guiding the user to rest, effectively relieving the fatigue condition of the user and improving the man-machine interaction efficiency.
Fig. 3 is a flowchart of another interface display method according to an embodiment of the present application, and as shown in fig. 3, in the embodiment of the present application, the instruction information is used to instruct a user to rest eyes for example. The method comprises the following steps:
301. the terminal displays a first interface, wherein the first interface is any interface of the target application.
In the embodiment of the present application, the target application is any application running in the terminal, and the first interface is any interface of the target application. The terminal can take the application program currently being used by the user as the target application.
302. And the terminal detects the use condition of the target application to obtain the use information of the target application.
In the embodiment of the application, the terminal can detect the use condition of the user on the target application, such as the accumulated use duration of the user on the target application, the single use duration of the target application, the fatigue degree of the user when using the target application and the like.
Optionally, the usage information is used to indicate a cumulative usage time of the target application, and the usage time of the target application can be accumulated by the terminal and can also be accumulated by the server.
In an alternative implementation manner, the terminal determines a single use duration of the target application according to the use operation of the target application, and the terminal accumulates based on the single use duration to determine an accumulated use duration of the target application in a target time range. The application operation of the target application comprises starting the target application, switching the target application running in the background to the foreground, switching the target application running in the foreground to the background, closing the target application and the like. Wherein the target time range is 4 hours, 12 hours, 24 hours, or the like, which is not limited in the embodiment of the present application.
For example, the target application is running in the background, when the target application running in the background is switched to the foreground, the terminal starts timing, when the target application running in the foreground is switched to the background or the target application is closed, the terminal finishes timing, and the timing duration is taken as the single use duration of the target application by the terminal.
In an alternative implementation manner, the terminal determines a single-use duration of the target application according to a use operation of the target application, then the terminal sends the single-use duration to the server, and the server determines an accumulated use duration of the target application in a target time range based on the single-use duration. Correspondingly, when the server determines that the accumulated use time of the target application in the target time range reaches the time threshold, the server can send a rest instruction to the terminal, and the terminal determines that the accumulated use time of the target application reaches the time threshold based on the rest instruction. Optionally, when the accumulated usage time of the target application in the target time range approaches the time threshold, the server can send an early warning instruction to the terminal, and based on the early warning instruction, when the user starts to use the target application, the terminal sends a timing request to the server, and the terminal and the server jointly perform timing until the accumulated usage time of the target application reaches the time threshold. By the method, the problem that whether the accumulated use time length of the target application reaches the time threshold cannot be effectively determined by the server due to overlong single use time of the target application can be avoided.
In an alternative implementation manner, the terminal determines the current starting time of the target application according to the starting operation of the target application, then the terminal sends the current starting time to the server, and the server determines the accumulated using time of the target application in the target time range based on the current starting time. The starting time is the starting time of the target application.
In an optional implementation manner, the terminal can perform facial expression recognition according to the collected facial image, and the recognition result of the facial expression recognition is used as the use information of the target application, and is used for indicating whether the user is in a fatigue state or not.
Optionally, in response to the identification result indicating that the user is in a tired state, the terminal displays a second interface. The embodiment of the application takes the case that the terminal displays the second interface when the accumulated use time of the target application reaches the time threshold value as an example.
303. In response to the cumulative usage time of the target application reaching the time threshold, the terminal displays an indication graphic on the second interface, the indication graphic being for indicating eye movement.
In the embodiment of the application, when the target condition is a time threshold, in a target time range, if the accumulated use time of the target application reaches the time threshold, the use information meets the target condition, the terminal can display a second interface, and display an indication graph on the second interface, so as to guide the user to execute corresponding eyeball actions such as blinking, 360-degree rotation of the eyeball along a sphere, 360-degree rotation of the eyeball along a quadrilateral, 8-shaped movement of the eyeball, and the like according to the indication graph. The terminal displays the second interface in a plurality of modes.
In an alternative implementation, in response to the accumulated usage time of the target application reaching the time threshold, the terminal can switch the currently displayed first interface to a second interface, and then the terminal displays the indication graphic on the second interface.
In an alternative implementation, in response to the accumulated usage time of the target application reaching the time threshold, the terminal can overlay the second interface on the first interface currently displayed, the transparency of the second interface gradually decreases, and then the terminal displays the indication graphic on the second interface. Wherein the second interface can be displayed in a floating layer from completely transparent to opaque. Alternatively, the second interface can be displayed with a countdown as the transparency of the second interface gradually decreases, and the second interface changes to an opaque state when the countdown is over.
For example, referring to fig. 4, fig. 4 is a schematic diagram of a second interface according to an embodiment of the present application. As shown in fig. 4, the second interface is displayed in a semi-transparent manner on the first interface of the social application, the second interface displaying eyes in a semi-transparent state and countdown seconds. Wherein the broken line indicates a semitransparent state.
It should be noted that, the second interface may be an interface in the target application, or may be an interface provided by a system running in the terminal, which is not limited in the embodiment of the present application.
304. And the terminal performs eyeball tracking according to the acquired face image to obtain the real-time eyeball position.
In the embodiment of the application, when the user performs the eyeball action according to the guidance, the terminal can acquire the face image based on the image acquisition equipment such as a camera, and the eyeball of the user is tracked based on the acquired face image, so that the real-time position of the eyeball of the user is obtained.
In an alternative implementation, the terminal can obtain depth information from the acquired face image, and perform eye tracking based on the depth information.
For example, description will be made taking an example in which a terminal performs truedept imaging (depth imaging) by a camera and captures a depth image having depth information, and eye tracking is implemented. Firstly, shooting a plurality of face images through a truedept camera, then, tracking eyes based on the face images, and determining the real-time position of the eyes in each face image, wherein the real-time position can represent the position in an indication graph which the eyes are watching.
The following briefly describes how to capture face images that enable eye tracking. One photographing device is first selected as a capturing device, such as a dual camera by a buildualcamera () function or a truedept camera by a buildupdephcamera () function. The relevant configuration is then performed by the avcaptureshotput () function before shooting. Depth images can also be captured by the avcapturehotosetpoints () function request alone and with a color image when depth capture is ready. An object is created, the format of the color image is selected by a capturep auto (with:) function, then depth capture and depth output are enabled, and the avccapturep hotsetts () function described above is called. After capture, the photo output will invoke the method of the delegate object, photo output (di FinishProcessPhoto: error:), which will receive each generated image and related data in the form of an AVCaptureP auto object.
305. And the terminal displays a progress pattern on the periphery of the indication pattern according to the real-time position of the eyeball, wherein the progress pattern is used for indicating the progress of executing the eyeball action.
In the embodiment of the application, after the terminal determines the real-time position of the eyeball, a progress image can be displayed on the periphery of the displayed indication graph according to the real-time position of the eyeball so as to indicate the current execution progress, and a user can determine whether the executed eyeball action is correct or not and determine the eyeball action required to be executed next through the progress graph.
In an alternative implementation manner, the terminal can guide the user to execute the eyeball action for multiple times, and correspondingly, the second interface also displays the residual execution times of the eyeball action. And in response to the progress indication graph indicating that the eyeball action is executed, the terminal can update the residual execution times, and then the terminal sets the progress indicated by the progress graph for executing the eyeball action to zero.
For example, referring to fig. 5, fig. 5 is a schematic diagram of another second interface provided according to an embodiment of the present application. As shown in fig. 5, the second interface displays a dashed circle indication graph, an arrow displayed in the indication graph is used for indicating a starting point and a moving direction of an eyeball action, and a numeral 1 displayed in the indication graph is used for indicating that the remaining execution times of the eyeball action are 1 time, that is, after the execution is completed, the execution is further required to be performed 1 time. The vertical line displayed on the periphery of the indication graph is a progress graph, and the execution progress of eyeball action is about one fourth of the total progress. When the execution progress of the eyeball action reaches one hundred percent, the terminal updates the residual execution times of the eyeball action to 0 and simultaneously zeroes the execution progress of the eyeball action.
In an alternative implementation manner, after the terminal displays the indication graph on the second interface, the terminal can also display an action setting control on the second interface, and respond to the triggering operation of the action setting control, the terminal displays an action setting interface, where the action setting interface is used for setting the eyeball action indicated by the indication graph.
In an alternative implementation manner, after displaying the indication graphic on the second interface of the target application, the terminal can also display a delay rest control on the second interface, and in response to the triggering operation of the delay rest control, the terminal displays the second interface after delaying the target duration. Wherein the target time is 3 minutes, 5 minutes, 10 minutes, or the like, which is not limited in the embodiment of the present application. By providing a delay rest control, a user can not be interrupted when using a target application to perform important events, such as meeting, communication work, video chat and the like.
306. And responding to the progress graph to indicate that the eyeball action is executed, and displaying a next indication graph by the terminal, wherein the next indication graph is different from the eyeball action indicated by the indication graph.
In the embodiment of the application, the terminal can guide the user to execute different eyeball actions, and after the eyeball actions indicated by the currently displayed progress image are executed, the terminal can display the next indication graph so as to guide the user to execute other eyeball actions.
For example, referring to fig. 6, fig. 6 is a schematic diagram of another second interface provided according to an embodiment of the present application. As shown in fig. 6, the second interface displays an indication graph of a dotted rounded rectangle, which is the next indication graph of the indication image shown in fig. 5. The arrow displayed in the indication graph is used for indicating the starting point and the movement direction of the eyeball action, the number 1 displayed in the indication graph is used for indicating the residual execution times of the eyeball action to be 1 time, the vertical line displayed on the periphery of the indication graph is a progress graph, and at the moment, the execution progress of the eyeball action is about one fourth of the total progress.
After the user performs all the eyeball movements, the terminal can also display an end prompt on the second interface to prompt the user that all the eyeball movements are completed, and then the terminal displays the first interface.
For example, referring to fig. 7, fig. 7 is a schematic diagram of another second interface provided according to an embodiment of the present application. As shown in fig. 7, the second interface displays a check number surrounded by eyes and circles, indicating that the user has completed all eye movements.
After the user performs all the eyeball movements, the terminal can display the first interface again.
It should be noted that, the steps 301 to 306 are optional implementation manners of the interface display method provided in the embodiment of the present application. Accordingly, there are other alternative implementations.
In an alternative implementation, before the terminal displays the indication graphic on the second interface of the target application, the terminal can also display a target condition setting interface, and in response to a setting operation detected at the target condition setting interface, the terminal can send a target condition setting request to the server, where the target condition setting request carries the target condition. Optionally, the terminal may further store the target condition set by the user in a local storage space of the terminal, which is not limited by the embodiment of the present application.
The embodiment of the application provides an interface display method, which is used for acquiring the use information of a target application by detecting the use condition of the target application, so that when the use information meets a target condition, any interface of the target application is displayed and changed into a second interface displaying indication information for guiding a user to rest, thereby guiding the user to rest, effectively relieving the fatigue condition of the user and improving the man-machine interaction efficiency.
Fig. 8 is a flowchart of another interface display method according to an embodiment of the present application, as shown in fig. 8, and in the embodiment of the present application, an application to a terminal is illustrated as an example. The method comprises the following steps:
801. the terminal displays a first interface, wherein the first interface is any interface of the target application.
This step is referred to as step 301, and will not be described in detail herein.
802. And the terminal detects the use condition of the target application to obtain the use information of the target application.
This step is referred to as step 302, and will not be described in detail herein.
803. In response to the cumulative use of the target application reaching the time threshold, the terminal displays a first eye movement on the second interface.
In the embodiment of the application, the target condition is a time threshold, and in the target time range, if the accumulated use time of the target application reaches the time threshold, the use information satisfies the target condition, and the terminal can display a second interface, and display the first eye movement on the second interface. Optionally, the first eye movement is blink, eye horizontal movement, eye vertical movement, eye oblique movement, etc., which is not limited in the embodiment of the present application.
In an alternative implementation, in response to the accumulated usage time of the target application reaching a time threshold, the terminal can switch the currently displayed first interface to the second interface, and then the terminal displays the first eye-ball action on the second interface.
In an alternative implementation, in response to the accumulated usage time of the target application reaching the time threshold, the terminal can overlay the second interface on the first interface currently displayed, the transparency of the second interface gradually decreases, and then the terminal displays the indication graphic on the second interface. Wherein the second interface can be displayed in a floating layer from completely transparent to opaque. Alternatively, the second interface can be displayed with a countdown as the transparency of the second interface gradually decreases, and the second interface changes to an opaque state when the countdown is over.
The use time period of the target application can be accumulated by the terminal and can also be accumulated by the server. The second interface can also be an interface provided by a system running in the terminal, which is not limited by the embodiment of the present application. See step 303 for details that are not described in detail herein.
Before the first eyeball action is displayed on the second interface of the target application, the terminal can also display an action setting control on the second interface, and respond to the triggering operation of the action setting control to display an action setting interface, wherein the action setting interface is used for setting the eyeball action displayed on the second interface and the at least one interface. The action setting interface comprises two parts, wherein the first part is the eyeball action executed at the time, and the second part is the optional eyeball action. The user can adjust the execution sequence of the eyeball actions and which eyeball actions are executed this time by dragging the control corresponding to the eyeball actions. Of course, the action setting interface can have other display forms, and the embodiment of the present application is not limited thereto.
For example, referring to fig. 9, fig. 9 is a schematic diagram of a second interface and an action setting interface according to an embodiment of the present application. As shown in fig. 9, the second interface displays an action setting control, and when the user triggers the action setting control, the terminal displays an action setting interface, where the action setting interface includes two parts, the first part is an eyeball action executed this time, and the second part is an optional eyeball action. The user can adjust the execution sequence of the eyeball actions and which eyeball actions are executed this time by dragging the control corresponding to the eyeball actions.
804. And the terminal determines an eyeball movement track according to the acquired face image.
In the embodiment of the application, after the first eyeball action is displayed, the terminal can acquire a plurality of face images based on the image acquisition equipment such as a camera, and determine the eyeball movement track of the user based on the acquired face images. The eyeball movement track is a connecting line of eyeball positions in the face images.
It should be noted that, the terminal may determine the eye movement track by using the eye tracking method shown in step 304, may determine the eye movement track by performing eye recognition on a plurality of face images, and may determine the eye movement track of the user by using other methods, which is not limited in the embodiment of the present application.
805. And in response to the eye movement track being the same as the first eye movement, sequentially displaying at least one interface for displaying eye movements different from the first eye movement.
In the embodiment of the application, if the movement track of the eyeball determined by the terminal is the same as the first eyeball movement, if the first eyeball movement is horizontally moved from left to right, and if the movement track of the eyeball determined by the terminal is a horizontal line from left to right, the movement track of the eyeball is determined to be the same as the first eyeball movement by the terminal, and then the terminal can display a third interface, which is the next interface of the second interface and is displayed with an eyeball movement different from the first eyeball movement. And the terminal acquires the face image again, determines the eyeball movement track again, and displays the next interface of the third interface until all interfaces containing eyeball movement are displayed if the determined eyeball track is the same as the second eyeball movement displayed on the third interface.
In an alternative implementation manner, the terminal can guide the user to execute the first eye ball action for multiple times, and correspondingly, the remaining execution times of the first eye ball action are also displayed on the second interface. Before the terminal sequentially displays the at least one interface, the terminal can update the residual times, and the step of displaying the at least one interface is executed in response to the residual execution times being zero; and determining an eyeball movement track according to the acquired face image in response to the residual execution times being not zero, and updating the residual execution times in response to the eyeball movement track being identical to the first eyeball action.
For example, referring to fig. 10, fig. 10 is a schematic diagram of another second interface provided according to an embodiment of the present application. As shown in fig. 10, the first eye movement is the movement of the eyeball from left to right, and the current remaining execution number is 4, i.e. after the execution is completed, the execution is further required to be executed for 3 times.
In an alternative implementation, the terminal can use different background colors when displaying two adjacent interfaces.
For example, referring to fig. 11, fig. 11 is a schematic diagram of a third interface according to an embodiment of the present application. As shown in fig. 11, the second eyeball moves from the right to the left, the current remaining number of times is 5, the background color of the third interface is black, and the background color of the second interface shown in fig. 10 is white.
In an alternative implementation, when the first eye movement is a blink, the second interface also displays a closed-eye time and text for relaxing the user.
For example, referring to fig. 12, fig. 12 is a schematic diagram of another second interface provided according to an embodiment of the present application. As shown in fig. 12, the first eye movement was a blink, and the second interface was displayed for 6 seconds, which represents the remaining eye-closure time, and a text that was relaxed by the user, such as "calm eye closure", was displayed, as shown in (a). After two seconds, as shown in (b), the second interface is displayed for 4 seconds, and another text: "although you are curious:) relax and close the eye.
In an optional implementation manner, the user can exit the current interface when any eyeball action is executed, respond to the exit operation, and display a progress display interface by the terminal, wherein the progress display interface is used for displaying execution progress controls of all eyeball actions, respond to triggering operations of the execution progress controls corresponding to the Ren Yiyan eyeball actions, and display an interface comprising the eyeball actions. Optionally, the display sequence of the execution progress control of each eyeball action in the progress display interface is the same as the execution sequence of each eyeball action set in the action setting interface.
For example, referring to fig. 13, fig. 13 is a schematic diagram of a progress display interface according to an embodiment of the present application. As shown in fig. 13, the progress display interface displays execution progress controls corresponding to a plurality of eyeball actions, and black portions of the execution progress controls are used for indicating execution progress.
After the user performs all the eyeball movements, the terminal can also display an end prompt on the second interface to prompt the user that all the eyeball movements are completed. See step 306, which is not described in detail herein.
Before the first eye action is displayed on the second interface of the target application, the terminal can also display a time threshold setting interface, and respond to the setting operation detected by the time threshold setting interface to send a time threshold setting request to the server, wherein the time threshold setting request carries the time threshold. Optionally, the terminal may also store the time threshold set by the user in a local storage space of the terminal, which is not limited by the embodiment of the present application. The time threshold setting interface is a target condition setting interface.
In an alternative implementation manner, after the first eye ball graph is displayed on the second interface of the target application, the terminal can also display a delay rest control on the second interface, and in response to the triggering operation of the delay rest control, the terminal displays the second interface after delaying the target duration. Wherein the target time is 3 minutes, 5 minutes, 10 minutes, or the like, which is not limited in the embodiment of the present application. By providing a delay rest control, a user can not be interrupted when using a target application to perform important events, such as meeting, communication work, video chat and the like.
The embodiment of the application provides an interface display method, which is used for acquiring the use information of a target application by detecting the use condition of the target application, so that when the use information meets a target condition, any interface of the target application is displayed and changed into a second interface displaying indication information for guiding a user to rest, thereby guiding the user to rest, effectively relieving the fatigue condition of the user and improving the man-machine interaction efficiency.
Fig. 14 is a block diagram of an interface display device according to an embodiment of the present application. The device is used for executing the steps when the interface display method is executed, and referring to fig. 14, the device comprises: a first display module 1401, a detection module 1402 and a second display module 1403.
A first display module 1401, configured to display a first interface, where the first interface is any interface of a target application;
a detection module 1402, configured to detect a usage situation of the target application, to obtain usage information of the target application;
the second display module 1403 is configured to display a second interface in response to the usage information meeting a target condition, where the second interface is configured to display indication information, and the indication information is configured to instruct a user to take a rest.
In an alternative implementation, the detecting module 1402 is configured to determine a single-use duration of the target application according to a usage operation of the target application; and sending the single-use duration to a server, and determining the accumulated use duration of the target application in a target time range by the server based on the single-use duration, wherein the accumulated use duration is used as the use information of the target application.
In an optional implementation manner, the detection module 1402 is configured to perform facial expression recognition according to the collected facial image, and use a recognition result of the facial expression recognition as the usage information of the target application, where the recognition result is used to indicate whether the user is in a fatigue state.
In an alternative implementation, the second display module 1403 is configured to display an indication graphic on the second interface in response to the usage information meeting a target condition, where the indication graphic is used to indicate an eyeball action; according to the acquired face image, eye tracking is carried out to obtain the real-time position of the eyeball; and displaying a progress pattern on the periphery of the indication pattern according to the real-time position of the eyeball, wherein the progress pattern is used for indicating the progress of executing the eyeball action.
In an alternative implementation, the second display module 1403 is further configured to display a next indication graph in response to the progress graph indicating that the eyeball motion is performed, where the next indication graph is different from the eyeball motion indicated by the indication graph.
In an alternative implementation, the second interface also displays the remaining number of executions of the eye movement;
the second display module 1403 is further configured to update the remaining execution times in response to the progress pattern indicating that the eyeball motion is executed; and setting the progress indicated by the progress graph to zero for executing the eyeball action.
In an alternative implementation, the apparatus further includes:
the second display module 1403 is further configured to display an action setting control on the second interface;
And the third display module is used for responding to the triggering operation of the action setting control and displaying an action setting interface, wherein the action setting interface is used for setting the eyeball action indicated by the indication graph.
In an alternative implementation, the second display module 1403 is configured to display a first eye movement on the second interface in response to the usage information meeting a target condition; according to the acquired face image, determining an eyeball movement track; and in response to the eye movement track being the same as the first eye movement, sequentially displaying at least one interface for displaying eye movements different from the first eye movement.
In an alternative implementation, the second interface also displays the remaining execution times of the first eye movement;
the second display module 1403 is further configured to update the remaining execution times; and in response to the remaining execution times being zero, sequentially displaying at least one interface of the target application.
In an alternative implementation, the apparatus further includes:
the fourth display module is used for displaying a target condition setting interface;
and the request sending module is used for responding to the setting operation detected at the target condition setting interface and sending a target condition setting request to the server, wherein the target condition setting request carries the target condition.
In an alternative implementation, the second display module 1403 is further configured to display a delay rest control on the second interface; and responding to the triggering operation of the delay rest control, and displaying the second interface after delaying the target duration.
In an alternative implementation, the apparatus further includes:
the fifth display module is used for responding to the exit operation, displaying a progress display interface, wherein the progress display interface is used for displaying a plurality of execution progress controls, and one execution progress control is used for displaying the execution progress of one eyeball action;
and the sixth display module is used for responding to the triggering operation of the execution progress control corresponding to the Ren Yiyan ball action and displaying an interface comprising the eyeball action.
The embodiment of the application provides an interface display method, which is characterized in that when a cumulative use target application reaches a time threshold, an indication graph for indicating eyeball action is displayed on a first interface, a user is guided to relax eyes, then the collected face image is combined to carry out eyeball tracking so as to determine the real-time position of the eyeballs, and the execution progress can be displayed according to the human eye movement track, so that the user is determined to execute relevant actions according to the guidance, the eye fatigue of the user is effectively relieved, and the man-machine interaction efficiency is improved.
It should be noted that: the interface display device provided in the above embodiment is only exemplified by the above division of each functional module when an application program is running, and in practical application, the above functional allocation may be performed by different functional modules according to needs, i.e., the internal structure of the device is divided into different functional modules, so as to perform all or part of the functions described above. In addition, the interface display device and the interface display method embodiment provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the interface display device and the interface display method embodiment are detailed in the method embodiment, which is not described herein again.
Fig. 15 is a block diagram of a terminal 1500 according to an embodiment of the present application. The terminal 1500 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 1500 can also be referred to as a user device, portable terminal, laptop terminal, desktop terminal, and the like.
In general, the terminal 1500 includes: a processor 1501 and a memory 1502.
The processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, or the like. The processor 1501 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1501 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1501 may be integrated with a GPU (Graphics Processing Unit, image processor) for taking care of rendering and rendering of content to be displayed by the display screen. In some embodiments, the processor 1501 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1502 may include one or more computer-readable storage media, which may be non-transitory. Memory 1502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1502 is used to store at least one program code for execution by processor 1501 to implement the interface display method provided by the method embodiments of the present application.
In some embodiments, the terminal 1500 may further optionally include: a peripheral interface 1503 and at least one peripheral device. The processor 1501, memory 1502 and peripheral interface 1503 may be connected by a bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1503 via a bus, signal lines, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1504, a display screen 1505, a camera assembly 1506, audio circuitry 1507, a positioning assembly 1508, and a power supply 1509.
A peripheral interface 1503 may be used to connect I/O (Input/Output) related at least one peripheral device to the processor 1501 and the memory 1502. In some embodiments, processor 1501, memory 1502, and peripheral interface 1503 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 1501, the memory 1502, and the peripheral interface 1503 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 1504 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1504 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1504 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1504 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuit 1504 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuit 1504 may also include NFC (Near Field Communication, short range wireless communication) related circuits, which the present application is not limited to.
Display 1505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When display screen 1505 is a touch display screen, display screen 1505 also has the ability to collect touch signals at or above the surface of display screen 1505. The touch signal may be input to the processor 1501 as a control signal for processing. At this point, display 1505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 1505 may be one, disposed on the front panel of the terminal 1500; in other embodiments, the display 1505 may be at least two, respectively disposed on different surfaces of the terminal 1500 or in a folded design; in other embodiments, display 1505 may be a flexible display disposed on a curved surface or a folded surface of terminal 1500. Even more, the display 1505 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display screen 1505 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1506 is used to capture images or video. Optionally, the camera assembly 1506 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 1506 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuitry 1507 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 1501 for processing, or inputting the electric signals to the radio frequency circuit 1504 for voice communication. For purposes of stereo acquisition or noise reduction, a plurality of microphones may be respectively disposed at different portions of the terminal 1500. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1501 or the radio frequency circuit 1504 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 1507 may also include a headphone jack.
The positioning component 1508 is for positioning a current geographic location of the terminal 1500 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 1508 may be a positioning component based on the United states GPS (Global Positioning System ), the Beidou system of China, or the Russian Galileo system.
The power supply 1509 is used to power the various components in the terminal 1500. The power supply 1509 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power supply 1509 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 1500 also includes one or more sensors 1510. The one or more sensors 1510 include, but are not limited to: acceleration sensor 1511, gyroscope sensor 1512, pressure sensor 1513, fingerprint sensor 1514, optical sensor 1515, and proximity sensor 1516.
The acceleration sensor 1511 may detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 1500. For example, the acceleration sensor 1511 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 1501 may control the display screen 1505 to display the user interface in a landscape view or a portrait view based on the gravitational acceleration signal acquired by the acceleration sensor 1511. The acceleration sensor 1511 may also be used for the acquisition of motion data of a game or user.
The gyro sensor 1512 may detect a body direction and a rotation angle of the terminal 1500, and the gyro sensor 1512 may collect 3D motion of the terminal 1500 by a user in cooperation with the acceleration sensor 1511. The processor 1501, based on the data collected by the gyro sensor 1512, may implement the following functions: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 1513 may be disposed on a side frame of the terminal 1500 and/or under the display 1505. When the pressure sensor 1513 is disposed on the side frame of the terminal 1500, a grip signal of the user on the terminal 1500 may be detected, and the processor 1501 performs left-right hand recognition or quick operation according to the grip signal collected by the pressure sensor 1513. When the pressure sensor 1513 is disposed at the lower layer of the display screen 1505, the processor 1501 realizes control of the operability control on the UI interface according to the pressure operation of the user on the display screen 1505. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The fingerprint sensor 1514 is used for collecting the fingerprint of the user, and the processor 1501 recognizes the identity of the user according to the collected fingerprint of the fingerprint sensor 1514, or the fingerprint sensor 1514 recognizes the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 1501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying for and changing settings, etc. The fingerprint sensor 1514 may be disposed on the front, back, or side of the terminal 1500. When a physical key or vendor Logo is provided on the terminal 1500, the fingerprint sensor 1514 may be integrated with the physical key or vendor Logo.
The optical sensor 1515 is used to collect the ambient light intensity. In one embodiment, processor 1501 may control the display brightness of display screen 1505 based on the intensity of ambient light collected by optical sensor 1515. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1505 is turned up; when the ambient light intensity is low, the display luminance of the display screen 1505 is turned down. In another embodiment, the processor 1501 may also dynamically adjust the shooting parameters of the camera assembly 1506 based on the ambient light intensity collected by the optical sensor 1515.
A proximity sensor 1516, also referred to as a distance sensor, is typically provided on the front panel of the terminal 1500. The proximity sensor 1516 is used to collect the distance between the user and the front of the terminal 1500. In one embodiment, when the proximity sensor 1516 detects a gradual decrease in the distance between the user and the front of the terminal 1500, the processor 1501 controls the display 1505 to switch from the on-screen state to the off-screen state; when the proximity sensor 1516 detects that the distance between the user and the front surface of the terminal 1500 gradually increases, the processor 1501 controls the display screen 1505 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 15 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
The embodiment of the application also provides a computer readable storage medium, which is applied to the terminal, and at least one section of program code is stored in the computer readable storage medium, and the at least one section of program code is loaded and executed by a processor to realize the operation executed by the terminal in the interface display method of the embodiment.
Embodiments of the present application also provide a computer program product or computer program comprising computer program code stored in a computer readable storage medium. The processor of the terminal reads the computer program code from the computer readable storage medium, and the processor executes the computer program code so that the terminal performs the interface display method provided in the above-described various alternative implementations.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the present application is not intended to limit the application, but rather, the application is to be construed as limited to the appended claims.

Claims (14)

1. An interface display method, characterized in that the method comprises:
displaying a first interface, wherein the first interface is any interface of a target application;
determining a single use duration of the target application according to a use operation of the target application, wherein the use operation of the target application comprises starting the target application, switching the target application running in the background to the foreground, switching the target application running in the foreground to the background, and closing the target application, a terminal starts timing when the target application is started or the target application running in the background is switched to the foreground, and finishes timing when the target application is closed or the target application running in the foreground is switched to the background, and the timing duration is taken as the single use duration of the target application;
the single-use duration is sent to a server, the server determines the accumulated use duration of the target application in a target time range based on the single-use duration, and the accumulated use duration is used as the use information of the target application;
In response to the received rest instruction, determining that the accumulated usage time of the target application reaches a time threshold and that the usage information satisfies a target condition; the rest instruction is sent to the terminal by the server when the accumulated use time of the target application in the target time range is determined to reach a time threshold; wherein the server is further configured to send an early warning instruction to the terminal when the accumulated usage time of the target application within the target time range approaches the time threshold, so that the terminal sends a timing request to the server when a user starts to use the target application based on the early warning instruction, and the terminal and the server perform timing together until the accumulated usage time of the target application reaches the time threshold;
and responding to the use information meeting the target condition, displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding a user to rest.
2. The method according to claim 1, wherein the method further comprises:
and carrying out facial expression recognition according to the acquired facial image, and taking a recognition result of the facial expression recognition as the use information of the target application, wherein the recognition result is used for indicating whether the user is in a fatigue state or not.
3. The method of claim 1, wherein the displaying a second interface in response to the usage information meeting the target condition comprises:
in response to the usage information meeting the target condition, displaying an indication graph on the second interface, wherein the indication graph is used for indicating eyeball action;
according to the acquired face image, eye tracking is carried out to obtain the real-time position of the eyeball;
and displaying a progress pattern on the periphery of the indication pattern according to the real-time position of the eyeball, wherein the progress pattern is used for indicating the progress of executing the eyeball action.
4. A method according to claim 3, wherein said displaying of a progress pattern on the periphery of said indication pattern is based on the real-time position of said eyeball, said method further comprising:
and responding to the progress graph to indicate that the eyeball motion is executed, and displaying a next indication graph, wherein the next indication graph is different from the eyeball motion indicated by the indication graph.
5. A method according to claim 3, wherein the second interface also displays a remaining number of executions of the eye movements;
the method further comprises, after displaying a progress pattern on the periphery of the indication pattern according to the real-time position of the eyeball:
Updating the residual execution times in response to the progress graph indicating that the eyeball action is executed;
and setting zero the progress of executing the eyeball action indicated by the progress graph.
6. A method according to claim 3, wherein, in response to the usage information meeting the target condition, after displaying a second interface, the method further comprises:
displaying an action setting control on the second interface;
and responding to the triggering operation of the action setting control, displaying an action setting interface, wherein the action setting interface is used for setting the eyeball action indicated by the indication graph.
7. The method of claim 1, wherein the displaying a second interface in response to the usage information meeting the target condition comprises:
in response to the usage information meeting a target condition, displaying a first eye movement on the second interface;
according to the acquired face image, determining an eyeball movement track;
and in response to the eyeball movement track being the same as the first eyeball movement, sequentially displaying at least one interface, wherein the at least one interface is used for displaying eyeball movement different from the first eyeball movement.
8. The method of claim 7, wherein the second interface further displays a remaining number of executions of the first eye drop action;
before the sequentially displaying the at least one interface of the target application, the method further comprises:
updating the residual execution times;
and in response to the residual execution times being zero, executing the step of sequentially displaying at least one interface of the target application.
9. The method of claim 1, wherein the method further comprises, prior to displaying a second interface in response to the usage information meeting the target condition:
displaying a target condition setting interface;
and responding to the setting operation detected at the target condition setting interface, and sending a target condition setting request to a server, wherein the target condition setting request carries the target condition.
10. The method of claim 1, wherein after displaying a second interface in response to the usage information meeting the target condition, the method further comprises:
displaying a delay rest control on the second interface;
and responding to the triggering operation of the delay rest control, and displaying the second interface after delaying the target duration.
11. The method according to claim 1, wherein the method further comprises:
in response to the exit operation, displaying a progress display interface, wherein the progress display interface is used for displaying a plurality of execution progress controls, and one execution progress control is used for displaying the execution progress of one eyeball action;
and responding to the triggering operation of the execution progress control corresponding to the Ren Yiyan ball action, and displaying an interface comprising the eyeball action.
12. An interface display device, the device comprising:
the first display module is used for displaying a first interface, wherein the first interface is any interface of a target application;
the detection module is used for determining single-use duration of the target application according to use operation of the target application, wherein the use operation of the target application comprises starting the target application, switching the target application running in the background to foreground, switching the target application running in the foreground to background, and closing the target application, the terminal starts timing when the target application is started or the target application running in the background is switched to the foreground, and finishes timing when the target application is closed or the target application running in the foreground is switched to the background, and the timing duration is used as the single-use duration of the target application; the single-use duration is sent to a server, the server determines the accumulated use duration of the target application in a target time range based on the single-use duration, and the accumulated use duration is used as the use information of the target application;
The second display module is used for responding to the received rest instruction, determining that the accumulated use time of the target application reaches a time threshold, and the use information meets a target condition; the rest instruction is sent to the terminal by the server when the accumulated use time of the target application in the target time range is determined to reach a time threshold; wherein the server is further configured to send an early warning instruction to the terminal when the accumulated usage time of the target application within the target time range approaches the time threshold, so that the terminal sends a timing request to the server when a user starts to use the target application based on the early warning instruction, and the terminal and the server perform timing together until the accumulated usage time of the target application reaches the time threshold; and in response to the usage information meeting the target condition, displaying a second interface, wherein the second interface is used for displaying indication information, and the indication information is used for guiding a user to rest.
13. A computer device comprising a processor and a memory for storing at least one piece of program code, the at least one piece of program code being loaded by the processor and executing the interface display method of any of claims 1 to 11.
14. A storage medium storing at least one piece of program code for performing the interface display method of any one of claims 1 to 11.
CN202010927582.3A 2020-09-07 2020-09-07 Interface display method, device, terminal and storage medium Active CN114153361B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010927582.3A CN114153361B (en) 2020-09-07 2020-09-07 Interface display method, device, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010927582.3A CN114153361B (en) 2020-09-07 2020-09-07 Interface display method, device, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN114153361A CN114153361A (en) 2022-03-08
CN114153361B true CN114153361B (en) 2023-08-22

Family

ID=80460791

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010927582.3A Active CN114153361B (en) 2020-09-07 2020-09-07 Interface display method, device, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN114153361B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115291783A (en) * 2022-06-30 2022-11-04 中国第一汽车股份有限公司 Interface operation method and device, electronic equipment and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106502859A (en) * 2016-10-11 2017-03-15 北京小米移动软件有限公司 The method and device of control terminal equipment
CN106533735A (en) * 2016-10-11 2017-03-22 北京奇虎科技有限公司 Mobile terminal use behavior monitoring method and device, server and system
CN108888487A (en) * 2018-05-22 2018-11-27 深圳奥比中光科技有限公司 A kind of eyeball training system and method
WO2019016406A1 (en) * 2017-07-21 2019-01-24 Stecnius Ug (Haftungsbeschraenkt) Method for guiding sequences of movements and training device for guiding sequences of movements
CN109656504A (en) * 2018-12-11 2019-04-19 北京锐安科技有限公司 Screen eye care method, device, terminal and storage medium
CN109771952A (en) * 2018-12-28 2019-05-21 努比亚技术有限公司 Based reminding method, terminal and computer readable storage medium based on game fatigue strength
CN109828731A (en) * 2018-12-18 2019-05-31 维沃移动通信有限公司 A kind of searching method and terminal device
CN109885362A (en) * 2018-11-30 2019-06-14 努比亚技术有限公司 Terminal and its eyeshield control method and computer readable storage medium
CN110007758A (en) * 2019-03-26 2019-07-12 维沃移动通信有限公司 A kind of control method and terminal of terminal
WO2019144814A1 (en) * 2018-01-26 2019-08-01 维沃移动通信有限公司 Display screen control method and mobile terminal
CN111281762A (en) * 2018-12-07 2020-06-16 广州幻境科技有限公司 Vision rehabilitation training method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200026523A1 (en) * 2018-06-26 2020-01-23 Bryan Allen Young System and method for limiting maximum run time for an application

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106502859A (en) * 2016-10-11 2017-03-15 北京小米移动软件有限公司 The method and device of control terminal equipment
CN106533735A (en) * 2016-10-11 2017-03-22 北京奇虎科技有限公司 Mobile terminal use behavior monitoring method and device, server and system
WO2019016406A1 (en) * 2017-07-21 2019-01-24 Stecnius Ug (Haftungsbeschraenkt) Method for guiding sequences of movements and training device for guiding sequences of movements
WO2019144814A1 (en) * 2018-01-26 2019-08-01 维沃移动通信有限公司 Display screen control method and mobile terminal
CN108888487A (en) * 2018-05-22 2018-11-27 深圳奥比中光科技有限公司 A kind of eyeball training system and method
CN109885362A (en) * 2018-11-30 2019-06-14 努比亚技术有限公司 Terminal and its eyeshield control method and computer readable storage medium
CN111281762A (en) * 2018-12-07 2020-06-16 广州幻境科技有限公司 Vision rehabilitation training method and system
CN109656504A (en) * 2018-12-11 2019-04-19 北京锐安科技有限公司 Screen eye care method, device, terminal and storage medium
CN109828731A (en) * 2018-12-18 2019-05-31 维沃移动通信有限公司 A kind of searching method and terminal device
CN109771952A (en) * 2018-12-28 2019-05-21 努比亚技术有限公司 Based reminding method, terminal and computer readable storage medium based on game fatigue strength
CN110007758A (en) * 2019-03-26 2019-07-12 维沃移动通信有限公司 A kind of control method and terminal of terminal

Also Published As

Publication number Publication date
CN114153361A (en) 2022-03-08

Similar Documents

Publication Publication Date Title
CN110971930B (en) Live virtual image broadcasting method, device, terminal and storage medium
CN108833818B (en) Video recording method, device, terminal and storage medium
CN110992493B (en) Image processing method, device, electronic equipment and storage medium
CN108803896B (en) Method, device, terminal and storage medium for controlling screen
CN110300274B (en) Video file recording method, device and storage medium
CN112328091B (en) Barrage display method and device, terminal and storage medium
CN112835445B (en) Interaction method, device and system in virtual reality scene
CN110956580B (en) Method, device, computer equipment and storage medium for changing face of image
CN108848405B (en) Image processing method and device
CN108172176B (en) Page refreshing method and device for ink screen
CN109831817B (en) Terminal control method, device, terminal and storage medium
CN112966798B (en) Information display method and device, electronic equipment and storage medium
CN112367533B (en) Interactive service processing method, device, equipment and computer readable storage medium
CN114153361B (en) Interface display method, device, terminal and storage medium
CN111061369B (en) Interaction method, device, equipment and storage medium
CN110152309B (en) Voice communication method, device, electronic equipment and storage medium
CN112023403A (en) Battle process display method and device based on image-text information
CN115904079A (en) Display equipment adjusting method, device, terminal and storage medium
CN114826799B (en) Information acquisition method, device, terminal and storage medium
CN108881715B (en) Starting method and device of shooting mode, terminal and storage medium
CN112764824A (en) Method, device, equipment and storage medium for triggering identity authentication in application program
CN111325083A (en) Method and device for recording attendance information
CN113384878B (en) Virtual card control method, device, terminal and storage medium
CN116704080B (en) Blink animation generation method, device, equipment and storage medium
CN113065457B (en) Face detection point processing method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant