CN114237119A - Display screen control method and device - Google Patents

Display screen control method and device Download PDF

Info

Publication number
CN114237119A
CN114237119A CN202111547767.2A CN202111547767A CN114237119A CN 114237119 A CN114237119 A CN 114237119A CN 202111547767 A CN202111547767 A CN 202111547767A CN 114237119 A CN114237119 A CN 114237119A
Authority
CN
China
Prior art keywords
display screen
control instruction
target
eye
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111547767.2A
Other languages
Chinese (zh)
Inventor
王鹏飞
聂利波
熊剑
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202111547767.2A priority Critical patent/CN114237119A/en
Publication of CN114237119A publication Critical patent/CN114237119A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a display screen control method and device. Wherein, the method comprises the following steps: acquiring eye movement information of a target object; determining a target control instruction corresponding to the eye action information based on a mapping relation between preset eye actions and display screen control instructions; and controlling the display screen to execute the operation corresponding to the target control instruction. The method and the device solve the technical problems that in the related art, the control of the display screen in the intelligent household equipment is not intelligent enough, and the user experience is poor.

Description

Display screen control method and device
Technical Field
The application relates to the technical field of intelligent equipment control, in particular to a display screen control method and device.
Background
Along with the development of science and technology, air conditioning unit's controller and control lamp plate adopt charactron or LED screen display to become more and more popular, mainly because screen display shows more advantages than ordinary LED lamp, for example: digital display, symbolic display, and the like, which are effects that cannot be realized by the LED lamp; however, LED screen displays also have disadvantages, such as: luminance is high, extravagant electric energy to with it install in the bedroom, when having a rest at night, the LED lamp is very bright, may influence user's rest, and the suitability is relatively poor. When a user wants to extinguish the LED screen, the user needs to remotely operate the LED screen through the remote controller to forcibly extinguish the screen, and similarly, if the LED screen needs to be lightened, the user needs to be lightened through the remote controller, so that the user needs to actively operate the LED screen, the operation is troublesome, and especially for the old, redundant burden is caused, the user is not humanized enough, and the user experience is poor.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the application provides a display screen control method and device, and aims to at least solve the technical problems that the control of a display screen in intelligent household equipment is not intelligent enough and user experience is poor in the related art.
According to an aspect of an embodiment of the present application, there is provided a display screen control method including: acquiring eye movement information of a target object; determining a target control instruction corresponding to the eye action information based on a mapping relation between preset eye actions and display screen control instructions; and controlling a display screen to execute the operation corresponding to the target control instruction.
Optionally, the display screen is a display screen on the household equipment; the display screen control instruction at least comprises: the first control instruction is used for controlling the display screen to be lightened at preset brightness, and the second control instruction is used for controlling the display screen to be extinguished.
Optionally, when the target control instruction is the first control instruction, controlling the display screen to be lit at the preset brightness, and displaying the running state information of the household equipment; when the display screen is in a lighting state and the target control instruction is the second control instruction, judging whether the running state of the household equipment is normal or not based on the running state information; if the household equipment runs normally, controlling the display screen to be turned off; and if the household equipment runs abnormally, refusing to execute the second control instruction, and controlling the display screen to display the abnormal state information of the household equipment.
Optionally, when the display screen is in a lighting state and the home equipment operates normally, if the eye movement information of the target object is not detected within a second preset time period, the display screen is controlled to be turned off.
Optionally, before acquiring eye movement information of a target object, detecting a first distance between the target object and the display screen; and when the first distance is smaller than a preset distance threshold, acquiring the eye movement information of the target object.
Optionally, the obtaining of the eye movement information of the target object includes: acquiring continuous multi-frame facial images of the target object within a first preset time period; and identifying the multi-frame facial images based on an image identification algorithm to obtain the eye action information of the target object.
Optionally, the home equipment stores a plurality of mapping relationships set by a plurality of objects, where the plurality of objects correspond to the plurality of mapping relationships one to one, and the determining, based on a mapping relationship between a preset eye action and a display screen control instruction, a target control instruction corresponding to the eye action information includes: acquiring target identity information of the target object; determining a target mapping relation corresponding to the target object from a plurality of mapping relations based on the target identity information; and determining a target control instruction corresponding to the eye action information based on the target mapping relation.
According to another aspect of the embodiments of the present application, there is also provided a display screen control apparatus, including: the acquisition module is used for acquiring the eye action information of the target object; the determining module is used for determining a target control instruction corresponding to the eye action information based on a mapping relation between preset eye actions and display screen control instructions; and the control module is used for controlling the display screen to execute the operation corresponding to the target control instruction.
According to another aspect of the embodiments of the present application, a non-volatile storage medium is further provided, where the non-volatile storage medium includes a stored program, and when the program runs, the device where the non-volatile storage medium is located is controlled to execute the above display screen control method.
According to another aspect of the embodiments of the present application, there is also provided a processor, configured to execute a program, where the program executes the display screen control method described above.
In the embodiment of the application, the smart home device may acquire the eye movement information of the target object, determine the target control instruction corresponding to the eye movement information based on the mapping relationship between the preset eye movement and the display screen control instruction, and control the display screen to execute the operation corresponding to the target control instruction. The display screen can be controlled to execute the corresponding control instruction by pre-establishing the mapping relation between different eye actions and the display screen control instruction and identifying the eye action information of the user, so that the technical problems that the control of the display screen in the intelligent household equipment is not intelligent enough and the user experience is poor in the related technology are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow chart illustrating a display screen control method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a display screen control device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
For a better understanding of the embodiments of the present application, some of the terms or expressions appearing in the course of describing the embodiments of the present application are to be interpreted as follows:
image recognition: image recognition is an important field of artificial intelligence, and refers to a technology for recognizing targets and objects in various modes by processing, analyzing and understanding images by using a computer. The specific process generally comprises the following steps: 1. acquiring information; 2. information preprocessing is carried out; 3. extracting and selecting features; 4. designing a classifier and a classification decision.
The common image recognition algorithm can use OpenCV, is an open-source cross-platform computer vision and machine learning software library, can run on Linux, Windows, Android and Mac OS operating systems, is composed of a series of C functions and a small number of C + + classes, provides interfaces of languages such as Python, Ruby, MATLAB and the like, is light and efficient, and realizes a plurality of common algorithms in the aspects of image processing and computer vision.
Example 1
In accordance with an embodiment of the present application, there is provided a display control method, it should be noted that the steps shown in the flowchart of the drawings may be performed in a computer system such as a set of computer executable instructions, and that while a logical order is shown in the flowchart, in some cases the steps shown or described may be performed in an order different than here.
Fig. 1 is a schematic flowchart of an alternative display screen control method according to an embodiment of the present application, and as shown in fig. 1, the method at least includes steps S102-S106, where:
step S102, eye movement information of the target object is acquired.
In some optional embodiments of the present application, before acquiring the eye movement information of the target object, a first distance between the target object and the display screen may be detected; and only when the first distance is smaller than a preset distance threshold value, acquiring the eye movement information of the target object.
The display screen is mainly a display screen on intelligent household equipment, and the intelligent household equipment can be an intelligent air conditioner, an intelligent refrigerator, an intelligent television and the like; the preset distance threshold is set for the user, and the effective distance of the display screen of the smart home device is controlled by the eye movement, and can be set by the user according to the use habit, and is not specifically limited here.
For example, the distance threshold set by the user for the intelligent air conditioner display screen is 2m, when the user wants to control the intelligent air conditioner display screen by using the eye movement, the user needs to walk to a range 2m away from the intelligent air conditioner display screen, at this time, the intelligent air conditioner detects that the first distance between the user and the display screen is smaller than the preset distance threshold, and the user can acquire the eye movement information of the user by using the front-facing camera; when the first distance between the user and the display screen is detected to be larger than the preset distance threshold value, the user is considered to be unintentionally controlled by the intelligent air conditioner display screen, and the eye action information of the user cannot be acquired.
In some optional embodiments of the present application, when obtaining eye movement information of a target object, continuous multi-frame facial images of the target object within a first preset time period may be collected, and then the multi-frame facial images are identified based on an image identification algorithm to obtain the eye movement information of the target object.
For example, when the smart air conditioner acquires the eye movement information of the user, the front-facing camera may be used to acquire continuous frame facial images of the user within 5s, and input the images into a pre-trained eye movement recognition model, where the model may recognize the eye movement information of the user based on an OpenCV algorithm, such as: continuously open eyes 3s, continuously closed eyes 3s, continuously blinking eyes 2, etc., different eye movement information may be used to control the smart air conditioner display to perform different operations.
And step S104, determining a target control instruction corresponding to the eye action information based on the mapping relation between the preset eye action and the display screen control instruction.
And step S106, controlling the display screen to execute the operation corresponding to the target control instruction.
In some optional embodiments of the present application, the display screen control instruction mainly includes: the first control instruction is used for controlling the display screen to be lightened at preset brightness, and the second control instruction is used for controlling the display screen to be extinguished. The preset brightness may include a plurality of different brightnesses, such as a common 100% brightness (i.e., the display screen is fully lighted) and a common 50% brightness (i.e., the display screen is half lighted), so that the first control instruction may be used to control lighting of the display screen, and may also be used to control adjustment of the brightness of the display screen.
In order to implement the purpose of controlling the display screen to execute different operations by using different eye actions, a mapping relationship between the eye actions and the display screen control instructions may be pre-established, and the embodiment of the present application provides several selectable mapping relationships as follows:
mapping relation one: continuously opening the eyes for 3s, and fully lighting the display screen; 2, continuously blinking for 2 seconds, and correspondingly half-lighting the display screen; the continuous closing of the eye 3s corresponds to the turning off of the display screen.
Taking the case that the user controls the intelligent air conditioner display screen as an example, in the screen extinguishing state, when the intelligent air conditioner detects that the user continuously opens the eyes for 3s, the user is considered to be watching the display screen and wants to control the display screen, and the display screen can be fully lightened at the moment; when the user is detected to blink for 2 seconds continuously, the user is considered to want to control the display screen to be lighted but does not expect that the brightness of the display screen is too high, and the display screen can be lighted for half at the moment; in the bright screen state, when the user is detected to be continuously closed for 3s, the user is considered to not want to use the display screen any more, and the display screen can be turned off at the moment.
The mapping relationship is two: the brightness of the display screen is controlled mainly according to the duration that a user gazes at the display screen, and the display screen is half-lighted corresponding to 2s of continuous eye opening; the continuously opened eyes 4s correspond to the fully lighted display screen; the continuous closing of the eye 3s corresponds to the turning off of the display screen.
Still taking the example that the user controls the display screen of the intelligent air conditioner, in the screen extinguishing state, when the intelligent air conditioner detects that the user continuously opens the eyes for 4s, the user is considered to want to control the display screen, and at the moment, the display screen can be fully lightened; when the fact that the user continuously opens the eyes for 2s is detected, the fact that the user wants to control the display screen to be lightened but does not expect that the brightness of the display screen is too high is considered, and the display screen can be lightened in half at the moment; in the bright screen state, when the user is detected to be continuously closed for 3s, the user is considered to not want to use the display screen any more, and the display screen can be turned off at the moment.
It should be noted that the above mapping relationship is only used for illustration, and in the actual use process, the user may flexibly set the mapping relationship between the eye movement and the display screen control instruction according to the own use habit, and the mapping relationship is not specifically limited here.
Considering that in a home use scene, a plurality of family members need to control the display screen of the intelligent home device, if the unified mapping relation is used, the control means is monotonous and lack of individuality for each family member, and therefore, the intelligent home device can store the individualized mapping relation set for each family member.
Specifically, in some optional embodiments of the present application, a plurality of mapping relationships set by a plurality of objects are stored in the home equipment, where the plurality of objects correspond to the plurality of mapping relationships one to one, and when determining a target control instruction corresponding to the eye movement information based on the mapping relationships, target identity information of a target object may be obtained first, then, based on the target identity information, a target mapping relationship corresponding to the target object is determined from the plurality of mapping relationships, and then, based on the target mapping relationship, the target control instruction corresponding to the eye movement information is determined.
For example, the intelligent air conditioner stores a mapping relationship between the eye movement and the display screen control command set for the intelligent air conditioner by each family member. In practical application, when the intelligent air conditioner collects a facial image of a target family member to identify eye action information of the target family member, identity information of the family member can be determined through a face recognition algorithm, after the identity of the family member is determined, a target mapping relation corresponding to the family member can be determined from a plurality of stored mapping relations, and a target control instruction corresponding to the eye action information of the family member is determined based on the target mapping relation. Assuming that the mapping relation set by the first family member is the first mapping relation and the mapping relation set by the second family member is the second mapping relation, when the intelligent air conditioner detects that the eye movement of the first family member is within 2s and is continuously blinked for 2, the target control instruction can be determined to be half-lighting of the display screen, and then the display screen is controlled to be half-lighting; when the intelligent air conditioner detects that the eye movement of the second family member is within 2s and 2s of continuous blinking, the second family member does not have a control instruction corresponding to the eye movement, the eye movement is regarded as an invalid movement, the intelligent air conditioner does not execute any operation, and meanwhile, related prompt information can be generated to prompt the second family member to input the invalid eye movement.
In some optional embodiments of the present application, when the display screen is controlled to execute an operation corresponding to the target control instruction, if the target control instruction is the first control instruction, the display screen may be controlled to be lit at a preset brightness, and the running state information of the home equipment is displayed; when the display screen is in a lighting state and the target control instruction is the second control instruction, whether the running state of the household equipment is normal or not can be judged firstly based on the running state information; if the household equipment runs normally, controlling the display screen to be turned off; and if the household equipment runs abnormally, refusing to execute the second control instruction, and controlling the display screen to display abnormal state information of the household equipment.
For example, when the state of the display screen of the intelligent air conditioner is off, the intelligent air conditioner detects that the eyes of the family members are continuously opened for 3s, the target control instruction can be determined to be the fully-lighted display screen in the first control instruction based on the mapping relation, and at the moment, the display screen can be controlled to display at 100% brightness, and meanwhile, the running state information of the intelligent air conditioner is displayed; then, if the continuous eye closing of the family member is detected for 3s, the target control instruction can be determined to be a display screen turning off in the second control instruction, at the moment, the intelligent air conditioner can detect the running state of the intelligent air conditioner at first, and if the intelligent air conditioner runs normally, the display screen is turned off in response to the second control instruction; and if the intelligent air conditioner runs abnormally, the display screen is refused to be extinguished, and the display screen is controlled to display abnormal state information of the intelligent air conditioner so as to remind a user of solving the fault in time.
In some optional embodiments of the present application, when the display screen is in a lit state and the home equipment operates normally, if the eye movement information of the target object is not detected within the second preset time period, the display screen is controlled to be turned off.
It can be understood that, from the energy saving perspective, if the home device does not receive the control instruction of the user for a long time, the display screen can be turned off intelligently, and the second preset time period can be set by the user, for example, 10 min. For example, the current running state of the intelligent air conditioner is normal, the display screen is lightened, and if the eye action information of the user is not detected within 10min, the intelligent air conditioner automatically controls the display screen to be turned off so as to save power.
In the embodiment of the application, the smart home device may acquire the eye movement information of the target object, determine the target control instruction corresponding to the eye movement information based on the mapping relationship between the preset eye movement and the display screen control instruction, and control the display screen to execute the operation corresponding to the target control instruction. The display screen can be controlled to execute the corresponding control instruction by pre-establishing the mapping relation between different eye actions and the display screen control instruction and identifying the eye action information of the user, so that the technical problems that the control of the display screen in the intelligent household equipment is not intelligent enough and the user experience is poor in the related technology are solved.
Example 2
According to an embodiment of the present application, there is also provided a display screen control apparatus for implementing the display screen control method in embodiment 1, as shown in fig. 2, the apparatus at least includes an obtaining module 20, a determining module 22 and a control module 24, where:
an obtaining module 20, configured to obtain eye movement information of the target object.
In some optional embodiments of the present application, before acquiring the eye movement information of the target object, a first distance between the target object and the display screen may be detected; and only when the first distance is smaller than a preset distance threshold value, acquiring the eye movement information of the target object.
The display screen is mainly a display screen on intelligent household equipment, and the intelligent household equipment can be an intelligent air conditioner, an intelligent refrigerator, an intelligent air conditioner and the like; the preset distance threshold is set for the user, and the effective distance of the display screen of the smart home device is controlled by the eye movement, and can be set by the user according to the use habit, and is not specifically limited here.
For example, the distance threshold set by the user for the intelligent air conditioner display screen is 2m, when the user wants to control the intelligent air conditioner display screen by using the eye movement, the user needs to walk to a range 2m away from the intelligent air conditioner display screen, at this time, the intelligent air conditioner detects that the first distance between the user and the display screen is smaller than the preset distance threshold, and the user can acquire the eye movement information of the user by using the front-facing camera; when the first distance between the user and the display screen is detected to be larger than the preset distance threshold value, the user is considered to be unintentionally controlled by the intelligent air conditioner display screen, and the eye action information of the user cannot be acquired.
In some optional embodiments of the present application, when obtaining eye movement information of a target object, continuous multi-frame facial images of the target object within a first preset time period may be collected, and then the multi-frame facial images are identified based on an image identification algorithm to obtain the eye movement information of the target object.
For example, when the smart air conditioner acquires the eye movement information of the user, the front-facing camera may be used to acquire continuous frame facial images of the user within 5s, and input the images into a pre-trained eye movement recognition model, where the model may recognize the eye movement information of the user based on an OpenCV algorithm, such as: continuously open eyes 3s, continuously closed eyes 3s, continuously blinking eyes 2, etc., different eye movement information may be used to control the smart air conditioner display to perform different operations.
The determining module 22 is configured to determine a target control instruction corresponding to the eye movement information based on a mapping relationship between a preset eye movement and a display screen control instruction;
and the control module 24 is used for controlling the display screen to execute the operation corresponding to the target control instruction.
In some optional embodiments of the present application, the display screen control instruction mainly includes: the first control instruction is used for controlling the display screen to be lightened at preset brightness, and the second control instruction is used for controlling the display screen to be extinguished. The preset brightness may include a plurality of different brightnesses, such as a common 100% brightness (i.e., the display screen is fully lighted) and a common 50% brightness (i.e., the display screen is half lighted), so that the first control instruction may be used to control lighting of the display screen, and may also be used to control adjustment of the brightness of the display screen.
In order to implement the purpose of controlling the display screen to execute different operations by using different eye actions, a mapping relationship between the eye actions and the display screen control instructions may be pre-established, and the embodiment of the present application provides several selectable mapping relationships as follows:
mapping relation one: continuously opening the eyes for 3s, and fully lighting the display screen; 2, continuously blinking for 2 seconds, and correspondingly half-lighting the display screen; the continuous closing of the eye 3s corresponds to the turning off of the display screen.
The mapping relationship is two: the brightness of the display screen is controlled mainly according to the duration that a user gazes at the display screen, and the display screen is half-lighted corresponding to 2s of continuous eye opening; the continuously opened eyes 4s correspond to the fully lighted display screen; the continuous closing of the eye 3s corresponds to the turning off of the display screen.
It should be noted that the above mapping relationship is only used for illustration, and in the actual use process, the user may flexibly set the mapping relationship between the eye movement and the display screen control instruction according to the own use habit, and the mapping relationship is not specifically limited here.
Considering that in a home use scene, a plurality of family members need to control the display screen of the intelligent home device, if the unified mapping relation is used, the control means is monotonous and lack of individuality for each family member, and therefore, the intelligent home device can store the individualized mapping relation set for each family member.
Specifically, in some optional embodiments of the present application, a plurality of mapping relationships set by a plurality of objects are stored in the home equipment, where the plurality of objects correspond to the plurality of mapping relationships one to one, and when determining a target control instruction corresponding to the eye movement information based on the mapping relationships, target identity information of a target object may be obtained first, then, based on the target identity information, a target mapping relationship corresponding to the target object is determined from the plurality of mapping relationships, and then, based on the target mapping relationship, the target control instruction corresponding to the eye movement information is determined.
For example, the intelligent air conditioner stores a mapping relationship between the eye movement and the display screen control command set for the intelligent air conditioner by each family member. In practical application, when the intelligent air conditioner collects a facial image of a target family member to identify eye action information of the target family member, identity information of the family member can be determined through a face recognition algorithm, after the identity of the family member is determined, a target mapping relation corresponding to the family member can be determined from a plurality of stored mapping relations, and a target control instruction corresponding to the eye action information of the family member is determined based on the target mapping relation. Assuming that the mapping relation set by the first family member is the first mapping relation and the mapping relation set by the second family member is the second mapping relation, when the intelligent air conditioner detects that the eye movement of the first family member is within 2s and is continuously blinked for 2, the target control instruction can be determined to be half-lighting of the display screen, and then the display screen is controlled to be half-lighting; when the intelligent air conditioner detects that the eye movement of the second family member is within 2s and 2s of continuous blinking, the second family member does not have a control instruction corresponding to the eye movement, the eye movement is regarded as an invalid movement, the intelligent air conditioner does not execute any operation, and meanwhile, related prompt information can be generated to prompt the second family member to input the invalid eye movement.
In some optional embodiments of the present application, when the display screen is controlled to execute an operation corresponding to the target control instruction, if the target control instruction is the first control instruction, the display screen may be controlled to be lit at a preset brightness, and the running state information of the home equipment is displayed; when the display screen is in a lighting state and the target control instruction is the second control instruction, whether the running state of the household equipment is normal or not can be judged firstly based on the running state information; if the household equipment runs normally, controlling the display screen to be turned off; and if the household equipment runs abnormally, refusing to execute the second control instruction, and controlling the display screen to display abnormal state information of the household equipment.
For example, when the state of the display screen of the intelligent air conditioner is off, the intelligent air conditioner detects that the eyes of the family members are continuously opened for 3s, the target control instruction can be determined to be the fully-lighted display screen in the first control instruction based on the mapping relation, and at the moment, the display screen can be controlled to display at 100% brightness, and meanwhile, the running state information of the intelligent air conditioner is displayed; then, if the continuous eye closing of the family member is detected for 3s, the target control instruction can be determined to be a display screen turning off in the second control instruction, at the moment, the intelligent air conditioner can detect the running state of the intelligent air conditioner at first, and if the intelligent air conditioner runs normally, the display screen is turned off in response to the second control instruction; and if the intelligent air conditioner runs abnormally, the display screen is refused to be extinguished, and the display screen is controlled to display abnormal state information of the intelligent air conditioner so as to remind a user of solving the fault in time.
In some optional embodiments of the present application, when the display screen is in a lit state and the home equipment operates normally, if the eye movement information of the target object is not detected within the second preset time period, the display screen is controlled to be turned off.
It can be understood that, from the energy saving perspective, if the home device does not receive the control instruction of the user for a long time, the display screen can be turned off intelligently, and the second preset time period can be set by the user, for example, 10 min. For example, the current running state of the intelligent air conditioner is normal, the display screen is lightened, and if the eye action information of the user is not detected within 10min, the intelligent air conditioner automatically controls the display screen to be turned off so as to save power.
It should be noted that, in the embodiment of the present application, each module in the display screen control apparatus corresponds to the implementation steps of the display screen control method in embodiment 1 one to one, and since the detailed description is already performed in embodiment 1, some details that are not shown in this embodiment may refer to embodiment 1, and are not described herein again.
Example 3
According to an embodiment of the present application, there is also provided a nonvolatile storage medium including a stored program, wherein, when the program is executed, a device in which the nonvolatile storage medium is located is controlled to execute the display screen control method in embodiment 1.
According to an embodiment of the present application, there is also provided a processor, wherein the processor is configured to execute a program, and the display screen control method in embodiment 1 is executed when the program is executed.
Specifically, when the program runs, the following steps can be executed: acquiring eye movement information of a target object; determining a target control instruction corresponding to the eye action information based on a mapping relation between preset eye actions and display screen control instructions; and controlling the display screen to execute the operation corresponding to the target control instruction.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, a division of a unit may be a division of a logic function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or may not be executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.

Claims (10)

1. A display screen control method is characterized by comprising the following steps:
acquiring eye movement information of a target object;
determining a target control instruction corresponding to the eye action information based on a mapping relation between preset eye actions and display screen control instructions;
and controlling a display screen to execute the operation corresponding to the target control instruction.
2. The method according to claim 1, wherein the display screen is a display screen on a household device; the display screen control instruction at least comprises: the first control instruction is used for controlling the display screen to be lightened at preset brightness, and the second control instruction is used for controlling the display screen to be extinguished.
3. The method of claim 2, wherein controlling a display to perform an operation corresponding to the target control instruction comprises:
when the target control instruction is the first control instruction, controlling the display screen to be lightened at the preset brightness, and displaying the running state information of the household equipment;
when the display screen is in a lighting state and the target control instruction is the second control instruction, judging whether the running state of the household equipment is normal or not based on the running state information;
if the household equipment runs normally, controlling the display screen to be turned off;
and if the household equipment runs abnormally, refusing to execute the second control instruction, and controlling the display screen to display the abnormal state information of the household equipment.
4. The method of claim 3, further comprising:
and when the display screen is in a lighting state and the household equipment runs normally, if the eye action information of the target object is not detected within a second preset time period, controlling the display screen to be extinguished.
5. The method of claim 1, wherein prior to obtaining eye movement information of a target object, the method further comprises:
detecting a first distance between the target object and the display screen;
and when the first distance is smaller than a preset distance threshold, acquiring the eye movement information of the target object.
6. The method of claim 1, wherein obtaining eye movement information of the target object comprises:
acquiring continuous multi-frame facial images of the target object within a first preset time period;
and identifying the multi-frame facial images based on an image identification algorithm to obtain the eye action information of the target object.
7. The method according to claim 2, wherein a plurality of mapping relationships set by a plurality of objects are stored in the home equipment, wherein the plurality of objects correspond to the plurality of mapping relationships one to one, and the determining of the target control instruction corresponding to the eye movement information based on the mapping relationship between the preset eye movement and the display screen control instruction comprises:
acquiring target identity information of the target object;
determining a target mapping relation corresponding to the target object from a plurality of mapping relations based on the target identity information;
and determining a target control instruction corresponding to the eye action information based on the target mapping relation.
8. A display screen control apparatus, comprising:
the acquisition module is used for acquiring the eye action information of the target object;
the determining module is used for determining a target control instruction corresponding to the eye action information based on a mapping relation between preset eye actions and display screen control instructions;
and the control module is used for controlling the display screen to execute the operation corresponding to the target control instruction.
9. A non-volatile storage medium, characterized in that the non-volatile storage medium includes a stored program, wherein, when the program runs, a device in which the non-volatile storage medium is located is controlled to execute the display screen control method according to any one of claims 1 to 7.
10. A processor, characterized in that the processor is configured to run a program, wherein the program is configured to execute the display screen control method according to any one of claims 1 to 7 when running.
CN202111547767.2A 2021-12-16 2021-12-16 Display screen control method and device Pending CN114237119A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111547767.2A CN114237119A (en) 2021-12-16 2021-12-16 Display screen control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111547767.2A CN114237119A (en) 2021-12-16 2021-12-16 Display screen control method and device

Publications (1)

Publication Number Publication Date
CN114237119A true CN114237119A (en) 2022-03-25

Family

ID=80757562

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111547767.2A Pending CN114237119A (en) 2021-12-16 2021-12-16 Display screen control method and device

Country Status (1)

Country Link
CN (1) CN114237119A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150186720A1 (en) * 2013-12-27 2015-07-02 Utechzone Co., Ltd. Authentication system controlled by eye open and eye closed state, handheld control apparatus thereof and computer readable recoding media
CN105630135A (en) * 2014-10-27 2016-06-01 中兴通讯股份有限公司 Intelligent terminal control method and device
CN105892642A (en) * 2015-12-31 2016-08-24 乐视移动智能信息技术(北京)有限公司 Method and device for controlling terminal according to eye movement
CN106406535A (en) * 2016-09-29 2017-02-15 深圳天珑无线科技有限公司 Operation method and device for mobile device, and mobile device
CN106444415A (en) * 2016-12-08 2017-02-22 湖北大学 Smart home control method and system
CN108146370A (en) * 2017-12-15 2018-06-12 北京汽车集团有限公司 Control method for vehicle and device
CN110554768A (en) * 2018-05-31 2019-12-10 努比亚技术有限公司 intelligent wearable device control method and device and computer readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150186720A1 (en) * 2013-12-27 2015-07-02 Utechzone Co., Ltd. Authentication system controlled by eye open and eye closed state, handheld control apparatus thereof and computer readable recoding media
CN105630135A (en) * 2014-10-27 2016-06-01 中兴通讯股份有限公司 Intelligent terminal control method and device
CN105892642A (en) * 2015-12-31 2016-08-24 乐视移动智能信息技术(北京)有限公司 Method and device for controlling terminal according to eye movement
US20170192500A1 (en) * 2015-12-31 2017-07-06 Le Holdings (Beijing) Co., Ltd. Method and electronic device for controlling terminal according to eye action
CN106406535A (en) * 2016-09-29 2017-02-15 深圳天珑无线科技有限公司 Operation method and device for mobile device, and mobile device
CN106444415A (en) * 2016-12-08 2017-02-22 湖北大学 Smart home control method and system
CN108146370A (en) * 2017-12-15 2018-06-12 北京汽车集团有限公司 Control method for vehicle and device
CN110554768A (en) * 2018-05-31 2019-12-10 努比亚技术有限公司 intelligent wearable device control method and device and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN108006889B (en) Air conditioner control method and device
CN108900781B (en) Light supplement control method and device for image acquisition device and image acquisition system
CN103096124B (en) Auxiliary focusing method and auxiliary focusing device
CN106878780A (en) It is capable of the intelligent TV set and its control system and control method of Intelligent adjustment brightness
CN104899489A (en) Information processing apparatus, information processing method, eyewear terminal, and authentication system
CN110196557B (en) Equipment control method, device, mobile terminal and storage medium
US10755537B1 (en) Implementing deterrent protocols in response to detected security events
CN108882439A (en) A kind of illumination control method and lighting apparatus
CN106941588B (en) Data processing method and electronic equipment
CN206698484U (en) The control device and lighting device of lighting apparatus
US10382734B2 (en) Electronic device and color temperature adjusting method
CN110381293A (en) Video monitoring method, device and computer readable storage medium
CN110174936A (en) Control method, device and the equipment of wear-type visual device
CN104110787A (en) Method and system for controlling air conditioner
KR20150088614A (en) Apparatus and method for output control accoridng to environment in electronic device
CN108881730A (en) Image interfusion method, device, electronic equipment and computer readable storage medium
CN111667798A (en) Screen adjusting method and device
CN111147935A (en) Control method of television, intelligent household control equipment and storage medium
CN114237119A (en) Display screen control method and device
US11423762B1 (en) Providing device power-level notifications
CN108965837A (en) A kind of projection environment brightness control method, system and terminal device
CN107273096B (en) One-key rapid dimming method and system
CN108984140A (en) A kind of display control method and system
CN113177524A (en) Control method and system of access control equipment and electronic equipment
CN107969057A (en) Indoor light control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination