CN112286350A - Equipment control method and device, electronic equipment, electronic device and processor - Google Patents

Equipment control method and device, electronic equipment, electronic device and processor Download PDF

Info

Publication number
CN112286350A
CN112286350A CN202011167977.4A CN202011167977A CN112286350A CN 112286350 A CN112286350 A CN 112286350A CN 202011167977 A CN202011167977 A CN 202011167977A CN 112286350 A CN112286350 A CN 112286350A
Authority
CN
China
Prior art keywords
target
area
user
control
watching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011167977.4A
Other languages
Chinese (zh)
Inventor
徐鹏飞
苗岑岑
冯玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202011167977.4A priority Critical patent/CN112286350A/en
Publication of CN112286350A publication Critical patent/CN112286350A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The invention discloses a device control method and device, electronic equipment, an electronic device and a processor. Wherein, the method comprises the following steps: determining a target watching area where a user eyeball watching point is located in a control area of target equipment; generating a control instruction based on the function indication information contained in the target gazing area; and controlling the target equipment to execute target operation corresponding to the control instruction. The invention solves the technical problems of inconvenient use and low control efficiency of users caused by the limitation of equipment control modes in the prior art to a certain degree.

Description

Equipment control method and device, electronic equipment, electronic device and processor
Technical Field
The invention relates to the field of equipment control, in particular to an equipment control method and device, electronic equipment, an electronic device and a processor.
Background
At present, various intelligent mobile devices are increasingly popularized, human-computer interaction modes are continuously developed, and in the prior art, traditional device control modes comprise remote control, voice, gestures and the like, but the control modes have certain limitations. For example, in practical applications, an air conditioner controlled by a remote controller needs to find the remote controller in advance; the voice command recognition is inaccurate, and no unified standard exists; gesture control requires learning complex gesture actions.
The existing equipment control modes have limitations to a certain degree, so that users are inconvenient to use and are difficult to achieve good human-computer interaction experience.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides a device control method and device, electronic equipment, an electronic device and a processor, and at least solves the technical problems that in the prior art, the device control modes are limited to a certain degree, so that users are inconvenient to use and the control efficiency is low.
According to an aspect of an embodiment of the present invention, there is provided an apparatus control method including: determining a target watching area where a user eyeball watching point is located in a control area of target equipment; generating a control instruction based on the function indication information contained in the target gazing area; and controlling the target equipment to execute target operation corresponding to the control instruction.
Optionally, the determining the target watching area in the control area of the target device includes: detecting an initial gazing area where the user eyeball gazing point is located within a target range by utilizing an eyeball tracking sensor configured on the target equipment; acquiring a fixation point offset of the fixation point of the user eyeball relative to the initial fixation area based on the rotation direction and the rotation angle of the user eyeball; and determining the target watching area according to the initial watching area and the watching point offset.
Optionally, the method further includes: recording a first watching duration of the user eyeball watching point in the current watching area after the user eyeball watching point moves from the initial watching area to the current watching area; when the first watching duration does not exceed a first preset threshold, continuously detecting a next watching area; and when the first watching duration exceeds the first preset threshold, locking the current watching area as the target watching area.
Optionally, the method further includes: and adding mark information to the target watching area.
Optionally, the generating the control command based on the function instruction information included in the target gazing area includes: detecting whether the target gazing area contains the functional icon or not; generating the control instruction corresponding to the function icon based on preset control information, wherein the preset control information includes: and the marks of the plurality of functional icons and the marks of each functional icon in the marks of the plurality of functional icons respectively correspond to different control instructions.
Optionally, the method further includes: acquiring user characteristic information; and setting user authority information based on the user characteristic information, wherein the user authority information is used for verifying the control authority of the user on the target equipment.
Optionally, the method further includes: recording a second watching duration of the eyeball watching point of the user in the initial watching area; and when the second watching duration exceeds a second preset threshold, acquiring the rotating direction and the rotating angle.
According to another aspect of the embodiments of the present invention, there is also provided an apparatus control device, including: the determining module is used for determining a target watching area where a user eyeball watching point is located in a control area of the target equipment; the generating module is used for generating a control instruction based on the function indication information contained in the target gazing area; and the control module is used for controlling the target equipment to execute target operation corresponding to the control instruction.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium having a computer program stored therein, wherein the computer program is configured to execute the above-mentioned device control method when running.
According to another aspect of the embodiments of the present invention, there is also provided a processor for executing a program, where the program is configured to execute the apparatus control method when running.
According to another aspect of the embodiments of the present invention, there is also provided an electronic apparatus, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform the device control method.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including: the device comprises an electronic device and an eyeball tracking sensor, wherein the electronic device comprises a memory and a processor, the memory stores a computer program, the processor is configured to run the computer program to execute the equipment control method, and the eyeball tracking sensor is used for tracking a target fixation area where a user eyeball fixation point is located.
In the embodiment of the invention, an equipment control mode based on an eyeball tracking technology is adopted, and a target watching area where a user eyeball watching point is located is determined in a control area of target equipment; generating a control instruction based on the function indication information contained in the target gazing area; the target equipment is controlled to execute the target operation corresponding to the control instruction, and the purposes of improving the operation efficiency and convenience of user control equipment are achieved, so that the technical effects of more accurate and reliable man-machine interaction and higher control efficiency are achieved, and the technical problems that in the prior art, equipment control modes have limitations to a certain degree, the use of users is inconvenient, and the control efficiency is lower are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a flowchart of an apparatus control method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an alternative scenario of a device control method according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an electronic device according to an embodiment of the invention;
fig. 4 is a schematic structural diagram of an apparatus control device according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of a device control method, it should be noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer-executable instructions, and that while a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a flowchart of an apparatus control method according to an embodiment of the present invention, as shown in fig. 1, the method including the steps of:
step S102, determining a target watching area where a user eyeball watching point is located in a control area of target equipment;
step S104, generating a control instruction based on the function indication information contained in the target gazing area;
and step S106, controlling the target equipment to execute target operation corresponding to the control instruction.
In the embodiment of the invention, an equipment control mode based on an eyeball tracking technology is adopted, and a target watching area where a user eyeball watching point is located is determined in a control area of target equipment; generating a control instruction based on the function indication information contained in the target gazing area; the target equipment is controlled to execute the target operation corresponding to the control instruction, and the purposes of improving the operation efficiency and convenience of user control equipment are achieved, so that the technical effects of more accurate and reliable man-machine interaction and higher control efficiency are achieved, and the technical problems that in the prior art, equipment control modes have limitations to a certain degree, the use of users is inconvenient, and the control efficiency is lower are solved.
Optionally, the target device may be a household appliance, for example, an air conditioner, a television, a refrigerator, a washing machine, or the like; but also office equipment such as printers, projectors, etc.; but also medical devices, public devices, etc.
Optionally, the control area may be a control panel on the target device, and the control panel may identify a gaze point of an eyeball of the user and determine a target gaze area where the gaze point of the eyeball of the user is located; the function indication information may be a function icon.
In an alternative embodiment, determining the target gaze area within the control area of the target device comprises:
step S202, detecting an initial gaze area where the user' S eyeball gaze point is located within a target range by using an eyeball tracking sensor configured on the target device;
step S204, acquiring a fixation point offset of the fixation point of the eyeball of the user relative to the initial fixation area based on the rotation direction and the rotation angle of the eyeball of the user;
in step S206, the target gaze region is determined by the initial gaze region and the gaze point offset.
In this embodiment, an eyeball tracking sensor is disposed in the control area, and the eyeball tracking sensor is an image acquisition device based on a human eyeball capture technology and having a human eye recognition function, and optionally, the eyeball tracking sensor includes: the system comprises an infrared camera, an image processor, a viewpoint calculation core and a distance sensor. The infrared camera is used for capturing face information of a user and position information corresponding to eyeballs, the image processor analyzes and processes obtained picture information, the viewpoint calculation core calculates the movement direction and movement angle of the eyeballs, and the distance sensor calculates the distance between the user and the target equipment.
The target range may be preset for factory leaving, but since the user conditions such as eyesight of different users are different, the user may also set the target range in a user-defined manner, as long as it is ensured that the user can clearly see the function prompt information on the control panel of the target device.
In the above embodiment, an initial gaze area where a user's eye gaze point is located within a target range is detected by an eye tracking sensor; i.e. the gaze area where the base coordinates are located.
In the embodiment of the application, an initial gazing area is established according to the position of the eyeball when the user gazes at the eyeball tracking sensor, for example, based on the position of the eyeball tracking sensor, and the initial gazing area is equivalent to a base coordinate; when the eyeballs of the user rotate, the gaze point offset of the gaze point of the eyeballs of the user relative to the initial gaze area is calculated according to the rotation direction and the rotation angle of the eyeballs, and then the target gaze area is determined according to the initial gaze area and the gaze point offset.
Taking the target device as an air conditioner as an example, a user firstly watches an area where an eyeball tracking sensor arranged on the air conditioner is located, namely an initial watching area; when the air conditioner is controlled to work, the eyeball tracking sensor detects a user in a target range, the image processor processes an image of the user when the user gazes at an initial gazing area, the eye movement of the user is identified, the attention point of the user is detected, the eyeball focusing position of the user can be obtained, the pupil of the user can be directly detected, the eyeball focusing position of the user is calculated, and the gazing point offset is obtained.
In an optional embodiment, the method further includes:
step S302, after the user eyeball fixation point moves from the initial fixation area to the current fixation area, recording a first fixation time length of the user eyeball fixation point in the current fixation area;
step S304, when the first watching duration does not exceed a first preset threshold, continuing to detect a next watching area;
step S306, locking the current gazing area as the target gazing area when the first gazing duration exceeds the first preset threshold.
In the above optional embodiment, whether the user wants to operate the current gazing area is determined according to the current gazing area focused by the user eyeball gazing point and whether the first gazing duration of the user eyeball gazing point in the current gazing area exceeds a first preset threshold (for example, 0.2 to 0.5 s).
It should be noted that, for example, the target device is an air conditioner, as shown in fig. 2, each function icon (e.g., function key) on a display panel of the air conditioner occupies an area with a predetermined size, and due to different positions of the function icons, when a user gazes at different function icons, a color block representing a gaze point of an eyeball of the user moves to an area where the function icon of the air conditioner is located along with the rotation of the eyeball of the user, and a first gazing duration of the gaze point of the eyeball of the user in the current gazing area exceeds a first preset threshold, the current gazing area is locked as the target gazing area, and if the first gazing duration does not exceed the first preset threshold, a next gazing area of the gaze point of the eyeball of the user is continuously detected.
In an optional embodiment, the method further includes:
step S402, adding mark information to the target gaze area.
In the embodiment of the present application, as shown in fig. 2, after the current gazing area is locked as the target gazing area, the target gazing area is represented by a mark color block (e.g., color block) different from the ground color of the control panel in the target gazing area.
In an optional embodiment, the generating the control command based on the function indication information included in the target gazing area includes:
step S502, detecting whether the target gazing area contains the functional icon;
step S504, generating the control instruction corresponding to the function icon based on preset control information, where the preset control information includes: and the marks of the plurality of functional icons and the marks of each functional icon in the marks of the plurality of functional icons respectively correspond to different control instructions.
It should be noted that, for example, in the case that the target device is an air conditioner, as shown in fig. 2, since each function icon (e.g., function key) on the display panel of the air conditioner occupies a predetermined area, the eyeball may rotate in real time when the user gazes at different function icons due to different positions of the function icons.
When it is detected that the user eyeball gazing point of the user moves from the initial gazing area to the current gazing area, and the first gazing duration of the user eyeball gazing point in the current gazing area exceeds a first preset threshold value, the current gazing area is locked as the target gazing area, it is determined that the user selects a function icon in the target gazing area, and the air conditioner executes a control instruction corresponding to the function icon.
Still taking the above target device as an air conditioner as an example, if the first watching duration of the user's eyeball watching point in the area where the temperature-raising function icon on the control panel is located exceeds the first preset threshold value of 0.5s, it is determined that the control instruction corresponding to the temperature-raising function icon is to control the air conditioner to raise the temperature to 11 ℃ based on the current temperature value of 10 ℃, and the current temperature value of 10 ℃ displayed on the control panel is updated to 11 ℃.
In an optional embodiment, the method further includes:
step S602, obtaining user characteristic information;
step S604, setting user right information based on the user characteristic information, where the user right information is used to verify a control right of the user to the target device.
Optionally, the user characteristic information may include: user facial feature information, user voiceprint feature information, user fingerprint feature information, user eye iris feature information, and the like.
In the embodiment of the application, the user can input the user characteristic information in advance, the user permission information of the target device is set based on the user characteristic information, the control permission of the user on the target device can be checked, and misoperation in the process of using the target device is avoided.
In an optional embodiment, the method further includes:
step S702, recording a second fixation time length of the eyeball fixation point of the user in the initial fixation area;
step S704, when the second watching time period exceeds a second preset threshold, obtaining the rotation direction and the rotation angle.
Optionally, the second preset threshold is 1-1.5S.
For example, when a second gazing duration of the user eyeball gazing point in the initial gazing area exceeds a second preset threshold value of 1.5S, it is determined that the user needs to operate the target device, the rotation direction and the rotation angle of the user eyeball are obtained, and then the gazing point offset of the user eyeball gazing point relative to the initial gazing area can be obtained according to the rotation angle and the rotation direction.
Still taking the above target device as an example of an air conditioner, when a user starts to control the air conditioner, an eyeball tracking sensor arranged on the air conditioner captures a user eyeball fixation point (i.e. a user eyeball position) through an infrared camera, a cartesian coordinate system is established with a midpoint of a connection line between two pupil centers as an origin to obtain coordinate values of two pupil positions, the coordinate values of the pupil positions are continuously output to the eyeball tracking sensor, the eyeball tracking sensor establishes a coordinate system with itself as the origin of coordinates (an initial fixation area), the user eyeball fixation point is represented on the coordinate system, and association between the initial fixation area and a current fixation area is established.
When the user's eyeball rotates, that is, the user's eyeball fixation point moves, the rotation direction and rotation angle of the user's eyeball are calculated and mapped onto the function icon of the air conditioner, the eyeball tracking sensor calculates the user's eyeball fixation point through the fixation point calculation core, and obtains the fixation point offset of the user's eyeball fixation point relative to the initial fixation area, so as to execute corresponding function control, for example: and if the first watching time of the functional icon of the control panel of the air conditioner watched by the user exceeds a first preset threshold value of 0.5s, judging that the current functional icon of the user controls the operation mode of the air conditioner, and switching the operation mode according to the functional icon to control the air conditioner through eyeball rotation by the user.
Example 2
According to an embodiment of the present invention, there is also provided an embodiment of an electronic device, and fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, as shown in fig. 3, the electronic device includes: an electronic device 30 comprising a memory in which a computer program is stored and a processor configured to run the computer program to perform the apparatus control method, and an eye tracking sensor 32 for tracking a target gaze area at which a user's eye gaze point is located.
In the embodiment of the invention, an equipment control mode based on an eyeball tracking technology is adopted, and a target watching area where a user eyeball watching point is located is determined in a control area of target equipment; generating a control instruction based on the function indication information contained in the target gazing area; the target equipment is controlled to execute the target operation corresponding to the control instruction, and the purposes of improving the operation efficiency and convenience of user control equipment are achieved, so that the technical effects of more accurate and reliable man-machine interaction and higher control efficiency are achieved, and the technical problems that in the prior art, equipment control modes have limitations to a certain degree, the use of users is inconvenient, and the control efficiency is lower are solved.
It should be noted that, reference may be made to the relevant description in embodiment 1 for alternative or preferred embodiments of this embodiment, and details are not described here again.
Example 3
According to an embodiment of the present invention, there is further provided an apparatus embodiment for implementing the device control method, and fig. 4 is a schematic structural diagram of a device control apparatus according to an embodiment of the present invention, and as shown in fig. 4, the device control apparatus includes: a determination module 40, a generation module 42, and a control module 44, wherein:
a determining module 40, configured to determine, in a control area of a target device, a target gaze area where a user eyeball gaze point is located; a generating module 42, configured to generate a control command based on the function indication information included in the target gazing area; and a control module 44, configured to control the target device to perform a target operation corresponding to the control instruction.
In the embodiment of the invention, an equipment control mode based on an eyeball tracking technology is adopted, and a target watching area where a user eyeball watching point is located is determined in a control area of target equipment; generating a control instruction based on the function indication information contained in the target gazing area; the target equipment is controlled to execute the target operation corresponding to the control instruction, and the purposes of improving the operation efficiency and convenience of user control equipment are achieved, so that the technical effects of more accurate and reliable man-machine interaction and higher control efficiency are achieved, and the technical problems that in the prior art, equipment control modes have limitations to a certain degree, the use of users is inconvenient, and the control efficiency is lower are solved.
It should be noted that the above modules may be implemented by software or hardware, for example, for the latter, the following may be implemented: the modules can be located in the same processor; alternatively, the modules may be located in different processors in any combination.
It should be noted here that the determining module 40, the generating module 42 and the control module 44 correspond to steps S102 to S106 in embodiment 1, and the modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure in embodiment 1. It should be noted that the modules described above may be implemented in a computer terminal as part of an apparatus.
It should be noted that, reference may be made to the relevant description in embodiment 1 for alternative or preferred embodiments of this embodiment, and details are not described here again.
The device control apparatus may further include a processor and a memory, wherein the determining module 40, the generating module 42, the control module 44, and the like are stored in the memory as program units, and the processor executes the program units stored in the memory to implement corresponding functions.
The processor comprises a kernel, and the kernel calls a corresponding program unit from the memory, wherein one or more than one kernel can be arranged. The memory may include volatile memory in a computer readable medium, Random Access Memory (RAM) and/or nonvolatile memory such as Read Only Memory (ROM) or flash memory (flash RAM), and the memory includes at least one memory chip.
According to an embodiment of the present application, there is also provided an embodiment of a non-volatile storage medium. Optionally, in this embodiment, the nonvolatile storage medium includes a stored program, and the device in which the nonvolatile storage medium is located is controlled to execute any one of the device control methods when the program runs.
Optionally, in this embodiment, the nonvolatile storage medium may be located in any one of a group of computer terminals in a computer network, or in any one of a group of mobile terminals, and the nonvolatile storage medium includes a stored program.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: determining a target watching area where a user eyeball watching point is located in a control area of target equipment; generating a control instruction based on the function indication information contained in the target gazing area; and controlling the target equipment to execute target operation corresponding to the control instruction.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: detecting an initial gazing area where the user eyeball gazing point is located within a target range by utilizing an eyeball tracking sensor configured on the target equipment; acquiring a fixation point offset of the fixation point of the user eyeball relative to the initial fixation area based on the rotation direction and the rotation angle of the user eyeball; and determining the target watching area according to the initial watching area and the watching point offset.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: recording a first watching duration of the user eyeball watching point in the current watching area after the user eyeball watching point moves from the initial watching area to the current watching area; when the first watching duration does not exceed a first preset threshold, continuously detecting a next watching area; and when the first watching duration exceeds the first preset threshold, locking the current watching area as the target watching area.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: and adding mark information to the target watching area.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: detecting whether the target gazing area contains the functional icon or not; generating the control instruction corresponding to the function icon based on preset control information, wherein the preset control information includes: and the marks of the plurality of functional icons and the marks of each functional icon in the marks of the plurality of functional icons respectively correspond to different control instructions.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: acquiring user characteristic information; and setting user authority information based on the user characteristic information, wherein the user authority information is used for verifying the control authority of the user on the target equipment.
Optionally, the apparatus in which the non-volatile storage medium is controlled to perform the following functions when the program is executed: recording a second watching duration of the eyeball watching point of the user in the initial watching area; and when the second watching duration exceeds a second preset threshold, acquiring the rotating direction and the rotating angle.
According to an embodiment of the present application, there is also provided an embodiment of a processor. Optionally, in this embodiment, the processor is configured to execute a program, where the program executes the any one of the device control methods.
According to an embodiment of the present application, there is also provided an embodiment of an electronic apparatus, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform any one of the above device control methods.
According to an embodiment of the present application, there is also provided an embodiment of a computer program product, which, when executed on a data processing device, is adapted to execute a program that initializes the steps of the device control method of any of the above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (12)

1. An apparatus control method characterized by comprising:
determining a target watching area where a user eyeball watching point is located in a control area of target equipment;
generating a control instruction based on the function indication information contained in the target gazing area;
and controlling the target equipment to execute target operation corresponding to the control instruction.
2. The device control method of claim 1, wherein determining the target gaze region within the control region of the target device comprises:
detecting an initial gazing area where the user eyeball gazing point is located in a target range by utilizing an eyeball tracking sensor configured on the target equipment;
acquiring a fixation point offset of a fixation point of the user eyeballs relative to the initial fixation area based on the rotation direction and the rotation angle of the user eyeballs;
and determining the target gazing area through the initial gazing area and the gazing point offset.
3. The apparatus control method according to claim 2, characterized in that the method further comprises:
after the user eyeball fixation point moves from the initial fixation area to the current fixation area, recording a first fixation duration of the user eyeball fixation point in the current fixation area;
when the first watching duration does not exceed a first preset threshold, continuously detecting a next watching area;
and when the first watching duration exceeds the first preset threshold, locking the current watching area as the target watching area.
4. The apparatus control method according to claim 1, characterized in that the method further comprises:
and adding mark information to the target watching area.
5. The device control method according to claim 1, wherein the function instruction information is a function icon, and generating the control command based on the function instruction information included in the target gazing area comprises:
detecting whether the target gazing area contains the function icon or not;
generating the control instruction corresponding to the function icon based on preset control information, wherein the preset control information comprises: and the marks of the plurality of functional icons and the marks of each functional icon in the marks of the plurality of functional icons respectively correspond to different control instructions.
6. The apparatus control method according to claim 1, characterized in that the method further comprises:
acquiring user characteristic information;
and setting user authority information based on the user characteristic information, wherein the user authority information is used for verifying the control authority of the user on the target equipment.
7. The apparatus control method according to claim 2, characterized in that the method further comprises:
recording a second fixation time length of the eyeball fixation point of the user in the initial fixation area;
and when the second watching duration exceeds a second preset threshold, acquiring the rotating direction and the rotating angle.
8. An apparatus control device, characterized by comprising:
the determining module is used for determining a target watching area where a user eyeball watching point is located in a control area of the target equipment;
the generating module is used for generating a control instruction based on the function indication information contained in the target gazing area;
and the control module is used for controlling the target equipment to execute target operation corresponding to the control instruction.
9. A non-volatile storage medium, characterized in that a computer program is stored in the storage medium, wherein the computer program is arranged to execute the device control method of any one of claims 1 to 7 when running.
10. A processor for running a program, wherein the program is arranged to perform the device control method of any one of claims 1 to 7 when running.
11. An electronic apparatus comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the device control method of any one of claims 1 to 7.
12. An electronic device, comprising: an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the apparatus control method according to any one of claims 1 to 7, and an eye tracking sensor for tracking a target gaze area at which a point of gaze of an eye of a user is located.
CN202011167977.4A 2020-10-27 2020-10-27 Equipment control method and device, electronic equipment, electronic device and processor Pending CN112286350A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011167977.4A CN112286350A (en) 2020-10-27 2020-10-27 Equipment control method and device, electronic equipment, electronic device and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011167977.4A CN112286350A (en) 2020-10-27 2020-10-27 Equipment control method and device, electronic equipment, electronic device and processor

Publications (1)

Publication Number Publication Date
CN112286350A true CN112286350A (en) 2021-01-29

Family

ID=74373450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011167977.4A Pending CN112286350A (en) 2020-10-27 2020-10-27 Equipment control method and device, electronic equipment, electronic device and processor

Country Status (1)

Country Link
CN (1) CN112286350A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112783330A (en) * 2021-03-16 2021-05-11 展讯通信(上海)有限公司 Electronic equipment operation method and device and electronic equipment
CN115406051A (en) * 2022-08-29 2022-11-29 珠海格力电器股份有限公司 Vision-based air conditioner control method and device, air conditioner panel and air conditioner

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104391574A (en) * 2014-11-14 2015-03-04 京东方科技集团股份有限公司 Sight processing method, sight processing system, terminal equipment and wearable equipment
CN106598259A (en) * 2016-12-28 2017-04-26 歌尔科技有限公司 Input method and input unit for head-mounted equipment and VR head-mounted equipment
CN107003744A (en) * 2016-12-01 2017-08-01 深圳前海达闼云端智能科技有限公司 Viewpoint determines method, device, electronic equipment and computer program product
CN108279778A (en) * 2018-02-12 2018-07-13 上海京颐科技股份有限公司 User interaction approach, device and system
CN109992096A (en) * 2017-12-29 2019-07-09 北京亮亮视野科技有限公司 Activate intelligent glasses functional diagram calibration method
CN110825228A (en) * 2019-11-01 2020-02-21 腾讯科技(深圳)有限公司 Interaction control method and device, storage medium and electronic device
CN111399658A (en) * 2020-04-24 2020-07-10 Oppo广东移动通信有限公司 Calibration method and device for eyeball fixation point, electronic equipment and storage medium
CN111596760A (en) * 2020-04-30 2020-08-28 维沃移动通信有限公司 Operation control method and device, electronic equipment and readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104391574A (en) * 2014-11-14 2015-03-04 京东方科技集团股份有限公司 Sight processing method, sight processing system, terminal equipment and wearable equipment
CN107003744A (en) * 2016-12-01 2017-08-01 深圳前海达闼云端智能科技有限公司 Viewpoint determines method, device, electronic equipment and computer program product
CN106598259A (en) * 2016-12-28 2017-04-26 歌尔科技有限公司 Input method and input unit for head-mounted equipment and VR head-mounted equipment
CN109992096A (en) * 2017-12-29 2019-07-09 北京亮亮视野科技有限公司 Activate intelligent glasses functional diagram calibration method
CN108279778A (en) * 2018-02-12 2018-07-13 上海京颐科技股份有限公司 User interaction approach, device and system
CN110825228A (en) * 2019-11-01 2020-02-21 腾讯科技(深圳)有限公司 Interaction control method and device, storage medium and electronic device
CN111399658A (en) * 2020-04-24 2020-07-10 Oppo广东移动通信有限公司 Calibration method and device for eyeball fixation point, electronic equipment and storage medium
CN111596760A (en) * 2020-04-30 2020-08-28 维沃移动通信有限公司 Operation control method and device, electronic equipment and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112783330A (en) * 2021-03-16 2021-05-11 展讯通信(上海)有限公司 Electronic equipment operation method and device and electronic equipment
CN115406051A (en) * 2022-08-29 2022-11-29 珠海格力电器股份有限公司 Vision-based air conditioner control method and device, air conditioner panel and air conditioner

Similar Documents

Publication Publication Date Title
AU2017287619B2 (en) Method and apparatus for identity recognition
US20190104340A1 (en) Intelligent Terminal Control Method and Intelligent Terminal
JP2020530631A (en) Interaction locating methods, systems, storage media, and smart devices
CN108241434B (en) Man-machine interaction method, device and medium based on depth of field information and mobile terminal
US10075629B2 (en) Electronic device for capturing images while user looks directly at camera
WO2017113668A1 (en) Method and device for controlling terminal according to eye movement
CN105303091A (en) Eyeball tracking technology based privacy protection method and system
CN108681399B (en) Equipment control method, device, control equipment and storage medium
CN110489952A (en) Identity authentication method, device and user equipment
CN112286350A (en) Equipment control method and device, electronic equipment, electronic device and processor
CN105320871A (en) Screen unlocking method and screen unlocking apparatus
CN105159505A (en) Interface operation method and terminal
CN107622246B (en) Face recognition method and related product
KR102392437B1 (en) Reflection-based control activation
CN107239222B (en) Touch screen control method and terminal device
WO2020160165A1 (en) Multi-factor authentication for virtual reality
CN110895934A (en) Household appliance control method and device
CN107092350A (en) A kind of remote computer based system and method
CN105320261A (en) Control method for mobile terminal and mobile terminal
CN111988493A (en) Interaction processing method, device, equipment and storage medium
CN110825228B (en) Interactive control method and device, storage medium and electronic device
US20200042105A1 (en) Information processing apparatus, information processing method, and recording medium
CN111596760A (en) Operation control method and device, electronic equipment and readable storage medium
CN113759748A (en) Intelligent home control method and system based on Internet of things
CN106060383A (en) Image obtaining method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210129

RJ01 Rejection of invention patent application after publication