WO2022224189A1 - Method and system for controlling objects of a housing unit by eye-gaze tracking - Google Patents

Method and system for controlling objects of a housing unit by eye-gaze tracking Download PDF

Info

Publication number
WO2022224189A1
WO2022224189A1 PCT/IB2022/053736 IB2022053736W WO2022224189A1 WO 2022224189 A1 WO2022224189 A1 WO 2022224189A1 IB 2022053736 W IB2022053736 W IB 2022053736W WO 2022224189 A1 WO2022224189 A1 WO 2022224189A1
Authority
WO
WIPO (PCT)
Prior art keywords
terminal
iot
housing unit
dimensional coordinates
eye
Prior art date
Application number
PCT/IB2022/053736
Other languages
French (fr)
Inventor
Francesco ALOTTO
Matteo DEL GIUDICE
Edoardo PATTI
Anna OSELLO
Andrea Acquaviva
Original Assignee
Politecnico Di Torino
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Politecnico Di Torino filed Critical Politecnico Di Torino
Publication of WO2022224189A1 publication Critical patent/WO2022224189A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Abstract

The present invention relates to a method for controlling at least one object (191) of a housing unit (190) by eye-gaze tracking, the object (191) being operatively connected to an loT device (181), the method comprising: - an acquisition phase, wherein a terminal (110) acquires from a server (150) a virtual model (290) of the housing unit (190) or at least a part thereof, the virtual model (290) being created in accordance with the BIM technique, and wherein the terminal (110) acquires from the server (150) a set of loT services (281) of said at least one loT device (181), the set of loT services (281) being associated with at least one object (191) and with the virtual model (290), which includes said at least one object (191); - a first output phase, wherein output means (115) of the terminal (110) create a first three- dimensional representation of said at least one object (191) on the basis of the virtual model (290), the first three-dimensional representation comprising the set of loT services (281); - a selection phase, wherein input means (120) receive at least one command from a user by eye-gaze tracking, in order to select at least one loT service of the set of loT services (281); - a transmission phase, wherein the terminal (110) transmits to the server (150), via the communication means (130), at least one control signal (140) for controlling at least one loT device (181), the control signal (140) corresponding to the selected loT service; - a second output phase, wherein the output means (115) of the terminal (110) create a second three-dimensional representation of the object (191) on the basis of the virtual model (290) and on the basis of the selected loT service.

Description

METHOD AND SYSTEM FOR CONTROLLING OBJECTS OF A HOUSING UNIT BY EYE-GAZE TRACKING
DESCRIPTION
The present invention relates to a method for controlling objects of a housing unit by eye- gaze tracking, in accordance with the preamble of claim 1. In particular, described herein is a method for controlling objects of a housing unit by eye-gaze tracking, along with the associated system and terminal.
The present invention can be used for improving the self-sufficiency of users who have lost their motor functions, permitting them to autonomously control objects of a housing unit, such as, for example, doors and windows, household appliances, electric systems, heating and/or cooling systems, water systems, and so forth, e.g. within home and/or hospital environments.
The present invention is particularly helpful for users suffering from degenerative diseases involving motor neurons, wherein a pathological mechanism leads to progressive loss of motor functions, resulting in a strongly limiting condition that, because of the inhibition of the motor neurons’ functionality, makes such users non-self-sufficient.
One of the most important problems that are faced by individuals affected by neurocognitive degradation is communication inability. In fact, motor neurons provide control over the body’s voluntary muscles: loss of neuron-cell activation results in inhibition of the individual’s vocal ability.
Individuals suffering from this disease will gradually lose also the functions of the muscles of the limbs, and therefore will not be able to use common interfaces, such as mouse, keyboard and voice interfaces, for interacting with, for example, personal computers. In order to make up for the loss of vocal and motory ability, systems are employed which implement interfaces specially dedicated to this category of users, such as, for example, eye- gaze control systems or brain-wave control systems, which make it possible to replace, at least partly, the lost motor functions.
Eye-gaze tracking systems permit tracking the user’s gaze, since eye movement remains unaffected even at advanced stages of the disease.
With the terminals currently known in the art, which utilize an interface consisting of an eye- gaze tracking system, letters can be typed based on the direction of the user’s gaze. Via a suitable menu showing available commands, many modern eye-gaze tracking systems also allow, in addition to typing text, using an infrared control to manage a small number of devices of the housing unit, such as, for example, electric switches, doors and windows, equipped with an infrared receiver, provided that such controllable devices have been properly configured. Thus the user can, via a gaze-controllable interface, generate sounds, browse the Internet, write emails and, in a few cases, control such devices equipped with an infrared receiver.
The systems currently known in the art suffer from a number of drawbacks, which will be illustrated below.
A first drawback is limited user independence, since the current systems do not allow for global management of the infrared devices of the housing unit, but only local, short-range wireless interaction with such devices.
A second drawback comes from the fact that the infrared devices often require the intervention of skilled technicians for their configuration, since such devices are not smart enough to provide automatic configuration.
A further drawback of the known systems lies in the fact that such systems are not easily scalable, i.e. they do not permit a simple integration of different devices, such as, for example, devices for smart environment control, medical devices and so forth, which support, for example, the Wi-Fi, Bluetooth, UMTS, LTE and 5G communication protocols. It is therefore one object of the present invention to solve these and other problems of the prior art, and particularly to provide a method, and the associated system and terminal, for controlling objects of a housing unit by eye-gaze tracking, in order to improve the independence of a user affected by motor disorders, e.g. within home and/or hospital environments.
It is another object of the present invention to provide a method, and the associated system and terminal, for controlling objects of a housing unit by eye-gaze tracking, which allow for global management of miscellaneous objects of a housing unit.
It is a further object of the present invention to provide a method, and the associated system and terminal, for controlling objects of a housing unit by eye-gaze tracking, wherein the obj ects of a housing unit can be easily configured and managed by the user without requiring the intervention of a skilled technician.
It is yet another object of the present invention to provide a method, and the associated system and terminal, for controlling objects of a housing unit by eye-gaze tracking, which allow for easy integration of heterogeneous objects of a housing unit.
In brief, the invention described herein consists of a method, and the associated system and terminal, for controlling objects of a housing unit by eye-gaze tracking via interaction between such objects and smart devices in accordance with the IoT (Internet of Things) paradigm.
Further advantageous features of the present invention are set out in the appended claims, which are an integral part of the present description.
The invention will now be described in detail through some non-limiting exemplary embodiments thereof, with particular reference to the annexed drawings, wherein:
- Figure 1 schematically shows a system comprising at least one terminal for controlling objects of a housing unit by eye-gaze tracking, according to one embodiment of the present invention;
- Figure 2 schematically shows a block diagram of the terminal of Figure 1;
- Figure 3 shows an illustrative flow chart of a method for controlling, by eye-gaze tracking, the objects of the housing unit shown in Figure 1;
- Figure 4 shows an illustrative flow chart of a method of eye-gaze tracking, in accordance with the present embodiment of the invention.
Figure 1 schematically shows a system 100 for controlling at least one object 191 of a housing unit 190 by eye-gaze tracking, said system 100 comprising a server 150, at least one terminal 110, and at least one IoT device 181 operatively connected to said object 191. The terminal 110, the server 150 and every IoT device 181 communicate with one another over a network 155 in accordance with the IoT paradigm, e.g. by means of one or more of the following interfaces: Wi-Fi, Bluetooth, UMTS, LTE, 5G.
The housing unit 190 comprises at least one object 191, and may generally comprise a plurality of objects 191, which may also be different from one another. In one embodiment of the present invention, the object 191 may comprise a fixture of the housing unit 190, like a door, a window, a shutter, a roller shutter, etc. In another embodiment of the invention, the object 191 may comprise a space of said housing unit 190, like a room, a hall, a store-room, etc.
In another embodiment of the present invention, the object 191 may comprise a household appliance of said housing unit 190. In a further embodiment of the invention, the object 191 may comprise a component of a system of said housing unit 190, e.g. the housing unit 190 may comprise at least one of the following systems: water system; air-conditioning system; heating system; electric system; surveillance system; alarm system. Depending on the system type, the component may be an electric switch, a valve, a tap, etc.
The object 191 is operatively connected to at least one IoT device 181, such that it is possible to interact with such object. Said IoT device 181 may comprise sensor means and/or actuator means: the sensor means are adapted to measure at least one value of one or more physical parameters that characterize a first operating state of the object 191, and may comprise, for example, thermal, acoustic, optical or mechanical sensors, accelerometers, gyroscopes, barometers, etc. The actuator means are adapted to produce a second operating state of said object 191, and may comprise, for example, servo mechanisms driven by electric motors or by electromechanical devices.
For example, the object 191 may have been designed to natively operate in accordance with the IoT paradigm, e.g. a smart appliance (smart refrigerator, smart washing machine), a smart electromedical device, a smart electric control unit for managing the electric system of the household unit 190, capable of connecting to the network 155, and so forth. In such a case, the first operating state of the object 191 will be characterized by the sensor means of the object 191 itself: for example, if the object 191 is a smart washing machine, the sensor means can determine the presence of the correct water pressure at the inlet of the smart washing machine, i.e. of the object 191; on the other hand, the actuator means may comprise, for example, an electrovalve and/or a pump of the smart washing machine, which can be suitably activated in order to carry out a wash cycle, thus producing the second operating state of the smart washing machine, i.e. of the object 191.
Alternatively, the object 191 may have been designed to not natively operate in accordance with the IoT paradigm, e.g. a common fixture; in such a case, at least one IoT device, e.g. a proximity sensor, may be installed on said object 191 to characterize the first operating state of the fixture, i.e. of the object 191. Said first operating state may correspond to, for example, the closed or open condition of the fixture. Another IoT device, comprising actuator means consisting of servo mechanisms, may be installed on the fixture to cause it to open or close, thus producing the second operating state of the fixture, i.e. of the object 191.
In another embodiment of the invention, the object 191 may be a space (a room) of the housing unit 190; in such a case, at least one IoT device may be installed, for example, on a wall of the room, e.g. a IoT control unit comprising sensor means for measuring the environmental characteristics of the object 191, so as to characterize the first operating state of the object 191 itself. Another IoT device, e.g. a smart air-conditioner, may be installed in the room to adjust a temperature and/or humidity value in such space, thus producing the second operating state of the space, i.e. of the object 191.
The server 150 is adapted to manage the system 100 by allowing information to be exchanged between at least one terminal 110 and at least one IoT device 181. In one embodiment, the server 150 may comprise a memory, an input/output interface, a communication module and a processor operatively connected to one another.
The memory of the server 150 is adapted to internally store instructions and information, or parts thereof, necessary for implementing the method according to the present invention, which will be hereafter described in detail with reference to Figures 3 and 4. Such information may comprise, for example, a virtual model 290 of said housing unit 190, or at least a part thereof, and a set of IoT services 281 of at least one IoT device 181 operatively connected to at least one object 191.
The virtual model 290 is created by using the Building Information Modeling (BIM) technique. BIM is a method that, starting from a three-dimensional model of the housing unit 190, provides a virtual reconstruction of the housing unit 190 through the use of virtual objects that are equivalent to the objects 191. Such virtual objects are not just simple geometries, but can relate to one another. Let us consider, for example, a wall of the housing unit 190: in accordance with the BIM paradigm, it has a geometry, i.e. structure information like thickness, height, conductance and various relations with the other objects 191. If the wall is modified, the entire model of the housing unit will be modified as well, taking into account the changes made to such wall. Therefore, the BIM module contains many types of information that make the BIM model much more informative than a common three- dimensional model. Thanks to the great deal of information contained therein, its function is not limited to the stages of designing and building the housing unit 190, but continues throughout the life cycle of the latter. Said virtual model 290, to be understood as a database of objects and their properties, can be manipulated to include information about the operating state of the objects 191 by means of the IoT devices 181 operatively connected to the objects 191.
As previously described, the memory of the server 150 can store the set of IoT services 281 of at least one IoT device 181 operatively connected to at least one object 191. In the present description, an IoT service comprises one or more functions executable by each IoT device 181, e.g. functions concerning the measurements of the sensor means of the IoT device 181, functions for controlling the actuator means of the IoT device 181, or communication functions of the IoT device 181. An IoT device 181 may have one or more IoT services, i.e. the set of IoT services 281, associable with the object 191 operatively connected to the IoT device 181. The set of IoT services 281 may comprise a service of measuring the first operating state of the object 191 and/or a service of producing the second operating state of the object 191.
The input/output interface of the server 150 makes it possible to interface, or connect, an administrator user for managing and configuring the server 150. The input/output interface of the server 150 may comprise, for example, a keyboard, a screen, a network adapter, etc. The communication module of the server 150 provides communication between at least one terminal 110 and at least one IoT device 181 over the network 155 (Figure 1). The communication module of the server 150 may comprise, for example, an Ethernet interface, a WiFi interface, a mobile network interface (GSM, UMTS, LTE, 5G), etc.
In another embodiment of the invention, all the operations executed by the server 150 may be carried out by means of a plurality of computers operatively connected to one another over a geographically distributed computer network.
Figure 2 schematically shows a block diagram of the terminal 110 for controlling at least one object 191 of the housing unit 190 by eye-gaze tracking, in accordance with the present embodiment of the invention. Said terminal 110 comprises communication means 130, input means 120, output means 115, storage means 125 and processing means 135; such means can be operatively connected to one another via a communication bus 201.
The communication means 130 provide communication between the terminal 110 and the server 150 over the network 155 (Figure 1). The communication means 130 may comprise, for example, an Ethernet interface, a WiFi interface, a mobile network interface (GSM, UMTS, LTE, 5G), etc.
The terminal 110 is adapted to acquire from the server 150, via the communication means 130, the virtual model 290 of the housing unit 190 or at least a part thereof, such model being created in accordance with the Building Information Modeling (BIM) technique. The terminal 110 is adapted to acquire from the server 150 the set of IoT services 281 of at least one IoT device 181, such set of IoT services 281 being associated with at least one object 191, wherein the virtual model 290 is acquired in such a way as to include said at least one object 191.
The input means 120 allow the user to send commands to the terminal 110 by means of an eye-gaze tracking system comprising, for example, an infrared (IR) camera. In this manner, the user can control with his/her eyes a pointer displayed on the output means 115 to select the IoT services 281 associated with the object 191 operatively connected to the IoT device 181. The terminal 110 is adapted to select, via the input means 120, at least one IoT service of the set of IoT services 281 upon receiving at least one command from the user, issued by the latter by eye-gaze tracking. The terminal 110 is adapted to transmit to the server 150, via the communication means 130, at least one control signal for said at least one IoT device 181, wherein the control signal corresponds to at least one selected IoT service. The control signal may be implemented in accordance with available network and IoT technologies known to those skilled in the art; for example, the control signal may be formatted in compliance with a predefined xml scheme and transported over the network 155 in compliance with the TCP/IP protocol.
The input means 120 may further comprise conventional input devices such as, for example, a keyboard, a mouse, a touchscreen, a video camera, a microphone, etc. In addition, the input means may comprise means for tracking the indoor location of the terminal 110, so that the objects 191 of the housing unit 190 can be located within a predefined distance from the terminal 110 and sorted based on their distance from the terminal 191.
The output means 115 allow the user to view the information outputted by the terminal 110, and may comprise a screen, a viewer, a loudspeaker, etc. The terminal 110 is adapted to provide, via the output means 115, a first three-dimensional representation of at least one object 191 based on the virtual model 290, wherein the first three-dimensional representation comprises the set of IoT services 281. For example, the output means 115 make it possible to represent the virtual model 290 in such a way that the user can navigate through the virtually re-created housing unit 190.
Upon receiving a command from the user, e.g. via the above-described eye-gaze tracking system, the terminal 110 is adapted to provide, via the output means 115, a second three- dimensional representation of at least one object 191 based on the virtual model 290 and based on at least one selected IoT service. In one embodiment of the invention, the output means 115 may represent animations on the corresponding virtual objects of the object 191; for example, if the object 191 is a light which is off and which is then turned on upon reception of the command from the user, the output means 115 may generate an animation (light going from off to on) for the virtual object corresponding to the object 191.
The storage means 125 are adapted to store the information and the instructions of the terminal 110, in accordance with the present embodiment of the invention, and may comprise, for example, a flash-type solid-state memory. Such information may comprise the data coming from the server 150, e.g. the virtual model 290, or at least a part thereof, and the set or IoT services 281 of at least one IoT device 181 associated with at least one object 191 included in the three-dimensional model 290. The instructions stored in the storage means 125 will be described in detail later on with reference to the flow charts of Figure 3 and Figure 4.
The processing means 135 are adapted to process the information and the instructions stored in the storage means 125, and may comprise, for example, a multicore ARM processor, an Arduino microcontroller, and so on. The processing means 135 can co-operate with the communication means 130, the input means 120, the output means 115 and the storage means 125 over the communication bus 201.
In one embodiment of the invention, the output means 115 comprise a screen, and the input means 120 comprise at least one IR camera to provide the terminal 110 with eye-gaze tracking functionality, wherein:
• the IR camera is adapted to acquire the user’s ocular movements and is adapted to generate a first sequence of two-dimensional coordinates corresponding to the ocular movements on the screen;
• the terminal 110 is adapted to sample the first sequence of two-dimensional coordinates according to a predetermined criterion, thereby obtaining a second sequence of two- dimensional coordinates which is shorter than the first sequence of two-dimensional coordinates;
• the terminal 110 is adapted to determine a third sequence of two-dimensional coordinates by interpolating pairs of adjacent two-dimensional coordinates of the second sequence of two-dimensional coordinates;
• the terminal 110 is adapted to determine a fourth sequence of two-dimensional coordinates by removing from the third sequence of two-dimensional coordinates every two-dimensional coordinate whose distance from an adjacent two-dimensional coordinate exceeds a predefined length value;
• the screen is adapted to display a pointer whose position on the screen is determined by the two-dimensional coordinates of the fourth sequence of two-dimensional coordinates;
• the terminal 110 is adapted to select at least one IoT service of the set of IoT services 281 when the pointer remains on at least one IoT service longer than a predefined time value.
With reference to Figure 3, the following will describe a method for controlling, by eye-gaze tracking, the objects 191 of the housing unit 190 in accordance with the present embodiment of the invention.
At step 300 an initialization phase is performed, wherein the terminal 110 is initialized. During this phase, for example, the processing means 135 execute all those operations which are necessary for activating the communication means 130, the input means 120, the output means 115 and the storage means 125.
At step 310 an acquisition phase is carried out, wherein the terminal 110 acquires from the server 150, via the communication means 130, the virtual model 290 of said housing unit 190 or at least a part thereof, the virtual model 290 being created in accordance with the Building Information Modeling (BIM) technique. In addition, the terminal 110 acquires from the server 150 the set of IoT services 281 of at least one IoT device 181, the set of IoT services 281 being associated with at least one object 191, and the acquired virtual model 290 including at least one object 191. For example, the virtual model 290 may include information compliant with the BIM paradigm about a window, i.e. the object 191. To such window a first IoT device comprising sensor means and a second IoT device comprising actuator means may be connected. The first IoT device can communicate to the server 150 the open or closed state of the window, i.e. the first operating state of the object 191. The server 150 can update the virtual model 290 by including the first operating state of the window; the virtual model 290 is then acquired by the terminal 110 during this phase. The terminal 110 acquires also the set of IoT services 281 associated with the window, e.g. the window closing IoT service and the windows opening IoT service. The association between the IoT services of the set of IoT services 281 and the object 191 can be made by means of one or more unique alphanumerical identifiers; for example, the window closing service may have a first identifier ID 1 and the window opening service may have a second identifier ID2 which can, through an association table, be associated with the window, the latter being also identified by means of a predefined alphanumerical object identifier.
At step 320 a first output phase is carried out, wherein output means 115 create a first three- dimensional representation of at least one object 191 on the basis of the virtual model 290, wherein the first three-dimensional representation comprises the set of IoT services 281. During this phase, for example, the screen of the terminal 110, i.e. the output means 115, display a three-dimensional view of the window, i.e. the virtual object corresponding to the real window, which is represented in accordance with the first operating state. Along with the representation of the virtual window, the set of IoT services 281 associated with the window, i.e. the window opening and closing services, are also shown; for example, the user can display the set of IoT services 281 in a pull-down menu.
At step 330 a selection phase is carried out, wherein the input means 120 receive at least one command from the user by eye-gaze tracking, for selecting at least one IoT service of said set of IoT services 281. For example, the user may select, by eye-gaze tracking, the window closing IoT service from the pull-down menu. In one embodiment of the invention, eye-gaze tracking may be effected in accordance with the steps shown in Figure 4, which will be described in more detail below.
At step 340 a transmission phase is carried out, wherein the terminal 110 transmits to the server 150, via the communication means 130, at least one control signal 140 for controlling said at least one IoT device 181, the control signal 140 corresponding to at least one selected IoT service. During this phase, for example, the terminal 110 may send the window closing command to the server 150, which may then relay the command to the second IoT device that controls the actuator means in order to close the window, thus producing the second operating state of the object 191. The control signal may be compliant with available network and IoT technologies known to those skilled in the art; for example, the control signal may be formatted in accordance with a predefined xml scheme and transported over the network 155 in accordance with the TCP/IP protocol. During this phase, the server 150 and/or the terminal 110 can update the virtual model 290 with the current state of the window, i.e. of the object 191 for which the IoT service has been selected.
At step 350 a second output phase is carried out, wherein the output means 115 create a second three-dimensional representation of at least one object 191 on the basis of the virtual model 290 and on the basis of at least one selected IoT service. During this phase, for example, the screen, i.e. the output means 115, may show a three-dimensional view of the window being closed, e.g. by means of a suitable animation stored in the storage means 125 and associated with the identifier IDl of the window closing IoT service.
At step 360 the terminal 110 checks if control of the objects 191 of the housing unit 190 should be continued, e.g. based on a suitable user command. If so, step 310 will be executed; otherwise, the process will go to step 370.
At step 370 the terminal 110 executes a termination phase, wherein all the operations necessary for completing the acquisition phase, the first output phase, the selection phase, the transmission phase and the second output phase are carried out. During this step, the terminal 110 may signal its own inoperative state, e.g. by means of luminous indicators, such as LEDs included in the terminal 110 itself.
With reference to Figure 4, the following will describe an exemplary eye-gaze tracking method in accordance with a further embodiment of the invention, wherein the output means 115 comprise a screen and the input means 120 comprise at least one IR camera. The values inputted to the eye-gaze tracking system are arrays of two values corresponding to the two coordinates of the point indicated by the user’s gaze relative to the screen. Several checks are performed on such points to ensure system robustness. The eye-gaze tracking method may comprise:
- a checking step 400, in which a preliminary check phase is carried out. During this step, the terminal 110 verifies the presence of the user in front of the terminal 110, the cleanliness of the lens of the IR camera, etc.;
- a first step 410, in which the IR camera acquires the user’s ocular movements and generates a first sequence of two-dimensional coordinates corresponding to the ocular movements on said screen;
- a second step 420, in which the first sequence of two-dimensional coordinates is sampled according to a predetermined sampling criterion, thereby obtaining a second sequence of two-dimensional coordinates which is shorter than the first sequence of two-dimensional coordinates. The sampling criterion may be, for example, the alternate elimination of values contained in the first sequence of two-dimensional coordinates, so as to obtain the second sequence of two-dimensional coordinates with half the number of elements compared with the first sequence. This will advantageously make the position of the pointer less sensitive to noise generated, for example, by saccadic movements of the user’s eyes;
- a third step 430, in which a third sequence of two-dimensional coordinates is determined by interpolating pairs of adjacent two-dimensional coordinates of the second sequence of two-dimensional coordinates. This will advantageously make the curve representing the position of the pointer on the screen as smooth as possible;
- a fourth step 440, in which a fourth sequence of two-dimensional coordinates is determined by removing from the third sequence of two-dimensional coordinates every two-dimensional coordinate whose distance from an adjacent two-dimensional coordinate exceeds a predefined length value, e.g. empirically defined on the basis of the movements of the user’s eyes. This will advantageously make the curve representing the position of the pointer on the screen as regular as possible;
- a fifth step 450, in which a pointer is represented on the screen and positioned in accordance with the two-dimensional coordinates of the fourth sequence of two- dimensional coordinates;
- a sixth step 460, in which said at least one IoT service of said set of IoT services 281 is selected when the pointer remains on said at least one IoT service longer than a predefined time value, e.g. empirically defined on the basis of the movements of the user’s eyes;
- a verification step 470, in which the terminal 110 verifies if it is necessary to continue tracking the user’s gaze, e.g. based on a predetermined position of the user’s gaze maintained for a predefined time. If so, step 410 will be executed; otherwise, the process will go to step 480;
- a termination step 480, in which the terminal 110 executes all the operations necessary for completing the checking step and said first, second, third, fourth, fifth and sixth steps.
In another embodiment of the invention, the above-described method may be implemented, for example, by means of a computer program product readable by at least one terminal 110, such as, for example, a smartphone, a tablet or a laptop, in order to execute the method of the present invention. The computer program product, e.g. a CD, a flash memory, a magnetic tape, etc., may comprise a set of instructions that, when loaded into the memory of the terminal 110, are adapted to execute the method according to the present invention, such instructions being encoded in accordance with a programming language that can be interpreted by the terminal 110. The advantages of the present invention are apparent from the above description.
The present invention advantageously provides a method, and the associated system and terminal, for controlling objects of a housing unit by eye-gaze tracking, which, through a virtual representation of the objects of such housing unit, improve the self-sufficiency of a user affected from motor disorders within home and/or hospital environments.
A further advantage of the present invention lies in the fact that it provides a method, and the associated system and terminal, for controlling objects of a housing unit by eye-gaze tracking, which permit integrating the BIM paradigm and the IoT paradigm, thereby advantageously allowing for global management of heterogeneous objects of a housing unit. Due to integration of the BIM paradigm and the IoT paradigm, the method and the associated system and terminal for controlling objects of a housing unit by eye-gaze tracking of the present invention advantageously allow the user to easily configure and manage the objects of the housing unit without requiring the intervention of a skilled technician.
A further advantage of the present invention lies in the fact that it provides a method, and the associated system and terminal, for controlling objects of a housing unit by eye-gaze tracking which, through integration of the BIM paradigm and the IoT paradigm, allow for easy (plug-and-play) integration of heterogeneous objects of a housing unit.
Of course, without prejudice to the principle of the present invention, the forms of embodiment and the implementation details may be extensively varied from those described and illustrated herein merely by way of non-limiting example, without however departing from the protection scope of the present invention as set out in the appended claims.

Claims

1. Method for controlling at least one object (191) of a housing unit (190) by eye-gaze tracking, said object (191) being operatively connected to at least one IoT device (181), said method comprising:
- an acquisition phase, wherein a terminal (110) acquires from a server (150), via communication means (130), a virtual model (290) of said housing unit (190) or at least a part thereof, said virtual model (290) being created in accordance with the Building Information Modeling (BIM) technique, and wherein said terminal (110) acquires from said server (150) a set of IoT services (281) of said at least one IoT device (181), said set of IoT services (281) being associated with said at least one object (191), and said acquired virtual model (290) including said at least one object (191);
- a first output phase, wherein output means (115) of said terminal (110) create a first three-dimensional representation of said at least one object (191) on the basis of said virtual model (290), said first three-dimensional representation comprising said set of IoT services (281);
- a selection phase, wherein input means (120) receive at least one command from a user by eye-gaze tracking, in order to select at least one IoT service of said set of IoT services (281);
- a transmission phase, wherein said terminal (110) transmits to said server (150), via said communication means (130), at least one control signal (140) for controlling said at least one IoT device (181), said at least one control signal (140) corresponding to said at least one selected IoT service;
- a second output phase, wherein said output means (115) of said terminal (110) create a second three-dimensional representation of said at least one object (191) on the basis of said virtual model (290) and on the basis of said at least one selected IoT service.
2. Method according to claim 1, wherein said object (191) comprises a door or a window of said housing unit (190).
3. Method according to claim 1, wherein said object (191) comprises a room of said housing unit (190).
4. Method according to claim 1, wherein said object (191) comprises a household appliance of said housing unit (190).
5. Method according to claim 1, wherein said object (191) comprises a component of a system of said housing unit (190).
6. Method according to claim 5, wherein said system of said housing unit (190) comprises at least one of the following systems: water system; air-conditioning system; heating system; electric system; surveillance system; alarm system.
7. Method according to one or more of claims 1 to 6, wherein said at least one IoT device (181) comprises sensor means and/or actuator means.
8. Method according to one or more of claims 1 to 7, wherein said sensor means measure at least one value of one or more physical parameters that characterize a first operating state of said object (191).
9. Method according to one or more of claims 1 to 8, wherein said actuator means produce a second operating state of said object (191).
10. Method according to one or more of claims 1 to 9, wherein said set of IoT services (281) comprises a service for measuring said first operating state of said object (191), and/or a service for producing said second operating state of said object (191).
11. Method according to one or more of claims 1 to 10, wherein said output means (115) comprise a screen and said input means (120) comprise at least one IR camera, and wherein said eye-gaze tracking comprises:
• a first step in which said IR camera acquires the user’s ocular movements and generates a first sequence of two-dimensional coordinates corresponding to the ocular movements on said screen;
• a second step in which said first sequence of two-dimensional coordinates is sampled according to a predetermined sampling criterion, thereby obtaining a second sequence of two-dimensional coordinates which is shorter than the first sequence of two-dimensional coordinates;
• a third step in which a third sequence of two-dimensional coordinates is determined by interpolating pairs of adjacent two-dimensional coordinates of said second sequence of two-dimensional coordinates;
• a fourth step in which a fourth sequence of two-dimensional coordinates is determined by removing from the third sequence of two-dimensional coordinates every two- dimensional coordinate whose distance from an adjacent two-dimensional coordinate exceeds a predefined length value;
• a fifth step in which a pointer is represented on said screen and positioned in accordance with the two-dimensional coordinates of said fourth sequence of two-dimensional coordinates;
• a sixth step in which said at least one IoT service of said set of IoT services (281) is selected when said pointer remains on said at least one IoT service longer than a predefined time value.
12. System (100) for controlling at least one object (191) of a housing unit (190) by eye- gaze tracking, said system (100) comprising a server (150), at least one terminal (110) and at least one IoT device (181) operatively connected to said object (191), said system (100) being adapted to implement the method according to one or more of claims 1 to 11. 13. System (100) according to claim 12, wherein said at least one terminal (110), said server
(150) and each IoT device (181) are adapted to communicate with one another via a network (155) according to the IoT paradigm.
14. Terminal (110) for controlling at least one object (191) of a housing unit (190) by eye- gaze tracking, said terminal (110) comprising communication means (130), input means (120), output means (115), storage means (125) and processing means (135) adapted to implement the method according to one or more of claims 1 to 11.
15. Computer program product adapted to be read by at least one terminal (110), said computer program product comprising a set of instructions which, when loaded into the memory of said terminal (110), are adapted to execute the method according to one or more of claims 1 to 11.
PCT/IB2022/053736 2021-04-23 2022-04-21 Method and system for controlling objects of a housing unit by eye-gaze tracking WO2022224189A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT102021000010415A IT202100010415A1 (en) 2021-04-23 2021-04-23 METHOD AND SYSTEM FOR CHECKING OBJECTS OF A HOUSING UNIT THROUGH EYE TRACKING
IT102021000010415 2021-04-23

Publications (1)

Publication Number Publication Date
WO2022224189A1 true WO2022224189A1 (en) 2022-10-27

Family

ID=77021924

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2022/053736 WO2022224189A1 (en) 2021-04-23 2022-04-21 Method and system for controlling objects of a housing unit by eye-gaze tracking

Country Status (2)

Country Link
IT (1) IT202100010415A1 (en)
WO (1) WO2022224189A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3572914A2 (en) * 2018-05-24 2019-11-27 TMRW Foundation IP & Holding S.A.R.L. Two-way real-time 3d interactive operations of real-time 3d virtual objects within a real-time 3d virtual world representing the real world
US20200304375A1 (en) * 2019-03-19 2020-09-24 Microsoft Technology Licensing, Llc Generation of digital twins of physical environments

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3572914A2 (en) * 2018-05-24 2019-11-27 TMRW Foundation IP & Holding S.A.R.L. Two-way real-time 3d interactive operations of real-time 3d virtual objects within a real-time 3d virtual world representing the real world
US20200304375A1 (en) * 2019-03-19 2020-09-24 Microsoft Technology Licensing, Llc Generation of digital twins of physical environments

Also Published As

Publication number Publication date
IT202100010415A1 (en) 2022-10-23

Similar Documents

Publication Publication Date Title
US11133953B2 (en) Systems and methods for home automation control
US10992491B2 (en) Smart home automation systems and methods
US11050577B2 (en) Automatically learning and controlling connected devices
US20200256575A1 (en) Devices and methods for interacting with an hvac controller
US20180122379A1 (en) Electronic device and controlling method thereof
WO2014190886A1 (en) Intelligent interaction system and software system thereof
CN110121696B (en) Electronic device and control method thereof
US20220131718A1 (en) System and method for controlling devices
KR20170115648A (en) Electornic apparatus and operating method thereof
WO2022224189A1 (en) Method and system for controlling objects of a housing unit by eye-gaze tracking
Silverstein IoT Control Device with Simplified Interface
Rodriguez Location Finding of Wireless Beacons
Ross Augmented Reality Interface for Visualizing and Interacting with IoT Devices
Chang et al. Intelligent Voice Assistant Extended Through Voice Relay System
Von Dehsen Camera Lens with Display Mode
Dhillon et al. Health Analyzing Smart Mirror
Von Dehsen Providing a Camera Stream on an Ancillary Display
Wilson Radically Connected Home for Power Control and Device Discovery and Synergy
Diniz de Faria Smart Thermally Controlled Bedding
Benassi Systems and Methods for Adjusting Lighting to Improve Image Quality
Tait Smart Floor for In Room Detection
Dhillon et al. Method for Real-Time Voice Communication
Tait Infrared Patterns for Device Control
Benassi User Notification Interface Using Internet of Things Devices
Dhillon et al. Enhancing Privacy of Smart Speakers Using Image Recognition and Motion Sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22723771

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22723771

Country of ref document: EP

Kind code of ref document: A1