WO2023150849A1 - Dispositif et système pour le contrôle d'interfaces électroniques - Google Patents

Dispositif et système pour le contrôle d'interfaces électroniques Download PDF

Info

Publication number
WO2023150849A1
WO2023150849A1 PCT/BR2023/050045 BR2023050045W WO2023150849A1 WO 2023150849 A1 WO2023150849 A1 WO 2023150849A1 BR 2023050045 W BR2023050045 W BR 2023050045W WO 2023150849 A1 WO2023150849 A1 WO 2023150849A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
head
movements
controlling electronic
electronic interfaces
Prior art date
Application number
PCT/BR2023/050045
Other languages
English (en)
Portuguese (pt)
Inventor
Adriano RABELO ASSIS
Henrique PIRES FRANCO LATORRE
Original Assignee
Tix Tecnologia Assistiva Ltda
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tix Tecnologia Assistiva Ltda filed Critical Tix Tecnologia Assistiva Ltda
Publication of WO2023150849A1 publication Critical patent/WO2023150849A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F4/00Methods or devices enabling patients or disabled persons to operate an apparatus or a device not forming part of the body 
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H3/00Mechanisms for operating contacts
    • H01H3/02Operating parts, i.e. for operating driving mechanism by a mechanical force external to the switch
    • H01H3/14Operating parts, i.e. for operating driving mechanism by a mechanical force external to the switch adapted for operation by a part of the human body other than the hand, e.g. by foot
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H35/00Switches operated by change of a physical condition

Definitions

  • the present invention concerns a new device and system to capture and process head movements and facial gestures, through infrared motion and inertial sensors that activate the control of electronic interfaces, through connectivity via Bluetooth, working on any device equipped with an operating system, regardless of brand or model. It is a hybrid system solution for controlling electronic interfaces, fully configurable, which replaces computational peripherals such as control, mouse and keyboard. It has a very useful application as a device adapted to people with disabilities, who cannot use common computational peripherals to control equipment that has electronic interfaces such as computers, tablets, smartphones and even televisions and other equipment. Furthermore, the equipment is substantially compact and can be easily attached to any headgear, such as headsets, glasses, headbands or caps.
  • the present invention which dispenses with the use of external accessories, webcams and hand and arm movements, is more versatile, customizable and efficient than other products on the market, since it does not depend on ambient conditions, such as quality of lighting, can be added to any headgear, is not limited to a single device or operating system and can be adapted according to the needs of use or limitations and disabilities of each user.
  • Software-based solutions include programs that use front cameras built into the device to be controlled, such as a computer, to capture users' head movements and transform them into movement equivalent to that of a screen cursor.
  • Hardware-based solutions have physical components equipped with electronic sensors - whether bioelectric, such as electromyography and electroencephalography; inertial ones, such as infrared; distance, gyroscope or accelerometer - whose function is to capture intentional movements of the user's head and/or face and transform them into commands for the devices that are intended to be controlled.
  • bioelectric such as electromyography and electroencephalography
  • inertial ones such as infrared
  • distance, gyroscope or accelerometer - whose function is to capture intentional movements of the user's head and/or face and transform them into commands for the devices that are intended to be controlled.
  • Hybrid solutions are those that have hardware, where one or more of the aforementioned sensors or a camera with infrared lighting are located, whose identification of the users' head movement is interpreted and processed by software.
  • Patent application No. PI 1004279-2 this is a device whose technical approach is electromyography to be implemented in a band on the user's forehead to capture their movements through sensors composed of a piezoelectric , which will later be processed in the control system.
  • Patent application No. BR 102018 071412 representative of a device and method to capture, process and classify eye movements, head movements and brain activity, through the electroencephalography approach.
  • Patent application No. BR 10 2014 025534 6 it is a system composed of a camera accompanied by a cursor control application that allows its movement in the electronic interface and is located in front of the user's lower lip .
  • Patent application No. WO2010064987 describes a device that uses a "mouse ball" to capture the movements of the mouth, tongue and lips to select keyboard commands and a headset in the form of an arc or strap around the forehead.
  • Patent application No. WO2015159108A2 describes glasses that detect the movements of the user's head and transform them into cursor movements on the screen of an electronic device, through connection via USB.
  • Patent application No. W02020044363A1 representative of wireless glasses with mouse and keyboard function.
  • the device includes an eyeglass frame with lenses to cover the wearer's eyes, a sensor to detect eye blinking and a sensor to detect head movements.
  • the connection is made wirelessly via Bluetooth.
  • a mouse click is detected by means of a wink.
  • the head motion sensor has an accelerometer and a gyroscope.
  • Patent application No. KR20210067149A representative of a mouse-glasses system and mouse cursor operated based on the rotation movement of the head of the wearer of the glasses.
  • the glasses capture the movement of the head by means of acceleration and gyroscope sensors, which are located on the left side of the frame of the glasses, and communication with the computer is carried out using l 2 C technology.
  • Patent Application No. US4486630 representing a head accessory operated by movements of the chin and eyebrows, which allows the use of computers and video game consoles by people with limited movement. The capture of movements is performed through electrical connections, more specifically through electrodes connected to a computer signal.
  • the present invention differs from those described in items “i”, “ii” and “ii i”, as it does not use the methodology of bioelectric capture, but infrared inertial sensors. It also differs from the invention described in item “viii”, which uses electrical movement capture by means of electrodes.
  • the present invention differs from that described in item "v", as it captures the movements of the user's head through the combination of two different types of sensors: (i) a digital level sensor, used to detect the vertical movement of the head, and (ii) a digital compass, responsible for detecting the lateral inclination of the head; while the former uses infrared motion and inertial sensors.
  • the present invention is also distinguished from the inventions described in items “v” and “viii", as the first has its connection limited to devices that have a USB connection and the second has its connection limited to computers and video game consoles, while the invention described here can be connected to any device that has connectivity via Bluetooth and operating system.
  • the present invention is not composed of a glasses frame, being a device that can be attached to any head or face accessory of user.
  • the present invention differs, as can be seen from the explanation below, by not limiting the capture of users' movements to one or more focal points which, in the case of the invention “vi” are the head and eyes, and in the case of the invention “viii” are the chin and eyebrows.
  • the variety of focal points used by the present invention is one of the characteristics that makes it innovative in the face of the assistive technology market.
  • the device object of the present filing request is the only one that can be attached to any head accessory, including those owned by the user, and this versatility is extended to the technology used in the invention.
  • This set of claimed adaptable characteristics therefore allows the device to be adapted according to different motor limitations of users, who can choose which part of the face the sensors will focus on (for example: eyes, mouth, chin, eyebrow, cheek and jaw), just position the device and configure the optional web system, according to your preferences.
  • the way in which the embedded system, together with the configurations of the web system, allows modifying the operating characteristics of the invention (the operating modes, the characterization of gestures and the operating parameters) without, however, modifying its constructive characteristics, is the main difference between the aforementioned invention and the other products on the market, since in other solutions the firmware is written in such a way as to always rigidly execute the same instructions, when faced with the same set of circumstances. On these devices, for example, if the person tilts the head towards the right shoulder, the non-adaptive firmware will always send the same mouse command to the connected device (such as moving the pointer to the right).
  • this same gesture can make the device issue different commands, such as right-clicking, scrolling down the screen, moving the pointer to the right or even doing nothing, depending on the previous configuration. chosen by the user.
  • This characteristic of versatility also allows the invention to be used as a control tool for other machines equipped with an operating system, in applications that can go far beyond accessibility, such as the control of intelligent vehicles and drones, for example.
  • the device object of this patent was created in order to solve numerous problems of the current state of the art, mainly with regard to versatility and adaptation to the particularities of use of this type of equipment, including disability and/or aesthetic tastes of each user. Therefore, the invention in question goes beyond the already known assistive technologies in the field of human-computer interaction commands, guaranteeing the versatility of its device, both in terms of aesthetics and practice, as well as its technological function itself.
  • the present device adopts an inertial electronic sensor, which records variations in acceleration in any direction and, as acceleration is a physical quantity that is independent of factors such as ambient lighting, unlike solutions based on software or hardware and hybrid ones that use cameras to capture the user's movements, the light does not interfere with the detection of the movements of those who use the invention deposited herein.
  • Hybrid and hardware solutions are divided as technology of sensors embedded in wearable accessories or dedicated systems with infrared lighting and specialized software, and even so, they have one or more of the following characteristics, which make their use less convenient for their users:
  • buttons depend on additional external accessories (known as triggers), for the user to perform some specific functions, such as mouse clicks, such as bite keys (special buttons made to be positioned between the person's teeth) or physical buttons triggered with head strikes (ex.: patent US4486630);
  • Figure 1 Invention cabinet attached to a glasses frame - opposite side to the user's face, indicating the items:
  • FIG. 1 Figure 2.
  • Cabinet of the invention coupled to a frame of glasses - side facing the user's face, indicating the items:
  • ABS plastic cabinet in the form of a rectangular prism, with dimensions of 50mm in length by 20mm in height and 18mm in width, with walls between 1mm and 5mm thick, custom-made to house the PCB;
  • e Flexible fin/rectangular ABS plastic cover only 0.3mm;
  • f 2.5mm diameter flexible steel wire covered by a plastic cover, which forms a rod that connects the removable clip “k” to the cabinet “d”;
  • g Space between the flaps of the mobile clip where the wire described in “f” is attached;
  • Cap Cap brim.
  • FIGS. 6 and 7. Interfaces of the web system, in which the user of the invention can adjust it, according to his preferences and limitations.
  • the invention is a system composed of a device, an embedded system and a web configuration system.
  • the Device has a microcontroller chip that operates as the "brain" of the device, this microcontroller runs an embedded system ( Figure 9) in which all the operations that the device must perform are programmed, and how it must be performed.
  • the device contains a motion sensor with 6 axes.
  • the motion sensor When the device is in operation and properly attached to a head garment or on the user's face, the motion sensor continuously measures acceleration and rotation orientation in each of its 6 axes, at an approximate rate of 1,000 measurements per second.
  • This sequence of instantaneous measurements is converted into digital values and constantly sent from the motion sensor to the microcontroller, which runs a sophisticated embedded system (Figure 9), responsible for digitally processing these values, in real time, to determine the instantaneous variation. measurements on each axis.
  • This embedded system combines, at all times, the values sent by the motion sensor to determine if there is a significant change in the position and orientation of the device (and therefore, if the position of the user's head is changing). If so, the embedded system then calculates in which direction the change has taken place and at what speed and determines how large the mouse pointer should be to shift (in pixels) across the screen and in which direction.
  • the microcontroller then creates a digital package with this data (direction and displacement in pixels), respecting the format specified by a digital protocol for human interface devices (HID protocol).
  • This package is encapsulated within another digital data package determined by the Bluetooth communication protocol and, finally, the data is transmitted via radio frequency to the connected device, where its operating system (eg Windows, Android, iOS, Linux) runs the received command and moves the pointer on the screen.
  • operating system eg Windows, Android, iOS, Linux
  • the embedded system is able to identify the position in which the user's head is located and, thus, identify predetermined patterns. determined in these movements and, with that, recognize common head gestures, such as the repetitive head nodding up and down ( Figure 5 - v) that signals “yes”, the nodding from side to side ( Figure 5 - u ) that signals “no”, among other movements indicated in the GESTURE CHARACTERIZATION, below.
  • the embedded system is responsible for instructing the microcontroller to process, in real time, the information sent by the sensors (variations in acceleration measured by the motion sensor and variations in the reflection of infrared light measured by the sensor of infrared) and, based on these numbers, identify whether the user is making any head movements or facial gestures intentional.
  • the microcontroller sends the corresponding mouse commands to the connected device via Bluetooth.
  • the innovation of the present invention in relation to the state of the art was to develop a system adaptive, in order to allow the user to change the way the system itself identifies movements and gestures, and what to do when they are recognized.
  • the technical effect of the system applied to the device lies in the usability of the present invention, which interprets head movements and facial gestures in different ways, enabling its use by people with different degrees of head and face mobility.
  • a user unable to move his head quickly can change the behavior of the invention so that a short head movement triggers a large displacement of the cursor, compensating for his physical disability.
  • a person who has trouble blinking their eyes vigorously can change the way the device works so that it detects more subtle movements of their eyelid.
  • the invention allows the user to assign frequent commands to certain head or face gestures, using them as shortcuts to increase productivity. For example, if a person needs to use the right mouse button frequently during work, he can configure the device to perform a right click whenever he tilts his head towards the right shoulder ( Figure 5 - r), since it's a quick move to make.
  • the device is also capable of executing any command possible on a conventional keyboard, from a simple key (for example, “Enter”), to a combination of keys (for example, “Ctrl + C”).
  • the command can be freely chosen by the user and assigned to any of the facial or head gestures identifiable by the invention, these choices being made through the optional web configuration system.
  • adjustments to the parameters of the invention may be intended to increase the comfort of use or even be fundamental to enable a person with very particular motor characteristics to actually be able to use a computer ( Figure 3 - 1) or a cell phone.
  • the basic technology involved for the connection between the web system and the device is the Bluetooth communication protocol, that is, the same principle already used to pair the device with the interface to be controlled. What changes are the types of data packets sent and received by the invention when the web system is running.
  • the embedded system of the invention has a limited number of parameters that can be changed by the user.
  • the web configuration system works as a friendly interface for the user to make adjustments in its operation.
  • the web system ( Figure 8) sends a data packet to the invention requesting the current values of its parameters.
  • the embedded system of the invention recognizes this request and sends a data packet containing the current value of each configurable parameter as a response ( Figure 9).
  • the web system when the web system is running on a device that is paired with the invention via Bluetooth, such as a cell phone, the program is able to display, on the screen, the current settings of the invention being used.
  • the web system sends a data package to the device containing a command instructing it to replace the value of that parameter with the new value selected by the user.
  • the embedded system recognizes this request and makes this change, saving the new values internally in the microcontroller's non-volatile memory. In this way, the new operating parameters of the device will remain in effect even if the device is turned off or its battery is completely discharged.
  • a microcontroller of at least 8 bits, in the form of an integrated circuit (microchip) capable of running embedded systems at a speed of 48 MHz, with integrated radio frequency communication capability in the 2.4 GHz, compatible with Bluetooth Low Energy 5.1 communication protocol.
  • the microcontroller is responsible for running the embedded system with the pre-programmed instructions in which a system for processing the signals coming from the sensors is implemented, the “translation” of these signals into mouse commands and the sending (via Bluetooth) of these commands to the devices to be controlled.
  • the adaptive embedded system which is the technology that allows the present invention to interpret the movements of each person differently.
  • the process takes place through the device's firmware, an embedded system that instructs the microcontroller on how to calculate the values measured by the motion sensor and by the infrared sensor in order to identify patterns that represent predetermined intentional gestures - for example, the pattern of repetitive movement of the head vertically representing an affirmative gesture characterized by repetitive rotation with alternation of direction ( Figure 5 - v) -, in addition to sending, via Bluetooth, to the connected device, the control commands corresponding to the identified gestures.
  • the aforementioned system is adaptive, it allows the reconfiguration, on demand, of both the criteria for identifying movements and gestures, as well as the mouse commands (or even keyboard) to be sent to the computer ( Figure 3 - t) or mobile device. That is, as a consequence of carrying out these actions, the general behavior of the system is thus altered, without, however, altering any of its constructive characteristics.
  • the technical effect of the embedded system of the present invention lies in the possibility of the program to reconfigure, on demand, its movement identification thresholds and several other parameters, thus changing the device's behavior according to each user.
  • a 6-axis MEMS (Micro Electro-Mechanical System) motion sensor (hereinafter “motion sensor”), integrating a 3-axis accelerometer and a 3-axis gyroscope, integrated into a single microchip.
  • an infrared light emitting and detector integrated circuit capable of emitting a beam of non-visible light pulses and measuring variations in the received light reflection, thus making it possible to detect the variation of distance between the sensor and surfaces illuminated by the beam.
  • the sensor is also capable of measuring visible ambient light (such as sunlight) and eliminating noise and electromagnetic interference caused by it, increasing its accuracy and allowing the detection of variations of less than 1 millimeter in the distance between the sensor and the illuminated surface. by the infrared beam.
  • the invention must be attached to the garment that the person is wearing over the head or face, in such a way that the opening of the device's infrared sensor (Figure 2 - m) is pointed towards the region of the face that the user intentionally wants to move to perform the clicks. Therefore, if the person wishes to make clicks by blinking the eyes, the opening of the sensor ( Figure 2 - m) must be pointed towards the eyelid ( Figure 5 - s2) of the user when, for example, the glasses ( Figure 3 - a) are positioned on the face.
  • the distance between the sensor aperture and the facial skin needs to be between 10mm and 50mm.
  • the infrared light beam emitted by the sensor will focus on the eyelid.
  • the variation in the distance between the eyelid and the opening of the sensor will cause a corresponding change in the amount of infrared light reflected back to the sensor, which numerically registers this reflection oscillation and passes these information to the microcontroller, whose embedded system tries to recognize patterns in these variations and identifies whether or not this corresponds to a user's intention to perform a mouse click.
  • the device sends the command equivalent to a mouse click to the device to which it is paired via Bluetooth.
  • any other intentional facial gesture can be captured (such as a smile, cheek contraction or eyebrow lift), simply by positioning the infrared sensor opening towards the part of the face you want to move, respecting the distance between 10mm and 50mm.
  • the sensor In order to capture more subtle facial movements, such as a slight blink of the eyelid ( Figure 5 - s2), eyebrow movement ( Figure 5 - s1 ) or even pupil movement ( Figure 5 - s3), the sensor must be closer (saving 10mm minimum distance).
  • the senor can be further away, but not exceeding 50mm.
  • the sensitivity parameter of the infrared sensor can be changed through the invention's optional configuration web system ( Figure 6).
  • a Lithium polymer battery (Li-po) with integrated charge and discharge control circuit and short circuit protection.
  • the internal rechargeable battery is composed of Lithium polymer cells, which have the property of storing electrical charges, discharging these charges in the form of electrical energy when connected to circuits and, upon receiving an electrical current supplied by an external source in reverse direction, store these charges again, being able to execute this cycle hundreds of times.
  • the nominal voltage of the battery is 3.7V and it has at least 350 mAh of charge capacity (enough to keep the invention working for more than 30 hours of continuous use). It is recharged when a continuous voltage of 5V is applied between its terminals, which can be done safely by connecting any standard 5V source via cable to the invention via the micro USB or USB type C connector.
  • the micro-USB or USB type-C connector is for connecting a standard USB cable for battery charging to any standard 5V outlet or USB port, with access to the connector through the opening ( Figure 1 - c) in the case.
  • An RGB type LED (light emitting diode) light indicator (capable of turning on, independently, a red, a green and a blue light, or any combination of these colors, within the same package) serving, in the practical, as a visual cue point that can light up in any color of the visible light spectrum.
  • the purpose of this LED is to provide the user with a visual signal that the invention is working, as described below. The light can be seen through the component ( Figure 2 - n) present in the cabinet.
  • a compact direct-current vibration motor popularly known as “vibracall”, for being present inside cell phones of all brands and models.
  • the vibration motor is a small DC rotary motor that moves an unbalanced load attached to its axis of rotation.
  • the load imbalance causes the entire engine mechanical assembly to oscillate with the same frequency as its rotation, causing an intentional vibration.
  • the purpose of this engine is to provide the user with haptic signaling of the device's operation, see description below.
  • An integrated sound emitter capable of emitting sounds in the spectrum from 1 kHz to 4kHz.
  • the “buzzer” is a small integrated circuit capable of emitting sounds in the spectrum from 1 kHz to 4 kHz, and its objective is to provide the user with a sound signal of the operating state of the invention.
  • operating state it is meant what the invention is performing or is capable of performing at any given moment. Operating states can be indicated by light signaling (RGB LED), by sound signaling (buzzer) and/or by haptic signaling (vibracall).
  • the invention can assume the following operating states: a) Off: light off, buzzer in silence, vibracall stopped; b) Off, but battery being recharged via USB cable: light off, buzzer silent, vibracall stopped; c) On, not connected via Bluetooth to any device: light flashing, alternating between green and red, buzzer silent, vibracall stopped; d) On, not connected via Bluetooth to any device, battery being recharged via USB cable: light flashing red, buzzer silent, vibrate stopped; e) On, connected via Bluetooth to a device, activated (that is, able to move the mouse pointer and click if requested by the user): light on green, buzzer in silence, vibrate stopped; f) On, connected via Bluetooth to some device, temporarily disabled by user choice (i.e.
  • cursor movement and mouse clicks are temporarily suspended - feature used when person wants to move head or make facial gestures without causing commands on screen): light off, buzzer beeps 2 times in a row, vibracall beeps 2 times in a row; g) On, connected via Bluetooth to a device, recently reactivated by user choice: light on green, buzzer beeps 1 time, vibracall beeps 1 time; h) On, connected via Bluetooth to a device, activated by clicking the left mouse button: light on green, buzzer signals while the click lasts, vibracall signals at the beginning of the click; i) On, connected via Bluetooth to a device, activated, clicking the right mouse button: light on green, buzzer beeps three times in a row, vibracall beeps once at the beginning of the click; j) On, connected via Bluetooth to a device, activated, scrolling the mouse vertically upwards: light on green, buzzer signals 1 time at the beginning of scrolling, vibracall signals 1
  • buttons intended to turn the invention on and off.
  • the button is activated by means of a flexible flap ( Figure 1 - e) on the cabinet.
  • PCB printed circuit board
  • the pushbutton is soldered to one end of the board, with the key perpendicular to the surface of the PCB. With the components assembled, the electronic board reaches a maximum height of 11 mm.
  • a hollow ABS plastic cabinet ( Figure 1 - d), in the shape of a rectangular prism, with dimensions of 50mm in length by 20mm in height and 18mm in width, with walls between 1mm and 5mm thick, made under dimension to house the printed circuit board with all the electronic components mentioned above.
  • the case has an opening at the rear, with which the soldered button on the end of the PCB aligns with when the card is fully inserted.
  • the opening is closed by a rectangular cover made of ABS plastic, only 0.3mm thick, which, being thin, makes it flexible. When glued over the opening, this cover seals the case, covering the button key, but allowing it to be actuated by pressing the rectangle, flexing its surface towards the inside of the case.
  • the plastic cabinet also has a square opening ( Figure 2 - m), 2mm on a side, positioned on one of the surfaces of the cabinet, designed to let the infrared light beam out of the sensor while allowing the entry of reflected light.
  • a round hole, 1 mm in diameter ( Figure 2 - n) through which the RGB LED light can be seen when turned on.
  • a longitudinal rail ( Figure 2 - 1) through which the base of the clip ( Figure 2 - o) removable from the device slides.
  • a removable clip ( Figure 1 - k), made of two pieces of ABS plastic connected by means of a flexible steel wire of 2.5mm in diameter covered by a plastic cover.
  • the length of the wire can be 30mm (to mount the device on the arms ( Figure 1 - i) of eyeglasses frames ( Figure 1 - a)) or 80mm (to facilitate the mounting of the device on other garments, such as caps , hats or headbands).
  • the plastic piece ( Figure 2 - o) present at one end of the flexible rod ( Figure 2 - f) has a trapezoidal and rectangular prismatic shape, 25 mm long, 4 mm high and 5 mm wide.
  • This piece can be slid under the longitudinal rail ( Figure 2 - 1) of the plastic cabinet, allowing the clip ( Figure 2 - k) to be easily fitted or detached from the cabinet ( Figure 2 - d) of the device.
  • the plastic piece present at the other end of the flexible rod is a clip, made of ABS plastic in the shape of a paper clip, with dimensions of 30x24x15mm, and serves to attach the set to the user's garment (glasses rod ( Figure 2 - i), cap brim ( Figure 4 - q) etc.).
  • the device can be attached, for example, to any eyeglass frame, with or without lenses (Figure 2 - j), aligning the plastic clip on the right stem (Figure 2 - i) of the frame, close to the junction between the stem and the window of the glasses, pressing its tabs (Figure 2 - h) against each other to open its lower part and fitting the stem into this opening ( Figure 2 - g), securing it by pressure.
  • the opening ( Figure 2 - m) of the device's infrared sensor will be aligned with the user's right eyelid ( Figure 5 - s2) when the glasses are positioned on the face, capturing the blink of the eye as a means for the user to click . If the person wants to move another part of the face to perform this command, such as twitching the cheek, for example, just flex the steel rod ( Figure 2 - f) so that the sensor opening points to that region.
  • the loop of the invention can be attached to other parts, such as rigid brims (Figure 4 - q) of a cap ( Figure 4 - p) or hat, headsets or hair bands, and provide the capture of the gesture user's preferred face, simply flexing the rod ( Figure 4 - f) in order to point the infrared sensor to the desired facial muscle.
  • the device described here is an electronic accessory that is light and flexible enough to be attachable to any existing garment, respecting the user's style preferences and ensuring the dignity of not having to use an indiscreet, uncomfortable or awkward accessibility feature.
  • the optional configuration system is one that runs over the internet ( Figures 6 and 7) (that is, a system that does not need to be downloaded and can be accessed in any internet browser), runs on any computer or mobile device.
  • the device with the embedded system does not depend on the web system to function, since this serves as a convenient interface to allow the user to modify the behavior of the device to his liking, if he deems it necessary.
  • the intention of the configuration system is to make the same device adapt to people with different movement capabilities or different types of interfaces, allowing it to meet a wide spectrum of conditions, without the need for different versions of the invention in the market, since products that have hardware are, by definition, immutable in their physical characteristics.
  • Software-based products on the other hand, can always be reprogrammed and updated, and this is the non-obvious characteristic that we explored to implement this resource, enhancing the present invention, which is characterized as a hybrid system.
  • OPERATION MODES The main ways in which the invention can interpret the user's head movements and facial gestures are detailed below and are collectively referred to as OPERATION MODES: a) STANDARD MODE: In this mode, the invention makes the mouse pointer “mimics”, in real time, the movements of the user's head. Thus, if the person moves their head upwards, the invention makes the pointer on the screen rise simultaneously; likewise, if the head turns to the left, the invention will move the pointer to the left at the same time, always following precisely the head displacement and the speed of the user's movement. b) VIRTUAL JOYSTICK MODE: In this mode, the invention waits for the user to point, with their head, in which direction they want the pointer to move.
  • the invention starts moving the cursor across the screen, in the indicated direction, progressively accelerating the pointer and stopping it when the person returns the head to the resting (neutral) position.
  • the user wants to move the pointer upwards, it is enough to lift the chin slightly until the head leaves the “neutral” position, causing the pointer to start accelerating upwards.
  • the displacement is interrupted when the head returns to the “neutral” position (the position in which the user's head is kept at rest during the invention's calibration period is considered “neutral”, which must be done in the first seconds of device operation when this mode is selected).
  • SINGLE AXIS MODE In this mode, the invention allows a user, who can only move his head in one direction (only vertically or only horizontally), to move the cursor to any point on the screen. For example, in case a user can only move the head vertically, initially, when moving the head up or down, the pointer will move correspondingly on the vertical axis. Then, by keeping the head at rest for a few moments, the invention emits a sound or vibration signal.
  • the pointer will move correspondingly, but on the horizontal axis (left/right).
  • the invention emits a new sound signal (or vibration) and starts to move vertically when the head moves again.
  • the rest interval that the device waits to change the axis of displacement of the pointer can be adjusted at the user's discretion through the optional web configuration system of the invention.
  • CLICK ONLY MODE Users totally incapable of moving their heads can use the invention only to detect the intention of executing mouse clicks (without moving the pointer), through the built-in infrared sensor, which captures facial gestures such as blinking eyes or the twitching of the cheeks.
  • the user will be able to use specialized cell phone and computer programs and applications that offer a feature known as “scanning”, in which the options available on the screen are alternately highlighted, one at a time, automatically, and the user “confirms” with the blink (or smile, or any facial gesture detectable by the invention) when the desired option is highlighted.
  • scanning in which the options available on the screen are alternately highlighted, one at a time, automatically, and the user “confirms” with the blink (or smile, or any facial gesture detectable by the invention) when the desired option is highlighted.
  • the invention allows the user to assign the execution of special mouse commands (right click, double click, sustained click, vertical scrolling, etc.) and even the keyboard (for example, key combinations shortcuts for copying, pasting, cutting, deleting, etc.) to performing predetermined gestures with the head or face. Gestures can also be used to change the behavior of the invention itself without necessarily causing actions on the computer or mobile device (for example, turning the device off or on without the need to press the physical button). These commands and gestures are explained below, being designated as CHARACTERIZATION OF GESTURES: a) AFFIRMATIVE GESTURE: Repeated movement of the head up and down, alternately.
  • the number of repetitions and the execution time of the movement can be freely changed by the user through the optional web configuration system of the invention.
  • NEGATIVE GESTURE Movement of the head to the left and to the right, alternately.
  • the amount of repetitions and the execution time of the movement can be freely changed by the user through the optional web configuration system of the invention.
  • LAY YOUR HEAD TO THE RIGHT Tilt your head towards your right shoulder.
  • the minimum angle of inclination of the head to the right can be freely changed by the user through the optional web configuration system of the invention.
  • LAY YOUR HEAD TO THE LEFT Tilt your head towards your left shoulder.
  • the minimum angle of inclination of the head to the left can be freely changed by the user through the optional web configuration system of the invention.
  • SHORT FACIAL GESTURE Perform a facial gesture detectable by the infrared sensor of the invention (wink, cheek twitch, smile, etc.) in a time interval lower than a predetermined threshold.
  • the maximum time of sustaining the facial gesture can be freely changed by the user through the optional web configuration system of the invention.
  • LONG FACIAL GESTURE Maintain the facial gesture detectable by the infrared sensor of the invention (wink, cheek contraction, smile, etc.) for a time interval greater than a predetermined threshold. For the correct identification of the intention of the gesture, the minimum time of sustaining the facial gesture can be freely changed by the user through the optional web configuration system of the invention.
  • the user can assign commands for the device to execute, according to his preference, simply configuring them in the optional web system of the invention.
  • the commands that can be configured are listed below: a) LEFT MOUSE BUTTON CLICK (DEFAULT CLICK): By default, assigned to the SHORT FACIAL GESTURE. b) SCROLL DOWN: By default assigned to LAY YOUR HEAD TO THE RIGHT. c) SCROLL UP: By default assigned to LEFT HEAD LAY. d) TEMPORARILY DISABLE THE INVENTION (SLEEP): By default, assigned to the NEGATIVE GESTURE if the invention is ON.
  • the invention is able to switch between its operating modes (STANDARD, VIRTUAL JOYSTICK, SINGLE AXIS or CLICK ONLY) in full operation.
  • the change of operating mode can be attributed to one of the detectable gestures, so that, upon identifying the gesture, the invention immediately switches to the next operating mode in the sequence shown above. For example, if the user configures the LONG FACIAL GESTURE to perform the change of operation mode, if the device is in STANDARD mode when this gesture is identified, it will start to operate as a VIRTUAL JOYSTICK.
  • the device switches to SINGLE AXIS MODE, and so on to CLICK ONLY mode, and again to STANDARD mode, each time the gesture assigned to the mode change is performed.
  • the user can change some operating parameters of the device to suit his needs or preferences.
  • the invention will perform the action of clicking the RIGHT mouse button instead of the LEFT one, for example.
  • the sensitivity can be increased to allow identification of the subtlest eyelid movements (in the case of people who have a weaker blink), or decreased in order to capture only stronger and more intentional blinks (ignoring, for example, unintentional light blinks) d) NUMBER OF REPETITIONS OF AFFIRMATIVE AND NEGATIVE GESTURES: Changes the number of repetitions of head movements in the vertical (affirmative) or horizontal (negative) direction needed to identify these gestures.
  • TIME FOR EXECUTION OF AFFIRMATIVE AND NEGATIVE MOVEMENTS Changes the minimum times within which the user will need to perform the number of repetitions of affirmative or negative head movements for these gestures to be identified.
  • MINIMUM ANGLE OF INCLINATION OF THE HEAD TO LEFT AND RIGHT Changes the minimum angles of inclination of the head of the user to the left and to the right so that these gestures are identified.
  • MAXIMUM TIME FOR IDENTIFICATION OF THE SHORT FACIAL GESTURE Changes the maximum time within which the infrared sensor needs to identify the beginning and end of the movement of the part of the face directly exposed under the beam of infrared light emitted by the device, so that it is considered as a short facial gesture.
  • MINIMUM TIME FOR IDENTIFICATION OF THE LONG FACIAL GESTURE Changes the minimum time for which the infrared sensor needs to identify the sustained contraction of the part of the face directly exposed under the beam of infrared light emitted by the device, for it to be considered as a facial gesture far away.
  • INTENSITY OF SOUND SIGNALS Changes the volume of the sound signals emitted by the device.
  • INTENSITY OF VIBRATING SIGNALS Alters the intensity of signaling vibrations emitted by the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Vascular Medicine (AREA)
  • Biomedical Technology (AREA)
  • Position Input By Displaying (AREA)

Abstract

L'invention concerne un dispositif et un système informatiques ainsi qu'une technologie d'assistance pour détecter et traiter des mouvements de la tête et des expressions faciales, au moyen de capteurs de mouvements et d'inerties infrarouges qui activent le contrôle d'interfaces électroniques. L'invention porte sur un dispositif compact, sans fils et adaptable qui peut être accouplé à un quelconque accessoire pour la tête, distribuant des accessoires externes et qui fonctionne avec n'importe quel équipement pourvu d'un système fonctionnel, à condition que l'appareil à contrôler fournisse une connexion Bluetooth. C'est une solution hybride composée d'un système embarqué servant à identifier les mouvements des utilisateurs et traiter les instructions de contrôle ; d'un système Web destiné à la paramétrisation de la captation de mouvements des utilisateurs; et d'un dispositif. L'invention est adaptée aux personnes qui, du fait de leur mobilité réduite, ne parviennent pas à utiliser des périphériques d'ordinateurs communs, favorisant l'indépendance et l'inclusion professionnelle, scolaire et récréative.
PCT/BR2023/050045 2022-02-09 2023-02-09 Dispositif et système pour le contrôle d'interfaces électroniques WO2023150849A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
BR102022002441A BR102022002441A2 (pt) 2022-02-09 2022-02-09 Dispositivo e sistema para controle de interfaces eletrônicas
BR1020220024413 2022-02-09

Publications (1)

Publication Number Publication Date
WO2023150849A1 true WO2023150849A1 (fr) 2023-08-17

Family

ID=81893564

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/BR2023/050045 WO2023150849A1 (fr) 2022-02-09 2023-02-09 Dispositif et système pour le contrôle d'interfaces électroniques

Country Status (2)

Country Link
BR (1) BR102022002441A2 (fr)
WO (1) WO2023150849A1 (fr)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4486630A (en) * 1983-03-11 1984-12-04 Fetchko John E Device for use by quadri-plegics to operate a computer, video game or the like by the use of movements of the jaw and eyebrows
WO2010064987A1 (fr) * 2008-12-03 2010-06-10 Brusell Dental As Dispositif permettant de transférer des commandes de souris et de clavier
BRPI1004279A2 (pt) * 2010-05-05 2012-02-14 Unicamp dispositivo assistivo de interface homem-máquina e equipamento controlados pelo mesmo
US8351773B2 (en) * 2007-01-05 2013-01-08 Invensense, Inc. Motion sensing and processing on mobile devices
US20140045547A1 (en) * 2012-08-10 2014-02-13 Silverplus, Inc. Wearable Communication Device and User Interface
WO2015159108A2 (fr) * 2014-04-07 2015-10-22 Nanousis Milto Mouvement d'un curseur à partir de lunettes commandant une souris
EP2945045A1 (fr) * 2014-05-16 2015-11-18 Samsung Electronics Co., Ltd Dispositif électronique et procédé permettant de lire de la musique dans un dispositif électronique
US20150370333A1 (en) * 2014-06-19 2015-12-24 Thalmic Labs Inc. Systems, devices, and methods for gesture identification
BR102014025534A2 (pt) * 2014-10-13 2016-05-17 Fundação Amazônica De Amparo A Pesquisa E Desenvolvimento Tecnológico Des sistema de utilização de um computador e de controle e acionamento de dispositivos pelo computador
US20180070166A1 (en) * 2016-09-06 2018-03-08 Apple Inc. Wireless Ear Buds
WO2020044363A1 (fr) * 2018-08-31 2020-03-05 Indian Council Of Medical Research (Icmr) Dispositif de souris et de clavier sans fil monté sur lunettes pour une personne handicapée
BR102018071412A2 (pt) * 2018-10-17 2020-04-28 Limesoft Equipamentos Eletronicos Ltda dispositivo e método para captar, processar e classificar movimentos oculares, movimentos com a cabeça e atividade cerebral.
KR20210067149A (ko) * 2019-11-29 2021-06-08 김지영 안경형 마우스 시스템

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4486630A (en) * 1983-03-11 1984-12-04 Fetchko John E Device for use by quadri-plegics to operate a computer, video game or the like by the use of movements of the jaw and eyebrows
US8351773B2 (en) * 2007-01-05 2013-01-08 Invensense, Inc. Motion sensing and processing on mobile devices
WO2010064987A1 (fr) * 2008-12-03 2010-06-10 Brusell Dental As Dispositif permettant de transférer des commandes de souris et de clavier
BRPI1004279A2 (pt) * 2010-05-05 2012-02-14 Unicamp dispositivo assistivo de interface homem-máquina e equipamento controlados pelo mesmo
US20140045547A1 (en) * 2012-08-10 2014-02-13 Silverplus, Inc. Wearable Communication Device and User Interface
WO2015159108A2 (fr) * 2014-04-07 2015-10-22 Nanousis Milto Mouvement d'un curseur à partir de lunettes commandant une souris
EP2945045A1 (fr) * 2014-05-16 2015-11-18 Samsung Electronics Co., Ltd Dispositif électronique et procédé permettant de lire de la musique dans un dispositif électronique
US20150370333A1 (en) * 2014-06-19 2015-12-24 Thalmic Labs Inc. Systems, devices, and methods for gesture identification
BR102014025534A2 (pt) * 2014-10-13 2016-05-17 Fundação Amazônica De Amparo A Pesquisa E Desenvolvimento Tecnológico Des sistema de utilização de um computador e de controle e acionamento de dispositivos pelo computador
US20180070166A1 (en) * 2016-09-06 2018-03-08 Apple Inc. Wireless Ear Buds
WO2020044363A1 (fr) * 2018-08-31 2020-03-05 Indian Council Of Medical Research (Icmr) Dispositif de souris et de clavier sans fil monté sur lunettes pour une personne handicapée
BR102018071412A2 (pt) * 2018-10-17 2020-04-28 Limesoft Equipamentos Eletronicos Ltda dispositivo e método para captar, processar e classificar movimentos oculares, movimentos com a cabeça e atividade cerebral.
KR20210067149A (ko) * 2019-11-29 2021-06-08 김지영 안경형 마우스 시스템

Also Published As

Publication number Publication date
BR102022002441A2 (pt) 2022-06-07

Similar Documents

Publication Publication Date Title
US9013264B2 (en) Multipurpose controller for electronic devices, facial expressions management and drowsiness detection
US10191558B2 (en) Multipurpose controllers and methods
USD800118S1 (en) Wearable artificial intelligence data processing, augmented reality, virtual reality, and mixed reality communication eyeglass including mobile phone and mobile computing via virtual touch screen gesture control and neuron command
US10353460B2 (en) Eye and head tracking device
US20190265802A1 (en) Gesture based user interfaces, apparatuses and control systems
CN105745568B (zh) 用于在可头戴式设备上执行多触摸操作的系统和方法
US9563258B2 (en) Switching method and electronic device
CN108761795A (zh) 一种穿戴式设备
US11481037B2 (en) Multipurpose controllers and methods
KR20130059827A (ko) 동공인식을 이용한 안경 카메라
WO2020073967A1 (fr) Procédé de commutation d'écran horizontal et vertical, dispositif portable et appareil ayant une fonction de stockage
US10444831B2 (en) User-input apparatus, method and program for user-input
KR20070043469A (ko) 장애인을 위한 마우스 동작 인식 시스템
US11145304B2 (en) Electronic device and control method
US11340736B2 (en) Image display device, image display method, and image display program
WO2023150849A1 (fr) Dispositif et système pour le contrôle d'interfaces électroniques
US20160034252A1 (en) Smart device control
KR20190131737A (ko) 외부 객체에 대한 접촉 상태에 기반한 가이드 정보를 제공하는 웨어러블 전자 장치 및 그 동작 방법
CN103124336A (zh) 具有可调角度的内置摄像头的电视机
US11211067B2 (en) Electronic device and control method
CN111160165A (zh) 一种自适应式姿态纠错检测方法及装置
CN103576340A (zh) 一种具有鼠标功能的眼镜
KR102381542B1 (ko) 실시간 거북목 자세의 판별을 위한 알고리즘을 포함하는 시스템, 상기 시스템과 연동하는 반응형 거치대 및 이들의 제어방법
CN210383032U (zh) 一种可旋转的led化妆镜
US10824850B2 (en) Body information analysis apparatus capable of indicating shading-areas

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23752170

Country of ref document: EP

Kind code of ref document: A1