WO2017104869A1 - 아이 브레인 인터페이스(ebi) 장치 및 그 제어 방법 - Google Patents
아이 브레인 인터페이스(ebi) 장치 및 그 제어 방법 Download PDFInfo
- Publication number
- WO2017104869A1 WO2017104869A1 PCT/KR2015/013894 KR2015013894W WO2017104869A1 WO 2017104869 A1 WO2017104869 A1 WO 2017104869A1 KR 2015013894 W KR2015013894 W KR 2015013894W WO 2017104869 A1 WO2017104869 A1 WO 2017104869A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- frequency
- gaze
- visual object
- brain
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/0024—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system for multiple sensor units attached to the patient, e.g. using a body or personal area network
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
Definitions
- the present invention relates to an eye brain interface device controlled according to a user's gaze and brain waves, and a control method thereof, and more particularly, to a calibration method of an eye brain interface device.
- the brain is the central nervous system responsible for the processing and processing of stimuli, and plays a central role in mental activities such as memory and judgment, as well as motor functions and emotional responses.
- the frontal lobe is located in the anterior part of the cerebrum and is responsible for the movement of the body according to thoughts, plans, and judgments.
- the broker region In the frontal lobe, there is a group of neurons that play an important role, the broker region, which can perform more complex functions than the other cerebral regions.
- the prefrontal cortex which is the widest part of the frontal lobe, is known as a site that separates humans from other animals, synthesizes information from the sensory system, and induces high-dimensional mental activity.
- BCI technology first introduced the concept at the UCLA lab in the US in 1973 and remained in the R & D and testing phase until the mid-2000s.
- various headset-type EEG devices such as Emotiv's EPOC, Interexon's Muse, and NeuroSky's MindWave, BCI is rapidly developing and becoming practical.
- the present specification provides a method for simultaneously calibrating EEG and gaze in an EBI system.
- the present disclosure provides a method for more accurately and efficiently calibrating the EEG and gaze in the EBI system.
- the present disclosure provides a method for acquiring an iris pattern in a gaze tracking calibration process.
- the present disclosure provides a method for measuring and recalibrating a stress index of a user.
- the present disclosure provides a control method of the EBI system based on the brain wave and the gaze of the user based on the calibration result.
- an eye-brain calibration (EBC) interface for calibrating the eye and brain waves together Providing a;
- the EBC interface includes a visual object, instructing the user to stare at the visual object in a specific cognitive state, acquiring the user's gaze and brainwaves for the visual object included in the EBC interface; Mapping the visual object and the gaze of the user; And mapping brain waves of the user to a specific cognitive state instructed by the user. It may include.
- the mapping of the gaze of the user may be a step of mapping the position coordinates on the screen of the visual object with the position coordinates of the user's gaze.
- the EBC interface may sequentially and / or alternately provide a first visual object indicating a first cognitive state and a second visual object indicating a second cognitive state.
- the first cognitive state may be a cognitive state including at least one of concentration and selection
- the second cognitive state may be a cognitive state including at least one of rest and search.
- the mapping of the brain waves of the EBI user may include obtaining first raw data for brain waves in the first cognitive state and second raw data for brain waves in the second cognitive state. ; Frequency converting the first and second row data; And setting classification criteria of the first and second recognition states based on frequency characteristics of the frequency-converted first and second row data. It may include.
- the setting of the classification criteria of the first and second recognition states may include extracting a frequency amplitude for each frequency band of a preset range from the frequency-converted first and second row data; Acquiring Fisher's Ratio for each frequency band by using the extracted frequency magnitude; Selecting a first frequency band having the highest Fisher's Ratio and a second frequency band having the next higher Fisher's Ratio; And setting the first and second frequency bands as classification criteria of the first and second recognition states. It may include.
- the Fisher's Ratio may be a value calculated based on the average and the variance of the frequency magnitudes in the frequency-converted first row data and the average and the variance of the frequency magnitudes in the frequency-converted second row data. .
- the frequency band of the predetermined range may correspond to the ⁇ wave band, ⁇ wave band, ⁇ wave band, or ⁇ wave band of the brain wave.
- the EBC interface may induce the EEG of the user to a specific frequency band by adjusting the frequency at which the visual object blinks.
- the EBC interface by adjusting the frequency of blinking the visual object in the range of about 8 ⁇ 13Hz, induces the brain wave of the user to the alpha wave range, and adjusts the frequency of blinking the visual object in the range of about 13 ⁇ 30Hz By doing so, the brain waves of the user can be induced in the beta wave range.
- the method may further include obtaining an iris image from the user's gaze; And encoding the iris image; It may further include.
- the encoding of the iris image may include: separating the obtained iris image into a plurality of images; Arranging the separated plurality of images in one direction; Converting the images listed in one direction into one two-dimensional image; It may include.
- the slave device for measuring the eye (eye) and brain waves according to another embodiment of the present invention, the eye tracking unit for tracking the eyes of the user; An EEG sensing unit for sensing the EEG of the user; A communication unit for performing communication with a host device; And a processor controlling the eye tracking unit, the brain wave sensing unit, and the communication unit;
- the host device is a device for providing an eye-brane calibration (EBC) interface for simultaneously calibrating the eye and brain waves, the EBC interface includes a visual object, the user to recognize the visual object specific state
- the processor may acquire the gaze and the brain wave of the user and transmit the gaze and the brain wave of the user to the host device.
- EBC eye-brane calibration
- the host device controlled based on the gaze and brain waves comprising: a display unit for displaying an image; A communication unit for communicating with a slave device; A processor controlling the display unit and the communication unit; Wherein the processor provides an eye-brain calibration (EBC) interface for simultaneously calibrating the eye and brain waves, wherein the EBC interface includes a visual object and provides a user with a specific cognitive state to stare at.
- EBC eye-brain calibration
- the processor may map the position coordinates on the screen of the visual object and the position coordinates of the user's gaze when the gaze of the user is mapped.
- the EBC interface may sequentially and / or alternately provide a first visual object indicating a first cognitive state and a second visual object indicating a second cognitive state.
- the first cognitive state may be a cognitive state of concentration or selection
- the second cognitive state may be a cognitive state of rest or search.
- the processor acquires first raw data for the brain waves in the first cognitive state and second raw data for the brain waves in the second cognitive state. And frequency converting the first and second row data, extracting a frequency size for each frequency band in a preset range from the frequency-converted first and second row data, and using the extracted frequency magnitude. Obtain a star Fisher's Ratio, select a first frequency band having the highest Fisher's Ratio and a second frequency band having the next higher Fisher's Ratio, and classify the first and second frequency bands into the first and second recognition states. Can be set as a reference.
- the processor may acquire the brain waves of the user in real time, and classify the brain waves of the user in real time according to the classification criteria.
- the EBC interface may induce the EEG of the user to a specific frequency band by adjusting the frequency at which the visual object blinks.
- the EBC interface capable of simultaneously calibrating the EEG and the gaze since the EBC interface capable of simultaneously calibrating the EEG and the gaze is provided, the user can more easily and quickly calibrate the EEG and the gaze at the same time.
- the cognitive state of the EEG is distinguished using the frequency characteristics of the EEG, there is an effect that can be distinguished more accurately the cognitive state of the EEG.
- the iris pattern may be utilized as user authentication information.
- FIG. 1 is a diagram illustrating an iBrain interface system according to an exemplary embodiment of the present invention.
- FIG. 2 is a block diagram of a host device and a slave device according to an embodiment of the present invention.
- 3 is a diagram illustrating various embodiments of a slave device.
- EBI iBrain interface
- FIG. 5 is a block diagram of an EBI device in accordance with an embodiment of the present invention.
- FIG 6 illustrates embodiments of an EBC interface.
- FIG. 7 is a diagram illustrating an embodiment of data acquired according to an EBC interface according to an embodiment of the present invention.
- FIG 8 illustrates an EBI system for performing gaze calibration according to an embodiment of the present invention.
- FIG. 9 illustrates an EBI system for acquiring an iris pattern of a user according to an embodiment of the present invention.
- FIG. 10 is a flowchart illustrating an EBI system for classifying brain waves of a user according to an embodiment of the present invention.
- FIG 14 illustrates various applications of an EBI system in accordance with an embodiment of the present invention.
- 15 is a flowchart illustrating a control method of an EBI system according to an embodiment of the present invention.
- 1 is a diagram illustrating an iBrain interface system according to an exemplary embodiment of the present invention.
- 2 is a block diagram of a host device and a slave device according to an embodiment of the present invention.
- an EyeBrain Interface (EBI) system may include a host device 150 and a slave device 100.
- the slave device 100 may represent various types of wearable devices that can be worn by a user.
- the slave device 100 may represent a device that contacts / wears on a user's body part such as a head mounted display (HMD), a headset, a smart ring, a smart watch, an earset, an earphone, and the like.
- the slave device 100 may include at least one sensor to sense the biosignal of the user through a body part of the user.
- the biosignal may represent various signals generated from the user's body according to the user's conscious and / or unconscious (eg, breathing, heartbeat, metabolism, etc.) behavior such as pulse, blood pressure, brain wave, and the like.
- the slave device 100 may sense the brain wave of the user as the biosignal of the user and transmit the sensing result to the host device 150.
- the host device 150 may represent a device operating based on a sensing result of the biosignal received from the slave device 100.
- the host device 150 may be various electronic devices that receive a biosignal sensing result of the user from the slave device 100 and perform various operations based on the received sensing result.
- the host device 150 may be, for example, various electronic devices such as a TV, a smartphone, a tablet PC, a smart car, a PC, a laptop, and the like.
- the EBI system includes a slave device 100 and a host device 150 to provide a control scheme based on a biosignal of a user.
- the system directly senses the user's intention by sensing the user's biosignal and is controlled accordingly.
- the EBI system provides the user with a more convenient and intentional control method.
- the configuration of the slave device 100 and the host device 150 will be described in more detail.
- the slave device 100 includes a position marker unit 120, a gaze tracking unit 130, an EEG sensing unit 110, a sensor unit 260, a communication unit 250, and a processor 240. ) May be included.
- the position marker unit 120 may include at least one light emitting device (eg, an infrared LED) that emits light.
- the host device 150 may track the position marker unit of the slave device 100 in real time, whereby the position of the user wearing the slave device 100, the position, the distance between the host device 150 and the user, and The relative position, etc. (hereinafter, 'user's position') can be detected.
- the plurality of light emitting elements may be positioned in the position marker unit 120 spaced apart by a predetermined distance.
- the host device 150 may detect the relative distance between the host device 150 and the user by tracking the light emitting elements of each position marker unit 120 and measuring the separation distance between the light emitting elements in real time. Can be. For example, when the position marker unit 120 moves away from the host device 150, the separation distance between the light emitting elements measured in the host device 150 is reduced, and the position marker unit 120 is connected to the host device 150. When approaching, the separation distance between the light emitting devices measured by the host device 150 may increase.
- the host device 150 calculates a ratio between the separation distance between the light emitting devices measured in real time and the predetermined separation distance between the actual light emitting devices, and thereby calculates the relative distance between the host device 150 and the user. The distance can be calculated.
- the position marker unit 120 for tracking a user's position may be included in the slave device 100 in various forms, and the host device 150 includes the position, size, and location of these position marker units 120. The position of the user may be detected based on the number, location, and separation distance of the light emitting devices.
- the gaze tracking unit 130 may track the gaze of the user.
- the gaze tracking unit 130 may be provided in the slave device 100 to be positioned around the eyes of the user to track the eyes of the user (eye movement) in real time.
- the eye tracking unit 130 may include a light emitting device (eg, an infrared LED) that emits light and a camera sensor that receives (or senses) the light emitted from the light emitting device.
- the gaze tracking unit 130 may photograph light reflected from the user's eyes with a camera sensor and transmit the photographed image to the processor 240 (video analysis method).
- video analysis method video analysis method
- the eye tracking unit 130 may include a contact lens method (mirror built-in contact) in addition to the above-described video analysis method.
- User's eyes using the reflected light of the lens, the eye tracking method using the magnetic field of the coiled contact lens) or the sensor attachment method (the eye tracking method using the electric field according to the eye movement by attaching the sensor around the eyes) Can be tracked.
- the EEG sensing unit 110 may sense the EEG of the user.
- the EEG sensing unit 110 may include at least one electroencephalogram (EGE) sensor and / or a magnettoencephalography (MEG) and a near-infrared spectrometer (NIRS).
- EGE electroencephalogram
- MEG magnettoencephalography
- NIRS near-infrared spectrometer
- the brain wave sensing unit 110 may be provided at a body (eg, head) contact position where the brain wave of the user may be measured when the user wears the slave device 100, and measure the brain wave of the user.
- the EEG sensing unit 110 measures an electrical / optical frequency that varies according to the EEG of the various frequencies generated from the body part of the contacted user or the activation state of the brain.
- EEG is a biosignal
- simply extracting the brain waves of the user and analyzing them on a uniform basis is less accurate in distinguishing the user's current cognitive state. Therefore, in order to accurately measure the cognitive state of the user based on the EEG, the present disclosure provides a calibration method of the EEG according to the current cognitive state for each user. A more detailed description thereof will be described later with reference to FIGS. 11 to 14.
- the sensor unit 260 may include at least one sensing means, and may sense the surrounding environment of the device 100 using the sensing means. In addition, the sensor unit 260 may transmit the sensing result to the processor. In particular, in the present specification, the sensor unit may sense a movement, a movement, and the like of the slave device 100 and transmit a sensing result to the processor 240.
- the sensor unit 260 is an inertia measurement unit (IMU) sensor, a gravity sensor, a geomagnetic sensor, a motion sensor, a gyro sensor, an accelerometer, a magnetometer, an acceleration sensor, an infrared ray as a sensing means.
- the sensor may include an inclination sensor, an altitude sensor, an infrared sensor, an illuminance sensor, a global positioning system (GPS) sensor, and the like.
- GPS global positioning system
- the sensor unit 260 collectively refers to the various sensing means described above.
- the sensor unit 260 may sense various user inputs and environment of the device, and may transmit a sensing result to allow the processor to perform an operation accordingly.
- the above-described sensing means may be included in the slave device 100 as a separate element or integrated into at least one or more elements.
- the communication unit 250 may communicate with an external device using various protocols, and may transmit / receive data through the communication unit 250.
- the communication unit 250 may connect to a network by wire or wirelessly to transmit / receive various signals and / or data.
- the slave device 100 may perform pairing with the host device 150 using the communication unit 250.
- the slave device 100 may transmit / receive various signals / data with the host device 150 using the communication unit 250.
- the processor 240 may control the position marker unit 120, the eye tracking unit 130, the brain wave sensing unit 110, the sensor unit 260, and the communication unit 250.
- the processor 240 may control transmission / reception of signals (or data) between the above-described units.
- the processor 240 may transmit a sensing result received from at least one sensor provided in the slave device 100 to the host device 150.
- the sensing result may refer to raw data obtained by using at least one sensor included in the slave device 100 or data processed through a predetermined algorithm.
- processor 240 may perform various operations for calibrating the user's gaze and brain waves, which will be described in detail later with reference to FIGS. 6 to 13.
- the slave device 100 may optionally include some of the configuration units shown in FIGS. 1 and 2, and in addition, various units required for the purpose and operation of the device, such as a memory unit, a camera unit, and a power supply unit. May be further included.
- the host device 150 may include a camera unit 140, a display unit 210, a communication unit 230, and a processor 220.
- the camera unit 140 may photograph the position marker unit 120 of the slave device 100.
- the camera unit 140 may capture the position marker unit 120 of the slave device 100 to obtain a captured image of the position marker unit 120.
- the camera unit 140 may transmit the acquired captured image to the processor 220, and the processor 220 may process the captured image to acquire a position of a user wearing the slave device 100.
- the processor 220 may acquire the position of the user by analyzing the position and size of the position marker units 120, the number of the light emitting elements included therein, the position, and the separation distance.
- the camera unit 140 may be configured as a wide angle camera having an angle of view of about 60 degrees or more.
- the camera unit 140 is configured as a general camera (camera having an angle of view of less than 60 degrees), the left and right angles of about 60 degrees in front of the host device 150, about 60 to about between the slave device 100 and the host device 150. You can track your position in the 90cm range.
- the camera unit 140 is configured with a wide angle camera (camera having an angle of view of 60 degrees or more), the left and right angles of about 170 degrees in front of the host device 150, and between the slave device 100 and the host device 150 Position tracking is possible up to a distance of about 3m.
- the camera unit 140 of the present invention can be configured as a wide angle camera to obtain more accurate position data of the user.
- the display unit 210 may display an image.
- the image may represent a still image, a moving image, text, a virtual reality (VR) image, an AR (Augment Reality) image, or various other visual expressions including the same, which may be displayed on the screen.
- the display unit 210 includes a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode (OLED), a 3D display, a transparent It may include at least one of a transparent OLED (TOLED).
- the display unit 210 may be made of a metal foil, very thin grass, or a plastic substrate.
- a plastic substrate a PC substrate, a PET substrate, a PES substrate, a PI substrate, a PEN substrate, an AryLite substrate, or the like may be used.
- the communication unit 230 may communicate with an external device using various protocols, and may transmit / receive data through the communication device 230.
- the communication unit 230 may connect to a network by wire or wirelessly to transmit / receive various signals and / or data.
- the host device 150 may perform pairing with the slave device 100 using the communication unit 230. In addition, the host device 150 may transmit / receive various signals / data with the slave device 100 using the communication unit 230.
- the processor 220 may control the camera unit 140, the display unit 210, and the communication unit 230.
- the processor 220 may control transmission / reception of signals (or data) between the above-described units.
- the processor 220 may perform various commands (or operations) corresponding to the sensing result received from the slave device 100. For example, when the gaze coordinates of the user are received as a result of sensing, the processor 220 may execute a command for selecting a visual object (eg, an icon) at a specific position on the display unit 210 mapped to the gaze coordinates. Can be done. In addition, when user EEG data corresponding to the “focused” state is received as the sensing result, the processor 220 may execute a command for executing the selected visual object (eg, executing an application corresponding to the selected icon). have.
- a visual object eg, an icon
- Calibration needs to be preceded.
- the EEG calibration for mapping the specific cognitive state of the user and the EEG of a specific frequency also needs to be preceded.
- the present invention can provide an EBC (Eye Brain Calibration) interface for simultaneously calibrating the user's eye and brain waves, which will be described later in detail with reference to FIGS.
- EBC Eye Brain Calibration
- the host device 150 may optionally include some of the configuration units shown in FIGS. 1 and 2, and in addition, various units required for the purpose and operation of the device, such as a sensor unit, a memory unit, and a power supply unit. May be further included.
- each unit included in the host device 150 and the slave device 100 is illustrated separately, but the unit of the slave device 100 may be included in the host device 150.
- the unit of the host device 150 may be included in the slave device 100.
- the eye tracking unit of the slave device 100 may be included in the host device 150 instead of the slave device 100 according to an embodiment.
- the above-described processors 220 and 240 may be embedded in a device (slave or host) or may be implemented in an independent form outside the device (not shown).
- the processors 220 and 240 may exist in the form of an external processor that is easy to carry by a user.
- the user may connect the external processors 220 and 240 to a specific device as needed, and the device to which the external processors 220 and 240 are connected may be a slave or a host device 100 or 150.
- the external processors 220 and 240 may process various data (particularly, data about a user's biosignal) by a predetermined algorithm so that the connected device may perform a slave or host device 100 and 150 function. have.
- the connected device should be provided with a unit capable of sensing a user's biosignal.
- FIG. 2 is a block diagram according to an exemplary embodiment, in which the blocks marked separately represent logical elements of hardware of the slave / host devices 100 and 150. Accordingly, the above-described elements of the slave / host devices 100 and 150 may be mounted in one chip or in a plurality of chips according to the design of each device.
- 3 is a diagram illustrating various embodiments of a slave device.
- the slave device may be implemented as various form factors.
- the slave device 100-1 may be implemented in the form of a headset.
- the EEG sensing unit 110-1 of the slave device 100-1 may be located at a contact portion of the head and / or forehead of the user, and may sense the brain wave of the user from the head and / or forehead.
- the gaze tracking unit 130 may be positioned around the eyes of the user, and may track the gaze of the user in real time.
- the sensor unit 110-2 may be located in the main body of the slave device 100-1, and may track a user's head position (movement, movement, etc.) in real time.
- the configuration units included in the slave device 100-1 may be included in the main body of the slave device 100-1.
- the slave device 100-2 may be implemented in the form of an earset.
- the EEG sensing unit 110-1 of the slave device 100-2 may be located at a portion that is inserted into the user's ear (eg, the inner ear or the inner ear), and in the user's ear. EEG can be sensed.
- a speaker unit (not shown) that outputs sound may be located at a portion that is inserted into the user's ear together with the EEG sensing unit 110-1.
- the gaze tracking unit 130 may be positioned around the eyes of the user, and may track the gaze of the user in real time.
- the component units included in the slave device 100-2 may be included in the main body of the slave device 100-2.
- the slave device 100 may be implemented in various form factors so as to sense the gaze / brain wave of the user, and is not limited to the embodiment illustrated in this drawing.
- EBI iBrain interface
- the EBI device 400 may represent a device in which the slave device 100 and the host device 150 described above with reference to FIGS. 1 to 3 are integrated into one device. Therefore, the EBI device 400 may directly sense the biosignal and perform various operations based on the sensing result.
- the EBI device 400 may be configured in the form of a wearable device that can be worn on a user's body.
- the EBI device 400 may include an EEG sensing unit 500, a gaze tracking unit 510, a communication unit 530, a display unit 540, and a processor 520. Since the description of the units included in the EBI device 400 overlaps with the description above in FIG. 2, the following description will focus on the differences.
- the EEG sensing unit 500 may sense the EEG of the user.
- the EEG sensing unit 500 may include at least one electroencephalogram (EGE) sensor and / or a magnettoencephalography (MEG).
- EGE electroencephalogram
- MEG magnettoencephalography
- the brain wave sensing unit 500 may be provided at a body (eg, head) contact position where the user's brain wave may be measured when the user wears the EBI device, and measure the brain wave of the user.
- the gaze tracking unit 510 may track the eyes of the user.
- the gaze tracking unit 510 may be provided in the EBI device 400 to be positioned around the eyes of the user to track the eyes of the user (eye movement) in real time.
- the gaze tracking unit 510 may include a light emitting device (eg, an infrared LED) that emits light and a camera sensor that receives (or senses) the light emitted from the light emitting device.
- the communication unit 530 may communicate with an external device using various protocols, and may transmit / receive data through the communication unit 530.
- the communication unit 530 may connect to a network by wire or wirelessly to transmit / receive various signals and / or data.
- the display unit 540 may display an image.
- the image may represent a still image, a moving image, text, a virtual reality (VR) image, an AR (Augment Reality) image, or various other visual expressions including the same, which may be displayed on the screen.
- VR virtual reality
- AR Augment Reality
- the processor 520 may control the EEG sensing unit 500, the eye tracking unit 510, the communication unit 530, the display unit 540, and the communication unit 530.
- the processor 520 may control transmission / reception of signals (or data) between the above-described units.
- the processor 520 may perform various operations corresponding to the sensing result received from the EEG sensing unit 500 and / or the gaze tracking unit 510.
- the EBI device 400 may optionally include some of the component units shown in FIG. 5, and in addition to the various components required for the purpose and operation of the device 400 such as a sensor unit, a memory unit, and a power supply unit. Units may be further included.
- FIG. 5 is a block diagram according to an exemplary embodiment, in which the blocks marked separately represent logical elements of hardware elements of the EBI device 400.
- the elements of the EBI device 400 described above may be mounted on one chip or on multiple chips, depending on the design of each device.
- the EBI system may provide an EBC (Eye Brain Calibration) interface for calibrating the user's eyes and brain waves, and the user's eyes and the brain waves can be simultaneously calibrated through the EBC interface.
- EBC Eye Brain Calibration
- the present invention is not limited thereto, and according to an exemplary embodiment, the EBI system may perform calibration for only one of the EEG and the gaze of the user through the EBC interface.
- a slave / host device and an EBI device will be collectively referred to as an EBI system. Therefore, the description of the EBI system described below may be applied to the slave and the host device when the EBI system includes the slave and the host device, and may be applied to the EBI device when the EBI system includes the EBI device.
- FIG 6 illustrates embodiments of an EBC interface.
- An EBI system may simultaneously perform calibration for a user's gaze and brain waves through an EBC interface.
- the EBI system can provide the user with an EBC interface for inducing a specific cognitive state at the same time as well as simultaneously inducing the movement of the gaze with respect to a specific point on the screen.
- the EBI system may sequentially display a plurality of visual objects located at different points as an EBC interface, and instruct a user to sequentially stare at the displayed plurality of visual objects. can do.
- the EBI system may instruct the user to stare at a certain visual object with a focused perception state, and when looking at another visual object, at a rest (or simply stare / decentralize) a stare at the stare.
- the EBI system can alternately display a plurality of visual objects with different visual effects (eg, color, size, shape, contrast, flicker, etc.).
- the EBI system can display red and blue visual objects alternately and sequentially, instructing the user to stare at the red cognitive state of focus, and the blue object to stare at the cognitive state of rest. can do.
- the EBI system may instruct the user to stare with a perceived state of search (or search) when the user moves his / her eyes from one visual object to the next.
- the EBI system may or may not directly guide (or display) the gaze path from the particular visual object to the next visual object.
- the EBI system may acquire the user's gaze coordinates for a particular visual object and at the same time acquire the brain waves of the user looking at the visual object.
- the EBI system may acquire the gaze coordinates of the user following the gaze path, and may also acquire the EEG of the user looking at the gaze path.
- the EBI system can only acquire the brainwaves of the user.
- the EBI system may map the coordinates on the screen of the specific visual object and the acquired gaze coordinates of each other.
- the EBI system may map the cognitive state instructed to the user with respect to the particular visual object with the acquired brain waves of the user. This enables the EBI system to calibrate eye and brain waves simultaneously and easily through one interface.
- the EBC interface allows a plurality of visual objects one by one (eg, polygons, circles, etc.) or in an unspecified form (randomly). (Or a predetermined number) may be sequentially displayed, and visual objects having different visual effects (eg, color, shape, size, shape, flickering, contrast, etc.) may be alternately displayed.
- the EBC interface may display a plurality of visual objects at the same time and sequentially indicate visual objects that the user should stare by giving a visual effect to a particular visual object.
- the EBC interface may also indicate a user's cognitive state through visual effects given to the corresponding visual object.
- the EBC interface may adjust the frequency at which the visual objects blink to induce the user's brain to a specific cognitive state.
- frequencies of about 8-12 Hz are known to help guide brain waves into the alpha ( ⁇ ) wave region corresponding to the resting (or searching) state.
- the EBC interface can impart a visual effect such that a particular visual object blinks at a frequency of about 8-13 Hz to induce a 'rest' cognitive state. Therefore, the user may be induced to the 'rest' state by simply staring at the visual object, and the EBI interface may extract the brainwave of the user and map it to the cognitive state of rest.
- the frequency of about 13 ⁇ 30Hz is known to help guide the EEG into the beta ( ⁇ ) wave region corresponding to the concentration (or awakening, selection, etc.) state.
- the EBC interface can impart a visual effect such that a particular visual object blinks at a frequency of about 13-30 Hz to induce a 'focus' cognitive state. Therefore, the user may be induced to a 'focus' state by simply staring at the visual object, and the EBI interface may extract the brainwave of the user and map it to the cognitive state of concentration.
- the EBC interface may calibrate the user's gaze and brain waves simultaneously by inducing the user's gaze and a specific cognitive state to a specific point on the screen in various ways.
- the EBC interface may also acquire an iris pattern of the user. Since the iris pattern is different for each user such as a fingerprint, it can be usefully used as user authentication information.
- the EBI system that has completed the gaze / brain wave calibration can use the user's brain wave as the control information regarding the user's execution command, and the user's eye as the control information regarding the user's execution command position. For example, if the user gazes after staring at a particular icon, the EBI system may perform a command to execute the icon stared by the user.
- FIG. 7 is a diagram illustrating an embodiment of data acquired according to an EBC interface according to an embodiment of the present invention.
- the EBI system may simultaneously acquire data regarding the gaze and the brain wave of the user through the EBC interface.
- the EBI system may further obtain data regarding the user's iris while providing an EBC interface.
- the data thus obtained can be processed by some algorithm.
- EEG since the EEG patterns are different according to cognitive states such as concentration, decentralization, and search for each user, there is a need to process data through a specific algorithm to more clearly distinguish EEG according to each cognitive state.
- FIG 8 illustrates an EBI system for performing gaze calibration according to an embodiment of the present invention.
- the EBI system assumes that the eye pupil is located at a specific coordinate (Xs, Ys) when the user looks at a specific point (Xp, Yp) on the screen, thereby multivariate linear regression between the two spaces. It can be inferred from the back.
- the EBI system may instruct the user to look at a specific point (Xp, Yp) on the screen via the EBC interface.
- the EBI system may obtain the gaze-tracking image of the user through the gaze tracking unit, and obtain the gaze coordinates (Xs, Ys) of the user from the captured image.
- the gaze coordinate of the user may be a relative coordinate determined based on the center of the eye (or the pupil of the eye).
- the EBI system may map a gaze coordinate of the user with a specific point on the screen.
- the EBI system may map a gaze coordinate and a specific point on the screen by using Equation 1 below.
- the EBI system may further use data regarding the head position of the user (using a position marker unit, camera unit and / or sensor unit) to obtain a more accurate eye gaze coordinate of the user.
- a position marker unit, camera unit and / or sensor unit used to obtain a more accurate eye gaze coordinate of the user.
- the EBI system can additionally obtain data about the user's head position in order to more accurately detect the user's gaze position, which can be used as additional data to accurately track the user's gaze. .
- the EBI system may additionally acquire the iris pattern data of the user while tracking the eyes of the user, which will be described later with reference to FIG. 9.
- FIG. 9 illustrates an EBI system for acquiring an iris pattern of a user according to an embodiment of the present invention.
- the EBI system may not only track the gaze of the user in real time through the gaze tracking unit, but also acquire an iris pattern of the user. Like the fingerprint, since the iris pattern is different for each user, the EBI system can use the iris pattern as user authentication information. For example, the EBI system may utilize the iris pattern as various user authentication information such as user login authentication information, payment authentication information, and security information.
- the EBI system may set an image of the iris region as an ROI (Region Of Interest) among the infrared images of the user's eyes obtained using the eye tracking unit, and separate them separately.
- the EBI system may separate the separated ROI image into a plurality of images, and then arrange the separated plurality of images in one direction.
- the EBI system may perform encoding operation to convert images listed in one direction into one two-dimensional image (for example, two-dimensional barcode or QR code), thereby obtaining a user-specific iris pattern.
- the EBI system may acquire an iris pattern by using one infrared image, but in order to obtain a more accurate iris pattern of the user, the EBI system may acquire an iris pattern by combining infrared images of gazes facing various directions. Can be.
- the iris pattern of the user becomes less accurate as more areas are covered by eyelids, eye angles, and light reflections. Accordingly, the EBI system may acquire infrared images of the user's eye for various gaze directions, obtain an iris pattern from each image, and combine the obtained iris patterns to obtain one iris pattern.
- the EBI system can distinguish the user's iris pattern with a high probability.
- 10 is a flowchart illustrating an EBI system for classifying brain waves of a user according to an embodiment of the present invention.
- 11 to 13 show data obtained by performing specific steps of the flowchart.
- the EBI system may acquire raw data regarding brain waves of a user using an EEG sensing unit (S1010).
- the EBI system may instruct the user through a variety of cognitive states (for example, selection / search / concentration / rest) through the above-described EBC interface, sense the brain waves of each cognitive state, to obtain raw data .
- the low data of EEG regarding the cognitive states of selection / search was as shown in FIG. 11 (a).
- the raw data of the EEG regarding the cognitive state of rest was as shown in Fig. 12 (a).
- the brain wave in the search state has a sharp change than the brain wave in the selection state.
- FIG. 12 (a) it was found that the brainwaves in the resting state and the brainwaves in the concentrated state are difficult to clearly distinguish with the naked eye.
- An EEG is a signal formed by combining various sin wave-type signals, and has a characteristic distinguished according to a specific cognitive state in a frequency band. Therefore, in order to more clearly distinguish the EEG according to the cognitive state, the FFT transform may be performed on the raw data (S1020). In this case, Equation 2 below may be used.
- FIGS. 11B and 12B Graphs in which the raw data is FFT-converted are shown in FIGS. 11B and 12B.
- the brain waves in the search and selection states showed a large difference in the frequency band of about 0 to 10 Hz.
- FIG. 12 (b) the brain waves in the concentrated and rest states are about 10. The difference was large in the ⁇ 20Hz frequency band.
- the EBI system may extract a frequency amplitude for each frequency band of the EEG from the sample data converted into the frequency domain region (S1030).
- the frequency bands that can be extracted are ⁇ wave (0 ⁇ 4Hz), ⁇ wave (4 ⁇ 8Hz), ⁇ wave (8 ⁇ 13Hz) and ⁇ wave (13 ⁇ 30Hz). (8 ⁇ 10Hz), High ⁇ 10 ⁇ 13Hz, Low ⁇ (13 ⁇ 20Hz), High ⁇ (20 ⁇ 30Hz).
- the frequency size can be extracted for each of the divided bands, and the frequency size for each band can be applied to an algorithm for extracting the features of EEG.
- EEG shows different patterns for different users for the same stimulus. Therefore, in order to accurately process the EEG data for each user, a calibration process of the EEG for each user is required.
- an algorithm or an algorithm for setting a criterion for classifying the cognitive state of the EEG for extracting the frequency characteristics of the EEG according to the cognitive state for each user may be applied.
- Fisher's Ratio is used.
- Fisher's Ratio is a method of measuring the discriminative power between data groups, and the equation for obtaining this is as shown in Equation 3 below.
- m1 represents the mean of one data group among two groups
- m2 represents the mean of the remaining data groups
- v1 represents the variance of one data group
- v2 represents the variance of the other data groups.
- the average and the variance may be calculated using the frequency size extracted for each frequency band.
- m1 and v1 may correspond to an average and a variance of frequency magnitudes of the FFT-converted row data when FFT transforms the raw data in the seek (or concentrate) state
- m2 and v2 are the concentrated (or rest) states.
- FFT transforming the raw data it may correspond to the mean and the variance of the frequency magnitudes of the FFT-converted raw data, respectively. ).
- Fisher's Ratio can be used to measure the discriminative power between two standard distributions. More specifically, by using Fisher's Ratio to find a frequency band that maximizes the amplitude (amplitude) of the user's specific cognitive state (e.g., selection, concentration, rest, search, etc.) in each frequency band, It is possible to find an optimal frequency band for distinguishing a specific cognitive state for each user.
- specific cognitive state e.g., selection, concentration, rest, search, etc.
- the Fisher's Ratio of the EEG is compared for each frequency band, and two frequency band sizes (the highest and next higher Fisher's Ratio) of the EEG having the highest Fisher's Ratio among them may be selected as the characteristic frequency band for distinguishing the recognition state (S1050). . Fisher's Ratio extracts two characteristic frequency bands that greatly affect each recognition state. The larger the Fisher's Ratio, the higher the accuracy in distinguishing each recognition state.
- Fig. 11 (c) shows Fisher's Ratio calculated from brain waves in the selection / search state
- Fig. 12 (c) shows Fisher's Ratio calculated from brain waves in the concentrated / rest state.
- the selection / search state has been distinguished from each other in the frequency band of about 0 to 5 Hz and about 5 to 10 Hz.
- the concentrated / relaxed states have characteristics that are distinguished from each other in a frequency band of about 0 to 5 Hz and a frequency band of about 10 to 20 Hz.
- 11 (d) and 12 (d) are graphs representing the size of a feature frequency band extracted through Fisher's Ratio in a selection / search / concentration / rest recognition state in a two-dimensional region. 11 (d) and 12 (d), it was confirmed that data of the same cognitive state are collected at a specific position.
- the EBI system may apply a classification model to determine which group the newly acquired data belongs to (S1060). That is, the EBI system may apply a classification model for determining whether the newly acquired EEG belongs to which cognitive state.
- the EBI system may apply SVM (Support Vector Machine) as the classification model. SVM is considered to have better generalization capability and performance than other classification models.
- the EBI system may distinguish (or classify) newly acquired EEG data in real time through the SVM for each cognitive state based on a feature acquired using Fisher's Ratio (S1070).
- the Fisher's Ratio and SVM techniques which extract the features of the frequency band, were able to distinguish the brain wave's cognitive state with about 80% accuracy.
- specific criteria and methods for calibrating a user's brain waves have not been established, and thus the accuracy of controlling a device using only the user's brain waves has been reduced.
- the cognitive state of the user EEG can be distinguished more accurately through the calibration method proposed by the present invention, the user can precisely control the device according to the user's intention only with the EEG.
- Each step of the flowchart of FIG. 10 may be performed by at least one device included in the EBI system, respectively.
- the EBI system includes one EBI device
- the steps of the flowchart of FIG. 10 may be performed by the EBI device.
- the EBI system includes a slave device and a host device
- some of the steps of the flowchart of FIG. 10 may be performed by the slave device, and the remaining steps may be performed by the host device.
- FIG. 13 is a diagram illustrating an embodiment of a recalibration method according to an embodiment of the present invention.
- the EBI system may map / classify data on newly acquired gaze and brain waves based on the calibration result, and perform various commands corresponding to the mapping / classification state. For example, the EBI system may map a user's gaze and a specific icon on the screen based on the calibration result. In addition, when the EBI system additionally acquires EEG data classified into a concentrated (or selected) state while looking at the icon, the EBI system may perform a command for selecting and executing the icon.
- the EBI system performs a mapping / classification operation of the newly acquired data based on the calibration result, and performs a command corresponding to the mapped / classified data.
- the environment at the time of calibration may be different from the current environment, or the accuracy of the calibration result may be reduced due to the change of the user or the user's environment.
- the EBI system needs to recalibrate (recalibration).
- Recalibration of an EBI system can be triggered in various embodiments.
- the recalibration of the EBI system can be triggered directly from the user.
- the EBI system may perform recalibration when it receives a user input for commanding recalibration.
- the user input may represent various types of input such as a voice, a touch, a gesture, a motion, and a motion of the user.
- the recalibration of the EBI system may be automatically triggered by measuring the stress index of the user. If the device operates in accordance with the EEG and the user's intention (in case of malfunction), the stress index of the user may increase. Therefore, when the stress index of the user is outside the predetermined threshold TH range, the EBI system may determine that recalibration is necessary and perform recalibration.
- Beta wave and gamma wave among the brain waves of the user are known to be associated with the stress index. Accordingly, the EBI system may measure gamma and beta waves of the user EEG in real time, and perform recalibration when a specific wave is out of a predetermined threshold range.
- the EBI system can measure, in real time, vital signals known to be associated with the stress index, such as heart rate, blood pressure, and trigger recalibration based on the measurement results.
- the EBI system will provide the EBC interface back to the user for recalibration.
- FIG 14 illustrates various applications of an EBI system in accordance with an embodiment of the present invention.
- the EBI system may be applied to various technical fields such as a drone control technology field, a home network technology field, an education field, a portable device technology field, a vehicle control technology field, an entertainment field, and the like.
- the host device may be the drone 140-1 and the slave device may be the wearable device 100.
- the user wears the slave device to control the drone through brain waves and gaze.
- the wearable device 100 when the wearable device 100 is in the form of a headset that can be worn on the user's head, the user may control the movement of the drone 140-1 through the head position.
- the drone When the user moves the head forward / backward / left / right, the drone may move forward / backward / left / right according to the movement of the user's head.
- the moving speed of the drone 140-1 when the user concentrates while looking at the drone 140-1 while wearing the wearable device 100, the moving speed of the drone 140-1 may increase, and the user may move the drone 140-1. If you look and rest, the drone 140-1 may stop at a point without moving.
- the drone 140-1 may operate based on various biosignals of the user.
- EBI systems can be applied to vehicle control technology.
- the EBI system may be applied to a technical field for controlling various vehicles such as a car, an airplane, and a bicycle.
- the vehicle 140-4 may be a host device, and the wearable device 100 worn on a user's body may be used. May be a slave device.
- the various home devices 140-2 located in the home may be a host device, and the wearable device 140-4 wearable to the user's body may be a slave device. Can be.
- the user may simply control the home devices by looking at the specific home device 140-2 while wearing the wearable device 140-4 and giving a specific command through an EEG.
- the light bulb 140-2 may be turned on or off.
- the various educational devices 140-3 may be a host device, and the wearable device 100 wearable on the user's body may be a slave device.
- the EBI system can measure the concentration of the user, and can track in real time how much the user is currently focused. At this time, the EBI system may help to improve the learning efficiency by recommending to re-learn later part of the learning at a time when the concentration is less concentrated.
- the EBI system may be applied to various technical fields.
- the EBI system may be applied to various fields to which a control technique using a user's biosignal may be applied, and is not limited to the above-described embodiments.
- the EBI system when only one of the EEG and the gaze is required as the control signal, the EBI system may perform calibration only on the required EEG or the gaze. That is, according to the embodiment, the EBI system may calibrate the EEG and the eye at the same time, or may perform the calibration for only one of the EEG and the eye.
- FIG. 15 is a flowchart illustrating a control method of an EBI system according to an embodiment of the present invention.
- this flowchart the descriptions of the above-described embodiments can be equally applied. Therefore, hereinafter, description overlapping with the above description will be omitted.
- the EBI system may provide an EBC interface (S1510). More specifically, the EBI system can provide an EBC interface for simultaneously calibrating the user's eye and brain waves.
- the EBC interface may include at least one visual object, and may instruct the user to stare with a specific recognition state for the visual object.
- the EBC interface may be provided to the user in various embodiments, as described above with respect to FIG. 6.
- the EBI system may acquire the gaze and the brain wave of the user (S1520). More specifically, the EBI system may acquire the gaze and the EEG of the user for the EBC interface using the EEG sensing unit and the gaze tracking unit.
- the EBI system may map the user's gaze with the visual object provided by the EBC interface (S1530).
- the EBI system may mutually map the position coordinates of the visual object and the gaze coordinates of the user.
- the EBI system may map the position of the visual object and the gaze of the user through multivariate linear regression, and the like will be described with reference to FIG. 8.
- the EBI system may map the brain wave of the user to a specific cognitive state indicated by the EBC interface (S1540).
- the EBI system may set the classification criteria for classifying the specific cognitive state by acquiring the raw data for the specific cognitive state and processing the acquired data through a predetermined algorithm.
- An embodiment of a preset algorithm for setting classification criteria is as described above with reference to FIG. 10.
- steps S1530 and S1540 may be changed, and a new step may be added or some steps may be deleted according to an embodiment.
- the EBI system may acquire an iris image from the user's eyes, and may code the obtained iris image and use it as user authentication information. This has been described above with reference to FIG. 9.
- Each step of this flowchart may be performed by at least one device included in an EBI system. If the EBI system includes one EBI device, the steps shown in this flowchart may be performed by one EBI device. If the EBI system includes a slave device and a host device, some of the steps shown in this flowchart may be performed by the slave device and the others by the host device.
- steps S1510, S1530, and S1540 may be performed by the host device, and step S1520 may be performed by the slave device.
- the host device may receive (or request and receive) data as a result of performing step S1520 from the slave device, and perform steps S1530 and S1540 based on the received data.
- the subject performing each step of the flowchart may be changed flexibly, and data or signals between devices may be changed to perform each step. Can be sent / received. Therefore, in the present specification, when the EBI system is composed of a plurality of devices, it may be considered that the data required for performing a specific step is transmitted / received between the devices even if the EBI system is composed of a plurality of devices.
- the drawings are divided and described, but the embodiments described in each drawing may be merged to implement a new embodiment.
- the display device is not limited to the configuration and method of the embodiments described as described above, the above embodiments are configured by selectively combining all or some of the embodiments so that various modifications can be made May be
Abstract
Description
Claims (20)
- 시선(eye) 및 뇌파에 기초하여 제어되는 EBI(Eye-Brain Interface) 시스템의 캘리브레이션 하는 방법에 있어서,상기 시선 및 뇌파를 함께 캘리브레이션하기 위한 EBC(Eye-Brain Calibration) 인터페이스를 제공하는 단계; 로서,상기 EBC 인터페이스는 비주얼 오브젝트를 포함하며, 사용자에게 상기 비주얼 오브젝트를 특정 인지 상태로 응시할 것을 지시함,상기 EBC 인터페이스에 포함된 비주얼 오브젝트에 대한 상기 사용자의 시선 및 뇌파를 획득하는 단계;상기 비주얼 오브젝트와 상기 사용자의 시선을 맵핑하는 단계; 및상기 사용자에게 지시한 특정 인지 상태와 상기 사용자의 뇌파를 맵핑하는 단계; 를 포함하는, EBI 시스템의 캘리브레이션 방법.
- 제 1 항에 있어서,상기 사용자의 시선을 맵핑하는 단계는,상기 비주얼 오브젝트의 화면 상의 위치 좌표와 상기 사용자 시선의 위치 좌표를 상호 맵핑하는 단계인, EBI 시스템의 캘리브레이션 방법.
- 제 1 항에 있어서,상기 EBC 인터페이스는 제1 인지 상태를 지시하는 제1 비주얼 오브젝트, 및 제2 인지 상태를 지시하는 제2 비주얼 오브젝트를 순차적으로 및/또는 교대로 제공하는, EBI 시스템의 캘리브레이션 방법.
- 제 3 항에 있어서,상기 제1 인지 상태는 집중 및 선택 중 적어도 하나를 포함하는 인지 상태이며, 상기 제2 인지 상태는 휴식 및 탐색 중 적어도 하나를 포함하는 인지 상태인, EBI 시스템의 캘리브레이션 방법.
- 제 3 항에 있어서,상기 사용자의 뇌파를 맵핑하는 단계는,상기 제1 인지 상태에서의 뇌파에 대한 제1 로우 데이터(Raw data) 및 상기 제2 인지 상태에서의 뇌파에 대한 제2 로우 데이터를 획득하는 단계;상기 제1 및 제2 로우 데이터를 주파수 변환하는 단계; 및상기 주파수 변환된 제1 및 제2 로우 데이터의 주파수 특성에 기초하여 상기 제1 및 제2 인지 상태의 분류 기준을 설정하는 단계; 를 포함하는, EBI 시스템의 캘리브레이션 방법.
- 제 5 항에 있어서,상기 제1 및 제2 인지 상태의 분류 기준을 설정하는 단계는,상기 주파수 변환된 제1 및 제2 로우 데이터에서 기설정된 범위의 주파수 대역별 주파수 크기(amplitude)를 추출하는 단계;상기 추출한 주파수 크기를 이용하여 상기 주파수 대역별 Fisher’s Ratio를 획득하는 단계;최상위 Fisher’s Ratio를 갖는 제1 주파수 대역 및 차상위 Fisher’s Ratio를 갖는 제2 주파수 대역을 선택하는 단계; 및상기 제1 및 제2 주파수 대역을 상기 제1 및 제2 인지 상태의 분류 기준으로 설정하는 단계; 를 포함하는, EBI 시스템의 캘리브레이션 방법.
- 제 6 항에 있어서,상기 Fisher’s Ratio는,상기 주파수 변환된 제1 로우 데이터에서 상기 주파수 크기의 평균 및 분산, 및 상기 주파수 변환된 제2 로우 데이터에서의 상기 주파수 크기의 평균 및 분산에 기초하여 산출되는 값인, EBI 시스템의 캘리브레이션 방법.
- 제 5 항에 있어서,상기 기설정된 범위의 주파수 대역은, 뇌파의 δ파 대역, θ파 대역, α파 대역, 또는 β파 대역에 해당하는, EBI 시스템의 캘리브레이션 방법.
- 제 1 항에 있어서,상기 EBC 인터페이스는 상기 비주얼 오브젝트가 깜빡이는 주파수를 조절함으로써 상기 사용자의 뇌파를 특정 주파수 대역으로 유도하는, EBI 시스템의 캘리브레이션 방법.
- 제 9 항에 있어서,상기 EBC 인터페이스는,상기 비주얼 오브젝트가 깜빡이는 주파수를 약 8~13Hz 범위로 조정함으로써 상기 사용자의 뇌파를 알파파 범위로 유도하고,상기 비주얼 오브젝트가 깜빡이는 주파수를 약 13~30Hz 범위로 조정함으로써 상기 사용자의 뇌파를 베타파 범위로 유도하는, EBI 시스템의 캘리브레이션 방법.
- 제 1 항에 있어서,상기 사용자의 시선으로부터 홍채 이미지를 획득하는 단계; 및상기 홍채 이미지를 코드화하는 단계; 를 더 포함하는, EBI 시스템의 캘리브레이션 방법.
- 제 11 항에 있어서,상기 홍채 이미지를 코드화하는 단계는,상기 획득한 홍채 이미지를 복수의 이미지들로 분리하는 단계;상기 분리한 복수의 이미지들을 한 방향으로 나열하는 단계; 및상기 한 방향으로 나열된 이미지들을 하나의 2차원 이미지로 전환하는 단계; 를 포함하는, EBI 시스템의 캘리브레이션 방법.
- 시선(eye) 및 뇌파를 측정하는 슬레이브 디바이스에 있어서,사용자의 시선을 추적하는 시선 추적 유닛;상기 사용자의 뇌파를 센싱하는 뇌파 센싱 유닛;호스트 디바이스와 통신을 수행하는 통신 유닛; 및상기 시선 추적 유닛, 뇌파 센싱 유닛 및 통신 유닛을 제어하는 프로세서; 를 포함하되,상기 호스트 디바이스는 상기 시선 및 뇌파를 동시에 캘리브레이션하기 위한 EBC(Eye-Brain Calibration) 인터페이스를 제공하는 디바이스로서, 상기 EBC 인터페이스는 비주얼 오브젝트를 포함하며, 사용자에게 상기 비주얼 오브젝트를 특정 인지 상태로 응시할 것을 지시함,상기 프로세서는,상기 호스트 디바이스로부터 캘리브레이션 개시 신호를 수신한 경우, 상기 사용자의 시선 및 뇌파를 함께 획득하고,상기 사용자의 시선 및 뇌파를 상기 호스트 디바이스로 전송하는, 슬레이브 디바이스.
- 시선 및 뇌파에 기초하여 제어되는 호스트 디바이스에 있어서,이미지를 디스플레이하는 디스플레이 유닛;슬레이브 디바이스와 통신을 수행하는 통신 유닛; 및상기 디스플레이 유닛 및 통신 유닛을 제어하는 프로세서; 를 포함하고,상기 프로세서는,상기 시선 및 뇌파를 동시에 캘리브레이션하기 위한 EBC(Eye-Brain Calibration) 인터페이스를 제공하되,상기 EBC 인터페이스는 비주얼 오브젝트를 포함하며, 사용자에게 상기 비주얼 오브젝트를 특정 인지 상태로 응시할 것을 지시함,상기 슬레이브 디바이스로부터 상기 사용자의 시선 및 뇌파를 요청 및 수신하고,상기 비주얼 오브젝트와 상기 사용자의 시선을 맵핑하고, 및상기 사용자에게 지시한 특정 인지 상태와 상기 사용자의 뇌파를 맵핑하는, 호스트 디바이스.
- 제 14 항에 있어서,상기 프로세서는,상기 사용자의 시선을 맵핑하는 경우,상기 비주얼 오브젝트의 화면 상의 위치 좌표와 상기 사용자 시선의 위치 좌표를 상호 맵핑하는, 호스트 디바이스.
- 제 14 항에 있어서,상기 EBC 인터페이스는 제1 인지 상태를 지시하는 제1 비주얼 오브젝트, 및 제2 인지 상태를 지시하는 제2 비주얼 오브젝트를 순차적으로 및/또는 교대로 제공하는, 호스트 디바이스.
- 제 16 항에 있어서,상기 제1 인지 상태는 집중 또는 선택의 인지 상태이며, 상기 제2 인지 상태는 휴식 또는 탐색의 인지 상태인, 호스트 디바이스.
- 제 16 항에 있어서,상기 프로세서는,상기 사용자의 뇌파를 맵핑하는 경우,상기 제1 인지 상태에서의 뇌파에 대한 제1 로우 데이터(Raw data) 및 상기 제2 인지 상태에서의 뇌파에 대한 제2 로우 데이터를 획득하고,상기 제1 및 제2 로우 데이터를 주파수 변환하고,상기 주파수 변환된 제1 및 제2 로우 데이터에서 기설정된 범위의 주파수 대역별 주파수 크기를 추출하고,상기 추출한 주파수 크기를 이용하여 상기 주파수 대역별 Fisher’s Ratio를 획득하고,최상위 Fisher’s Ratio를 갖는 제1 주파수 대역 및 차상위 Fisher’s Ratio를 갖는 제2 주파수 대역을 선택하고, 및상기 제1 및 제2 주파수 대역을 상기 제1 및 제2 인지 상태의 분류 기준으로 설정하는, 호스트 디바이스.
- 제 16 항에 있어서,상기 프로세서는,상기 사용자의 뇌파를 실시간으로 획득하고,상기 실시간으로 획득한 사용자의 뇌파를 상기 분류 기준에 따라 실시간으로 분류하는, 호스트 디바이스.
- 제 14 항에 있어서,상기 EBC 인터페이스는 상기 비주얼 오브젝트가 깜빡이는 주파수를 조절함으로써 상기 사용자의 뇌파를 특정 주파수 대역으로 유도하는, 호스트 디바이스.
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580085723.9A CN108700931A (zh) | 2015-12-17 | 2015-12-17 | 眼睛-大脑接口(ebi)系统及其控制方法 |
PCT/KR2015/013894 WO2017104869A1 (ko) | 2015-12-17 | 2015-12-17 | 아이 브레인 인터페이스(ebi) 장치 및 그 제어 방법 |
JP2018551726A JP6664512B2 (ja) | 2015-12-17 | 2015-12-17 | アイブレインインターフェースシステムのキャリブレーション方法、及びシステム内のスレーブデバイス、ホストデバイス |
US15/740,298 US10481683B2 (en) | 2015-12-17 | 2015-12-17 | Eye-brain interface (EBI) system and method for controlling same |
EP15910792.9A EP3392739B1 (en) | 2015-12-17 | 2015-12-17 | Eye-brain interface (ebi) system and method for controlling same |
US16/601,418 US10860097B2 (en) | 2015-12-17 | 2019-10-14 | Eye-brain interface (EBI) system and method for controlling same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/KR2015/013894 WO2017104869A1 (ko) | 2015-12-17 | 2015-12-17 | 아이 브레인 인터페이스(ebi) 장치 및 그 제어 방법 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/740,298 A-371-Of-International US10481683B2 (en) | 2015-12-17 | 2015-12-17 | Eye-brain interface (EBI) system and method for controlling same |
US16/601,418 Continuation US10860097B2 (en) | 2015-12-17 | 2019-10-14 | Eye-brain interface (EBI) system and method for controlling same |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017104869A1 true WO2017104869A1 (ko) | 2017-06-22 |
Family
ID=59056842
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2015/013894 WO2017104869A1 (ko) | 2015-12-17 | 2015-12-17 | 아이 브레인 인터페이스(ebi) 장치 및 그 제어 방법 |
Country Status (5)
Country | Link |
---|---|
US (2) | US10481683B2 (ko) |
EP (1) | EP3392739B1 (ko) |
JP (1) | JP6664512B2 (ko) |
CN (1) | CN108700931A (ko) |
WO (1) | WO2017104869A1 (ko) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019053399A1 (en) * | 2017-09-13 | 2019-03-21 | Sandeep Kumar Chintala | SYSTEM AND METHOD FOR CONTACTLESS CONTROL OF A TECHNICAL FIELD OF APPARATUS |
JP2019129913A (ja) * | 2018-01-29 | 2019-08-08 | 富士ゼロックス株式会社 | 情報処理装置、情報処理システム及びプログラム |
JP2019129914A (ja) * | 2018-01-29 | 2019-08-08 | 富士ゼロックス株式会社 | 情報処理装置、情報処理システム及びプログラム |
US10481683B2 (en) | 2015-12-17 | 2019-11-19 | Looxid Labs Inc. | Eye-brain interface (EBI) system and method for controlling same |
US10606260B2 (en) * | 2017-09-27 | 2020-03-31 | Intel IP Corporation | Ocular navigation of unmanned aerial vehicle |
CN111542800A (zh) * | 2017-11-13 | 2020-08-14 | 神经股份有限公司 | 具有对于高速、精确和直观的用户交互的适配的大脑-计算机接口 |
CN111629653A (zh) * | 2017-08-23 | 2020-09-04 | 神经股份有限公司 | 具有高速眼睛跟踪特征的大脑-计算机接口 |
JP2021529368A (ja) * | 2018-06-21 | 2021-10-28 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | 理学療法用の仮想環境 |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11317861B2 (en) | 2013-08-13 | 2022-05-03 | Sync-Think, Inc. | Vestibular-ocular reflex test and training system |
US9958939B2 (en) * | 2013-10-31 | 2018-05-01 | Sync-Think, Inc. | System and method for dynamic content delivery based on gaze analytics |
JP6525010B2 (ja) * | 2014-08-05 | 2019-06-05 | ソニー株式会社 | 情報処理装置及び情報処理方法、並びに画像表示システム |
US10359842B2 (en) * | 2014-09-16 | 2019-07-23 | Ricoh Company, Limited | Information processing system and information processing method |
CN105528084A (zh) * | 2016-01-21 | 2016-04-27 | 京东方科技集团股份有限公司 | 一种显示控制装置及其控制方法、显示控制系统 |
WO2018164960A1 (en) * | 2017-03-07 | 2018-09-13 | Cornell University | Sensory evoked response based attention evaluation systems and methods |
US10877647B2 (en) * | 2017-03-21 | 2020-12-29 | Hewlett-Packard Development Company, L.P. | Estimations within displays |
CN108733203A (zh) * | 2017-04-20 | 2018-11-02 | 上海耕岩智能科技有限公司 | 一种眼球追踪操作的方法和装置 |
CN107105369A (zh) * | 2017-06-29 | 2017-08-29 | 京东方科技集团股份有限公司 | 声音定向切换装置及显示系统 |
CN109938727A (zh) * | 2017-12-20 | 2019-06-28 | 中国科学院深圳先进技术研究院 | 非人灵长类动物三维视觉刺激实验系统和方法 |
US10861215B2 (en) * | 2018-04-30 | 2020-12-08 | Qualcomm Incorporated | Asynchronous time and space warp with determination of region of interest |
CN109464239A (zh) * | 2019-01-09 | 2019-03-15 | 浙江强脑科技有限公司 | 基于脑波控制的智能轮椅 |
CN109846477B (zh) * | 2019-01-29 | 2021-08-06 | 北京工业大学 | 一种基于频带注意力残差网络的脑电分类方法 |
USD927005S1 (en) * | 2019-02-28 | 2021-08-03 | Helius Medical, Inc | Non-invasive neurostimulation device |
USD916300S1 (en) * | 2019-02-28 | 2021-04-13 | Helius Medical, Inc | Non-invasive neurostimulation device |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US10854012B1 (en) * | 2019-05-29 | 2020-12-01 | Dell Products, L.P. | Concealing loss of distributed simultaneous localization and mapping (SLAM) data in edge cloud architectures |
EP3991013A1 (en) * | 2019-06-28 | 2022-05-04 | Sony Group Corporation | Method, computer program and head-mounted device for triggering an action, method and computer program for a computing device and computing device |
CN111290580B (zh) * | 2020-02-13 | 2022-05-31 | Oppo广东移动通信有限公司 | 基于视线追踪的校准方法及相关装置 |
US20230065296A1 (en) * | 2021-08-30 | 2023-03-02 | Facebook Technologies, Llc | Eye-tracking using embedded electrodes in a wearable device |
WO2024040360A1 (zh) * | 2022-08-26 | 2024-02-29 | 吕馨 | Brain reality |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120124772A (ko) * | 2011-05-04 | 2012-11-14 | 경북대학교 산학협력단 | 사용자 집중도 분석장치 및 방법 |
KR20130015488A (ko) * | 2011-08-03 | 2013-02-14 | 동국대학교 산학협력단 | 인터페이스 시스템 및 방법 |
US20130307771A1 (en) * | 2012-05-18 | 2013-11-21 | Microsoft Corporation | Interaction and management of devices using gaze detection |
EP2685351A1 (en) * | 2012-07-10 | 2014-01-15 | Thomson Licensing | Method for calibration free gaze tracking using low cost camera |
US20140347265A1 (en) * | 2013-03-15 | 2014-11-27 | Interaxon Inc. | Wearable computing apparatus and method |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004152046A (ja) | 2002-10-31 | 2004-05-27 | Oki Electric Ind Co Ltd | 利用者認証方法および生体情報記録装置、利用者認証装置、利用者認証システム並びにチケット発行装置 |
JP2006136464A (ja) | 2004-11-11 | 2006-06-01 | Nec Commun Syst Ltd | 携帯電話機及び携帯電話機用プログラム |
JP4537901B2 (ja) * | 2005-07-14 | 2010-09-08 | 日本放送協会 | 視線測定装置および視線測定プログラム、ならびに、視線校正データ生成プログラム |
WO2008056492A1 (fr) | 2006-11-06 | 2008-05-15 | Panasonic Corporation | Dispositif de réglage d'un procédé d'identification d'ondes cérébrales et procédé |
WO2008059878A1 (fr) * | 2006-11-15 | 2008-05-22 | Panasonic Corporation | Dispositif d'ajustement pour un procédé d'identification d'ondes cérébrales, procédé d'ajustement et programme informatique |
CN101681201B (zh) * | 2008-01-25 | 2012-10-17 | 松下电器产业株式会社 | 脑波接口系统、脑波接口装置、方法 |
CN102542243A (zh) * | 2010-12-17 | 2012-07-04 | 北京理工大学 | 一种基于lbp图像和分块编码的虹膜特征提取方法 |
WO2012133185A1 (ja) | 2011-03-31 | 2012-10-04 | 独立行政法人理化学研究所 | 脳波解析装置、脳波解析方法、プログラム、及び記録媒体 |
US20120257035A1 (en) * | 2011-04-08 | 2012-10-11 | Sony Computer Entertainment Inc. | Systems and methods for providing feedback by tracking user gaze and gestures |
KR20140011204A (ko) * | 2012-07-18 | 2014-01-28 | 삼성전자주식회사 | 컨텐츠 제공 방법 및 이를 적용한 디스플레이 장치 |
US9699433B2 (en) * | 2013-01-24 | 2017-07-04 | Yuchen Zhou | Method and apparatus to produce re-focusable vision with detecting re-focusing event from human eye |
WO2015047032A1 (ko) | 2013-09-30 | 2015-04-02 | 삼성전자 주식회사 | 생체 신호에 기초하여 컨텐츠를 처리하는 방법, 및 그에 따른 디바이스 |
EP3392739B1 (en) | 2015-12-17 | 2022-04-20 | Looxid Labs Inc. | Eye-brain interface (ebi) system and method for controlling same |
-
2015
- 2015-12-17 EP EP15910792.9A patent/EP3392739B1/en active Active
- 2015-12-17 WO PCT/KR2015/013894 patent/WO2017104869A1/ko active Application Filing
- 2015-12-17 CN CN201580085723.9A patent/CN108700931A/zh active Pending
- 2015-12-17 JP JP2018551726A patent/JP6664512B2/ja not_active Expired - Fee Related
- 2015-12-17 US US15/740,298 patent/US10481683B2/en not_active Expired - Fee Related
-
2019
- 2019-10-14 US US16/601,418 patent/US10860097B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20120124772A (ko) * | 2011-05-04 | 2012-11-14 | 경북대학교 산학협력단 | 사용자 집중도 분석장치 및 방법 |
KR20130015488A (ko) * | 2011-08-03 | 2013-02-14 | 동국대학교 산학협력단 | 인터페이스 시스템 및 방법 |
US20130307771A1 (en) * | 2012-05-18 | 2013-11-21 | Microsoft Corporation | Interaction and management of devices using gaze detection |
EP2685351A1 (en) * | 2012-07-10 | 2014-01-15 | Thomson Licensing | Method for calibration free gaze tracking using low cost camera |
US20140347265A1 (en) * | 2013-03-15 | 2014-11-27 | Interaxon Inc. | Wearable computing apparatus and method |
Non-Patent Citations (1)
Title |
---|
See also references of EP3392739A4 * |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10481683B2 (en) | 2015-12-17 | 2019-11-19 | Looxid Labs Inc. | Eye-brain interface (EBI) system and method for controlling same |
US10860097B2 (en) | 2015-12-17 | 2020-12-08 | Looxid Labs, Inc. | Eye-brain interface (EBI) system and method for controlling same |
CN111629653A (zh) * | 2017-08-23 | 2020-09-04 | 神经股份有限公司 | 具有高速眼睛跟踪特征的大脑-计算机接口 |
JP2020532031A (ja) * | 2017-08-23 | 2020-11-05 | ニューラブル インコーポレイテッド | 高速視標追跡機能を有する脳−コンピュータインタフェース |
US11972049B2 (en) | 2017-08-23 | 2024-04-30 | Neurable Inc. | Brain-computer interface with high-speed eye tracking features |
WO2019053399A1 (en) * | 2017-09-13 | 2019-03-21 | Sandeep Kumar Chintala | SYSTEM AND METHOD FOR CONTACTLESS CONTROL OF A TECHNICAL FIELD OF APPARATUS |
US10606260B2 (en) * | 2017-09-27 | 2020-03-31 | Intel IP Corporation | Ocular navigation of unmanned aerial vehicle |
CN111542800A (zh) * | 2017-11-13 | 2020-08-14 | 神经股份有限公司 | 具有对于高速、精确和直观的用户交互的适配的大脑-计算机接口 |
JP2021502659A (ja) * | 2017-11-13 | 2021-01-28 | ニューラブル インコーポレイテッド | 高速、正確及び直観的なユーザ対話のための適合を有する脳−コンピュータインターフェース |
JP2019129914A (ja) * | 2018-01-29 | 2019-08-08 | 富士ゼロックス株式会社 | 情報処理装置、情報処理システム及びプログラム |
JP2019129913A (ja) * | 2018-01-29 | 2019-08-08 | 富士ゼロックス株式会社 | 情報処理装置、情報処理システム及びプログラム |
US11412974B2 (en) | 2018-01-29 | 2022-08-16 | Agama-X Co., Ltd. | Information processing apparatus, information processing system, and non-transitory computer readable medium |
JP2021529368A (ja) * | 2018-06-21 | 2021-10-28 | インターナショナル・ビジネス・マシーンズ・コーポレーションInternational Business Machines Corporation | 理学療法用の仮想環境 |
JP7289082B2 (ja) | 2018-06-21 | 2023-06-09 | インターナショナル・ビジネス・マシーンズ・コーポレーション | 理学療法用の仮想環境 |
Also Published As
Publication number | Publication date |
---|---|
CN108700931A (zh) | 2018-10-23 |
JP2019506691A (ja) | 2019-03-07 |
US10481683B2 (en) | 2019-11-19 |
JP6664512B2 (ja) | 2020-03-13 |
EP3392739A4 (en) | 2019-08-28 |
US10860097B2 (en) | 2020-12-08 |
US20200057495A1 (en) | 2020-02-20 |
EP3392739B1 (en) | 2022-04-20 |
US20180196511A1 (en) | 2018-07-12 |
EP3392739A1 (en) | 2018-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2017104869A1 (ko) | 아이 브레인 인터페이스(ebi) 장치 및 그 제어 방법 | |
US10820850B2 (en) | Systems and methods for measuring reactions of head, eyes, eyelids and pupils | |
JP6929644B2 (ja) | 注視によるメディア選択及び編集のためのシステム並びに方法 | |
JP6184989B2 (ja) | 目の動きをモニターするバイオセンサ、コミュニケーター及びコントローラー並びにそれらの使用方法 | |
KR102029219B1 (ko) | 뇌 신호를 추정하여 사용자 의도를 인식하는 방법, 그리고 이를 구현한 헤드 마운트 디스플레이 기반 뇌-컴퓨터 인터페이스 장치 | |
KR101723841B1 (ko) | 아이 브레인 인터페이스(ebi) 장치 및 그 제어 방법 | |
US11081015B2 (en) | Training device, training method, and program | |
US10948988B1 (en) | Contextual awareness based on eye motion tracking by an eye-mounted system | |
WO2018080202A1 (ko) | 머리 착용형 디스플레이 장치 및 그의 제어 방법 | |
US20220293241A1 (en) | Systems and methods for signaling cognitive-state transitions | |
WO2018207959A1 (ko) | 이미지 처리 장치 및 방법 | |
TW201816545A (zh) | 虛擬實境頭戴式裝置 | |
CN113995416A (zh) | 用于显示眼镜中的用户界面的装置和方法 | |
US20170242482A1 (en) | Training device, corresponding area specifying method, and program | |
WO2016111421A1 (ko) | 눈 영상에 기반한 사용자 인터페이스를 제공하기 위한 방법, 시스템 및 비일시성의 컴퓨터 판독 가능한 기록 매체 | |
US20220236795A1 (en) | Systems and methods for signaling the onset of a user's intent to interact | |
US20220191296A1 (en) | Devices, systems, and methods for modifying features of applications based on predicted intentions of users | |
KR20220162566A (ko) | 사용자의 스트레스 완화를 위해 상호작용하는 가상동물을 제공하는 전자장치 및 이의 제어방법 | |
CN116830064A (zh) | 用于预测交互意图的系统和方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15910792 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2018551726 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015910792 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2015910792 Country of ref document: EP Effective date: 20180717 |