KR101728707B1 - Method and program for controlling electronic device by wearable glass device - Google Patents

Method and program for controlling electronic device by wearable glass device Download PDF

Info

Publication number
KR101728707B1
KR101728707B1 KR1020150042941A KR20150042941A KR101728707B1 KR 101728707 B1 KR101728707 B1 KR 101728707B1 KR 1020150042941 A KR1020150042941 A KR 1020150042941A KR 20150042941 A KR20150042941 A KR 20150042941A KR 101728707 B1 KR101728707 B1 KR 101728707B1
Authority
KR
South Korea
Prior art keywords
glass
wearable device
electronic device
user
control
Prior art date
Application number
KR1020150042941A
Other languages
Korean (ko)
Other versions
KR20160024733A (en
Inventor
한성철
엄정한
김진영
이경현
김대중
김석기
유철현
Original Assignee
넥시스 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 넥시스 주식회사 filed Critical 넥시스 주식회사
Priority to PCT/KR2015/007914 priority Critical patent/WO2016018063A2/en
Publication of KR20160024733A publication Critical patent/KR20160024733A/en
Application granted granted Critical
Publication of KR101728707B1 publication Critical patent/KR101728707B1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q2209/00Arrangements in telecontrol or telemetry systems
    • H04Q2209/40Arrangements in telecontrol or telemetry systems using a wireless architecture

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention relates to an indoor electronic device control method and a control program using a glass-like wearable device.
A method of controlling an indoor electronic device using a glass-type wearable device according to an embodiment of the present invention includes: a step (S100) of measuring a current indoor position of the glass-type wearable device; (S110) the user recognizes the azimuth or elevation angle of the glass-like wearable device and measures the direction of the user's gaze; (S120) of recognizing the electronic device located in the viewing direction at the measured current indoor position; Receiving a control command of the electronic device from a user (S130); According to another aspect of the present invention, there is provided a method for controlling an electronic device in a home, the method comprising the steps of: It is possible to solve the inconvenience of moving and controlling the motor.

Description

TECHNICAL FIELD [0001] The present invention relates to a method and a control program for an indoor electronic device using a glass-type wearable device,

The present invention relates to an indoor electronic device control method and a control program using a glass-type wearable device, and more particularly, to a method and a control program for an indoor electronic device using a glass-type wearable device, To a system or method for controlling an indoor electronic device.

Recently wearable devices are emerging. It has appeared in the form of glasses that are linked to smart phones, and some forms that can operate independently without a smartphone are also emerging.

Recently, with the emergence of Internet Of Things (IOT), there have been many methods of controlling electronic devices connected by a single wireless communication with a smart phone. However, a method of controlling indoor electronic devices using a smart phone is disadvantageous in that it is necessary to always have a smartphone so that electronic devices connected to wireless communication can be controlled. Also, when using a smartphone, a method of selecting an electronic device and a method of inputting a control command may be limited.

SUMMARY OF THE INVENTION It is an object of the present invention to provide a glass-like wearable device for selecting an electronic device to be controlled through a glass-like wearable device and inputting a control command to control an electronic device desired by a user, And an indoor electronic device control method and a control program using the device.

According to an embodiment of the present invention, there is provided a method of controlling an indoor electronic device using a glass-type wearable device, the method comprising: measuring a current indoor position of the glass-like wearable device; The wearable device measures a direction of a user's gaze, and the gaze direction is a direction of a face of a user wearing the glass-like wearable device; Wherein the indoor map includes information on the location of the electronic device in the specific location, the indoor map including information on the location of the electronic device in the specific direction, A control target judgment step; A step of the glass wearable device receiving a control command of the electronic device from the user; And transmitting the control command to the electronic device via wireless communication, wherein the glass-type wearable device transmits the control command to the electronic device via wireless communication.
According to an embodiment of the present invention, the glass wearable device acquires the gaze direction through the second camera, the gaze direction is a direction in which the pupil of the user gazes, and the second camera displays the pupil of the user The method according to claim 1, further comprising: a step of acquiring an image of a current direction of the room, wherein the camera is a camera for acquiring an image of a current direction, And calculates a direction in which the image is positioned.
According to an embodiment of the present invention, there is provided a method of controlling a wearable device, the method comprising: displaying at least one electronic device list, which is determined by the glass-like wearable device as a control target, And the glass-wearable device selecting a specific electronic device among the electronic devices in the list from the user.
According to an embodiment of the present invention, in the control command receiving step, the glass-wearable device recognizes at least one of eye flicker recognition, user's head movement pattern recognition, touch operation input, user's voice input recognition, The control command can be selected by one or more.
According to an embodiment of the present invention, the glass wearable device may receive the control result of the electronic device according to the control command and provide the control result to the user.
According to an embodiment of the present invention, the glass-type wearable device may receive an input method or an input pattern from a user; And setting the received input method or input pattern as a specific control command for a specific electronic device by the glass-wearable device according to a user's request.

delete

delete

delete

delete

delete

delete

delete

delete

delete

An electronic device control program for a glass-like wearable device according to another embodiment of the present invention is combined with hardware to execute the above-mentioned electronic device control method, and is stored in a medium.

According to the present invention as described above, the following various effects are obtained.

First, according to the present invention, it is possible to control an electronic device in a home at a remote location, thereby eliminating the inconvenience of moving and controlling the electronic device to be directly controlled by a user. For example, you can eliminate the inconvenience that you have to go to where the light switch is to turn off the light while lying on the bed.

Second, if each electronic device can be connected to wireless communication in the home, there is an advantage that it can be controlled using a glass-type wearable device without a remote control of each electronic device.

Third, the electronic device can be controlled by a simple operation such as a blinking pattern or a head movement pattern, and the user can select the electronic device to be controlled simply by watching the electronic device desired to be controlled. There is an effect that can be controlled. For example, if the user wishes to turn off the audio being reproduced while the user is lying on the bed, the user may gaze at the audio while wearing the glass-like wearable device and input an eye blinking pattern corresponding to audio off.

1 is an internal configuration diagram of a glass-type wearable device system according to an embodiment of the present invention.
2 is a flowchart of a method of controlling an indoor electronic device using a glass-type wearable device according to an embodiment of the present invention.
3 is a flowchart of a method of controlling an indoor electronic device using a glass-like wearable device by front image analysis according to an embodiment of the present invention.
4 is a flowchart of a method of controlling an indoor electronic device using a glass-like wearable device by voice command recognition according to an embodiment of the present invention.
5 is a perspective view of a wearable wearable device according to an embodiment of the present invention.
FIG. 6 is an exemplary diagram showing electronic devices and control commands recognized on a display unit of a glass-type wearable device according to an embodiment of the present invention.
7 is an internal configuration diagram of an indoor electronic device control system using a glass-type wearable device according to an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. To fully disclose the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.

Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.

FIG. 1 is an internal configuration diagram of a glass-type wearable device system according to an embodiment of the present invention. FIG. 2 is a schematic view illustrating an indoor electronic device Fig. 3 is a flowchart of a method of controlling an indoor electronic device using a glass-like wearable device by front image analysis according to an embodiment of the present invention. 4 is a flowchart of a method of controlling an indoor electronic device using a glass-like wearable device by voice command recognition according to an embodiment of the present invention. FIG. 5 is a perspective view of a glass-type wearable device according to an embodiment of the present invention, FIG. 6 is an illustration showing an electronic device and a control command recognized on a display unit of a glass-type wearable device according to an embodiment of the present invention to be. 7 is an internal configuration diagram of an indoor electronic device control system using a glass-type wearable device according to an embodiment of the present invention.

1 to 7 show a system 100, a user input unit 110, an application 111, a keyboard 112, a voice input unit 113, a touch pad 114, a GPS signal unit 115, A camera 120, a first camera 121, a second camera 122, a third camera 123, a sensing unit 130, a gyro sensor 131, an acceleration sensor 132, a pressure sensor 133, an iris recognition sensor 134, a heartbeat detection sensor 135, an electromyogram sensor 136, an information processing unit 210, a voice recognition unit 220, a situation evaluation module 230, a voice- A wireless communication unit 250, a memory 260, an interface unit 270, an output unit 300, a display unit 310, an acoustic output unit 320, an alarm unit 330, and a haptic module 340. A wireless access point 400 is shown.

1 is an internal configuration diagram of a glass-type wearable device system according to an embodiment of the present invention.

The wearable device system 100 includes a user input unit 110, an application 111, a keyboard 112, a voice input unit 113, a touch pad 114, a GPS signal unit 115 A short distance communication 116, a camera 120, a first camera 121, a second camera 122, a third camera 123, a sensing unit 130, a gyro sensor 131, an acceleration sensor A heart rate detection sensor 135, an electromyogram sensor 136, an information processing unit 210, a voice recognition unit 220, a situation evaluation module 230, A text conversion module 240, a wireless communication unit 250, a memory 260, an interface unit 270, an output unit 300, a display unit 310, an audio output unit 320, an alarm unit 330, And a haptic module 340, all of which are shown in FIG. The glass-like wearable device system 100 may further include other additional configurations.

The camera 120 is for inputting video signals or image signals, and may be provided in accordance with a configuration of the device. The camera 120 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 310. [ The image frame processed by the camera 120 may be stored in the memory 260 or transmitted to the outside through the wireless communication unit 250. When an image signal or a video signal is used as an input for information processing, the image signal and the video signal are transmitted to the control unit 210.

The camera unit 120 may include one or more cameras according to the direction or purpose of the image to be photographed. The first camera 121 is provided at one side of the glass-like wearable device so as to take an image of the front side. The second camera 122 may be provided on one side of the glass-like wearable device to obtain an image or an image in the eyeball direction. The third camera 123 is disposed behind or on the side of the glass-type wearable device, and can acquire a rearward or lateral image or an image.

The voice input unit 113 is for inputting voice signals and may include a microphone and the like. The microphone receives an external acoustic signal by a microphone in a communication mode, a recording mode, a voice recognition mode, and the like and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication unit and output when the voice data is in the call mode. A variety of noise canceling algorithms may be used to remove the noise generated by the microphone in receiving an external acoustic signal.

The user input unit 110 generates key input data that the user inputs for controlling the operation of the device. The user input unit 110 may include a key pad, a keyboard, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and a finger mouse. Particularly, when the touch pad has a mutual layer structure with the display unit 310 described later, it can be called a touch screen.

The sensing unit 130 senses the current state of the device such as the open / close state of the device, the position of the device, the presence or absence of the user, and generates a sensing signal for controlling the operation of the device. In addition, the sensing unit 130 may function as an input unit for receiving an input signal for information processing of a device, and may perform various sensing functions such as recognition of connection to an external device.

The sensing unit 130 may include a proximity sensor, a pressure sensor 133, a motion sensor, a fingerprint recognition sensor, an iris recognition sensor 134, a heartbeat detection sensor 135, a skin temperature sensor, And the like.

The proximity sensor makes it possible to detect the presence of an object to be approached or nearby, without mechanical contact. The proximity sensor can detect a nearby object by using the change of the alternating magnetic field or the change of the static magnetic field, or by using the change rate of the capacitance. The proximity sensor may be equipped with two or more sensors according to the configuration.

The pressure sensor 133 can detect whether or not pressure is applied to the device, the magnitude of the pressure, and the like. The pressure sensor 133 may be installed in a part of the device where the pressure needs to be detected depending on the use environment. When the pressure sensor 133 is installed on the display unit 310, a touch input through the display unit 310 and a pressure applied by the touch input The pressure touch input can be identified. In addition, the magnitude of the pressure applied to the display unit 310 at the time of the pressure touch input can be determined according to the signal output from the pressure sensor 133. [

The motion sensor includes at least one of an acceleration sensor 132, a gyro sensor 131, and a geomagnetic sensor, and detects the position and movement of the device using the sensor. The acceleration sensor 132, which can be used for a motion sensor, is a device that converts an acceleration change in one direction into an electric signal and is widely used along with the development of MEMS (micro-electromechanical systems) technology. Further, the gyro sensor 131 is a sensor for measuring the angular velocity, and can sense the direction of rotation with respect to the reference direction.

The heartbeat detection sensor 135 measures the change in the optical blood flow according to the change in the thickness of the blood vessel caused by the heartbeat. The skin temperature sensor measures the skin temperature as the resistance value changes in response to the temperature change. The skin resistance sensor measures the skin's electrical resistance.

The iris recognition sensor 134 performs a function of recognizing a person using iris information of an eye having characteristics unique to each person. The human iris is completed after 18 months of age, and the circular iris pattern, which is raised near the medial side of the iris, is almost unchanged once determined, and the shape of each person is different. Therefore, iris recognition is the application of information technology to security for information of different iris characteristics. That is, it is an authentication method developed to identify people by analyzing the shape and color of iris and the morphology of retinal capillaries.

The iris recognition sensor 134 encodes a pattern of the iris and converts it into a video signal to compare and determine. The general operation principle is as follows. First, when the user's eye is aligned with the mirror located at the center of the iris recognizer at a certain distance, the infrared camera adjusts the focus through the zoom lens. After the iris camera images the user's iris as a photo, the iris recognition algorithm analyzes the iris pattern of the iris region to generate iris codes unique to the user. Finally, a comparison search is performed at the same time that the iris code is registered in the database.

Distance sensors include two-point distance measurement, triangulation (infrared, natural light) and ultrasonic. As in the conventional triangulation principle, when the object to be measured from two paths is reflected by a rectangular prism and incident on two image sensors, the distance between two points is displayed when the relative positions are matched. In this case, there is a method of making natural light (manual type) and a method of emitting infrared rays. The ultrasonic method is a method of transmitting ultrasonic waves having sharp direction to the object to be measured and measuring the time until the reflected wave from the object is received to find the distance. A piezoelectric element is used as the receiving sensor.

The Doppler radar is a radar that uses a Doppler effect of a wave, that is, a phase change of a reflected wave. The Doppler radar includes a continuous wave radar that transmits and receives a sinusoidal wave that is not pulse-modulated, and a pulse radar that uses a pulse-modulated wave to a square wave as an electromagnetic wave signal waveform.

In the continuous wave radar, the modulation frequency is relatively high in order to obtain the performance of the Doppler frequency filter. Therefore, it is inappropriate for the radar for the long distance, but the motion of the human body or the vehicle is reproduced as a stable sound by selecting the Doppler frequency as the audible frequency band. There is a feature that can be. The pulse radar measures the distance to the target by the time from the pulse transmission to the reflection echo reception. There is a method referred to as a pulse compression laser that performs frequency modulation or phase modulation within the transmission pulse width.

The output unit 300 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 300 may include a display unit 310, an audio output module, an alarm unit 330, a haptic module 340, and the like.

The display unit 310 displays and outputs information processed in the device. For example, when the device is in the call mode, a UI (User Interface) or GUI (Graphic User Interface) associated with the call is displayed. When the device is in the video communication mode or the photographing mode, the captured or received image can be displayed individually or simultaneously, and the UI and the GUI are displayed.

Meanwhile, as described above, when the display unit 310 and the touch pad have a mutual layer structure to constitute a touch screen, the display unit 310 can be used as an input device in addition to the output device. If the display unit 310 is configured as a touch screen, it may include a touch screen panel, a touch screen panel controller, and the like.

In addition, the display unit 310 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display (3D display). There may be two or more display units 310 depending on the implementation type of the device. For example, the device may include an external display unit 310 and an internal display unit 310 at the same time.

The display unit 310 may be implemented as a head up display (HUD), a head mounted display (HMD), or the like. HMD (Head Mounted Display) is an image display device that allows you to enjoy large images on your head like glasses. A Head Up Display (HUD) is a video display device that projects a virtual image onto a glass in a visible region of a user.

The audio output unit 320 outputs audio data received from the wireless communication unit or stored in the memory 260 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the sound output module 320 outputs sound signals related to functions performed in the device, for example, call signal reception tones, message reception tones, and the like. The sound output module 320 may include a speaker, a buzzer, and the like.

The alarm unit 330 outputs a signal for notifying the occurrence of an event of the device. Examples of events that occur in a device include receiving a call signal, receiving a message, and inputting a key signal. The alarm unit 330 outputs a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal. For example, it is possible to output a signal in a vibration mode. The alarm unit 330 may output a signal to notify when a call signal is received or a message is received. Also. When the key signal is input, the alarm unit 330 can output a signal as a feedback signal to the key signal input. The user can recognize the occurrence of an event through the signal output by the alarm unit 330. A signal for notifying the occurrence of an event in the device may also be output through the display unit 310 or the sound output unit.

The haptic module 340 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 340 is a vibration effect. When the haptic module 340 generates vibration with a haptic effect, the intensity and pattern of the vibration generated by the haptic module 340 can be converted, and the different vibrations may be synthesized and output or sequentially output.

The wireless communication unit 250 may include a broadcast receiving module, a mobile communication module, a wireless Internet module, a short distance communication module, and a GPS module.

The broadcast receiving module receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. At this time, the broadcast channel may include a satellite channel, a terrestrial channel, and the like. The broadcast management server may refer to a server for generating and transmitting at least one of a broadcast signal and broadcast related information and a server for receiving at least one of the generated broadcast signal and broadcast related information and transmitting the broadcast signal to the terminal.

The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information can also be provided through a mobile communication network, in which case it can be received by the mobile communication module. Broadcast-related information can exist in various forms.

The broadcast receiving module receives a broadcast signal using various broadcast systems, and can receive a digital broadcast signal using a digital broadcast system. In addition, the broadcast receiving module may be configured to be suitable for all broadcasting systems that provide broadcast signals as well as the digital broadcasting system. The broadcast signal and / or broadcast related information received through the broadcast receiving module may be stored in the memory 260.

The mobile communication module transmits and receives radio signals to and from at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module refers to a module for wireless Internet access, and the wireless Internet module can be embedded in a device or externally. Wireless Internet technologies include WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), LTE (Long Term Evolution-Advanced) or the like can be used.

The short-range communication module 116 is a module for short-range communication. Beacon, Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee and the like can be used as a short distance communication technology.

The GPS (Global Position System) module 115 receives position information from a plurality of GPS satellites.

The memory 260 may store a program for processing and controlling the control unit 210 and may perform a function for temporarily storing input or output data (e.g., a message, a still image, a moving image, etc.) It is possible.

The memory 260 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (e.g., SD or XD memory), a RAM , And a ROM. ≪ / RTI > The device may also operate a web storage that performs storage functions of the memory on the Internet.

The memory 260 may be represented by a storage unit 260 as follows.

The interface unit 270 serves as an interface with all external devices connected to the device. Examples of external devices connected to the device include card sockets such as a wired / wireless headset, an external charger, a wired / wireless data port, a memory card, a Subscriber Identification Module (SIM) or a User Identity Module (UIM) Audio I / O (input / output) jacks, video I / O (input / output) jacks, and earphones. The interface unit 270 may receive data from the external device or supply power to the respective components in the device, and may transmit data in the device to the external device.

The control unit 210 typically controls the operation of each unit to control the overall operation of the device. For example, voice communication, data communication, video communication, and the like. In addition, the control unit 210 performs a function of processing data for multimedia reproduction. In addition, it performs a function of processing data input from the input unit or the sensing unit 130.

In addition, the control unit 210 performs face detection and face recognition for face recognition. That is, the control unit 210 may include a face detection module and a face recognition module for face recognition. The face detection module can extract only the face region from the camera image acquired by the camera unit 120. [ For example, the face detection module can extract face regions by recognizing feature elements in the face such as eyes, nose, mouth, and the like. The face recognition module extracts feature information from the extracted face region to generate a template, and the face information can be recognized by performing template comparison with the face information data in the face database.

In addition, the control unit 210 may perform a function of extracting and recognizing a character within an image or an image acquired by the camera unit 120. [ That is, the control unit 210 may include a character recognition module for character recognition. An optical character recognition (OCR) method can be applied to the character recognition method of the character recognition module. The OCR method is a method that can be implemented by software by converting a type image of a document written or printed by a person, which can be obtained by image scanning, into a form such as a character code that can be edited by a computer. For example, in the OCR method, a plurality of standard pattern characters prepared in advance and an input character are compared with each other to select the character most similar to the standard pattern character. If the character recognition module includes standard pattern characters of various languages, printed characters of various languages can be read. Such a method is referred to as a pattern matching method among the OCR methods, and the OCR method is not limited thereto and various methods can be applied. In addition, the character recognition method of the character recognition module is not limited to the OCR method, and various methods capable of recognizing already-printed offline characters can be applied.

In addition, the control unit 210 may perform a function of recognizing the gaze direction based on the ocular direction image or the image acquired by the second camera 122. That is, the control unit 210 may include a line of sight analysis module that performs line-of-sight direction recognition. The direction of sight of the user and the direction of the line of sight are measured and then synthesized to determine the direction the user is looking at. The gaze direction refers to the direction of the user's face and can be measured by the gyro sensor 131 or the acceleration sensor 132 of the sensing unit 130. The gaze direction can be grasped by the gaze analysis module in the direction in which the user's pupil looks. The eye-gaze analysis module can detect motion of a pupil through analysis of a real-time camera image and calculate a gaze direction based on a fixed position reflected on the cornea. For example, the location of the cornea reflection light by the center of the pupil and illumination can be extracted through the image processing method and the eye position can be calculated through the positional relationship between them.

The control unit 210 may be expressed as an information processing unit 210 hereinafter.

The power supply unit receives external power and internal power under the control of the controller 210, and supplies power necessary for operation of the respective components.

The speech recognition unit 220 performs a function of recognizing verbally meaningful contents from the speech by automatic means. Specifically, a speech waveform is input to identify a word or a word sequence, and a meaning is extracted. The process is largely divided into voice analysis, phoneme recognition, word recognition, sentence analysis, and semantic extraction. The voice recognition unit 220 may further include a voice evaluation module that compares the stored voice with the input voice. The voice recognition unit 220 may further include a voice-to-text conversion module 240 that converts the input voice to text or converts the voice to voice.

The EEG signal generator generates an EEG synchronized signal having a frequency and a waveform for synchronizing human brain waves. That is, the EEG coherent signal generator performs the function of synchronizing the EEG by transmitting the vibration of the EEG frequency to the skull. Electroencephalogram (EEG) refers to the flow of electricity that occurs when a cranial nerve signal is transmitted. These brain waves are very slow when sleeping Delta wave EEG, when the action is a fast EEG betapa, meditation when the middle rate of the alpha waves are increased. Therefore, the EEG signal generation part can induce the alpha wave and the seta wave, so that the effect of learning assistance and mental concentration can be demonstrated.

Hereinafter, an indoor electronic device control system, a control method, and a control program using a glass-like wearable device according to embodiments of the present invention will be described with reference to the drawings.

2 is a flowchart of a method of controlling an indoor electronic device using the glass-like wearable device 100. As shown in FIG.

2 is a flowchart of a method of controlling an indoor electronic device using a glass-type wearable device according to an embodiment of the present invention.

Referring to FIG. 2, a method of controlling an indoor electronic device using a glass-type wearable device according to an embodiment of the present invention includes: (S100) measuring a current indoor location of the glass-type wearable device; (S110) the user recognizes the azimuth or elevation angle of the glass-like wearable device and measures the direction of the user's gaze; Determining (S120) the electronic device located in the viewing direction at the measured current indoor position; Receiving a control command of the electronic device from a user (S130); And transmitting the control command to the electronic device through wireless communication (S140). An indoor electronic device control method using a glass-type wearable device according to an embodiment of the present invention will be described in order.

The glass-like wearable device measures the current indoor position (S100). A variety of indoor positioning methods can be applied to the indoor position recognition of the glass-like wearable device. However, the indoor positioning method is not limited to the method described below, and various methods can be applied.

A method of measuring using an indoor wireless communication network such as a Wi-Fi or a beacon can be applied as an indoor location measurement method. The wearable type wearable device 100 receives the wireless communication signal, recognizes the intensity, direction, and type of the signal, and can measure the indoor position through the recognition.

Also, as shown in FIG. 6, feature elements may be extracted from the user's forward image or image acquired by the first camera 121, which is provided at one side of the glass-like wearable device 100 and acquires a forward image or image. And a method of grasping the position of the wearable type wearable device 100 on the indoor map including the feature elements can be applied.

The wearable wearable device recognizes the azimuth or elevation angle and measures the direction of the user's gaze (S110). The viewing direction refers to a direction in which the face of the user faces to look at a specific electronic device, and the face direction coincides with the direction in which the glass-like wearable device is looking. The wearable wearable device 100 may perform the azimuth or elevation angle measurement using the gyro sensor 131, the geomagnetic sensor, the acceleration sensor 132, or the like. The direction corresponding to the measured high articulation and azimuth corresponds to the direction of the user's gaze. At step S120, the electronic device positioned in the viewing direction is determined at the measured current indoor position. That is, the glass-like wearable device 100 recognizes the electronic device located in the viewing direction at the recognized current indoor position. An azimuth angle and an elevation angle measured based on the recognized current position are used to determine a direction in which an electronic device desired to be controlled is located and then an electronic device located in a direction recognized on the stored indoor map is grasped.

And receives a control command of the electronic device from the user (S130). The control command is received by a method of recognizing the blinking of eyes acquired by the second camera 122 and inputting the control command, a method of recognizing the head movement pattern by the motion sensor and inputting the control command, A method of inputting the control command by the touch operation of the touch panel 114, a method of inputting the control command by the user's voice input, a method of recognizing the hand gesture of the user and inputting the control command Or the like.

The input control command is transmitted to the electronic device through wireless communication (S140). The wearable device of a glass type can transmit a control command by a communication method that directly connects with the electronic device. For example, a glass wearable device can transmit a control command to an electronic device identified by a WiFi-Direct method or a Bluetooth method.

In addition, the glass-like wearable device 100 may be connected to an in-home wireless communication network such as Wi-Fi (WLAN) to transmit a control command to the electronic device. That is, the glass-like wearable device 100 is also connected to a wireless communication network to which one or more electronic devices are connected, and the glass-like wearable device 100 transmits a control command to the wireless access point 400 using the wireless communication network And the wireless access point 400 can transmit the control command to the electronic device to be controlled.

In addition, when the wearable wearable device 100 is worn by the user, the wearable device 100 can be automatically connected to the wireless communication network when the user enters the home. The wearable type wearable device 100 may actively sense the wireless communication network when the wearable device 100 enters the home and automatically connect to the home to allow the user to directly control the electronic device by using the glass wearable device 100 have.

The method according to claim 1, further comprising the step of acquiring an eye-gaze direction of the glass-type wearable device, wherein the control object determination step (S120) comprises: determining an angle corresponding to the gaze direction The electronic device is determined to be a control target. Since the wearer wears the wearable wearable device 100 and does not look only at the front face, it is necessary to consider the eye direction of the user. Therefore, eye tracking is performed by the second camera 122 to recognize the user's gaze direction, and the gaze direction of the recognized eye is applied to the gaze direction according to the measured azimuth angle and elevation angle, The accurate position can be recognized.

Displaying the list of one or more electronic devices judged to be the object of control by the glass wearable device on the screen and providing the one or more electronic devices when the one or more electronic devices are judged to be the objects to be controlled; And selecting a specific electronic device from among the electronic devices in the list from the user. A plurality of electronic apparatuses can be positioned in a direction in which the wearer wears the wearable wearable device 100 and a plurality of electronic apparatuses can be positioned close to each other even if the eyesight direction by eye tracking is reflected, The device can not grasp which electronic device among the plurality of recognized electronic devices the user desires to control. In such a case, it is necessary to provide a plurality of electronic devices judged as objects to be controlled to the user to be selected from the user. Therefore, the glass-like wearable device 100 can display and provide the determined list of one or more electronic devices on the screen. Thereafter, the glass-like wearable device 100 can select a specific electronic device among the electronic devices in the list from the user and determine the electronic device to be controlled. For example, a glass-type wearable device can display a plurality of electronic devices on a screen together with numbers on the screen, and can input a touch input, a voice input, a blink input, a hand gesture input, Etc. to select a specific electronic device to be controlled.

The method may further include receiving the control result of the electronic device according to the control command and providing a notification to the user. That is, the glass-like wearable device 100 can receive the control result from the electronic device which has received the control command, and can perform the notification to the user. A method of providing a notification to a user may include a method of displaying on the screen by the display unit 310, a method of notifying by a sound output, and the like.

The method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device. The wearable type wearable device 100 may receive input patterns such as a hand gesture pattern, a blinking eye pattern, a head movement pattern, and select a corresponding control command from a user to set a correspondence relationship between an input pattern and a control command.

In addition, the user may set and store the input method for each control command. For example, when the user wishes to control the light, the instruction to turn off the audio can be stored by setting the eye blinking pattern for closing the left eye and the eye blinking pattern for turning on the audio for the right eye. In addition, when the audio is reproduced, the command to pass to the next music can be stored by setting it in the head movement direction. Thus, the user can set the desired command type, and the command input method and pattern can be set according to the characteristics of the user.

3 is a flowchart of a method of controlling an indoor electronic device using a glass-like wearable device by front image analysis according to an embodiment of the present invention.

Referring to FIG. 3, a method of controlling an indoor electronic device using a glass-type wearable device according to another embodiment of the present invention includes acquiring an image corresponding to a user's direction of sight (S200); (S210) controlling the electronic device in the image through the image analysis; Receiving a control command of the electronic device from a user (S220); And transmitting the received control command to the electronic device via wireless communication (S230). An indoor electronic device control method using a glass-type wearable device according to an embodiment of the present invention will be described in order. Hereinafter, a detailed description of the steps described above will be omitted.

And acquires an image corresponding to the user's direction of sight (S200). That is, the first camera 121 of the glass-like wearable device 100 acquires an image in the direction (i.e., the viewing direction or the front direction) of the user's face as shown in Fig.

The electronic device in the image is determined as an object to be controlled through the image analysis (S210). For example, a glass-like wearable device can recognize an electronic device located at the center of the image as a control target. In general, since the electronic device to be controlled by the user is positioned at the center of the forward image or the image acquired by the first camera 121, the electronic device located at the center of the acquired image can be recognized as the control object . The controller 210 may analyze the image of the electronic device to determine which electronic device is located at the center of the image and select the electronic device as a control target.

In addition, when the glass-like wearable device captures an image including a user's hand gesture, the glass-like wearable device grasps the hand gesture area of the user included in the image, and the electronic device corresponding to the hand gesture area So that it can be judged to be the control target. For example, the electronic device in the direction in which the user's finger points can be judged as a control target by the glass-like wearable device.

Further, for example, the glass-like wearable device 100 can judge an electronic device that is contained in a specific gesture of the user (for example, a circle gesture formed by collecting fingers) as a control target. That is, the control unit 210 recognizes the user's specific hand gesture and extracts the electronic image portion included in the recognized hand gesture. It is possible to recognize the electronic device that is desired to be controlled through analysis of the image of the extracted electronic device.

And receives a control command of the electronic device from the user (S220).

And transmits the input control command to the electronic device through wireless communication (S230).

The method may further include acquiring an eye gaze direction of the wearable wearable device.

In addition, the control object determination step (S210) may extract the electronic device in the area corresponding to the viewing direction in the image and determine the object to be controlled. That is, the wearable wearable device 100 recognizes the eye gaze direction of the user and can calculate the gaze point corresponding to the gaze direction of the image. Thereafter, the glass-like wearable device 100 can extract the electronic device located at the gaze point and judge it as the control target.

Since the wearer wears the wearable wearable device 100 and does not look only at the front face, it is necessary to consider the eye direction of the user. Accordingly, eye tracking is performed by the second camera 122 to recognize the user's eyeball direction, and the recognized eyeball direction can be applied to the measured azimuth and elevation angle to recognize an accurate position of the electronic device have.

Displaying the list of one or more electronic devices judged to be the object of control by the glass wearable device on the screen and providing the one or more electronic devices when the one or more electronic devices are judged to be the objects to be controlled; And selecting a specific electronic device from among the electronic devices in the list from the user.

The method may further include receiving the control result of the electronic device according to the control command and informing the user of the control result.

The method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.

4 is a flowchart of a method of controlling an indoor electronic device using a glass-like wearable device by voice command recognition according to an embodiment of the present invention.

Referring to FIG. 4, a method of controlling an indoor electronic device using a glass-like wearable device according to another embodiment of the present invention includes receiving an electronic device selection command for controlling a device from a user and a voice command corresponding to a control command (S300); Analyzing the voice command to determine a control command with an electronic device to be controlled (S310); And transmitting the input control command to the selected electronic device through wireless communication (S320). An indoor electronic device control method using a glass-type wearable device according to an embodiment of the present invention will be described in order. Hereinafter, a detailed description of the steps described above will be omitted.

The user receives a voice command corresponding to an electronic device selection command and a control command to control the device (S300). The voice input unit 113 of the wearable type wearable device 100 receives the user's voice command including the name of the control object and the control command.

The voice command is analyzed to determine a control command with the electronic device to be controlled (S310). That is, the voice recognition unit 220 of the glass-like wearable device 100 performs voice recognition of the input voice command to identify the electronic device corresponding to the control subject in the voice command, Identify the command. For example, the voice input unit 113 receives a voice command of the user "a living room light switch." The voice recognition unit 220 interprets the voice command, recognizes that the electronic device corresponding to the control target is the living room light, and recognizes that the control command desired by the user is off. However, the manner in which the wearable wearable device 100 recognizes the control object and the control command by receiving the voice command of the user is not limited to this, and various methods can be applied.

And transmits the input control command to the selected electronic device through wireless communication (S320).

The method may further include receiving the control result of the electronic device according to the control command and informing the user of the control result.

The method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.

Also, as shown in FIG. 6, the display unit 310 may display the selected electronic device and the control command so that the user can confirm whether the selection command and the control command of the electronic device are properly input.

The control command is transmitted to the selected electronic device to control the electronic device (S400). The electronic device performs information processing on the received control command to perform control.

The method may further include receiving the control result of the electronic device according to the control command and informing the user of the control result. In order to notify the user whether the control command is received by the electronic device and whether the desired operation has been properly performed, the electronic device transmits the execution result according to the control command to the glass-type wearable device 100, The wearable device 100 performs information processing on the received result information and notifies the user of the result information. The notification method may be applied to a method of visually displaying the control result information on the display unit 310, a method of informing the user through voice through the sound output unit 320, and the like.

5 is an internal configuration diagram of an indoor electronic device control system using a glass-like wearable device 100 according to an embodiment of the present invention. In FIG. 5, a detailed description of the configuration described above will be omitted.

Referring to FIG. 7, an indoor electronic device control system using a glass-type wearable device 100 according to another embodiment of the present invention includes a glass-type wearable device 100; And a wireless access point (400).

The wearable type wearable device 100 has a function of receiving an electronic device selection command and a control command to be controlled and transmitting a selection command and a control command of the electronic device connected to the wireless access point and input through the wireless communication . That is, the glass-type wearable device 100 includes a wireless communication unit 250; A control unit 210; And a user input unit 110.

The user input unit 110 performs a function of receiving an electronic device selection command and a control command to be controlled by the glass-type wearable device 100.

The control unit 210 determines an electronic device to be controlled by the user based on the selection command and the control command input by the user input unit 110 and grasps a desired control command. In addition, the controller 210 performs information processing to transmit a control command to the selected electronic device through the wireless communication unit 250. [

The wireless communication unit 250 connects the glass-like wearable device 100 to a wireless access point and transmits a selection command and a control command of the electronic device input through the wireless communication. In addition, a function of receiving an indoor wireless communication signal to recognize a real-time indoor location of the wearable device 100 may be performed.

In addition, the first camera 121 may further include: The first camera 121 is provided at one side of the glass-like wearable device 100 and acquires a forward image or image. The camera acquires a forward image or image to recognize the electronic device that the user is watching. . The control unit 210 recognizes an electronic device located at the center of the image or image input by the first camera 121 or an electronic device existing in the user's specific hand gesture. In addition, the first camera 121 performs a function of acquiring a forward image or image for real-time indoor positioning of the glass-like wearable device 100. The controller 210 may extract the feature element from the forward image or the image obtained by the first camera 121 and determine the indoor position based on the position, size, and the like of the feature element.

Also, motion sensors; And a second camera (122). The motion sensor recognizes a movement pattern of a user's head and performs a function of inputting a selection command or a control command of the electronic apparatus. In addition, the motion sensor performs a function of recognizing a direction that the user views. The geomagnetism sensor, the gyro sensor 131 and the like measure the azimuth angle and the acceleration sensor 132 measures the elevation angle to recognize the direction in which the user looks at the current location of the room.

The second camera 122 is a camera provided on one side of the glass-like wearable device 100 to acquire images or images in the eyeball direction. The second camera 122 recognizes a blinking pattern of a user and performs a function of inputting a selection command or a control command of the electronic device. In addition, the second camera 122 acquires an eyeball image or image, and the eyeglass wearable device 100 performs eye tracking, thereby accurately grasping the direction of the user's eye in consideration of the eyeball direction.

The wireless access point 400 receives a selection command and a control command of the electronic device from the wearable device 100 via wireless communication and transmits the control command to the selected electronic device.

The method of controlling an indoor electronic device using a glass-type wearable device according to an embodiment of the present invention is implemented as a program (or an application) in combination with a glass-like wearable device 100, Lt; / RTI >

The above-described program is a program for causing a processor (CPU) of the glass-like wearable device 100 to read the program and execute the methods implemented by the program, C ++, JAVA, machine language, etc., which can be read through the device interface of the device 100. [ Such code may include a function code related to a function or the like that defines necessary functions for executing the above methods, and may execute the functions necessary to execute the functions of the processor of the glass-like wearable device 100 according to a predetermined procedure Procedure-related control codes. This code may also be used when the additional information or media necessary for the processor of the glass-like wearable device 100 to perform the above functions must be referred to at any position (address) of the internal or external memory of the glass-like wearable device 100 You can also include more memory reference related code. When the processor of the glass-like wearable device 100 needs to communicate with any other computer or server that is remote to execute the functions, the code is transmitted to the communication module of the glass-like wearable device 100 And may further include communication related codes such as how to communicate with any other remote computer or server, what information or media should be transmitted or received during communication, and the like.

The medium to be stored is not a medium for storing data for a short time such as a register, a cache, a memory, etc., but means a medium that semi-permanently stores data and is capable of being read by a device. Specifically, examples of the medium to be stored include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, but are not limited thereto. That is, the program may be stored in various recording media on various servers that the glass-like wearable device 100 can access, or on various recording media on the glass-wearable device 100 of the user. In addition, the medium may be distributed to a network-connected computer system so that computer-readable codes may be stored in a distributed manner.

According to the present invention as described above, the following various effects are obtained.

First, according to the present invention, it is possible to control an electronic device in a home at a remote location, thereby eliminating the inconvenience of moving and controlling the electronic device to be directly controlled by a user. For example, you can eliminate the inconvenience that you have to go to where the light switch is to turn off the light while lying on the bed.

Second, if each electronic device can be connected to wireless communication in the home, there is an advantage that it can be controlled using a glass-type wearable device without a remote control of each electronic device.

Third, the electronic device can be controlled by a simple operation such as a blinking pattern or a head movement pattern, and the user can select the electronic device to be controlled simply by watching the electronic device desired to be controlled. There is an effect that can be controlled. For example, if the user wishes to turn off the audio being reproduced while the user is lying on the bed, the user may gaze at the audio while wearing the glass-like wearable device and input an eye blinking pattern corresponding to audio off.

100: system 110: user input
111: Application 112: Keyboard
113: voice input unit 114: touch pad
115: GPS signal unit 116: Local area communication
120: camera unit 121: first camera
122: second camera 123: third camera
130: sensing unit 131: gyro sensor
132: acceleration sensor 133: pressure sensor
134: iris recognition sensor 135: heart rate detection sensor
136: EMG sensor
210: control unit 220: voice recognition unit
230: situation evaluation module 240: voice-to-text conversion module
250: wireless communication unit 260: memory
270:
300: output unit 310: display unit
320: Acoustic output unit 330:
340: Haptic module
400: wireless access point

Claims (11)

A method of controlling an indoor electronic device using a glass-like wearable device,
Measuring the current indoor position of the glass-like wearable device;
The wearable device measures a direction of a user's gaze, and the gaze direction is a direction of a face of a user wearing the glass-like wearable device;
Wherein the indoor map includes information on the location of the electronic device in the specific location, the indoor map including information on the location of the electronic device in the specific direction, A control target judgment step;
A step of the glass wearable device receiving a control command of the electronic device from the user; And
And the glass-wearable device transmits the control command to the electronic device via wireless communication.
The method according to claim 1,
Wherein the glass wearable device acquires a gaze direction through a second camera, the gaze direction is a direction in which the pupil of the user gazes, and the second camera is a camera for acquiring an image in a direction in which the pupil of the user gazes And a gaze direction acquiring step,
The control target determination step may include:
And calculating a direction in which the controlled object is positioned by reflecting the gaze direction in the gaze direction on the basis of the measured current indoor position.
delete delete delete delete The method according to claim 1,
When at least one of the electronic devices is determined as the control target,
Displaying the list of one or more electronic devices judged to be objects of control by the glass wearable device on a screen and providing the list; And
Further comprising the step of the glass-wearable device selecting a specific electronic device among the electronic devices in the list from the user.
The method according to claim 1,
The control command receiving step includes:
Characterized in that the glass-type wearable device receives the control command by at least one of at least one of eye flicker recognition, eye movement pattern recognition by the user, touch operation input, user's voice input recognition, and user's hand gesture recognition. A method of controlling an indoor electronic device using a wearable device.
The method according to claim 1,
And the glass-wearable device receives the control result according to the control command and provides the control result to the user.
The method according to claim 1,
Receiving the input method or the input pattern from the user by the glass-type wearable device; And
And setting the received input method or input pattern as a specific control command for a specific electronic device according to a request of the user, wherein the glass type wearable device sets the received input method or input pattern as a specific control command for the specific electronic device.
A wearable electronic device control program for a glass-like wearable device stored in a medium for executing the method according to any one of claims 1, 2, and 7 to 10 in combination with a glass-type wearable device that is hardware.
KR1020150042941A 2014-07-30 2015-03-27 Method and program for controlling electronic device by wearable glass device KR101728707B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/007914 WO2016018063A2 (en) 2014-07-30 2015-07-29 Information-processing system and method using wearable device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20140110608 2014-08-25
KR1020140110608 2014-08-25

Publications (2)

Publication Number Publication Date
KR20160024733A KR20160024733A (en) 2016-03-07
KR101728707B1 true KR101728707B1 (en) 2017-04-20

Family

ID=55540183

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150042941A KR101728707B1 (en) 2014-07-30 2015-03-27 Method and program for controlling electronic device by wearable glass device

Country Status (1)

Country Link
KR (1) KR101728707B1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10437343B2 (en) * 2017-01-06 2019-10-08 Samsung Electronics Co., Ltd. Augmented reality control of internet of things devices
KR102411124B1 (en) * 2017-10-27 2022-06-21 삼성전자주식회사 Electronic device and method for performing task using external electronic device in electronic device
KR102190458B1 (en) * 2019-01-30 2020-12-11 재단법인대구경북과학기술원 Educational apparatus and method for experiencing brain machine interface technology
CN112987580B (en) * 2019-12-12 2022-10-11 华为技术有限公司 Equipment control method and device, server and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005261728A (en) * 2004-03-19 2005-09-29 Fuji Xerox Co Ltd Line-of-sight direction recognition apparatus and line-of-sight direction recognition program
US20130069985A1 (en) * 2011-09-21 2013-03-21 Google Inc. Wearable Computer with Superimposed Controls and Instructions for External Device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005261728A (en) * 2004-03-19 2005-09-29 Fuji Xerox Co Ltd Line-of-sight direction recognition apparatus and line-of-sight direction recognition program
US20130069985A1 (en) * 2011-09-21 2013-03-21 Google Inc. Wearable Computer with Superimposed Controls and Instructions for External Device

Also Published As

Publication number Publication date
KR20160024733A (en) 2016-03-07

Similar Documents

Publication Publication Date Title
CN106471419B (en) Management information is shown
CN112507799A (en) Image identification method based on eye movement fixation point guidance, MR glasses and medium
KR102056221B1 (en) Method and apparatus For Connecting Devices Using Eye-tracking
EP2891954B1 (en) User-directed personal information assistant
KR101684264B1 (en) Method and program for the alarm of bus arriving by wearable glass device
US10571715B2 (en) Adaptive visual assistive device
KR101661555B1 (en) Method and program for restricting photography of built-in camera of wearable glass device
CN112181152A (en) Advertisement push management method, equipment and application based on MR glasses
JP2010061265A (en) Person retrieval and registration system
KR20160017593A (en) Method and program for notifying emergency exit by beacon and wearable glass device
KR101728707B1 (en) Method and program for controlling electronic device by wearable glass device
KR20160015142A (en) Method and program for emergency reporting by wearable glass device
KR20160026310A (en) System and method for curling coaching by wearable glass device
KR102251710B1 (en) System, method and computer readable medium for managing content of external device using wearable glass device
JP2020077271A (en) Display unit, learning device, and method for controlling display unit
KR20160011302A (en) System and method for digital image processing by wearable glass device
KR101629758B1 (en) Method and program with the unlock system of wearable glass device
KR20160024140A (en) System and method for identifying shop information by wearable glass device
KR20160025150A (en) System and method for social dating service by wearable glass device
KR20160023226A (en) System and method for exploring external terminal linked with wearable glass device by wearable glass device
KR20160016216A (en) System and method for real-time forward-looking by wearable glass device
KR20160016149A (en) System and method for preventing drowsiness by wearable glass device
EP3255927B1 (en) Method and device for accessing wi-fi network
KR101661556B1 (en) Method and program for confirmation of identity by wearable glass device
KR20160025203A (en) System and method for billiard coaching by wearable glass device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E701 Decision to grant or registration of patent right