WO2016018063A2 - Système et procédé de traitement d'informations utilisant un dispositif portable - Google Patents

Système et procédé de traitement d'informations utilisant un dispositif portable Download PDF

Info

Publication number
WO2016018063A2
WO2016018063A2 PCT/KR2015/007914 KR2015007914W WO2016018063A2 WO 2016018063 A2 WO2016018063 A2 WO 2016018063A2 KR 2015007914 W KR2015007914 W KR 2015007914W WO 2016018063 A2 WO2016018063 A2 WO 2016018063A2
Authority
WO
WIPO (PCT)
Prior art keywords
wearable device
user
type wearable
information
glass
Prior art date
Application number
PCT/KR2015/007914
Other languages
English (en)
Korean (ko)
Other versions
WO2016018063A3 (fr
Inventor
한성철
엄정한
김진영
이경현
김대중
김석기
유철현
김주천
김주원
Original Assignee
넥시스 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020150042547A external-priority patent/KR20160017593A/ko
Priority claimed from KR1020150042550A external-priority patent/KR20160015142A/ko
Priority claimed from KR1020150042943A external-priority patent/KR20160015143A/ko
Priority claimed from KR1020150042941A external-priority patent/KR101728707B1/ko
Application filed by 넥시스 주식회사 filed Critical 넥시스 주식회사
Publication of WO2016018063A2 publication Critical patent/WO2016018063A2/fr
Publication of WO2016018063A3 publication Critical patent/WO2016018063A3/fr

Links

Images

Definitions

  • the present invention relates to an information processing system and method using a wearable device, and more particularly, to a system and method for information processing data acquired during an activity of a user having a wearable device.
  • the head mounted display device is mainly made of a safety glasses or a helmet type device to see the screen in front of the eyes, and was developed to realize virtual reality.
  • Glass-type wearable devices generally have a small display such as a liquid crystal installed at a position close to both eyes (both eyes) to project an image.
  • the wearable device is for use in space development, nuclear reactors, military institutions, and medical institutions.
  • Various developments for games and the like are in progress.
  • a glass type wearable device according to the prior art is also disclosed in US Pat. No. US8,427,396.
  • glass wearable devices Although various types of glass wearable devices have been researched and emerged, methods or services that wearers can conveniently utilize in daily life are very limited, and UI (User Interface) or UX (User Experience) suitable for glass wearable devices is very limited. Neither is it being developed. In addition, recently, glass-type wearable devices that can perform most functions by themselves, as well as glass-type wearable devices used in conjunction with mobile terminals such as smartphones, have been developed. Different from the mobile terminal of the need for a service suitable for the characteristics of the glass-type wearable device.
  • UI User Interface
  • UX User Experience
  • one object of the present invention is to provide a service suitable for a glass-type wearable device that can be utilized by the user or wearer of the glass-type wearable device.
  • an object of the present invention is to provide a system and method for providing an escape route or performing emergency contact to the outside using a glass type wearable device.
  • the present invention provides a system and method for controlling an electronic device desired by a user through wireless communication by selecting an electronic device to be controlled through a glass type wearable device and inputting a control command.
  • Information processing method using a wearable device the step of receiving event occurrence location information in the building; Calculating an escape route by determining safe emergency exit location information based on the event occurrence location information; Generating guide information along the calculated escape route; And providing the guide information to the user, wherein the event occurrence location information is location information at which a specific event requiring evacuation of the user in a specific building has occurred.
  • an information processing method using a wearable device includes: obtaining a voice input or an operation input of a user; Determining an emergency situation by recognizing the voice input or operation input; And transmitting the emergency notification information to the emergency contact counterpart via wireless communication.
  • An information processing method using a wearable device may include: measuring, by the glass type wearable device, a current indoor location; Measuring a gaze direction of a user by recognizing an azimuth angle or a high and low angle by the glass type wearable device; Determining the electronic device located in the gaze direction from the measured current indoor location as a control object; Receiving a control command of the electronic device from a user; And transmitting the input control command to the electronic device through wireless communication.
  • an information processing method using a wearable device includes: acquiring an image corresponding to a gaze direction of a user; Determining the electronic device in the image as a control object through the image analysis; Receiving a control command of the electronic device from a user; And transmitting the input control command to the electronic device through wireless communication.
  • an information processing method using a wearable device includes: receiving an electronic device selection command and a control command corresponding to a control command from a user; Analyzing the voice command to determine a control command with an electronic device to be controlled; And transmitting the inputted control command to the selected electronic device through wireless communication.
  • an information processing method using a wearable device includes: receiving current location information; Receiving real-time shooting specification information from the shell direction recognition system; Calculating a predicted landing location based on the shooting specification information and the range information; And displaying the expected impact point on a map and providing the same to the user.
  • the glass-type wearable device provides people with a substantially safe path that avoids incident location information obtained through a wireless communication signal, thereby reducing casualties caused by accidents.
  • the route guidance is displayed on the display unit of the glass wearable device located in front of the user's eyes so that the user can easily find the emergency exit or evacuation site.
  • a user in an emergency may perform emergency contact without additional manipulation.
  • crimes or accidents can be reduced early by responding to emergency situations.
  • an emergency contact can be performed to a counterpart who can appropriately respond to an emergency situation that a user encounters.
  • Sixth, according to the present invention can control the electronic devices in the home at a long distance, it is possible to eliminate the inconvenience of the user to move to control the desired electronic devices. For example, the inconvenience of having to go to the place where the light switch is located in order to turn off the light while lying in the bed can be eliminated.
  • each electronic device can be connected to the wireless communication in the home, there is an advantage that can be controlled using the glass-type wearable device without a remote control of each electronic device.
  • the electronic device can be controlled by a simple operation such as a blinking pattern or a moving pattern, and the user can select a control target electronic device simply by looking at the electronic device that the user wants to control.
  • a controllable effect For example, if the user wants to turn off the audio being played while the user is lying in bed, the user may stare at the audio while wearing the glass wearable device and input a blinking pattern corresponding to the audio off.
  • the projected landing of the shell is calculated in real time and displayed to the user of the glass type wearable device, so that the shell can be fired exactly at a desired position in battle.
  • the shell can be fired exactly at the desired position, thereby increasing the killing power of the artillery.
  • FIG. 1 is a block diagram of a glass type wearable device system according to an exemplary embodiment of the present invention.
  • FIG 2 is an exemplary view of a glass type wearable device related to one embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a connection relationship between a glass type wearable device, an external server, and an external device according to an exemplary embodiment of the present invention.
  • FIG. 4 is a flowchart illustrating a method of providing an escape route using a glass type wearable device according to an embodiment of the present invention.
  • FIG. 5 is an exemplary view showing guide information on a display unit of a glass type wearable device according to an embodiment of the present invention.
  • FIG. 6 is an internal configuration diagram of a system for providing an escape route using a beacon and a glass type wearable device according to an embodiment of the present invention.
  • FIG. 7 is an internal configuration diagram of a system for providing an escape route using a control server, a beacon, and a glass type wearable device according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating an emergency emergency contact method using a glass type wearable device according to an embodiment of the present invention.
  • FIG. 9 is a diagram illustrating a connection relationship between a glass type wearable device and an emergency contact counterpart according to an embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by measuring indoor positioning and gaze direction according to an embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by analyzing a front image according to an embodiment of the present disclosure.
  • FIG. 12 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by voice command recognition according to an embodiment of the present invention.
  • FIG. 13 is an exemplary view showing an electronic device and a control command recognized in a display unit of a glass type wearable device according to an embodiment of the present invention.
  • FIG. 14 is a block diagram of an indoor electronic device control system using a glass type wearable device according to an embodiment of the present invention.
  • FIG. 15 is an internal configuration diagram of a shell direction recognition system according to an embodiment of the present invention.
  • 16 is a flowchart illustrating a method for providing a projected impact target shell using a glass type wearable device according to an embodiment of the present invention.
  • FIG. 17 is a diagram illustrating a connection relationship between a shell direction recognition system and a glass type wearable device according to an exemplary embodiment of the present invention.
  • FIG. 18 is an exemplary view showing an expected impact point on a display unit of a glass type wearable device according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a glass type wearable device system 10 according to an embodiment of the present invention.
  • the glass type wearable device system 10 may include an input unit 100, a user input unit 110, a keyboard 111, a touch pad 112, and a camera unit 120. ), The first camera 121, the second camera 122, the third camera 123, the sensing unit 130, the gyro sensor 131, the acceleration sensor 132, the pressure sensor 133, iris recognition sensor 134, heart rate detection sensor 135, EMG sensor 136, voice input unit 140, control unit 210, voice recognition unit 220, interface unit 230, voice-to-text conversion module 240, The wireless communication unit 250, a memory 260, an output unit 300, a display unit 310, an audio output unit 320, an alarm unit 330, and all or a part of the haptic module 340 are included.
  • the components shown in FIG. 1 are not essential, so a glassy wearable device with more or less components may be implemented.
  • FIG 2 is an exemplary view of a glass type wearable device related to one embodiment of the present invention.
  • the components may be provided inside or on one side of the glass type wearable device as shown in FIG. 2.
  • the input unit 100 is for inputting an audio signal, a video signal, a user's manipulation signal, a biosignal, and the like.
  • the input unit 100 includes a user input unit 110, a camera unit 120, a sensing unit 130, and a voice input unit 140.
  • the user input unit 110 generates key input data input by the user for controlling the operation of the device.
  • the user input unit 110 may include a keypad, a keyboard 111, a dome switch, a touch pad (static pressure / capacitance) 112, a jog wheel, a jog switch, a finger mouse, and the like.
  • a touch pad static pressure / capacitance
  • the touch pad forms a mutual layer structure with the display unit 310 to be described later, this may be referred to as a touch screen.
  • the camera 120 is for inputting a video signal or an image signal, and two or more cameras 120 may be provided according to a configuration aspect of the device.
  • the camera 120 processes image frames such as still images or moving images obtained by an image sensor in a video call mode or a photographing mode.
  • the processed image frame may be displayed on the display 310.
  • the image frame processed by the camera 120 may be stored in the memory 260 or transmitted to the outside through the wireless communication unit 250.
  • the control unit 210 transmits the image signal and the video signal.
  • the camera unit 120 may include one or more cameras according to the direction or purpose of the captured image.
  • the first camera 121 may be provided at one side of the glass type wearable device to capture an image of the front side.
  • the second camera 122 may be provided at one side of the glass type wearable device to acquire an image or an image in an eyeball direction.
  • the third camera 123 may be provided at the rear or side of the glass type wearable device 10 to acquire an image or an image of the rear or side.
  • the sensing unit 130 generates a sensing signal for controlling the operation of the device by detecting the current state of the device, such as whether the user wears the glass-shaped wearable device 10 or the position of the device.
  • the sensing unit 130 may perform a function of an input unit receiving an input signal for information processing of the device, and may perform various sensing functions such as whether an external device is connected or not.
  • the sensing unit 130 includes a proximity sensor, a pressure sensor 133, a motion sensor, a fingerprint recognition sensor, an iris recognition sensor 134, a heart rate detection sensor 135, a skin temperature sensor, a skin resistance sensor and an electrocardiogram sensor. And various sensors.
  • the proximity sensor can detect the presence or absence of an approaching object or an object present in the vicinity without mechanical contact.
  • the proximity sensor can detect a proximity object by using a change in an alternating magnetic field or a change in a static magnetic field, or by using a change rate of capacitance.
  • Two or more proximity sensors may be provided according to the configuration aspect.
  • the pressure sensor 133 may detect whether pressure is applied to the device, the magnitude of the pressure, and the like.
  • the pressure sensor 133 may be installed at a portion of the device requiring detection of pressure according to the use environment. If the pressure sensor 133 is installed in the display 310, a touch input through the display 310 and a greater pressure than the touch input are applied according to the signal output from the pressure sensor 133. The pressure touch input can be identified. In addition, according to the signal output from the pressure sensor 133, it is also possible to know the magnitude of the pressure applied to the display 310 when the pressure touch input.
  • the motion sensor includes one or more of sensors such as a gyro sensor 131, an acceleration sensor 132, and a geomagnetic sensor, and detects the position or movement of the device using the same.
  • the acceleration sensor 132 that can be used for a motion sensor is a device that converts an acceleration signal in one direction into an electrical signal, and is widely used with the development of micro-electromechanical systems (MEMS) technology.
  • MEMS micro-electromechanical systems
  • the gyro sensor 131 is a sensor for measuring the angular velocity, and may sense a direction returned to the reference direction.
  • the heart rate detection sensor 135 measures the change in the light blood flow rate according to the change in blood vessel thickness due to the heartbeat.
  • the skin temperature sensor measures the skin temperature as the resistance value changes in response to the temperature change.
  • Skin resistance sensors measure the electrical resistance of the skin.
  • the iris recognition sensor 134 performs a function of recognizing a person using iris information of an eye having unique characteristics for each person.
  • the human iris is completed after 18 months of age, and then the circular iris pattern, which is raised near the inner edge of the iris, is almost unchanged once determined. Therefore, iris recognition is the application of security authentication technology by informatizing the characteristics of different iris for each person. In other words, it is an authentication method developed as a means of identifying people by analyzing the shape and color of the iris and the morphology of the retinal capillaries.
  • the iris recognition sensor 134 codes and compares the iris pattern with an image signal, and determines the comparison.
  • the general operation principle is as follows. First, when the user's eyes are focused on the mirror in the center of the iris recognizer at a certain distance, the infrared camera adjusts the focus through the zoom lens. Then, the iris camera images the user's iris into a photograph, and the iris recognition algorithm analyzes the iris contrast patterns by area to generate a unique iris code. Finally, a comparison search is performed as soon as the iris code is registered in the database.
  • the iris recognition sensor 134 may be provided inside the second camera 122 disposed in the eye direction, and in this case, the second camera 122 may perform a function of the iris recognition sensor.
  • the distance sensor includes a distance measurement method between two points, a triangulation method (infrared ray type, natural light type), and ultrasonic type.
  • a triangulation method infrared ray type, natural light type
  • ultrasonic type As in the conventional triangulation principle, the distance between two points is displayed when the measured objects from the two paths are reflected by a right-angle prism and incident on the two image sensors so that the relative positions match.
  • the ultrasonic method is a method in which the distance is measured by transmitting an ultrasonic wave having a sharp directivity to the object to be measured and receiving a reflected wave from the object to be measured.
  • the receiving sensor uses a piezoelectric element.
  • the Doppler radar is a radar that utilizes a phase change of the reflected wave, that is, a Doppler effect of the wave.
  • the doppler radar includes a continuous wave radar that transmits and receives a sine wave which is not pulse modulated, and a pulse radar that uses pulse modulated radio waves as an electromagnetic wave signal waveform.
  • Continuous wave radar is unsuitable for long range radar because modulating frequency is relatively high in order to obtain the performance of Doppler frequency filter. There are features that can be.
  • the pulse radar measures the distance to the target by the time from pulse transmission to reflection echo reception.
  • the voice input unit 140 is for inputting a voice signal and may include a microphone.
  • the microphone receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes it into electrical voice data.
  • the processed voice data may be converted into a form transmittable to the mobile communication base station through the wireless communication unit 250 and output in the call mode.
  • various noise canceling algorithms may be used to remove noise generated while receiving an external sound signal.
  • the output unit 300 is for outputting an audio signal, an image signal, a video signal or an alarm signal.
  • the output unit 300 may include a display unit 310, a sound output unit 320, an alarm unit 330, and a haptic module 340.
  • the display 310 displays and outputs information processed by the device. For example, when the device is in a call mode, the device displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the device is in a video call mode or a photographing mode, the captured or received images may be displayed at the same time or simultaneously, and the UI and the GUI may be displayed.
  • UI user interface
  • GUI graphic user interface
  • the display unit 310 may be used as an input device in addition to the output device. If the display unit 310 is configured as a touch screen, the display unit 310 may include a touch screen panel and a touch screen panel controller.
  • the display unit 310 may include a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a three-dimensional display. It may include at least one of (3D display).
  • two or more display units 310 may exist according to the implementation form of the device. For example, the external display unit 310 and the internal display unit 310 may be simultaneously provided in the device.
  • the display unit 310 may be implemented as a head up display (HUD), a head mounted display (HMD), or the like.
  • HMD Head mounted Display
  • a head up display (HUD) is an image display device for projecting a virtual image onto glass in a user's visible area.
  • the sound output unit 320 outputs audio data received from the wireless communication unit or stored in the memory 260 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output module 320 outputs a sound signal related to a function performed in the device, for example, a call signal reception sound and a message reception sound.
  • the sound output module 320 may include a speaker, a buzzer, and the like.
  • the alarm unit 330 outputs a signal for notifying occurrence of an event of the device. Examples of events occurring in the device include call signal reception, message reception, and key signal input.
  • the alarm unit 330 outputs a signal for notifying occurrence of an event in a form other than an audio signal or a video signal. For example, the signal may be output in the form of vibration.
  • the alarm unit 330 may output a signal to notify the call signal when the call signal is received or a message is received. Also.
  • the key signal is input, the alarm unit 330 may output the signal as a feedback to the key signal input. The user may recognize the occurrence of an event through the signal output from the alarm unit 330.
  • the signal for notifying the event occurrence in the device may also be output through the display 310 or the sound output unit 320.
  • the haptic module 340 generates various haptic effects that a user can feel.
  • a representative example of the haptic effect generated by the haptic module 340 is a vibration effect.
  • the haptic module 340 When the haptic module 340 generates vibration by the tactile effect, the intensity and pattern of the vibration generated by the haptic module 340 may be converted, and may be output by combining different vibrations or sequentially.
  • the wireless communication unit 250 may include a broadcast receiving module, a mobile communication module, a wireless internet module, a short range communication module, and a location information module.
  • the broadcast receiving module receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel, a terrestrial channel, and the like.
  • the broadcast management server may mean a server that generates and transmits at least one of a broadcast signal and broadcast related information, or a server that receives at least one of the pre-generated broadcast signal and broadcast related information and transmits the same to a terminal.
  • the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may also be provided through a mobile communication network, and in this case, may be received by the mobile communication module.
  • Broadcast related information may exist in various forms.
  • the broadcast receiving module may receive a broadcast signal using various broadcast systems, and receive a digital broadcast signal using a digital broadcast system.
  • the broadcast receiving module may be configured to be suitable for all broadcast systems providing broadcast signals as well as such digital broadcast systems.
  • the broadcast signal and / or broadcast related information received through the broadcast receiving module may be stored in the memory 260.
  • the mobile communication module transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to voice call signal, video call signal, or text / multimedia message transmission and reception.
  • the wireless internet module refers to a module for wireless internet access, and the wireless internet module may be embedded or external to the device.
  • Wireless Internet technologies include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), long term evolution (LTE), LTE-A Long Term Evolution-Advanced and the like can be used.
  • the short range communication module refers to a module for short range communication.
  • Beacon, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, etc. may be used as a short range communication technology.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • the location information module refers to a module that receives the positioning signal and measures the position of the glass type wearable device.
  • the position information module may correspond to a Global Position System (GPS) module, and the positioning signal may correspond to a GPS signal.
  • GPS Global Position System
  • the Global Position System (GPS) module receives position information from a plurality of GPS satellites.
  • the memory 260 may store a program for processing and controlling the controller 210, and may perform a function for temporarily storing input or output data (eg, a message, a still image, a video, etc.). It may be.
  • input or output data eg, a message, a still image, a video, etc.
  • the memory 260 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM
  • the storage medium may include at least one type of storage medium.
  • the device may operate a web storage that performs a storage function of a memory on the Internet.
  • the memory 260 may be referred to as a storage 260 hereinafter.
  • the interface unit 230 serves as an interface with all external devices connected to the device. Examples of external devices that connect to the device include wired / wireless headsets, external chargers, wired / wireless data ports, memory cards, card sockets such as Subscriber Identification Module (SIM) or User Identity Module (UIM) cards, Audio I / O (Input / Output) terminals, video I / O (Input / Output) terminals, earphones, and the like.
  • SIM Subscriber Identification Module
  • UIM User Identity Module
  • the interface unit 230 may receive data from such an external device or receive power and transfer the power to each component inside the device, and allow the data inside the device to be transmitted to the external device.
  • the controller 210 typically controls the operation of each unit to perform a function of controlling the overall operation of the device. For example, perform related control and processing for voice calls, data communications, and the like. In addition, the controller 210 performs a function of processing data for multimedia reproduction. In addition, it performs a function of processing the data received from the input unit or the sensing unit 130.
  • the controller 210 performs face detection and face recognition for face recognition. That is, the controller 210 may include a face detection module and a face recognition module for face recognition.
  • the face detection module may extract only the face area from the camera image acquired by the camera unit 120. For example, the face detection module extracts a face region by recognizing feature elements in the face such as eyes, nose, and mouth.
  • the face recognition module may generate a template by extracting feature information from the extracted face region, and recognize a face by comparing a template with face information data in a face database.
  • the controller 210 may perform a function of extracting and recognizing a character from an image or an image acquired by the camera unit 120. That is, the controller 210 may include a character recognition module for character recognition.
  • Optical character recognition may be applied as a character recognition method of the character recognition module.
  • OCR Optical character recognition
  • the OCR method converts a typeface image of a document written by a person who can be obtained by an image scan or printed by a machine into a format such as a computer code that can be edited by a computer, and can be implemented by software.
  • the OCR method may compare several standard pattern letters and input characters prepared in advance and select the most similar to the standard pattern letters as the corresponding letters.
  • the character recognition module includes standard pattern letters of various languages, it is possible to read printed characters of various languages. Such a method is called a pattern matching method among the OCR methods, and the OCR method is not limited thereto, and various methods may be applied. In addition, the character recognition method of the character recognition module is not limited to the OCR method, various methods for recognizing the character on the offline already printed may be applied.
  • the controller 210 may perform a function of recognizing a gaze direction based on an eyeball image or an image acquired by the second camera 122. That is, the controller 210 may include a gaze analysis module that performs gaze direction recognition. After measuring the gaze direction and the gaze direction of the user, it is possible to determine the direction that the user looks by calculating by synthesis.
  • the gaze direction refers to a direction of the face of the user and may be measured by the gyro sensor 131 or the acceleration sensor 132 of the sensing unit 130.
  • the gaze direction may be grasped by the gaze analysis module in a direction viewed by the user's pupil.
  • the gaze analysis module may detect a movement of a pupil through analysis of a real-time camera image, and apply a method of calculating a direction of a gaze based on a fixed position reflected by the cornea. For example, through the image processing method, the position of the corneal reflected light by the center of the pupil and the illumination may be extracted and the gaze position may be calculated through the positional relationship thereof.
  • the power supply unit receives an external power source and an internal power source under the control of the controller 210 to supply power for operation of each component.
  • the speech recognition unit 220 performs a function of identifying linguistic meaning content from the speech by automatic means. Specifically, the process of identifying a word or word sequence and extracting meaning by inputting a speech waveform is classified into five categories: speech analysis, phoneme recognition, word recognition, sentence interpretation, and meaning extraction.
  • the voice recognition unit 220 may further include a voice evaluation module for comparing whether the stored voice and the input voice are the same.
  • the voice recognition unit 220 may further include a voice-to-text conversion module 240 for converting an input voice into text or converting a text into voice.
  • FIG 3 is a diagram illustrating a connection relationship between the glass type wearable device 10, the external server 20, and the external device 30 according to an embodiment of the present invention.
  • the glass type wearable device 10 may perform all the processing for information processing therein, but the external server 20 may perform some of the information processing. Accordingly, the glass type wearable device 10 may transmit the data acquired through the input unit 100 or data on which some information processing is performed to the external server 20 as information processing request data. The glass type wearable device 10 may receive information processing result data performed by the external server 20 through wireless communication. The glass type wearable device 10 may provide the received information processing result data to the user through the output unit 300 in various ways. The external server 20 may be different according to a service performed by the glass type wearable device 10.
  • the glass type wearable device 10 may provide the information processing result data to the user through its output unit 300, or may provide the information processing result data using the external device 30. That is, when the glass type wearable device 10 performs the entire information processing process, the external device 30 may output information processing result data received from the glass type wearable device 10. In addition, when the external server 20 receives the information processing request data from the glass type wearable device 10 to perform some information processing, the external device 30 receives the information processing result data received from the external server 20. You can output
  • the external device 30 may include various devices such as a smartphone, a tablet PC, a smart TV, and an output unit (for example, a display unit provided in the vehicle glass or an internal vehicle sound output unit) provided in the vehicle. .
  • the glass type wearable device 10 may receive a wireless communication signal (for example, a beacon signal transmitted from a beacon tag that is a wireless communication tag 30) transmitted from the external device 30.
  • the glass type wearable device 10 may perform information processing using the received wireless communication signal.
  • FIG. 4 is a flowchart illustrating a method of providing an escape route using a glass type wearable device according to an exemplary embodiment of the present invention.
  • the method for providing an escape route using the glass type wearable device includes: receiving event occurrence location information in a building (S100); Calculating an escape route by determining safe emergency exit location information based on the event occurrence location information (S110); Generating guide information along the calculated escape route (S120); And providing the guide information to the user (S130).
  • S100 event occurrence location information in a building
  • S110 Calculating an escape route by determining safe emergency exit location information based on the event occurrence location information
  • Generating guide information along the calculated escape route S120
  • S130 providing the guide information to the user
  • the glass type wearable device receives location occurrence information of an event in a building (S100).
  • the incident occurrence position information is position information on which a specific incident requiring evacuation of a user in a specific building occurs.
  • the incident location information may be identified through the operation of the sprinkler in each zone, the operation of the fire alarm, the power outage sensor, the heat detector, the emergency alarm after the security guard's recognition, the failure of the surveillance camera or communication equipment. . For example, if a sprinkler in a specific zone operates due to the smoke generated by a fire, the specific zone is determined to be an event occurrence place.
  • the glass type wearable device 10 receives event occurrence position information through wireless communication.
  • the glass type wearable device 10 may actively sense the beacon signal.
  • the beacon 40 may be attached to various locations of the building, in particular may be included in the emergency exit notification light.
  • the glass type wearable device 10 may receive a beacon signal including event occurrence location information at any position through beacons attached to various places inside a building.
  • the glass type wearable device 10 calculates an escape route by determining safe emergency exit location information on the basis of the event occurrence location information (S110).
  • the emergency exit location information is information on the location of the emergency exit located in the building, the glass type wearable device 10 may be stored, the beacon signal is included with the event occurrence location information glass type wearable device 10 Can be sent to.
  • the emergency exit may be a passage to evacuate to the outside, or, if it is not possible to evacuate to the outside, it may be a path to a place that can safely evacuate to a roof or the like and wait for rescue.
  • the control unit 210 of the glass type wearable device 10 determines an emergency exit where safety can be evacuated based on the location of the accident and the current location of the user. Thereafter, the glass type wearable device 10 calculates an escape route that can be safely evacuated to the identified emergency exit.
  • the glass type wearable device 10 generates guide information along the calculated escape route (S120).
  • the guide information may include moving route information, a moving route switching direction, a distance remaining until the moving route switching, and precautions when moving.
  • the control unit 210 of the glass type wearable device 10 Based on the escape route, the control unit 210 of the glass type wearable device 10 generates guide information including a direction to the emergency exit, a distance to a branch point, and the like.
  • the guide information is provided to the user (S130).
  • the method of informing the user of the guide information a method of informing the user by displaying the guide information on the display unit 310, a method of informing the user of the guide information by the sound output, the user through the intensity and vibration generation direction of the vibration generation A method of notifying the guide information to the user may be applied.
  • the moving path switching direction and the distance remaining until the moving path switching may be displayed on the display unit 310 of the glass type wearable device 10.
  • the escape information since the escape information is sequentially included in the guide information, the user requests the control unit 210 to provide the next guide information through the user input unit 110 after the change of the movement path, and confirms the guide information in order, and then escapes the route. You can move it.
  • the method of notifying the user by displaying the guide information on the display unit 310, by measuring the current moving direction through a motion sensor such as a gyro sensor 131, by grasping whether the user's moving direction corresponds to the escape route It can provide a notification to the user.
  • the method may further include receiving real-time event progress status information through wireless communication, wherein the guide information providing step (S130) may provide the user with the event occurrence location information or the event progress status information.
  • the event progress status information is information about a situation in which the event is being processed or a situation in which the event is in progress.
  • the event progress information may mean the progress of the fire (ie, the range of buildings in which the fire is expanded), the degree of extinguishing the fire, or whether a firefighter is inserted.
  • the incident progress status information is analyzed based on at least one or more of the sprinkler position or number operated by an external server, the location of the camera or communication equipment in the building, or the failure time point, and wireless communication.
  • the glass type wearable device 10 displays event occurrence information on a map displayed on the display 310 and displays a situation in which an event is being processed. If the case is being handled safely, the user can be aware of the handling situation and can prevent secondary accidents caused by hasty evacuation.
  • the glass-shaped wearable device 10 includes a step of measuring the current position through the indoor positioning; wherein, the escape route calculation step (S110) is based on the indoor location information and the event occurrence location information appropriate emergency exit and The escape route can be calculated.
  • the escape route calculation step (S110) is based on the indoor location information and the event occurrence location information appropriate emergency exit and The escape route can be calculated.
  • a method of recognizing the indoor location of the user a method of measuring the current location of the user through communication between the wireless communication device such as the beacon 40 and the glass type wearable device 10 may be applied.
  • the glass type wearable device 10 receives three distinguishing beacon signals, based on the strength of the received beacon signal and the position of the beacon 40 that transmitted the beacon signal. You can survey the location.
  • the position measurement method using the beacon 40 and the mutual communication is not limited thereto, and various methods such as a method using the direction of the beacon signal may be applied.
  • the first camera 121 may be used to recognize a characteristic element such as a characteristic terrain or a feature of the surrounding, and may be stored in the glass type wearable device 10 or wirelessly from an external server.
  • the current location of the user may be recognized by performing comparison with indoor map information received through communication.
  • the first camera 121 is a camera provided at one side of the glass type wearable device 10 to acquire an image or an image of the front side. Accordingly, the first camera 121 acquires the front image or the image, extracts the characteristic element in the image or the image, and grasps the characteristic element from the indoor map information to determine the current position.
  • the direction that the user is currently facing may be recognized through the feature element position in the front image or the image.
  • the user may be provided with appropriate guide information corresponding to the current location while moving from the glass type wearable device 10.
  • the method of determining the indoor location of the user is not limited thereto, and various methods of implementing indoor location recognition may be applied.
  • the glass type wearable device 10 may calculate an appropriate emergency exit and escape route based on the indoor location information and the event occurrence location information. Through this, the glass type wearable device 10 may guide an emergency exit and an escape route that best fits the location of the user, thereby assisting the user to safely escape.
  • the method may further include transmitting real-time location information of the user to an external device through wireless communication.
  • transmitting the position of the user recognized by the glass type wearable device 10 to an external terminal through wireless communication it is possible for people outside to grasp the position of the user in the building, so that it can be easily structured.
  • the glass-type wearable device 10 to identify the building location information; And requesting and receiving an internal server map from the inside of the building based on the location information of the building.
  • a building interior map is required. Accordingly, the glass type wearable device 10 may grasp building location information through a GPS signal to identify a building to receive an internal map. Thereafter, the glass type wearable device 10 may request and receive an internal map of a building corresponding to the location information from an external server.
  • FIG. 6 is a block diagram of the emergency exit notification system using the beacon 40 and the glass-type wearable device 10 according to an embodiment of the present invention.
  • the emergency exit notification system using the beacon 40 and the glass type wearable device 10 may include a wireless communication unit 250; Control unit 210; And an output unit 300.
  • a wireless communication unit 250 may include a wireless communication unit 250; Control unit 210; And an output unit 300.
  • the wireless communication unit 250 performs a function of receiving event occurrence place information from the beacon 40. In addition, when the glass type wearable device 10 does not store the emergency exit location information, the wireless communication unit 250 may receive the emergency exit location information. In addition, the wireless communication unit 250 performs a function of exchanging information with an external terminal. That is, the wireless communication unit 250 may transmit the measured indoor location information of the user to an external terminal.
  • the controller 210 recognizes the event occurrence place and the emergency exit location information, calculates an escape route, and generates guide information according to the escape route. In addition, the controller 210 performs a function of recognizing the current location of the user in accordance with a method such as mutual communication with the beacon 40 or recognition of characteristic elements in the front by the first camera 121. can do. For example, the controller 210 may measure the current location of the user by using one or more of the beacon signals received by the wireless communication unit 250.
  • the output unit 300 performs a function of informing the user of the guide information.
  • the output unit 300 includes a display unit 310 for visually displaying the guide information, an audio output unit 320 for notifying the user of the guide information as a sound output, and guides the user through the strength and direction of vibration generation. It may include a vibration alarm unit for informing information.
  • the emergency exit notification system using the beacon 40 and the glass type wearable device 10 may include an external server 20; Beacon 40; It includes a glass-shaped wearable device 10.
  • an external server 20 Beacon 40
  • Beacon 40 It includes a glass-shaped wearable device 10.
  • FIG. 7 detailed description of the previously described configuration will be omitted.
  • the external server 20 recognizes the occurrence place of the event, calculates the escape route, and performs the function of setting the movement route guide information to be transmitted for each beacon 40.
  • the beacon 40 performs a function of generating a beacon signal corresponding to the guide information set by the external server 20.
  • the glass type wearable device 10 receives the beacon signal and performs a function of informing the user of guide information.
  • the glass type wearable device 10 includes a wireless communication unit 250; Control unit 210; And an output unit 300.
  • the wireless communication unit 250 receives a beacon signal from the beacon 40 and performs a function of exchanging information with the outside.
  • the controller 210 performs a function of performing information processing according to an output form based on the received beacon signal.
  • the output unit 300 performs a function of informing the user of the guide information.
  • the external server 20 recognizes the occurrence place of the event, calculates the escape route, and sets the movement route guide information to be transmitted for each beacon 40.
  • the guide information may include a moving path switching direction, a distance remaining until the moving path switching, and precautions when moving.
  • the beacon 40 generates a beacon signal corresponding to the guide information set by the external server 20, and the wireless communication unit 250 of the glass type wearable device 10 receives the beacon signal. Thereafter, the control unit 210 performs information processing on the received beacon signal according to the output form, and the output unit 300 receives the processed information and notifies the user of the route information. As the user moves, the beacon signal may be received to provide appropriate guide information of the user's location.
  • controller 210 may measure the current location of the user by using one or more of the beacon signals received by the wireless communication unit 250.
  • FIG. 8 is a flowchart illustrating an emergency emergency contact method using a glass type wearable device according to an exemplary embodiment of the present invention.
  • an emergency emergency contact method using a glass-type wearable device may include obtaining a voice input or an operation input of a user (S200); Determining an emergency situation by recognizing the voice input or operation input (S210); And transmitting the emergency notification information to the emergency contact counterpart via wireless communication (S220).
  • S200 voice input or an operation input of a user
  • S210 voice input or operation input
  • S220 wireless communication
  • the glass type wearable device 10 receives a user's voice input or an operation input (S200).
  • the glass type wearable device 10 may receive a user voice through the voice input unit 140.
  • the voice input may be a scream or a designated emergency signal phrase. For example, if you are in a criminal situation, voice prompts such as screaming or asking for help, such as 'Help Me!'
  • the glass type wearable device 10 may receive a voice of a surrounding according to a user's setting. For example, if a user has a disease and falls frequently, the user may be asked to input a voice question asked to confirm the user's condition.
  • the glass type wearable device 10 may acquire motion data of a user by a motion sensor.
  • the motion input may correspond to an unexpected movement of a user or a stored bow movement pattern.
  • a sudden movement of the user may correspond to an abnormal movement pattern due to a writhing action when the user is kidnapped.
  • the glass type wearable device 10 determines the emergency situation by recognizing the voice input or the operation input (S210).
  • the voice recognition unit 220 grasps a linguistic meaning or tone from the voice input, and recognizes whether the controller 210 corresponds to an emergency based on this.
  • the control unit 210 compares the user's motion information recognized by the motion sensor with the stored motion pattern information, and if it is recognized as the pattern information corresponding to the emergency situation, it is determined as an emergency situation. Can be.
  • the controller 210 may determine an emergency situation.
  • the glass type wearable device 10 transmits the emergency notification information to the emergency contact counterpart via wireless communication (S220).
  • the wireless communication unit 250 commands the emergency contact. Accordingly, the wireless communication unit 250 performs an emergency contact to the outside.
  • the emergency contact may include a text message, a phone call or a push notification service to a predetermined contact.
  • the push notification service may be, for example, a push service for notifying a message to users who use the same program in a smartphone.
  • the predetermined contact may be a friend, guardian or police, but is not limited to this kind.
  • the contact information may be obtained from a memory of the glass type wearable device, or may be received by an external server through wireless communication.
  • the emergency contact method is not limited to the described method, and various methods that may be performed by the wireless communication unit 250 of the glass type wearable device may be applied.
  • the method may further include measuring a current location of the user, wherein the emergency contact performing step S220 may include the measured current location information in the emergency notification information.
  • the current location information of the user may be grasped by the GPS module of the glass type wearable device 10.
  • the method may further include determining an emergency classification corresponding to the voice input or the operation input; And selecting a specific designated agency or a specific acquaintance as the emergency contact counterpart according to the emergency classification.
  • the glass type wearable device 10 may classify and store data for quickly performing emergency contact in the storage unit 260.
  • the data to be included in the storage unit 260 may include voice input or operation input data corresponding to a contact point and an emergency of the designated authority and the designated authority. Accordingly, the storage unit 260 may classify the user's voice input or operation input according to the emergency situation, and store the designated agency and the contact information of the corresponding organization for each emergency situation.
  • control unit 210 may grasp the emergency classification including the user's voice input or operation input, and select a designated authority or acquaintance to perform an emergency contact according to the emergency classification. Thereafter, the control unit 210 may transmit a command signal to the wireless communication unit 250 to perform an emergency contact to a contact point of a designated authority or an acquaintance, and the wireless communication unit 250 may perform an emergency contact to the contact point.
  • the contact number is not limited to a phone number, and may include an address or the like for transmitting data by various wireless communication methods.
  • the method may further include performing real-time video recording or real-time audio recording of the front side.
  • the emergency contact performing step (S220) may transmit the captured video or the recorded voice through wireless communication. That is, the glass type wearable device 10 may transmit a real time image captured by the first camera 121 or a real time voice obtained by the voice input unit together with the wireless communication when performing an emergency contact.
  • the other party e.g., friend, guardian, firefighter, or police officer
  • the other party who has received an emergency call can recognize the user's emergency, so that the other party can take appropriate action according to the emergency. Can be.
  • the emergency contact step (S220) may include the step of determining the emergency classification occurred to the user by analyzing the real-time image. For example, when a real-time image falling onto the floor is acquired, the glass type wearable device 10 may recognize that the user falls down and grasp an emergency as an emergency according to a physical abnormality.
  • the emergency notification information transmission step (S220) an unspecified counterpart within a certain distance may transmit an active recognition signal to recognize the emergency situation of the user.
  • the glass type wearable device 10 may transmit emergency notification information to a wireless communication signal capable of active sensing, such as a beacon signal, to transmit to the surroundings.
  • a wireless communication signal capable of active sensing such as a beacon signal
  • the method may further include outputting voice to an external emergency response method corresponding to the emergency of the user.
  • an external emergency response method corresponding to the emergency of the user.
  • the glass type wearable device 10 may generate a first aid method corresponding to an emergency of the user as a voice output and notify the surrounding people.
  • the method may further include receiving a bio signal of the user from an external wearable device or acquiring the bio signal by the glass type wearable device, wherein the emergency determination step (S210) may reflect the bio signal to the user's emergency situation. Can be judged.
  • the emergency determination step (S210) may reflect the bio signal to the user's emergency situation. Can be judged.
  • user biometric information such as heart rate and electrocardiogram is required. Therefore, the glass type wearable device 10 may receive a biosignal from another wearable device capable of measuring the biosignal, or the glass type wearable device 10 may directly measure the biosignal, and may reflect the biosignal of the user. Understand the situation. For example, when a real-time image that suddenly falls to the floor is obtained, the emergency situation of the user may be reflected by reflecting a biosignal to determine whether the user has fallen or fallen.
  • FIG. 10 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by measuring indoor positioning and gaze direction according to an embodiment of the present invention.
  • the method for controlling an indoor electronic device using the glass type wearable device 10 may include: measuring, by the glass type wearable device, a current indoor position (S300); Measuring a gaze direction of the user by recognizing the azimuth angle or the high and low angle of the glass type wearable device (S310); Determining an electronic device located in a gaze direction from the measured current indoor location (S320); Receiving a control command of the electronic device from the user (S330); And transmitting the inputted control command to the electronic device through wireless communication (S340).
  • the indoor electronic device control method using the glass type wearable device according to an embodiment of the present invention will be described in order.
  • the glass type wearable device 10 measures a current indoor location (S300).
  • Various indoor positioning methods may be applied to indoor location recognition of the glass type wearable device 10.
  • the indoor positioning method is not limited to the method described below, and various methods may be applied.
  • a method of measuring using an indoor wireless communication network such as Wi-Fi or beacon may be applied.
  • the glass type wearable device 10 may receive the wireless communication signal to recognize the strength, direction, type, and the like of the signal, and thereby measure an indoor location.
  • a feature element is extracted from a front image or an image of the user acquired by the first camera 121 provided on one side of the glass type wearable device 10 to acquire an image or an image of the front, and the feature element is included.
  • a method of determining the position of the glass type wearable device 10 on the indoor map may be applied.
  • the glass type wearable device 10 recognizes an azimuth angle or a high and low angle to measure a direction of attention of the user (S310).
  • the gaze direction refers to a direction that the face of the user faces to look at a specific electronic device, and the face direction corresponds to the direction that the glass wearable device is facing.
  • the glass type wearable device 10 may perform azimuth or high and low angle measurements using the gyro sensor 131, the geomagnetic sensor, and the acceleration sensor 132.
  • the direction corresponding to the measured high operation and azimuth angle corresponds to the user's gaze direction.
  • the glass type wearable device 10 determines an electronic device located in the gaze direction at the measured current indoor location (S320). That is, the glass type wearable device 10 recognizes an electronic device located in a direction viewed from the recognized current indoor location.
  • the glass type wearable device 10 may determine the direction in which the electronic device to be controlled is located by applying the azimuth and elevation measured based on the recognized current position, and then identify the electronic device located in the direction determined on the stored indoor map. Can be.
  • the glass type wearable device 10 receives a control command of an electronic device from a user (S330). Receiving a control command, a method of inputting the control command by recognizing the blink of the eye obtained by the second camera 122, a method of inputting the control command by recognizing the motion pattern by the motion sensor, touch unit A method of inputting a control command by a touch operation of 112, a method of inputting the control command by a voice input of a user, and a method of inputting the control command by recognizing a user's hand gesture by the first camera 121. Or the like.
  • the glass type wearable device 10 transmits the input control command to the electronic device through wireless communication (S340).
  • the glass type wearable device 10 may transmit a control command by a communication method directly connecting to the electronic device.
  • the glass type wearable device 10 may transmit a control command to an electronic device identified by a Wi-Fi Direct method or a Bluetooth method.
  • the glass type wearable device 10 may be connected to a home wireless communication network such as Wi-Fi (WLAN) and transmit a control command to an electronic device. That is, the glass type wearable device 10 is also connected to a wireless communication network to which at least one electronic device is connected, and the glass type wearable device 10 issues a control command to a wireless access point 50 using the wireless communication network. In addition, the wireless access point 50 may transmit the control command to the electronic device to be controlled.
  • WLAN Wi-Fi
  • the glass type wearable device 10 may be automatically connected to the wireless communication network.
  • the wireless communication network is actively sensed so as to be automatically connected, so that the user can directly enter the home and control the electronic device using the glass type wearable device 10. have.
  • the method may further include obtaining, by the glass type wearable device, an eyeball gaze direction, and the determining of the control object (S320) may include an angle corresponding to the gaze direction with respect to the gaze direction based on the measured current indoor position.
  • the electronic device is determined as the control target. Since the user wears the glass type wearable device 10 and does not look only at the front, it is necessary to consider the eyeball direction of the user. Accordingly, the eye tracking is performed by the second camera 122 to recognize the user's gaze direction, and the electronic device is located by applying the recognized eye gaze direction to the gaze direction according to the measured azimuth and the high and low angles. Can recognize the exact location.
  • the method may further include: when the one or more electronic devices are determined to be controlled, displaying the list of one or more electronic devices determined to be controlled by the glass type wearable device; And selecting a specific electronic device from the electronic devices in the list by the user. Since a plurality of electronic devices may be located in a direction that the user wearing the glass type wearable device 10 watches, and the plurality of electronic devices may be located close to each other even when the eye tracking direction is reflected, the glass type wearable The device cannot determine which electronic device the user wants to control among the recognized plurality of electronic devices. In this case, it is necessary to provide the user with a plurality of electronic devices determined to be controlled to be selected by the user.
  • the glass type wearable device 10 may display the determined one or more electronic device list on the screen and provide the same. Thereafter, the glass type wearable device 10 may select a specific electronic device from the electronic devices in the list and determine the electronic device to be controlled by the user. For example, the glass type wearable device may display a list of a plurality of electronic devices together with a number on the screen, and a touch input, a voice input, a blinking eye input, a hand gesture input, and a motion input corresponding to a specific number from a user. And the like to select a specific electronic device to be controlled.
  • the method may further include providing a notification to a user by receiving the electronic device control result according to the control command. That is, the glass type wearable device 10 may receive a wireless communication result of the control from the electronic device that receives the control command, and may notify the user.
  • the manner of providing a notification to the user may include a method of displaying on the screen by the display unit 310, a method of notifying by a voice output, and the like.
  • the method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.
  • the glass type wearable device 10 may receive an input pattern such as a hand gesture pattern, an eye blink pattern, or a moving pattern from a user, select a control command corresponding thereto, and set a corresponding relationship between the input pattern and the control command.
  • the user may set and store different input methods for each control command. For example, when the user wants to control the lighting, the command to turn off the audio may be set as a blinking pattern for closing the left eye, and the command for turning on the audio may be stored for the right blinking pattern.
  • a command to be transferred to the next song at the time of audio reproduction can be set in the moving direction and stored. Through this, the user can set it in the form of the command he / she wants, and the command input method and pattern can be set according to the characteristics of the user.
  • FIG. 11 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by analyzing a front image according to an embodiment of the present disclosure.
  • a method of controlling an indoor electronic device using a glass type wearable device may include: obtaining an image corresponding to a gaze direction of a user (S400); (S410) controlling the electronic device in the image through the image analysis; Receiving a control command of the electronic device from a user (S420); And transmitting the inputted control command to the electronic device through wireless communication (S430).
  • the indoor electronic device control method using the glass type wearable device according to an embodiment of the present invention will be described in order. Hereinafter, detailed description of the above-described steps will be omitted.
  • the glass type wearable device 10 obtains an image corresponding to the direction of attention of the user (S400).
  • the first camera 121 of the glass type wearable device 10 acquires an image in a direction (ie, a viewing direction or a forward direction) of the face of the user.
  • the glass type wearable device 10 determines the electronic device in the image as a control object through the image analysis (S410). For example, the glass type wearable device may recognize the electronic device located in the center of the image as a control target. In general, since the electronic device to be controlled by the user will be located in the center of the front image or image acquired by the first camera 121, the electronic device located in the center of the acquired image can be recognized as a control object. .
  • the controller 210 may analyze the image of the electronic device to determine which electronic device is located at the center of the image, and select the electronic device as a control target.
  • the glass type wearable device 10 when the glass type wearable device 10 captures an image including the user's hand gesture, the glass type wearable device 10 identifies the user's hand gesture area included in the image and displays the image in the hand gesture area.
  • the corresponding electronic device may be extracted and determined to be a control object.
  • the glass type wearable device may determine the electronic device in the direction indicated by the user's finger as the control target.
  • the glass type wearable device 10 may determine an electronic device included in a specific gesture of the user (for example, a circular gesture made by collecting a finger) as a control target. That is, the controller 210 may recognize a specific hand gesture of the user and extract an electronic device image part included in the recognized hand gesture.
  • the image of the extracted electronic device may be analyzed to recognize the electronic device to be controlled.
  • the glass type wearable device 10 receives a control command of the electronic device from the user (S420).
  • the glass type wearable device 10 transmits the input control command to the electronic device through wireless communication (S430).
  • the method may further include acquiring, by the glass type wearable device, an eyeball direction.
  • the electronic device in the region corresponding to the gaze direction may be extracted from the image and determined as the controlling object. That is, the glass type wearable device 10 may recognize the eyeball gaze direction of the user and calculate a gaze point corresponding to the gaze direction in the image. Thereafter, the glass type wearable device 10 may extract an electronic device located at a gaze point and determine the control target. Since the user wears the glass type wearable device 10 and does not look only at the front, it is necessary to consider the eyeball direction of the user. Accordingly, eye tracking by the second camera 122 may be performed to recognize the eyeball direction of the user, and the correct eyeball position may be recognized by applying the recognized eyeball direction to the measured azimuth and elevation angles. have.
  • the method may further include: when the one or more electronic devices are determined to be controlled, displaying the list of one or more electronic devices determined to be controlled by the glass type wearable device; And selecting a specific electronic device from the electronic devices in the list by the user.
  • the method may further include receiving an electronic device control result according to the control command and notifying the user.
  • the method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.
  • FIG. 12 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by voice command recognition according to an embodiment of the present invention.
  • a user receives an electronic device selection command and a voice command corresponding to the control command from the user.
  • the indoor electronic device control method using the glass type wearable device according to an embodiment of the present invention will be described in order. Hereinafter, detailed description of the above-described steps will be omitted.
  • the glass type wearable device 10 receives an electronic device selection command and a voice command corresponding to the control command from the user (S500).
  • the voice input unit 140 of the glass type wearable device 10 receives a voice command of a user including the name of the control target and the control command.
  • the glass type wearable device 10 analyzes the voice command to determine a control command with the electronic device to be controlled (S510). That is, the voice recognition unit 220 of the glass type wearable device 10 performs voice recognition of the input voice command, and controls the electronic device corresponding to the control object in the voice command and transmits the same to the electronic device. Figure out the order.
  • the voice input unit 140 receives a user's voice command “turn off the living room lamp.”
  • the voice recognition unit 220 interprets the voice command to recognize that the electronic device corresponding to the control object is a living room lamp, and recognizes that the control command desired by the user is off.
  • the manner in which the glass type wearable device 10 receives the user's voice command and recognizes the control target and the control command is not limited thereto, and various methods may be applied.
  • the glass type wearable device 10 transmits the input control command to the selected electronic device through wireless communication (S520).
  • the method may further include receiving a control result of the electronic device according to the control command and notifying the user.
  • the method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.
  • the display unit 310 may display the selected electronic device and a control command so that the user may check whether the selection command and the control command of the electronic device are properly input.
  • the method may further include receiving a control result of the electronic device according to the control command and notifying the user.
  • the electronic device transmits the execution result according to the control command to the glass type wearable device 10, and the glass type.
  • the wearable device 10 notifies the user by processing the received result information.
  • the notification method may include a method of visually displaying the control result information on the display 310, a method of notifying the user by voice through the sound output unit 320, and the like.
  • FIG. 14 is a block diagram of an indoor electronic device control system using the glass type wearable device 10 according to an exemplary embodiment of the present invention. However, in FIG. 5, detailed description of the previously described configuration will be omitted.
  • an indoor electronic device control system using the glass type wearable device 10 may include a glass type wearable device 10; And a wireless access point 50.
  • the glass type wearable device 10 has a function of receiving an electronic device selection command and a control command for control and transmitting a selection command and a control command of the electronic device connected to a wireless access point through the wireless communication. To perform. That is, the glass type wearable device 10 includes a wireless communication unit 250; Control unit 210; And a user input unit 110.
  • the user input unit 110 performs the function of receiving the electronic device selection command and the control command that the glass type wearable device 10 wants to control.
  • the controller 210 determines an electronic device to be controlled by the user based on the selection command and the control command input by the user input unit 110 and grasps a desired control command. In addition, the controller 210 performs information processing to transmit a control command to the electronic device selected through the wireless communication unit 250.
  • the wireless communication unit 250 connects the glass type wearable device 10 with a wireless access point, and transmits a selection command and a control command of the electronic device input through the wireless communication.
  • a function of receiving an indoor wireless communication signal may be performed.
  • the first camera 121 may further include a.
  • the first camera 121 is a camera provided at one side of the glass type wearable device 10 to acquire an image or image in front of the user.
  • the first camera 121 acquires an image or image in front of the user to recognize the electronic device that the user watches. Do this.
  • the controller 210 may recognize an electronic device located in the center of the image or image input by the first camera 121 or may recognize the electronic device present in a specific hand gesture of the user.
  • the first camera 121 performs a function of acquiring a front image or an image for real time indoor positioning of the glass type wearable device 10.
  • the controller 210 may extract a feature element from the front image or the image acquired by the first camera 121, and determine the indoor location based on the location, size, etc. of the feature element.
  • a motion sensor In addition, a motion sensor; And a second camera 122.
  • the motion sensor recognizes a user's moving pattern and performs a function of inputting a selection command or a control command of the electronic device. In addition, the motion sensor performs a function of recognizing the direction the user looks.
  • the geomagnetic sensor and the gyro sensor 131 and the like measure the azimuth angle and the acceleration sensor 132 measures the high and low angles, it is possible to recognize the direction the user looks at the current location of the room.
  • the second camera 122 is a camera provided on one side of the glass type wearable device 10 to acquire an image or an image in an eyeball direction.
  • the second camera 122 recognizes the user's blink pattern and performs a function of inputting a selection command or a control command of the electronic device.
  • the second camera 122 acquires an eyeball direction image or an image, and the glass type wearable device 10 performs eye tracking, so that the gaze direction of the user considering the eyeline direction can be accurately determined.
  • the wireless access point 50 receives a selection command and a control command of the electronic device from the glass type wearable device 10 through wireless communication, and transmits the control command to the selected electronic device.
  • FIG 15 is an internal configuration diagram of the shell direction recognition system 60 according to an embodiment of the present invention.
  • the shell direction recognition system 60 according to an embodiment of the present invention, the shooting specification measuring unit 610; GPS module 620; A wireless communication unit 630; And a second control unit 640.
  • the shooting specification measuring unit 610 performs a function of measuring the shooting specifications of the gun.
  • An elevation sensor 611 performs a function of recognizing the elevation of the gun.
  • the elevation sensor 611 uses a gravity measurement method.
  • the gravity measurement is a method of measuring tilt against gravity by measuring earth gravity through a micro electro mechanical system (MEMS) device and a conductive liquid. By using the method of measuring the inclination according to the conduction step can be applied.
  • MEMS micro electro mechanical system
  • the azimuth sensor 612 recognizes the azimuth of the gun.
  • the azimuth sensor 612 may be made of a gyro compass.
  • the gyro compass is attached to the axis of the high-speed rotating gyroscope to obtain the direction by indicating the rotation axis (true north direction) of the earth under the influence of the earth's rotation.
  • the azimuth sensor 612 may be made of an electronic compass.
  • the electronic compass is a device that obtains the magnetic north direction by measuring the earth's magnetic field using a small device using a micro electro mechanical systems (MEMS) technology.
  • MEMS micro electro mechanical systems
  • the azimuth sensor 612 may be formed of a plurality of GPS modules. It is a method of measuring the north direction (direction) by obtaining two GPS values and comparing each location information with each other.
  • the technology for measuring the azimuth is not limited thereto, and various techniques for measuring the azimuth may be applied.
  • the GPS module 620 recognizes the position of the gun. Since the location where the shell is fired can be calculated based on the measured shooting range and the firing range of the gun, the GPS module 620 recognizes the location of the gun to determine the location where the shell is fired.
  • the second control unit 640 performs a function of processing data in order to transmit the shooting specifications measured from the shooting specification measuring unit 610 through the wireless communication unit 630.
  • the wireless communication unit 630 performs a function of exchanging information with the glass type wearable device. That is, the wireless communication unit 630 transmits the shooting specification information measured by the shooting specification measuring unit 610 to an external device such as the glass type wearable device 10 through wireless communication.
  • the power supply unit may further include.
  • the power supply unit supplies power to drive the system.
  • the power supply unit may be charged wirelessly or wired, or may be made of a power replacement.
  • the shelling recognition unit may further include.
  • the shell recognition unit recognizes whether or not the shell is fired.
  • the shell recognition unit may include an acceleration sensor, a vibration sensor, an acoustic sensor, a smoke sensor, and the like.
  • the acceleration sensor is an element that converts an acceleration change in one direction into an electrical signal, and recognizes whether or not the trigger is triggered by measuring acceleration due to recoil when the shell is triggered.
  • the acoustic sensor recognizes the explosion sound generated when the shell is triggered to recognize whether or not the trigger is triggered.
  • the smoke sensor recognizes the smoke generated by the explosives when the shell is triggered to determine whether or not the trigger.
  • the method of recognizing the shelling is not limited thereto, and may be implemented by various sensors capable of identifying characteristics of the shell firing.
  • the shell direction recognition system 60 measures the elevation angle and the azimuth angle at which the shooting specification measuring unit 610 is the shooting specification, and the GPS module 620 measures the current position of the gun.
  • the second control unit 640 performs the information processing for transmitting the measured information to the wireless communication to the wireless communication unit 630, the wireless communication unit 630 transmits to the glass type wearable device.
  • the second controller 640 may receive the terrain information and the range information through the wireless communication unit 630 to perform the predicted landing location calculation.
  • Anticipated bombardment position display system using a glass-type wearable device using a glass-type wearable device according to an embodiment of the present invention, the wireless communication unit 250; Memory 260; First control unit 210; And a display unit 310.
  • the wireless communication unit 250 performs a function of receiving shooting specification information from the shell direction recognition system 60. In addition, the wireless communication unit 250 may perform a function of receiving terrain information from an external server. In addition, the wireless communication unit 250 may perform a function of receiving the current position information of the gun from the shell direction recognition system 60. Data reception from the shell direction recognition system 60 of the glass wearable device 10 is shown in FIG. 17.
  • the first control unit 210 performs a function of calculating the expected impact location based on the shooting specification information received by the wireless communication unit 250 and the range information stored in the memory 260.
  • the range is the distance that the shell flies upon firing. That is, the range information refers to various information such as the firing speed of the shell required for calculating the reach of the shell according to the elevation.
  • the first controller 210 detects the current position by the GPS module 620 of the shell direction recognition system 60 or the GPS module of the shelling position display system, and based on the elevation of the received shooting specification information. Calculate the range the shell reaches at the elevation.
  • the first controller 210 calculates which direction the shell will fly by the firing range in the direction from the current position by applying the azimuth angle of the received shooting specification information.
  • the first controller 210 may calculate the predicted impact location by reflecting the terrain information (for example, a map including information such as a contour map or a building height). That is, since the position where the shell can reach may vary according to the terrain characteristics such as the height of the mountain, the first controller 210 may calculate the predicted landing area in consideration of the terrain information.
  • the terrain information for example, a map including information such as a contour map or a building height. That is, since the position where the shell can reach may vary according to the terrain characteristics such as the height of the mountain, the first controller 210 may calculate the predicted landing area in consideration of the terrain information.
  • the terrain information may be received from an external server by the wireless communication unit 250 or stored in the memory 260 of the glass type wearable device 10.
  • the terrain information may further include real-time acquisition information received by the wireless communication unit 250.
  • the real-time acquisition information may include an image of the current operation area obtained by the reconnaissance plane. Therefore, the first control unit 210 may determine the situation of the building, the forest, the enemy's position and the like through the image transmitted in real time, and calculate the impact point in consideration of this.
  • the real-time acquisition information is not limited thereto, and may include various information obtained in real time with respect to the battlefield situation.
  • the memory 260 stores a range information for each type of gun.
  • the memory 260 may perform a function of storing a terrain degree (for example, a map including information such as a contour map or a building height).
  • the display 310 displays a predicted landing on a map and displays the expected landing location to the user.
  • an embodiment of the present invention may include a user input unit 110.
  • the user input unit 110 performs a function of inputting the bombardment location information desired by the user.
  • the bombardment location information is location information of the hit point of the shell desired by the user, for example, the latitude and longitude of the location may correspond.
  • the voice input unit 140 and the voice recognition unit 220 may be included.
  • the voice input unit 140 performs a function of receiving the bombardment location information as a voice command of the user, and the voice recognition unit 220 performs a function of identifying the bombardment location information from the input voice command.
  • an embodiment of the present invention may include a first camera 121.
  • the first camera 121 is a camera provided at one side of the front part of the glass type wearable device to acquire an image or an image of the front of the user.
  • the first camera 121 performs a function of acquiring an image or an image to receive the bombardment location information.
  • the first camera 121 obtains an image including the bombardment location information and transmits the image to the first controller 210, and the first controller 210 extracts a character corresponding to the bombardment location information in the image,
  • the location information (latitude and longitude) of the extracted text can be recognized.
  • the GPS module may further include.
  • the GPS module performs the function of identifying the current user's location. The position of the user firing the gun is within the position and error of the gun. Therefore, without receiving the current position information of the artillery from the shell direction recognition system 60 by the wireless communication unit 250, the GPS module measures the current position of the user and the first control unit 210 for calculating the predicted impact location. Can be delivered to.
  • the alarm unit 330 may further include.
  • the alarm unit 330 notifies the user when the bombardment location information input by the user input unit 110 or the first camera 121 and the predicted landing position information calculated by the first control unit 210 match. Perform the function.
  • the alarm unit 330 may include a vibration alarm unit 330 for notifying the user by vibration or a sound output unit 320 for notifying the user by sound output.
  • 16 is a flowchart illustrating a method for predicting shell impact using a glass type wearable device according to an exemplary embodiment of the present invention.
  • a method for providing an expected impact location for a shell using a glass type wearable device includes: receiving current location information (S600); Receiving real-time shooting specification information from the shell direction recognition system (S610); Calculating a predicted landing point based on the shooting specification information and the range information (S620); And displaying the expected impact point on a map and providing the same to the user (S630).
  • S600 current location information
  • S610 real-time shooting specification information from the shell direction recognition system
  • S620 Calculating a predicted landing point based on the shooting specification information and the range information
  • S630 displaying the expected impact point on a map and providing the same to the user
  • the glass type wearable device 10 receives current location information (S600).
  • the current position information may include the glass type wearable device receiving the current position information from the shell direction recognition system, or the glass type wearable device measuring the current position information.
  • the glass type wearable device 10 may receive location information measured by the GPS module 420 of the shell direction recognition system, or the GPS module of the glass type wearable device 10 may measure a current position. Can be.
  • Real-time shooting specification information is received from the shell direction recognition system (S610).
  • the glass type wearable device 10 receives the shooting specification information measured by the shooting specification measuring unit 410 from the shell direction recognition system 60 through wireless communication.
  • the shooting specification information may include elevation information and azimuth information of the gun.
  • the glass type wearable device 10 calculates an expected landing point based on the shooting specification information and the range information (S620). For example, the glass type wearable device 10 may calculate the firing range of the shell at the elevation based on the elevation of the received shooting specification information based on the current position. The glass type wearable device 10 calculates which direction the shell will fly by the calculated range in which direction from the current position by applying an azimuth angle from the received shooting specification information. Through this, the expected impact point can be calculated.
  • the predicted impact location calculation may be performed by the second control unit 440 of the shell direction recognition system 60 as well as the first control unit 210 of the predicted bombardment position display system 10, and wearable of the glass type wearable.
  • the device 10 may receive the calculated predicted impact location information from the shell direction recognition system 60.
  • the predicted impact location calculation may be performed by the external server by transmitting the shooting specification information and the current location information to an external server.
  • the glass type wearable device may receive the predicted impact location information calculated by the external server through the wireless communication unit 250.
  • the predicted impact location calculation may calculate the impact location by reflecting the terrain information received by the glass type wearable device 10.
  • the terrain information may mean information on features of a terrain of a specific region or features located in a specific region.
  • the glass type wearable device 10 may receive terrain information data such as a map including contour maps or building heights from an external server through wireless communication.
  • the glass type wearable device 10 may acquire terrain information obtained in real time by a reconnaissance device (for example, UAV) in real time through wireless communication.
  • the glass type wearable device 10 displays the expected impact point on a map and provides the same to the user (S630). That is, the predicted impact location is displayed on a map (for example, a map on which contour lines are displayed) and visually provided to the user through the display unit 310.
  • a map for example, a map on which contour lines are displayed
  • the predicted impact land calculation step (S620) the step of calculating a first estimated impact land based on the shooting specification information and the range information; Receiving terrain information including the current location and the first predicted impact landing; And calculating a second predicted landing site by reflecting the terrain information. Since the topographical information data can be changed in real time and the size of the data is very large, it is efficient to receive only the topographical information for the required area. Therefore, the glass type wearable device needs to calculate the approximate predicted impact location and request and receive the topographic information of the corresponding area.
  • the glass type wearable device 10 may calculate a first predicted impact landing based on the shooting specification information and the range information. That is, the glass type wearable device 10 may calculate a first predicted impact point, which is an expected impact point in the case of flat land without considering the terrain information. Thereafter, the glass type wearable device 10 may request and receive terrain information including the current location and the first predicted impact landing. Thereafter, the glass type wearable device may calculate the second predicted impact landing by reflecting the terrain information. In the predicted impact land providing step (S630), the second predicted impact land may be displayed as the predicted impact land on the map.
  • the method may further include receiving the map data including the current location or the predicted impact location from an external server.
  • the glass-type wearable device 10 may further include obtaining bombardment location information.
  • the glass type wearable device 10 may receive the bombardment location information from an external server through wireless communication, recognize a character corresponding to the bombardment location information in an image obtained by a first camera, and The bombardment location information may be obtained through a method of recognizing the bombardment location information in voice data.
  • the method may further include displaying and displaying the bombardment location information on a map. That is, the glass type wearable device 10 may visually display the bombardment location information along with the predicted impact location information on the map.
  • the step of notifying the user may further include.
  • the glass type wearable device may determine in real time whether the predicted landing location information calculated in real time and the location location of the desired bombardment target are matched based on real-time shooting specifications.
  • the glass type wearable device 10 may inform the user that the bombardment location information and the predicted landing information match. This allows the user to accurately fire the shell at the desired landing area.
  • the information processing method using the glass type wearable device according to the embodiments of the present invention described above is implemented as a program (or an application) to be executed in combination with the glass type wearable device 10 which is hardware, and stored in a medium. Can be.
  • the processor CPU of the glassy wearable device 10 may execute the glassy wearable device ( 10) may include code coded in a computer language such as C, C ++, JAVA, or machine language that can be read through the device interface.
  • code may include functional code associated with a function or the like that defines the necessary functions for executing the methods, the execution of which is required for the processor of the glassy wearable device 10 to execute according to a predetermined procedure. Procedure-related control codes.
  • such code requires that additional information or media required to execute the functions by the processor of the glassy wearable device 10 is referred to at any position (address address) of the internal or external memory of the glassy wearable device 10. It may further include a memory reference code for whether or not.
  • the code may be configured to communicate with the communication module of the glass type wearable device 10. It may further include communication-related code, such as how to communicate with any other computer or server in the remote, what information or media should be transmitted and received during communication.
  • the stored medium is not a medium for storing data for a short time such as a register, a cache, a memory, but semi-permanently, and means a medium that can be read by the device.
  • examples of the storage medium include, but are not limited to, a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
  • the program may be stored in various recording media on various servers accessible by the glass type wearable device 10 or various recording media on the glass type wearable device 10 of the user.
  • the media may also be distributed over network coupled computer systems so that the computer readable code is stored in a distributed fashion.

Landscapes

  • Alarm Systems (AREA)
  • Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne un système et un procédé de traitement d'informations utilisant un dispositif portable Selon un mode de réalisation de la présente invention, le procédé de traitement d'informations utilisant le dispositif portable comprend les étapes consistant à : recevoir des informations de localisation d'un accident dans un bâtiment ; calculer un itinéraire d'évacuation par détermination d'informations de localisation de sortie d'urgence sûre, sur la base des informations de localisation de l'accident ; générer des informations de guidage en conformité avec l'itinéraire d'évacuation calculé ; et fournir les informations de guidage à un utilisateur. Selon l'invention, les informations de localisation de l'accident comprennent des informations de localisation relatives à l'endroit où un accident spécifique s'est produit, qui sont requises pour qu'un utilisateur prenne refuge dans un bâtiment spécifique. Selon la présente invention, le dispositif portable de type lunettes présente l'effet avantageux de fournir aux personnes un itinéraire essentiellement sûr sur la base des informations de position de l'accident acquises par l'intermédiaire d'un signal de communication sans fil, ce qui permet de réduire le nombre de victimes en cas d'accident.
PCT/KR2015/007914 2014-07-30 2015-07-29 Système et procédé de traitement d'informations utilisant un dispositif portable WO2016018063A2 (fr)

Applications Claiming Priority (16)

Application Number Priority Date Filing Date Title
KR10-2014-0097084 2014-07-30
KR10-2014-0097132 2014-07-30
KR20140097132 2014-07-30
KR20140097084 2014-07-30
KR10-2014-0101025 2014-08-06
KR20140101025 2014-08-06
KR20140110608 2014-08-25
KR10-2014-0110608 2014-08-25
KR1020150042547A KR20160017593A (ko) 2014-08-06 2015-03-26 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법 및 프로그램
KR10-2015-0042550 2015-03-26
KR10-2015-0042547 2015-03-26
KR1020150042550A KR20160015142A (ko) 2014-07-30 2015-03-26 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락방법 및 프로그램
KR1020150042943A KR20160015143A (ko) 2014-07-30 2015-03-27 포탄방향인식시스템, 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법 및 프로그램
KR10-2015-0042941 2015-03-27
KR10-2015-0042943 2015-03-27
KR1020150042941A KR101728707B1 (ko) 2014-08-25 2015-03-27 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법 및 제어프로그램

Publications (2)

Publication Number Publication Date
WO2016018063A2 true WO2016018063A2 (fr) 2016-02-04
WO2016018063A3 WO2016018063A3 (fr) 2016-03-24

Family

ID=55218426

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/007914 WO2016018063A2 (fr) 2014-07-30 2015-07-29 Système et procédé de traitement d'informations utilisant un dispositif portable

Country Status (1)

Country Link
WO (1) WO2016018063A2 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107101633A (zh) * 2017-04-13 2017-08-29 清华大学 一种可呈现疏散指令的智能穿戴设备及疏散指令呈现方法
CN108846992A (zh) * 2018-05-22 2018-11-20 东北大学秦皇岛分校 一种能够对听障人士进行安全预警的方法及装置
CN113034843A (zh) * 2021-02-21 2021-06-25 深圳市九象数字科技有限公司 一种高支模无线自动化监测系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100724967B1 (ko) * 2005-09-28 2007-06-04 삼성전자주식회사 재난 방송 안내 서비스를 제공하는 재난 방송 안내 시스템및 그 방법
JP2009036621A (ja) * 2007-08-01 2009-02-19 Denso Corp 車載経路誘導装置
KR101018583B1 (ko) * 2010-07-14 2011-03-03 김현태 지능형 소방 방재 시스템
KR101282669B1 (ko) * 2012-11-12 2013-07-12 (주)티엘씨테크놀로지 작업장 사고방지를 위한 스마트웨어 시스템
KR20140070940A (ko) * 2012-11-30 2014-06-11 주식회사 하나아이엔씨 스마트 방재 서비스 플랫폼

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107101633A (zh) * 2017-04-13 2017-08-29 清华大学 一种可呈现疏散指令的智能穿戴设备及疏散指令呈现方法
CN108846992A (zh) * 2018-05-22 2018-11-20 东北大学秦皇岛分校 一种能够对听障人士进行安全预警的方法及装置
CN113034843A (zh) * 2021-02-21 2021-06-25 深圳市九象数字科技有限公司 一种高支模无线自动化监测系统

Also Published As

Publication number Publication date
WO2016018063A3 (fr) 2016-03-24

Similar Documents

Publication Publication Date Title
WO2016133269A1 (fr) Dispositif pouvant être porté servant à générer un signal d'image, et son système de commande
WO2018182217A1 (fr) Procédé d'authentification adaptative et dispositif électronique le supportant
WO2018030799A1 (fr) Procédé de fourniture d'informations sur une place de parking d'un véhicule et dispositif électronique associé
KR101700395B1 (ko) 경호용 드론 및 이를 이용한 이동식 경호 시스템
WO2019013517A1 (fr) Appareil et procédé pour contexte de commande vocale
WO2019103212A1 (fr) Système de surveillance pour terminal intelligent ido dans un navire utilisant un réseau de communication
US10755222B2 (en) Work management apparatus, work defect prevention program, and work defect prevention method
KR20160017593A (ko) 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법 및 프로그램
CN107113354A (zh) 包括头戴式设备的通信系统
WO2018026142A1 (fr) Procédé de commande du fonctionnement d'un capteur d'iris et dispositif électronique associé
WO2016018063A2 (fr) Système et procédé de traitement d'informations utilisant un dispositif portable
WO2016021907A1 (fr) Système de traitement d'informations et procédé utilisant un dispositif à porter sur soi
WO2018048130A1 (fr) Procédé de lecture de contenu et dispositif électronique prenant en charge ce procédé
WO2016006920A1 (fr) Système et procédé pour traiter des informations à l'aide d'un dispositif vestimentaire
KR20160007341A (ko) 글라스형 웨어러블 디바이스의 버스도착 알림방법 및 이를 이용한 글라스형 웨어러블 디바이스용 프로그램
KR101728707B1 (ko) 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법 및 제어프로그램
KR20160015142A (ko) 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락방법 및 프로그램
WO2016010328A1 (fr) Système de traitement d'informations et procédé utilisant un dispositif portatif
JP7092344B2 (ja) 携帯飛行監視端末、監視装置および監視方法
WO2020246639A1 (fr) Procédé de commande de dispositif électronique de réalité augmentée
KR101569880B1 (ko) 영상 신호 생성을 위한 웨어러블 장치 및 그 제어를 위한 시스템
WO2018097483A1 (fr) Procédé de génération d'informations de mouvement et dispositif électronique le prenant en charge
KR20160015143A (ko) 포탄방향인식시스템, 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법 및 프로그램
KR20160053391A (ko) 글라스형 웨어러블 디바이스를 이용한 외부디바이스 내 콘텐츠 관리 시스템, 방법 및 글라스형 웨어러블 디바이스용 어플리케이션
KR101629758B1 (ko) 글라스형 웨어러블 디바이스의 잠금해제 방법 및 프로그램

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15828123

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 15828123

Country of ref document: EP

Kind code of ref document: A2