WO2016018063A2 - Information-processing system and method using wearable device - Google Patents

Information-processing system and method using wearable device Download PDF

Info

Publication number
WO2016018063A2
WO2016018063A2 PCT/KR2015/007914 KR2015007914W WO2016018063A2 WO 2016018063 A2 WO2016018063 A2 WO 2016018063A2 KR 2015007914 W KR2015007914 W KR 2015007914W WO 2016018063 A2 WO2016018063 A2 WO 2016018063A2
Authority
WO
WIPO (PCT)
Prior art keywords
wearable device
user
type wearable
information
glass
Prior art date
Application number
PCT/KR2015/007914
Other languages
French (fr)
Korean (ko)
Other versions
WO2016018063A3 (en
Inventor
한성철
엄정한
김진영
이경현
김대중
김석기
유철현
김주천
김주원
Original Assignee
넥시스 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020150042547A external-priority patent/KR20160017593A/en
Priority claimed from KR1020150042550A external-priority patent/KR20160015142A/en
Priority claimed from KR1020150042943A external-priority patent/KR20160015143A/en
Priority claimed from KR1020150042941A external-priority patent/KR101728707B1/en
Application filed by 넥시스 주식회사 filed Critical 넥시스 주식회사
Publication of WO2016018063A2 publication Critical patent/WO2016018063A2/en
Publication of WO2016018063A3 publication Critical patent/WO2016018063A3/en

Links

Images

Definitions

  • the present invention relates to an information processing system and method using a wearable device, and more particularly, to a system and method for information processing data acquired during an activity of a user having a wearable device.
  • the head mounted display device is mainly made of a safety glasses or a helmet type device to see the screen in front of the eyes, and was developed to realize virtual reality.
  • Glass-type wearable devices generally have a small display such as a liquid crystal installed at a position close to both eyes (both eyes) to project an image.
  • the wearable device is for use in space development, nuclear reactors, military institutions, and medical institutions.
  • Various developments for games and the like are in progress.
  • a glass type wearable device according to the prior art is also disclosed in US Pat. No. US8,427,396.
  • glass wearable devices Although various types of glass wearable devices have been researched and emerged, methods or services that wearers can conveniently utilize in daily life are very limited, and UI (User Interface) or UX (User Experience) suitable for glass wearable devices is very limited. Neither is it being developed. In addition, recently, glass-type wearable devices that can perform most functions by themselves, as well as glass-type wearable devices used in conjunction with mobile terminals such as smartphones, have been developed. Different from the mobile terminal of the need for a service suitable for the characteristics of the glass-type wearable device.
  • UI User Interface
  • UX User Experience
  • one object of the present invention is to provide a service suitable for a glass-type wearable device that can be utilized by the user or wearer of the glass-type wearable device.
  • an object of the present invention is to provide a system and method for providing an escape route or performing emergency contact to the outside using a glass type wearable device.
  • the present invention provides a system and method for controlling an electronic device desired by a user through wireless communication by selecting an electronic device to be controlled through a glass type wearable device and inputting a control command.
  • Information processing method using a wearable device the step of receiving event occurrence location information in the building; Calculating an escape route by determining safe emergency exit location information based on the event occurrence location information; Generating guide information along the calculated escape route; And providing the guide information to the user, wherein the event occurrence location information is location information at which a specific event requiring evacuation of the user in a specific building has occurred.
  • an information processing method using a wearable device includes: obtaining a voice input or an operation input of a user; Determining an emergency situation by recognizing the voice input or operation input; And transmitting the emergency notification information to the emergency contact counterpart via wireless communication.
  • An information processing method using a wearable device may include: measuring, by the glass type wearable device, a current indoor location; Measuring a gaze direction of a user by recognizing an azimuth angle or a high and low angle by the glass type wearable device; Determining the electronic device located in the gaze direction from the measured current indoor location as a control object; Receiving a control command of the electronic device from a user; And transmitting the input control command to the electronic device through wireless communication.
  • an information processing method using a wearable device includes: acquiring an image corresponding to a gaze direction of a user; Determining the electronic device in the image as a control object through the image analysis; Receiving a control command of the electronic device from a user; And transmitting the input control command to the electronic device through wireless communication.
  • an information processing method using a wearable device includes: receiving an electronic device selection command and a control command corresponding to a control command from a user; Analyzing the voice command to determine a control command with an electronic device to be controlled; And transmitting the inputted control command to the selected electronic device through wireless communication.
  • an information processing method using a wearable device includes: receiving current location information; Receiving real-time shooting specification information from the shell direction recognition system; Calculating a predicted landing location based on the shooting specification information and the range information; And displaying the expected impact point on a map and providing the same to the user.
  • the glass-type wearable device provides people with a substantially safe path that avoids incident location information obtained through a wireless communication signal, thereby reducing casualties caused by accidents.
  • the route guidance is displayed on the display unit of the glass wearable device located in front of the user's eyes so that the user can easily find the emergency exit or evacuation site.
  • a user in an emergency may perform emergency contact without additional manipulation.
  • crimes or accidents can be reduced early by responding to emergency situations.
  • an emergency contact can be performed to a counterpart who can appropriately respond to an emergency situation that a user encounters.
  • Sixth, according to the present invention can control the electronic devices in the home at a long distance, it is possible to eliminate the inconvenience of the user to move to control the desired electronic devices. For example, the inconvenience of having to go to the place where the light switch is located in order to turn off the light while lying in the bed can be eliminated.
  • each electronic device can be connected to the wireless communication in the home, there is an advantage that can be controlled using the glass-type wearable device without a remote control of each electronic device.
  • the electronic device can be controlled by a simple operation such as a blinking pattern or a moving pattern, and the user can select a control target electronic device simply by looking at the electronic device that the user wants to control.
  • a controllable effect For example, if the user wants to turn off the audio being played while the user is lying in bed, the user may stare at the audio while wearing the glass wearable device and input a blinking pattern corresponding to the audio off.
  • the projected landing of the shell is calculated in real time and displayed to the user of the glass type wearable device, so that the shell can be fired exactly at a desired position in battle.
  • the shell can be fired exactly at the desired position, thereby increasing the killing power of the artillery.
  • FIG. 1 is a block diagram of a glass type wearable device system according to an exemplary embodiment of the present invention.
  • FIG 2 is an exemplary view of a glass type wearable device related to one embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a connection relationship between a glass type wearable device, an external server, and an external device according to an exemplary embodiment of the present invention.
  • FIG. 4 is a flowchart illustrating a method of providing an escape route using a glass type wearable device according to an embodiment of the present invention.
  • FIG. 5 is an exemplary view showing guide information on a display unit of a glass type wearable device according to an embodiment of the present invention.
  • FIG. 6 is an internal configuration diagram of a system for providing an escape route using a beacon and a glass type wearable device according to an embodiment of the present invention.
  • FIG. 7 is an internal configuration diagram of a system for providing an escape route using a control server, a beacon, and a glass type wearable device according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating an emergency emergency contact method using a glass type wearable device according to an embodiment of the present invention.
  • FIG. 9 is a diagram illustrating a connection relationship between a glass type wearable device and an emergency contact counterpart according to an embodiment of the present invention.
  • FIG. 10 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by measuring indoor positioning and gaze direction according to an embodiment of the present invention.
  • FIG. 11 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by analyzing a front image according to an embodiment of the present disclosure.
  • FIG. 12 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by voice command recognition according to an embodiment of the present invention.
  • FIG. 13 is an exemplary view showing an electronic device and a control command recognized in a display unit of a glass type wearable device according to an embodiment of the present invention.
  • FIG. 14 is a block diagram of an indoor electronic device control system using a glass type wearable device according to an embodiment of the present invention.
  • FIG. 15 is an internal configuration diagram of a shell direction recognition system according to an embodiment of the present invention.
  • 16 is a flowchart illustrating a method for providing a projected impact target shell using a glass type wearable device according to an embodiment of the present invention.
  • FIG. 17 is a diagram illustrating a connection relationship between a shell direction recognition system and a glass type wearable device according to an exemplary embodiment of the present invention.
  • FIG. 18 is an exemplary view showing an expected impact point on a display unit of a glass type wearable device according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a glass type wearable device system 10 according to an embodiment of the present invention.
  • the glass type wearable device system 10 may include an input unit 100, a user input unit 110, a keyboard 111, a touch pad 112, and a camera unit 120. ), The first camera 121, the second camera 122, the third camera 123, the sensing unit 130, the gyro sensor 131, the acceleration sensor 132, the pressure sensor 133, iris recognition sensor 134, heart rate detection sensor 135, EMG sensor 136, voice input unit 140, control unit 210, voice recognition unit 220, interface unit 230, voice-to-text conversion module 240, The wireless communication unit 250, a memory 260, an output unit 300, a display unit 310, an audio output unit 320, an alarm unit 330, and all or a part of the haptic module 340 are included.
  • the components shown in FIG. 1 are not essential, so a glassy wearable device with more or less components may be implemented.
  • FIG 2 is an exemplary view of a glass type wearable device related to one embodiment of the present invention.
  • the components may be provided inside or on one side of the glass type wearable device as shown in FIG. 2.
  • the input unit 100 is for inputting an audio signal, a video signal, a user's manipulation signal, a biosignal, and the like.
  • the input unit 100 includes a user input unit 110, a camera unit 120, a sensing unit 130, and a voice input unit 140.
  • the user input unit 110 generates key input data input by the user for controlling the operation of the device.
  • the user input unit 110 may include a keypad, a keyboard 111, a dome switch, a touch pad (static pressure / capacitance) 112, a jog wheel, a jog switch, a finger mouse, and the like.
  • a touch pad static pressure / capacitance
  • the touch pad forms a mutual layer structure with the display unit 310 to be described later, this may be referred to as a touch screen.
  • the camera 120 is for inputting a video signal or an image signal, and two or more cameras 120 may be provided according to a configuration aspect of the device.
  • the camera 120 processes image frames such as still images or moving images obtained by an image sensor in a video call mode or a photographing mode.
  • the processed image frame may be displayed on the display 310.
  • the image frame processed by the camera 120 may be stored in the memory 260 or transmitted to the outside through the wireless communication unit 250.
  • the control unit 210 transmits the image signal and the video signal.
  • the camera unit 120 may include one or more cameras according to the direction or purpose of the captured image.
  • the first camera 121 may be provided at one side of the glass type wearable device to capture an image of the front side.
  • the second camera 122 may be provided at one side of the glass type wearable device to acquire an image or an image in an eyeball direction.
  • the third camera 123 may be provided at the rear or side of the glass type wearable device 10 to acquire an image or an image of the rear or side.
  • the sensing unit 130 generates a sensing signal for controlling the operation of the device by detecting the current state of the device, such as whether the user wears the glass-shaped wearable device 10 or the position of the device.
  • the sensing unit 130 may perform a function of an input unit receiving an input signal for information processing of the device, and may perform various sensing functions such as whether an external device is connected or not.
  • the sensing unit 130 includes a proximity sensor, a pressure sensor 133, a motion sensor, a fingerprint recognition sensor, an iris recognition sensor 134, a heart rate detection sensor 135, a skin temperature sensor, a skin resistance sensor and an electrocardiogram sensor. And various sensors.
  • the proximity sensor can detect the presence or absence of an approaching object or an object present in the vicinity without mechanical contact.
  • the proximity sensor can detect a proximity object by using a change in an alternating magnetic field or a change in a static magnetic field, or by using a change rate of capacitance.
  • Two or more proximity sensors may be provided according to the configuration aspect.
  • the pressure sensor 133 may detect whether pressure is applied to the device, the magnitude of the pressure, and the like.
  • the pressure sensor 133 may be installed at a portion of the device requiring detection of pressure according to the use environment. If the pressure sensor 133 is installed in the display 310, a touch input through the display 310 and a greater pressure than the touch input are applied according to the signal output from the pressure sensor 133. The pressure touch input can be identified. In addition, according to the signal output from the pressure sensor 133, it is also possible to know the magnitude of the pressure applied to the display 310 when the pressure touch input.
  • the motion sensor includes one or more of sensors such as a gyro sensor 131, an acceleration sensor 132, and a geomagnetic sensor, and detects the position or movement of the device using the same.
  • the acceleration sensor 132 that can be used for a motion sensor is a device that converts an acceleration signal in one direction into an electrical signal, and is widely used with the development of micro-electromechanical systems (MEMS) technology.
  • MEMS micro-electromechanical systems
  • the gyro sensor 131 is a sensor for measuring the angular velocity, and may sense a direction returned to the reference direction.
  • the heart rate detection sensor 135 measures the change in the light blood flow rate according to the change in blood vessel thickness due to the heartbeat.
  • the skin temperature sensor measures the skin temperature as the resistance value changes in response to the temperature change.
  • Skin resistance sensors measure the electrical resistance of the skin.
  • the iris recognition sensor 134 performs a function of recognizing a person using iris information of an eye having unique characteristics for each person.
  • the human iris is completed after 18 months of age, and then the circular iris pattern, which is raised near the inner edge of the iris, is almost unchanged once determined. Therefore, iris recognition is the application of security authentication technology by informatizing the characteristics of different iris for each person. In other words, it is an authentication method developed as a means of identifying people by analyzing the shape and color of the iris and the morphology of the retinal capillaries.
  • the iris recognition sensor 134 codes and compares the iris pattern with an image signal, and determines the comparison.
  • the general operation principle is as follows. First, when the user's eyes are focused on the mirror in the center of the iris recognizer at a certain distance, the infrared camera adjusts the focus through the zoom lens. Then, the iris camera images the user's iris into a photograph, and the iris recognition algorithm analyzes the iris contrast patterns by area to generate a unique iris code. Finally, a comparison search is performed as soon as the iris code is registered in the database.
  • the iris recognition sensor 134 may be provided inside the second camera 122 disposed in the eye direction, and in this case, the second camera 122 may perform a function of the iris recognition sensor.
  • the distance sensor includes a distance measurement method between two points, a triangulation method (infrared ray type, natural light type), and ultrasonic type.
  • a triangulation method infrared ray type, natural light type
  • ultrasonic type As in the conventional triangulation principle, the distance between two points is displayed when the measured objects from the two paths are reflected by a right-angle prism and incident on the two image sensors so that the relative positions match.
  • the ultrasonic method is a method in which the distance is measured by transmitting an ultrasonic wave having a sharp directivity to the object to be measured and receiving a reflected wave from the object to be measured.
  • the receiving sensor uses a piezoelectric element.
  • the Doppler radar is a radar that utilizes a phase change of the reflected wave, that is, a Doppler effect of the wave.
  • the doppler radar includes a continuous wave radar that transmits and receives a sine wave which is not pulse modulated, and a pulse radar that uses pulse modulated radio waves as an electromagnetic wave signal waveform.
  • Continuous wave radar is unsuitable for long range radar because modulating frequency is relatively high in order to obtain the performance of Doppler frequency filter. There are features that can be.
  • the pulse radar measures the distance to the target by the time from pulse transmission to reflection echo reception.
  • the voice input unit 140 is for inputting a voice signal and may include a microphone.
  • the microphone receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes it into electrical voice data.
  • the processed voice data may be converted into a form transmittable to the mobile communication base station through the wireless communication unit 250 and output in the call mode.
  • various noise canceling algorithms may be used to remove noise generated while receiving an external sound signal.
  • the output unit 300 is for outputting an audio signal, an image signal, a video signal or an alarm signal.
  • the output unit 300 may include a display unit 310, a sound output unit 320, an alarm unit 330, and a haptic module 340.
  • the display 310 displays and outputs information processed by the device. For example, when the device is in a call mode, the device displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the device is in a video call mode or a photographing mode, the captured or received images may be displayed at the same time or simultaneously, and the UI and the GUI may be displayed.
  • UI user interface
  • GUI graphic user interface
  • the display unit 310 may be used as an input device in addition to the output device. If the display unit 310 is configured as a touch screen, the display unit 310 may include a touch screen panel and a touch screen panel controller.
  • the display unit 310 may include a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a three-dimensional display. It may include at least one of (3D display).
  • two or more display units 310 may exist according to the implementation form of the device. For example, the external display unit 310 and the internal display unit 310 may be simultaneously provided in the device.
  • the display unit 310 may be implemented as a head up display (HUD), a head mounted display (HMD), or the like.
  • HMD Head mounted Display
  • a head up display (HUD) is an image display device for projecting a virtual image onto glass in a user's visible area.
  • the sound output unit 320 outputs audio data received from the wireless communication unit or stored in the memory 260 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like.
  • the sound output module 320 outputs a sound signal related to a function performed in the device, for example, a call signal reception sound and a message reception sound.
  • the sound output module 320 may include a speaker, a buzzer, and the like.
  • the alarm unit 330 outputs a signal for notifying occurrence of an event of the device. Examples of events occurring in the device include call signal reception, message reception, and key signal input.
  • the alarm unit 330 outputs a signal for notifying occurrence of an event in a form other than an audio signal or a video signal. For example, the signal may be output in the form of vibration.
  • the alarm unit 330 may output a signal to notify the call signal when the call signal is received or a message is received. Also.
  • the key signal is input, the alarm unit 330 may output the signal as a feedback to the key signal input. The user may recognize the occurrence of an event through the signal output from the alarm unit 330.
  • the signal for notifying the event occurrence in the device may also be output through the display 310 or the sound output unit 320.
  • the haptic module 340 generates various haptic effects that a user can feel.
  • a representative example of the haptic effect generated by the haptic module 340 is a vibration effect.
  • the haptic module 340 When the haptic module 340 generates vibration by the tactile effect, the intensity and pattern of the vibration generated by the haptic module 340 may be converted, and may be output by combining different vibrations or sequentially.
  • the wireless communication unit 250 may include a broadcast receiving module, a mobile communication module, a wireless internet module, a short range communication module, and a location information module.
  • the broadcast receiving module receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel, a terrestrial channel, and the like.
  • the broadcast management server may mean a server that generates and transmits at least one of a broadcast signal and broadcast related information, or a server that receives at least one of the pre-generated broadcast signal and broadcast related information and transmits the same to a terminal.
  • the broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider.
  • the broadcast related information may also be provided through a mobile communication network, and in this case, may be received by the mobile communication module.
  • Broadcast related information may exist in various forms.
  • the broadcast receiving module may receive a broadcast signal using various broadcast systems, and receive a digital broadcast signal using a digital broadcast system.
  • the broadcast receiving module may be configured to be suitable for all broadcast systems providing broadcast signals as well as such digital broadcast systems.
  • the broadcast signal and / or broadcast related information received through the broadcast receiving module may be stored in the memory 260.
  • the mobile communication module transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network.
  • the wireless signal may include various types of data according to voice call signal, video call signal, or text / multimedia message transmission and reception.
  • the wireless internet module refers to a module for wireless internet access, and the wireless internet module may be embedded or external to the device.
  • Wireless Internet technologies include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), long term evolution (LTE), LTE-A Long Term Evolution-Advanced and the like can be used.
  • the short range communication module refers to a module for short range communication.
  • Beacon, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, etc. may be used as a short range communication technology.
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • ZigBee ZigBee
  • the location information module refers to a module that receives the positioning signal and measures the position of the glass type wearable device.
  • the position information module may correspond to a Global Position System (GPS) module, and the positioning signal may correspond to a GPS signal.
  • GPS Global Position System
  • the Global Position System (GPS) module receives position information from a plurality of GPS satellites.
  • the memory 260 may store a program for processing and controlling the controller 210, and may perform a function for temporarily storing input or output data (eg, a message, a still image, a video, etc.). It may be.
  • input or output data eg, a message, a still image, a video, etc.
  • the memory 260 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM
  • the storage medium may include at least one type of storage medium.
  • the device may operate a web storage that performs a storage function of a memory on the Internet.
  • the memory 260 may be referred to as a storage 260 hereinafter.
  • the interface unit 230 serves as an interface with all external devices connected to the device. Examples of external devices that connect to the device include wired / wireless headsets, external chargers, wired / wireless data ports, memory cards, card sockets such as Subscriber Identification Module (SIM) or User Identity Module (UIM) cards, Audio I / O (Input / Output) terminals, video I / O (Input / Output) terminals, earphones, and the like.
  • SIM Subscriber Identification Module
  • UIM User Identity Module
  • the interface unit 230 may receive data from such an external device or receive power and transfer the power to each component inside the device, and allow the data inside the device to be transmitted to the external device.
  • the controller 210 typically controls the operation of each unit to perform a function of controlling the overall operation of the device. For example, perform related control and processing for voice calls, data communications, and the like. In addition, the controller 210 performs a function of processing data for multimedia reproduction. In addition, it performs a function of processing the data received from the input unit or the sensing unit 130.
  • the controller 210 performs face detection and face recognition for face recognition. That is, the controller 210 may include a face detection module and a face recognition module for face recognition.
  • the face detection module may extract only the face area from the camera image acquired by the camera unit 120. For example, the face detection module extracts a face region by recognizing feature elements in the face such as eyes, nose, and mouth.
  • the face recognition module may generate a template by extracting feature information from the extracted face region, and recognize a face by comparing a template with face information data in a face database.
  • the controller 210 may perform a function of extracting and recognizing a character from an image or an image acquired by the camera unit 120. That is, the controller 210 may include a character recognition module for character recognition.
  • Optical character recognition may be applied as a character recognition method of the character recognition module.
  • OCR Optical character recognition
  • the OCR method converts a typeface image of a document written by a person who can be obtained by an image scan or printed by a machine into a format such as a computer code that can be edited by a computer, and can be implemented by software.
  • the OCR method may compare several standard pattern letters and input characters prepared in advance and select the most similar to the standard pattern letters as the corresponding letters.
  • the character recognition module includes standard pattern letters of various languages, it is possible to read printed characters of various languages. Such a method is called a pattern matching method among the OCR methods, and the OCR method is not limited thereto, and various methods may be applied. In addition, the character recognition method of the character recognition module is not limited to the OCR method, various methods for recognizing the character on the offline already printed may be applied.
  • the controller 210 may perform a function of recognizing a gaze direction based on an eyeball image or an image acquired by the second camera 122. That is, the controller 210 may include a gaze analysis module that performs gaze direction recognition. After measuring the gaze direction and the gaze direction of the user, it is possible to determine the direction that the user looks by calculating by synthesis.
  • the gaze direction refers to a direction of the face of the user and may be measured by the gyro sensor 131 or the acceleration sensor 132 of the sensing unit 130.
  • the gaze direction may be grasped by the gaze analysis module in a direction viewed by the user's pupil.
  • the gaze analysis module may detect a movement of a pupil through analysis of a real-time camera image, and apply a method of calculating a direction of a gaze based on a fixed position reflected by the cornea. For example, through the image processing method, the position of the corneal reflected light by the center of the pupil and the illumination may be extracted and the gaze position may be calculated through the positional relationship thereof.
  • the power supply unit receives an external power source and an internal power source under the control of the controller 210 to supply power for operation of each component.
  • the speech recognition unit 220 performs a function of identifying linguistic meaning content from the speech by automatic means. Specifically, the process of identifying a word or word sequence and extracting meaning by inputting a speech waveform is classified into five categories: speech analysis, phoneme recognition, word recognition, sentence interpretation, and meaning extraction.
  • the voice recognition unit 220 may further include a voice evaluation module for comparing whether the stored voice and the input voice are the same.
  • the voice recognition unit 220 may further include a voice-to-text conversion module 240 for converting an input voice into text or converting a text into voice.
  • FIG 3 is a diagram illustrating a connection relationship between the glass type wearable device 10, the external server 20, and the external device 30 according to an embodiment of the present invention.
  • the glass type wearable device 10 may perform all the processing for information processing therein, but the external server 20 may perform some of the information processing. Accordingly, the glass type wearable device 10 may transmit the data acquired through the input unit 100 or data on which some information processing is performed to the external server 20 as information processing request data. The glass type wearable device 10 may receive information processing result data performed by the external server 20 through wireless communication. The glass type wearable device 10 may provide the received information processing result data to the user through the output unit 300 in various ways. The external server 20 may be different according to a service performed by the glass type wearable device 10.
  • the glass type wearable device 10 may provide the information processing result data to the user through its output unit 300, or may provide the information processing result data using the external device 30. That is, when the glass type wearable device 10 performs the entire information processing process, the external device 30 may output information processing result data received from the glass type wearable device 10. In addition, when the external server 20 receives the information processing request data from the glass type wearable device 10 to perform some information processing, the external device 30 receives the information processing result data received from the external server 20. You can output
  • the external device 30 may include various devices such as a smartphone, a tablet PC, a smart TV, and an output unit (for example, a display unit provided in the vehicle glass or an internal vehicle sound output unit) provided in the vehicle. .
  • the glass type wearable device 10 may receive a wireless communication signal (for example, a beacon signal transmitted from a beacon tag that is a wireless communication tag 30) transmitted from the external device 30.
  • the glass type wearable device 10 may perform information processing using the received wireless communication signal.
  • FIG. 4 is a flowchart illustrating a method of providing an escape route using a glass type wearable device according to an exemplary embodiment of the present invention.
  • the method for providing an escape route using the glass type wearable device includes: receiving event occurrence location information in a building (S100); Calculating an escape route by determining safe emergency exit location information based on the event occurrence location information (S110); Generating guide information along the calculated escape route (S120); And providing the guide information to the user (S130).
  • S100 event occurrence location information in a building
  • S110 Calculating an escape route by determining safe emergency exit location information based on the event occurrence location information
  • Generating guide information along the calculated escape route S120
  • S130 providing the guide information to the user
  • the glass type wearable device receives location occurrence information of an event in a building (S100).
  • the incident occurrence position information is position information on which a specific incident requiring evacuation of a user in a specific building occurs.
  • the incident location information may be identified through the operation of the sprinkler in each zone, the operation of the fire alarm, the power outage sensor, the heat detector, the emergency alarm after the security guard's recognition, the failure of the surveillance camera or communication equipment. . For example, if a sprinkler in a specific zone operates due to the smoke generated by a fire, the specific zone is determined to be an event occurrence place.
  • the glass type wearable device 10 receives event occurrence position information through wireless communication.
  • the glass type wearable device 10 may actively sense the beacon signal.
  • the beacon 40 may be attached to various locations of the building, in particular may be included in the emergency exit notification light.
  • the glass type wearable device 10 may receive a beacon signal including event occurrence location information at any position through beacons attached to various places inside a building.
  • the glass type wearable device 10 calculates an escape route by determining safe emergency exit location information on the basis of the event occurrence location information (S110).
  • the emergency exit location information is information on the location of the emergency exit located in the building, the glass type wearable device 10 may be stored, the beacon signal is included with the event occurrence location information glass type wearable device 10 Can be sent to.
  • the emergency exit may be a passage to evacuate to the outside, or, if it is not possible to evacuate to the outside, it may be a path to a place that can safely evacuate to a roof or the like and wait for rescue.
  • the control unit 210 of the glass type wearable device 10 determines an emergency exit where safety can be evacuated based on the location of the accident and the current location of the user. Thereafter, the glass type wearable device 10 calculates an escape route that can be safely evacuated to the identified emergency exit.
  • the glass type wearable device 10 generates guide information along the calculated escape route (S120).
  • the guide information may include moving route information, a moving route switching direction, a distance remaining until the moving route switching, and precautions when moving.
  • the control unit 210 of the glass type wearable device 10 Based on the escape route, the control unit 210 of the glass type wearable device 10 generates guide information including a direction to the emergency exit, a distance to a branch point, and the like.
  • the guide information is provided to the user (S130).
  • the method of informing the user of the guide information a method of informing the user by displaying the guide information on the display unit 310, a method of informing the user of the guide information by the sound output, the user through the intensity and vibration generation direction of the vibration generation A method of notifying the guide information to the user may be applied.
  • the moving path switching direction and the distance remaining until the moving path switching may be displayed on the display unit 310 of the glass type wearable device 10.
  • the escape information since the escape information is sequentially included in the guide information, the user requests the control unit 210 to provide the next guide information through the user input unit 110 after the change of the movement path, and confirms the guide information in order, and then escapes the route. You can move it.
  • the method of notifying the user by displaying the guide information on the display unit 310, by measuring the current moving direction through a motion sensor such as a gyro sensor 131, by grasping whether the user's moving direction corresponds to the escape route It can provide a notification to the user.
  • the method may further include receiving real-time event progress status information through wireless communication, wherein the guide information providing step (S130) may provide the user with the event occurrence location information or the event progress status information.
  • the event progress status information is information about a situation in which the event is being processed or a situation in which the event is in progress.
  • the event progress information may mean the progress of the fire (ie, the range of buildings in which the fire is expanded), the degree of extinguishing the fire, or whether a firefighter is inserted.
  • the incident progress status information is analyzed based on at least one or more of the sprinkler position or number operated by an external server, the location of the camera or communication equipment in the building, or the failure time point, and wireless communication.
  • the glass type wearable device 10 displays event occurrence information on a map displayed on the display 310 and displays a situation in which an event is being processed. If the case is being handled safely, the user can be aware of the handling situation and can prevent secondary accidents caused by hasty evacuation.
  • the glass-shaped wearable device 10 includes a step of measuring the current position through the indoor positioning; wherein, the escape route calculation step (S110) is based on the indoor location information and the event occurrence location information appropriate emergency exit and The escape route can be calculated.
  • the escape route calculation step (S110) is based on the indoor location information and the event occurrence location information appropriate emergency exit and The escape route can be calculated.
  • a method of recognizing the indoor location of the user a method of measuring the current location of the user through communication between the wireless communication device such as the beacon 40 and the glass type wearable device 10 may be applied.
  • the glass type wearable device 10 receives three distinguishing beacon signals, based on the strength of the received beacon signal and the position of the beacon 40 that transmitted the beacon signal. You can survey the location.
  • the position measurement method using the beacon 40 and the mutual communication is not limited thereto, and various methods such as a method using the direction of the beacon signal may be applied.
  • the first camera 121 may be used to recognize a characteristic element such as a characteristic terrain or a feature of the surrounding, and may be stored in the glass type wearable device 10 or wirelessly from an external server.
  • the current location of the user may be recognized by performing comparison with indoor map information received through communication.
  • the first camera 121 is a camera provided at one side of the glass type wearable device 10 to acquire an image or an image of the front side. Accordingly, the first camera 121 acquires the front image or the image, extracts the characteristic element in the image or the image, and grasps the characteristic element from the indoor map information to determine the current position.
  • the direction that the user is currently facing may be recognized through the feature element position in the front image or the image.
  • the user may be provided with appropriate guide information corresponding to the current location while moving from the glass type wearable device 10.
  • the method of determining the indoor location of the user is not limited thereto, and various methods of implementing indoor location recognition may be applied.
  • the glass type wearable device 10 may calculate an appropriate emergency exit and escape route based on the indoor location information and the event occurrence location information. Through this, the glass type wearable device 10 may guide an emergency exit and an escape route that best fits the location of the user, thereby assisting the user to safely escape.
  • the method may further include transmitting real-time location information of the user to an external device through wireless communication.
  • transmitting the position of the user recognized by the glass type wearable device 10 to an external terminal through wireless communication it is possible for people outside to grasp the position of the user in the building, so that it can be easily structured.
  • the glass-type wearable device 10 to identify the building location information; And requesting and receiving an internal server map from the inside of the building based on the location information of the building.
  • a building interior map is required. Accordingly, the glass type wearable device 10 may grasp building location information through a GPS signal to identify a building to receive an internal map. Thereafter, the glass type wearable device 10 may request and receive an internal map of a building corresponding to the location information from an external server.
  • FIG. 6 is a block diagram of the emergency exit notification system using the beacon 40 and the glass-type wearable device 10 according to an embodiment of the present invention.
  • the emergency exit notification system using the beacon 40 and the glass type wearable device 10 may include a wireless communication unit 250; Control unit 210; And an output unit 300.
  • a wireless communication unit 250 may include a wireless communication unit 250; Control unit 210; And an output unit 300.
  • the wireless communication unit 250 performs a function of receiving event occurrence place information from the beacon 40. In addition, when the glass type wearable device 10 does not store the emergency exit location information, the wireless communication unit 250 may receive the emergency exit location information. In addition, the wireless communication unit 250 performs a function of exchanging information with an external terminal. That is, the wireless communication unit 250 may transmit the measured indoor location information of the user to an external terminal.
  • the controller 210 recognizes the event occurrence place and the emergency exit location information, calculates an escape route, and generates guide information according to the escape route. In addition, the controller 210 performs a function of recognizing the current location of the user in accordance with a method such as mutual communication with the beacon 40 or recognition of characteristic elements in the front by the first camera 121. can do. For example, the controller 210 may measure the current location of the user by using one or more of the beacon signals received by the wireless communication unit 250.
  • the output unit 300 performs a function of informing the user of the guide information.
  • the output unit 300 includes a display unit 310 for visually displaying the guide information, an audio output unit 320 for notifying the user of the guide information as a sound output, and guides the user through the strength and direction of vibration generation. It may include a vibration alarm unit for informing information.
  • the emergency exit notification system using the beacon 40 and the glass type wearable device 10 may include an external server 20; Beacon 40; It includes a glass-shaped wearable device 10.
  • an external server 20 Beacon 40
  • Beacon 40 It includes a glass-shaped wearable device 10.
  • FIG. 7 detailed description of the previously described configuration will be omitted.
  • the external server 20 recognizes the occurrence place of the event, calculates the escape route, and performs the function of setting the movement route guide information to be transmitted for each beacon 40.
  • the beacon 40 performs a function of generating a beacon signal corresponding to the guide information set by the external server 20.
  • the glass type wearable device 10 receives the beacon signal and performs a function of informing the user of guide information.
  • the glass type wearable device 10 includes a wireless communication unit 250; Control unit 210; And an output unit 300.
  • the wireless communication unit 250 receives a beacon signal from the beacon 40 and performs a function of exchanging information with the outside.
  • the controller 210 performs a function of performing information processing according to an output form based on the received beacon signal.
  • the output unit 300 performs a function of informing the user of the guide information.
  • the external server 20 recognizes the occurrence place of the event, calculates the escape route, and sets the movement route guide information to be transmitted for each beacon 40.
  • the guide information may include a moving path switching direction, a distance remaining until the moving path switching, and precautions when moving.
  • the beacon 40 generates a beacon signal corresponding to the guide information set by the external server 20, and the wireless communication unit 250 of the glass type wearable device 10 receives the beacon signal. Thereafter, the control unit 210 performs information processing on the received beacon signal according to the output form, and the output unit 300 receives the processed information and notifies the user of the route information. As the user moves, the beacon signal may be received to provide appropriate guide information of the user's location.
  • controller 210 may measure the current location of the user by using one or more of the beacon signals received by the wireless communication unit 250.
  • FIG. 8 is a flowchart illustrating an emergency emergency contact method using a glass type wearable device according to an exemplary embodiment of the present invention.
  • an emergency emergency contact method using a glass-type wearable device may include obtaining a voice input or an operation input of a user (S200); Determining an emergency situation by recognizing the voice input or operation input (S210); And transmitting the emergency notification information to the emergency contact counterpart via wireless communication (S220).
  • S200 voice input or an operation input of a user
  • S210 voice input or operation input
  • S220 wireless communication
  • the glass type wearable device 10 receives a user's voice input or an operation input (S200).
  • the glass type wearable device 10 may receive a user voice through the voice input unit 140.
  • the voice input may be a scream or a designated emergency signal phrase. For example, if you are in a criminal situation, voice prompts such as screaming or asking for help, such as 'Help Me!'
  • the glass type wearable device 10 may receive a voice of a surrounding according to a user's setting. For example, if a user has a disease and falls frequently, the user may be asked to input a voice question asked to confirm the user's condition.
  • the glass type wearable device 10 may acquire motion data of a user by a motion sensor.
  • the motion input may correspond to an unexpected movement of a user or a stored bow movement pattern.
  • a sudden movement of the user may correspond to an abnormal movement pattern due to a writhing action when the user is kidnapped.
  • the glass type wearable device 10 determines the emergency situation by recognizing the voice input or the operation input (S210).
  • the voice recognition unit 220 grasps a linguistic meaning or tone from the voice input, and recognizes whether the controller 210 corresponds to an emergency based on this.
  • the control unit 210 compares the user's motion information recognized by the motion sensor with the stored motion pattern information, and if it is recognized as the pattern information corresponding to the emergency situation, it is determined as an emergency situation. Can be.
  • the controller 210 may determine an emergency situation.
  • the glass type wearable device 10 transmits the emergency notification information to the emergency contact counterpart via wireless communication (S220).
  • the wireless communication unit 250 commands the emergency contact. Accordingly, the wireless communication unit 250 performs an emergency contact to the outside.
  • the emergency contact may include a text message, a phone call or a push notification service to a predetermined contact.
  • the push notification service may be, for example, a push service for notifying a message to users who use the same program in a smartphone.
  • the predetermined contact may be a friend, guardian or police, but is not limited to this kind.
  • the contact information may be obtained from a memory of the glass type wearable device, or may be received by an external server through wireless communication.
  • the emergency contact method is not limited to the described method, and various methods that may be performed by the wireless communication unit 250 of the glass type wearable device may be applied.
  • the method may further include measuring a current location of the user, wherein the emergency contact performing step S220 may include the measured current location information in the emergency notification information.
  • the current location information of the user may be grasped by the GPS module of the glass type wearable device 10.
  • the method may further include determining an emergency classification corresponding to the voice input or the operation input; And selecting a specific designated agency or a specific acquaintance as the emergency contact counterpart according to the emergency classification.
  • the glass type wearable device 10 may classify and store data for quickly performing emergency contact in the storage unit 260.
  • the data to be included in the storage unit 260 may include voice input or operation input data corresponding to a contact point and an emergency of the designated authority and the designated authority. Accordingly, the storage unit 260 may classify the user's voice input or operation input according to the emergency situation, and store the designated agency and the contact information of the corresponding organization for each emergency situation.
  • control unit 210 may grasp the emergency classification including the user's voice input or operation input, and select a designated authority or acquaintance to perform an emergency contact according to the emergency classification. Thereafter, the control unit 210 may transmit a command signal to the wireless communication unit 250 to perform an emergency contact to a contact point of a designated authority or an acquaintance, and the wireless communication unit 250 may perform an emergency contact to the contact point.
  • the contact number is not limited to a phone number, and may include an address or the like for transmitting data by various wireless communication methods.
  • the method may further include performing real-time video recording or real-time audio recording of the front side.
  • the emergency contact performing step (S220) may transmit the captured video or the recorded voice through wireless communication. That is, the glass type wearable device 10 may transmit a real time image captured by the first camera 121 or a real time voice obtained by the voice input unit together with the wireless communication when performing an emergency contact.
  • the other party e.g., friend, guardian, firefighter, or police officer
  • the other party who has received an emergency call can recognize the user's emergency, so that the other party can take appropriate action according to the emergency. Can be.
  • the emergency contact step (S220) may include the step of determining the emergency classification occurred to the user by analyzing the real-time image. For example, when a real-time image falling onto the floor is acquired, the glass type wearable device 10 may recognize that the user falls down and grasp an emergency as an emergency according to a physical abnormality.
  • the emergency notification information transmission step (S220) an unspecified counterpart within a certain distance may transmit an active recognition signal to recognize the emergency situation of the user.
  • the glass type wearable device 10 may transmit emergency notification information to a wireless communication signal capable of active sensing, such as a beacon signal, to transmit to the surroundings.
  • a wireless communication signal capable of active sensing such as a beacon signal
  • the method may further include outputting voice to an external emergency response method corresponding to the emergency of the user.
  • an external emergency response method corresponding to the emergency of the user.
  • the glass type wearable device 10 may generate a first aid method corresponding to an emergency of the user as a voice output and notify the surrounding people.
  • the method may further include receiving a bio signal of the user from an external wearable device or acquiring the bio signal by the glass type wearable device, wherein the emergency determination step (S210) may reflect the bio signal to the user's emergency situation. Can be judged.
  • the emergency determination step (S210) may reflect the bio signal to the user's emergency situation. Can be judged.
  • user biometric information such as heart rate and electrocardiogram is required. Therefore, the glass type wearable device 10 may receive a biosignal from another wearable device capable of measuring the biosignal, or the glass type wearable device 10 may directly measure the biosignal, and may reflect the biosignal of the user. Understand the situation. For example, when a real-time image that suddenly falls to the floor is obtained, the emergency situation of the user may be reflected by reflecting a biosignal to determine whether the user has fallen or fallen.
  • FIG. 10 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by measuring indoor positioning and gaze direction according to an embodiment of the present invention.
  • the method for controlling an indoor electronic device using the glass type wearable device 10 may include: measuring, by the glass type wearable device, a current indoor position (S300); Measuring a gaze direction of the user by recognizing the azimuth angle or the high and low angle of the glass type wearable device (S310); Determining an electronic device located in a gaze direction from the measured current indoor location (S320); Receiving a control command of the electronic device from the user (S330); And transmitting the inputted control command to the electronic device through wireless communication (S340).
  • the indoor electronic device control method using the glass type wearable device according to an embodiment of the present invention will be described in order.
  • the glass type wearable device 10 measures a current indoor location (S300).
  • Various indoor positioning methods may be applied to indoor location recognition of the glass type wearable device 10.
  • the indoor positioning method is not limited to the method described below, and various methods may be applied.
  • a method of measuring using an indoor wireless communication network such as Wi-Fi or beacon may be applied.
  • the glass type wearable device 10 may receive the wireless communication signal to recognize the strength, direction, type, and the like of the signal, and thereby measure an indoor location.
  • a feature element is extracted from a front image or an image of the user acquired by the first camera 121 provided on one side of the glass type wearable device 10 to acquire an image or an image of the front, and the feature element is included.
  • a method of determining the position of the glass type wearable device 10 on the indoor map may be applied.
  • the glass type wearable device 10 recognizes an azimuth angle or a high and low angle to measure a direction of attention of the user (S310).
  • the gaze direction refers to a direction that the face of the user faces to look at a specific electronic device, and the face direction corresponds to the direction that the glass wearable device is facing.
  • the glass type wearable device 10 may perform azimuth or high and low angle measurements using the gyro sensor 131, the geomagnetic sensor, and the acceleration sensor 132.
  • the direction corresponding to the measured high operation and azimuth angle corresponds to the user's gaze direction.
  • the glass type wearable device 10 determines an electronic device located in the gaze direction at the measured current indoor location (S320). That is, the glass type wearable device 10 recognizes an electronic device located in a direction viewed from the recognized current indoor location.
  • the glass type wearable device 10 may determine the direction in which the electronic device to be controlled is located by applying the azimuth and elevation measured based on the recognized current position, and then identify the electronic device located in the direction determined on the stored indoor map. Can be.
  • the glass type wearable device 10 receives a control command of an electronic device from a user (S330). Receiving a control command, a method of inputting the control command by recognizing the blink of the eye obtained by the second camera 122, a method of inputting the control command by recognizing the motion pattern by the motion sensor, touch unit A method of inputting a control command by a touch operation of 112, a method of inputting the control command by a voice input of a user, and a method of inputting the control command by recognizing a user's hand gesture by the first camera 121. Or the like.
  • the glass type wearable device 10 transmits the input control command to the electronic device through wireless communication (S340).
  • the glass type wearable device 10 may transmit a control command by a communication method directly connecting to the electronic device.
  • the glass type wearable device 10 may transmit a control command to an electronic device identified by a Wi-Fi Direct method or a Bluetooth method.
  • the glass type wearable device 10 may be connected to a home wireless communication network such as Wi-Fi (WLAN) and transmit a control command to an electronic device. That is, the glass type wearable device 10 is also connected to a wireless communication network to which at least one electronic device is connected, and the glass type wearable device 10 issues a control command to a wireless access point 50 using the wireless communication network. In addition, the wireless access point 50 may transmit the control command to the electronic device to be controlled.
  • WLAN Wi-Fi
  • the glass type wearable device 10 may be automatically connected to the wireless communication network.
  • the wireless communication network is actively sensed so as to be automatically connected, so that the user can directly enter the home and control the electronic device using the glass type wearable device 10. have.
  • the method may further include obtaining, by the glass type wearable device, an eyeball gaze direction, and the determining of the control object (S320) may include an angle corresponding to the gaze direction with respect to the gaze direction based on the measured current indoor position.
  • the electronic device is determined as the control target. Since the user wears the glass type wearable device 10 and does not look only at the front, it is necessary to consider the eyeball direction of the user. Accordingly, the eye tracking is performed by the second camera 122 to recognize the user's gaze direction, and the electronic device is located by applying the recognized eye gaze direction to the gaze direction according to the measured azimuth and the high and low angles. Can recognize the exact location.
  • the method may further include: when the one or more electronic devices are determined to be controlled, displaying the list of one or more electronic devices determined to be controlled by the glass type wearable device; And selecting a specific electronic device from the electronic devices in the list by the user. Since a plurality of electronic devices may be located in a direction that the user wearing the glass type wearable device 10 watches, and the plurality of electronic devices may be located close to each other even when the eye tracking direction is reflected, the glass type wearable The device cannot determine which electronic device the user wants to control among the recognized plurality of electronic devices. In this case, it is necessary to provide the user with a plurality of electronic devices determined to be controlled to be selected by the user.
  • the glass type wearable device 10 may display the determined one or more electronic device list on the screen and provide the same. Thereafter, the glass type wearable device 10 may select a specific electronic device from the electronic devices in the list and determine the electronic device to be controlled by the user. For example, the glass type wearable device may display a list of a plurality of electronic devices together with a number on the screen, and a touch input, a voice input, a blinking eye input, a hand gesture input, and a motion input corresponding to a specific number from a user. And the like to select a specific electronic device to be controlled.
  • the method may further include providing a notification to a user by receiving the electronic device control result according to the control command. That is, the glass type wearable device 10 may receive a wireless communication result of the control from the electronic device that receives the control command, and may notify the user.
  • the manner of providing a notification to the user may include a method of displaying on the screen by the display unit 310, a method of notifying by a voice output, and the like.
  • the method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.
  • the glass type wearable device 10 may receive an input pattern such as a hand gesture pattern, an eye blink pattern, or a moving pattern from a user, select a control command corresponding thereto, and set a corresponding relationship between the input pattern and the control command.
  • the user may set and store different input methods for each control command. For example, when the user wants to control the lighting, the command to turn off the audio may be set as a blinking pattern for closing the left eye, and the command for turning on the audio may be stored for the right blinking pattern.
  • a command to be transferred to the next song at the time of audio reproduction can be set in the moving direction and stored. Through this, the user can set it in the form of the command he / she wants, and the command input method and pattern can be set according to the characteristics of the user.
  • FIG. 11 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by analyzing a front image according to an embodiment of the present disclosure.
  • a method of controlling an indoor electronic device using a glass type wearable device may include: obtaining an image corresponding to a gaze direction of a user (S400); (S410) controlling the electronic device in the image through the image analysis; Receiving a control command of the electronic device from a user (S420); And transmitting the inputted control command to the electronic device through wireless communication (S430).
  • the indoor electronic device control method using the glass type wearable device according to an embodiment of the present invention will be described in order. Hereinafter, detailed description of the above-described steps will be omitted.
  • the glass type wearable device 10 obtains an image corresponding to the direction of attention of the user (S400).
  • the first camera 121 of the glass type wearable device 10 acquires an image in a direction (ie, a viewing direction or a forward direction) of the face of the user.
  • the glass type wearable device 10 determines the electronic device in the image as a control object through the image analysis (S410). For example, the glass type wearable device may recognize the electronic device located in the center of the image as a control target. In general, since the electronic device to be controlled by the user will be located in the center of the front image or image acquired by the first camera 121, the electronic device located in the center of the acquired image can be recognized as a control object. .
  • the controller 210 may analyze the image of the electronic device to determine which electronic device is located at the center of the image, and select the electronic device as a control target.
  • the glass type wearable device 10 when the glass type wearable device 10 captures an image including the user's hand gesture, the glass type wearable device 10 identifies the user's hand gesture area included in the image and displays the image in the hand gesture area.
  • the corresponding electronic device may be extracted and determined to be a control object.
  • the glass type wearable device may determine the electronic device in the direction indicated by the user's finger as the control target.
  • the glass type wearable device 10 may determine an electronic device included in a specific gesture of the user (for example, a circular gesture made by collecting a finger) as a control target. That is, the controller 210 may recognize a specific hand gesture of the user and extract an electronic device image part included in the recognized hand gesture.
  • the image of the extracted electronic device may be analyzed to recognize the electronic device to be controlled.
  • the glass type wearable device 10 receives a control command of the electronic device from the user (S420).
  • the glass type wearable device 10 transmits the input control command to the electronic device through wireless communication (S430).
  • the method may further include acquiring, by the glass type wearable device, an eyeball direction.
  • the electronic device in the region corresponding to the gaze direction may be extracted from the image and determined as the controlling object. That is, the glass type wearable device 10 may recognize the eyeball gaze direction of the user and calculate a gaze point corresponding to the gaze direction in the image. Thereafter, the glass type wearable device 10 may extract an electronic device located at a gaze point and determine the control target. Since the user wears the glass type wearable device 10 and does not look only at the front, it is necessary to consider the eyeball direction of the user. Accordingly, eye tracking by the second camera 122 may be performed to recognize the eyeball direction of the user, and the correct eyeball position may be recognized by applying the recognized eyeball direction to the measured azimuth and elevation angles. have.
  • the method may further include: when the one or more electronic devices are determined to be controlled, displaying the list of one or more electronic devices determined to be controlled by the glass type wearable device; And selecting a specific electronic device from the electronic devices in the list by the user.
  • the method may further include receiving an electronic device control result according to the control command and notifying the user.
  • the method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.
  • FIG. 12 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by voice command recognition according to an embodiment of the present invention.
  • a user receives an electronic device selection command and a voice command corresponding to the control command from the user.
  • the indoor electronic device control method using the glass type wearable device according to an embodiment of the present invention will be described in order. Hereinafter, detailed description of the above-described steps will be omitted.
  • the glass type wearable device 10 receives an electronic device selection command and a voice command corresponding to the control command from the user (S500).
  • the voice input unit 140 of the glass type wearable device 10 receives a voice command of a user including the name of the control target and the control command.
  • the glass type wearable device 10 analyzes the voice command to determine a control command with the electronic device to be controlled (S510). That is, the voice recognition unit 220 of the glass type wearable device 10 performs voice recognition of the input voice command, and controls the electronic device corresponding to the control object in the voice command and transmits the same to the electronic device. Figure out the order.
  • the voice input unit 140 receives a user's voice command “turn off the living room lamp.”
  • the voice recognition unit 220 interprets the voice command to recognize that the electronic device corresponding to the control object is a living room lamp, and recognizes that the control command desired by the user is off.
  • the manner in which the glass type wearable device 10 receives the user's voice command and recognizes the control target and the control command is not limited thereto, and various methods may be applied.
  • the glass type wearable device 10 transmits the input control command to the selected electronic device through wireless communication (S520).
  • the method may further include receiving a control result of the electronic device according to the control command and notifying the user.
  • the method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.
  • the display unit 310 may display the selected electronic device and a control command so that the user may check whether the selection command and the control command of the electronic device are properly input.
  • the method may further include receiving a control result of the electronic device according to the control command and notifying the user.
  • the electronic device transmits the execution result according to the control command to the glass type wearable device 10, and the glass type.
  • the wearable device 10 notifies the user by processing the received result information.
  • the notification method may include a method of visually displaying the control result information on the display 310, a method of notifying the user by voice through the sound output unit 320, and the like.
  • FIG. 14 is a block diagram of an indoor electronic device control system using the glass type wearable device 10 according to an exemplary embodiment of the present invention. However, in FIG. 5, detailed description of the previously described configuration will be omitted.
  • an indoor electronic device control system using the glass type wearable device 10 may include a glass type wearable device 10; And a wireless access point 50.
  • the glass type wearable device 10 has a function of receiving an electronic device selection command and a control command for control and transmitting a selection command and a control command of the electronic device connected to a wireless access point through the wireless communication. To perform. That is, the glass type wearable device 10 includes a wireless communication unit 250; Control unit 210; And a user input unit 110.
  • the user input unit 110 performs the function of receiving the electronic device selection command and the control command that the glass type wearable device 10 wants to control.
  • the controller 210 determines an electronic device to be controlled by the user based on the selection command and the control command input by the user input unit 110 and grasps a desired control command. In addition, the controller 210 performs information processing to transmit a control command to the electronic device selected through the wireless communication unit 250.
  • the wireless communication unit 250 connects the glass type wearable device 10 with a wireless access point, and transmits a selection command and a control command of the electronic device input through the wireless communication.
  • a function of receiving an indoor wireless communication signal may be performed.
  • the first camera 121 may further include a.
  • the first camera 121 is a camera provided at one side of the glass type wearable device 10 to acquire an image or image in front of the user.
  • the first camera 121 acquires an image or image in front of the user to recognize the electronic device that the user watches. Do this.
  • the controller 210 may recognize an electronic device located in the center of the image or image input by the first camera 121 or may recognize the electronic device present in a specific hand gesture of the user.
  • the first camera 121 performs a function of acquiring a front image or an image for real time indoor positioning of the glass type wearable device 10.
  • the controller 210 may extract a feature element from the front image or the image acquired by the first camera 121, and determine the indoor location based on the location, size, etc. of the feature element.
  • a motion sensor In addition, a motion sensor; And a second camera 122.
  • the motion sensor recognizes a user's moving pattern and performs a function of inputting a selection command or a control command of the electronic device. In addition, the motion sensor performs a function of recognizing the direction the user looks.
  • the geomagnetic sensor and the gyro sensor 131 and the like measure the azimuth angle and the acceleration sensor 132 measures the high and low angles, it is possible to recognize the direction the user looks at the current location of the room.
  • the second camera 122 is a camera provided on one side of the glass type wearable device 10 to acquire an image or an image in an eyeball direction.
  • the second camera 122 recognizes the user's blink pattern and performs a function of inputting a selection command or a control command of the electronic device.
  • the second camera 122 acquires an eyeball direction image or an image, and the glass type wearable device 10 performs eye tracking, so that the gaze direction of the user considering the eyeline direction can be accurately determined.
  • the wireless access point 50 receives a selection command and a control command of the electronic device from the glass type wearable device 10 through wireless communication, and transmits the control command to the selected electronic device.
  • FIG 15 is an internal configuration diagram of the shell direction recognition system 60 according to an embodiment of the present invention.
  • the shell direction recognition system 60 according to an embodiment of the present invention, the shooting specification measuring unit 610; GPS module 620; A wireless communication unit 630; And a second control unit 640.
  • the shooting specification measuring unit 610 performs a function of measuring the shooting specifications of the gun.
  • An elevation sensor 611 performs a function of recognizing the elevation of the gun.
  • the elevation sensor 611 uses a gravity measurement method.
  • the gravity measurement is a method of measuring tilt against gravity by measuring earth gravity through a micro electro mechanical system (MEMS) device and a conductive liquid. By using the method of measuring the inclination according to the conduction step can be applied.
  • MEMS micro electro mechanical system
  • the azimuth sensor 612 recognizes the azimuth of the gun.
  • the azimuth sensor 612 may be made of a gyro compass.
  • the gyro compass is attached to the axis of the high-speed rotating gyroscope to obtain the direction by indicating the rotation axis (true north direction) of the earth under the influence of the earth's rotation.
  • the azimuth sensor 612 may be made of an electronic compass.
  • the electronic compass is a device that obtains the magnetic north direction by measuring the earth's magnetic field using a small device using a micro electro mechanical systems (MEMS) technology.
  • MEMS micro electro mechanical systems
  • the azimuth sensor 612 may be formed of a plurality of GPS modules. It is a method of measuring the north direction (direction) by obtaining two GPS values and comparing each location information with each other.
  • the technology for measuring the azimuth is not limited thereto, and various techniques for measuring the azimuth may be applied.
  • the GPS module 620 recognizes the position of the gun. Since the location where the shell is fired can be calculated based on the measured shooting range and the firing range of the gun, the GPS module 620 recognizes the location of the gun to determine the location where the shell is fired.
  • the second control unit 640 performs a function of processing data in order to transmit the shooting specifications measured from the shooting specification measuring unit 610 through the wireless communication unit 630.
  • the wireless communication unit 630 performs a function of exchanging information with the glass type wearable device. That is, the wireless communication unit 630 transmits the shooting specification information measured by the shooting specification measuring unit 610 to an external device such as the glass type wearable device 10 through wireless communication.
  • the power supply unit may further include.
  • the power supply unit supplies power to drive the system.
  • the power supply unit may be charged wirelessly or wired, or may be made of a power replacement.
  • the shelling recognition unit may further include.
  • the shell recognition unit recognizes whether or not the shell is fired.
  • the shell recognition unit may include an acceleration sensor, a vibration sensor, an acoustic sensor, a smoke sensor, and the like.
  • the acceleration sensor is an element that converts an acceleration change in one direction into an electrical signal, and recognizes whether or not the trigger is triggered by measuring acceleration due to recoil when the shell is triggered.
  • the acoustic sensor recognizes the explosion sound generated when the shell is triggered to recognize whether or not the trigger is triggered.
  • the smoke sensor recognizes the smoke generated by the explosives when the shell is triggered to determine whether or not the trigger.
  • the method of recognizing the shelling is not limited thereto, and may be implemented by various sensors capable of identifying characteristics of the shell firing.
  • the shell direction recognition system 60 measures the elevation angle and the azimuth angle at which the shooting specification measuring unit 610 is the shooting specification, and the GPS module 620 measures the current position of the gun.
  • the second control unit 640 performs the information processing for transmitting the measured information to the wireless communication to the wireless communication unit 630, the wireless communication unit 630 transmits to the glass type wearable device.
  • the second controller 640 may receive the terrain information and the range information through the wireless communication unit 630 to perform the predicted landing location calculation.
  • Anticipated bombardment position display system using a glass-type wearable device using a glass-type wearable device according to an embodiment of the present invention, the wireless communication unit 250; Memory 260; First control unit 210; And a display unit 310.
  • the wireless communication unit 250 performs a function of receiving shooting specification information from the shell direction recognition system 60. In addition, the wireless communication unit 250 may perform a function of receiving terrain information from an external server. In addition, the wireless communication unit 250 may perform a function of receiving the current position information of the gun from the shell direction recognition system 60. Data reception from the shell direction recognition system 60 of the glass wearable device 10 is shown in FIG. 17.
  • the first control unit 210 performs a function of calculating the expected impact location based on the shooting specification information received by the wireless communication unit 250 and the range information stored in the memory 260.
  • the range is the distance that the shell flies upon firing. That is, the range information refers to various information such as the firing speed of the shell required for calculating the reach of the shell according to the elevation.
  • the first controller 210 detects the current position by the GPS module 620 of the shell direction recognition system 60 or the GPS module of the shelling position display system, and based on the elevation of the received shooting specification information. Calculate the range the shell reaches at the elevation.
  • the first controller 210 calculates which direction the shell will fly by the firing range in the direction from the current position by applying the azimuth angle of the received shooting specification information.
  • the first controller 210 may calculate the predicted impact location by reflecting the terrain information (for example, a map including information such as a contour map or a building height). That is, since the position where the shell can reach may vary according to the terrain characteristics such as the height of the mountain, the first controller 210 may calculate the predicted landing area in consideration of the terrain information.
  • the terrain information for example, a map including information such as a contour map or a building height. That is, since the position where the shell can reach may vary according to the terrain characteristics such as the height of the mountain, the first controller 210 may calculate the predicted landing area in consideration of the terrain information.
  • the terrain information may be received from an external server by the wireless communication unit 250 or stored in the memory 260 of the glass type wearable device 10.
  • the terrain information may further include real-time acquisition information received by the wireless communication unit 250.
  • the real-time acquisition information may include an image of the current operation area obtained by the reconnaissance plane. Therefore, the first control unit 210 may determine the situation of the building, the forest, the enemy's position and the like through the image transmitted in real time, and calculate the impact point in consideration of this.
  • the real-time acquisition information is not limited thereto, and may include various information obtained in real time with respect to the battlefield situation.
  • the memory 260 stores a range information for each type of gun.
  • the memory 260 may perform a function of storing a terrain degree (for example, a map including information such as a contour map or a building height).
  • the display 310 displays a predicted landing on a map and displays the expected landing location to the user.
  • an embodiment of the present invention may include a user input unit 110.
  • the user input unit 110 performs a function of inputting the bombardment location information desired by the user.
  • the bombardment location information is location information of the hit point of the shell desired by the user, for example, the latitude and longitude of the location may correspond.
  • the voice input unit 140 and the voice recognition unit 220 may be included.
  • the voice input unit 140 performs a function of receiving the bombardment location information as a voice command of the user, and the voice recognition unit 220 performs a function of identifying the bombardment location information from the input voice command.
  • an embodiment of the present invention may include a first camera 121.
  • the first camera 121 is a camera provided at one side of the front part of the glass type wearable device to acquire an image or an image of the front of the user.
  • the first camera 121 performs a function of acquiring an image or an image to receive the bombardment location information.
  • the first camera 121 obtains an image including the bombardment location information and transmits the image to the first controller 210, and the first controller 210 extracts a character corresponding to the bombardment location information in the image,
  • the location information (latitude and longitude) of the extracted text can be recognized.
  • the GPS module may further include.
  • the GPS module performs the function of identifying the current user's location. The position of the user firing the gun is within the position and error of the gun. Therefore, without receiving the current position information of the artillery from the shell direction recognition system 60 by the wireless communication unit 250, the GPS module measures the current position of the user and the first control unit 210 for calculating the predicted impact location. Can be delivered to.
  • the alarm unit 330 may further include.
  • the alarm unit 330 notifies the user when the bombardment location information input by the user input unit 110 or the first camera 121 and the predicted landing position information calculated by the first control unit 210 match. Perform the function.
  • the alarm unit 330 may include a vibration alarm unit 330 for notifying the user by vibration or a sound output unit 320 for notifying the user by sound output.
  • 16 is a flowchart illustrating a method for predicting shell impact using a glass type wearable device according to an exemplary embodiment of the present invention.
  • a method for providing an expected impact location for a shell using a glass type wearable device includes: receiving current location information (S600); Receiving real-time shooting specification information from the shell direction recognition system (S610); Calculating a predicted landing point based on the shooting specification information and the range information (S620); And displaying the expected impact point on a map and providing the same to the user (S630).
  • S600 current location information
  • S610 real-time shooting specification information from the shell direction recognition system
  • S620 Calculating a predicted landing point based on the shooting specification information and the range information
  • S630 displaying the expected impact point on a map and providing the same to the user
  • the glass type wearable device 10 receives current location information (S600).
  • the current position information may include the glass type wearable device receiving the current position information from the shell direction recognition system, or the glass type wearable device measuring the current position information.
  • the glass type wearable device 10 may receive location information measured by the GPS module 420 of the shell direction recognition system, or the GPS module of the glass type wearable device 10 may measure a current position. Can be.
  • Real-time shooting specification information is received from the shell direction recognition system (S610).
  • the glass type wearable device 10 receives the shooting specification information measured by the shooting specification measuring unit 410 from the shell direction recognition system 60 through wireless communication.
  • the shooting specification information may include elevation information and azimuth information of the gun.
  • the glass type wearable device 10 calculates an expected landing point based on the shooting specification information and the range information (S620). For example, the glass type wearable device 10 may calculate the firing range of the shell at the elevation based on the elevation of the received shooting specification information based on the current position. The glass type wearable device 10 calculates which direction the shell will fly by the calculated range in which direction from the current position by applying an azimuth angle from the received shooting specification information. Through this, the expected impact point can be calculated.
  • the predicted impact location calculation may be performed by the second control unit 440 of the shell direction recognition system 60 as well as the first control unit 210 of the predicted bombardment position display system 10, and wearable of the glass type wearable.
  • the device 10 may receive the calculated predicted impact location information from the shell direction recognition system 60.
  • the predicted impact location calculation may be performed by the external server by transmitting the shooting specification information and the current location information to an external server.
  • the glass type wearable device may receive the predicted impact location information calculated by the external server through the wireless communication unit 250.
  • the predicted impact location calculation may calculate the impact location by reflecting the terrain information received by the glass type wearable device 10.
  • the terrain information may mean information on features of a terrain of a specific region or features located in a specific region.
  • the glass type wearable device 10 may receive terrain information data such as a map including contour maps or building heights from an external server through wireless communication.
  • the glass type wearable device 10 may acquire terrain information obtained in real time by a reconnaissance device (for example, UAV) in real time through wireless communication.
  • the glass type wearable device 10 displays the expected impact point on a map and provides the same to the user (S630). That is, the predicted impact location is displayed on a map (for example, a map on which contour lines are displayed) and visually provided to the user through the display unit 310.
  • a map for example, a map on which contour lines are displayed
  • the predicted impact land calculation step (S620) the step of calculating a first estimated impact land based on the shooting specification information and the range information; Receiving terrain information including the current location and the first predicted impact landing; And calculating a second predicted landing site by reflecting the terrain information. Since the topographical information data can be changed in real time and the size of the data is very large, it is efficient to receive only the topographical information for the required area. Therefore, the glass type wearable device needs to calculate the approximate predicted impact location and request and receive the topographic information of the corresponding area.
  • the glass type wearable device 10 may calculate a first predicted impact landing based on the shooting specification information and the range information. That is, the glass type wearable device 10 may calculate a first predicted impact point, which is an expected impact point in the case of flat land without considering the terrain information. Thereafter, the glass type wearable device 10 may request and receive terrain information including the current location and the first predicted impact landing. Thereafter, the glass type wearable device may calculate the second predicted impact landing by reflecting the terrain information. In the predicted impact land providing step (S630), the second predicted impact land may be displayed as the predicted impact land on the map.
  • the method may further include receiving the map data including the current location or the predicted impact location from an external server.
  • the glass-type wearable device 10 may further include obtaining bombardment location information.
  • the glass type wearable device 10 may receive the bombardment location information from an external server through wireless communication, recognize a character corresponding to the bombardment location information in an image obtained by a first camera, and The bombardment location information may be obtained through a method of recognizing the bombardment location information in voice data.
  • the method may further include displaying and displaying the bombardment location information on a map. That is, the glass type wearable device 10 may visually display the bombardment location information along with the predicted impact location information on the map.
  • the step of notifying the user may further include.
  • the glass type wearable device may determine in real time whether the predicted landing location information calculated in real time and the location location of the desired bombardment target are matched based on real-time shooting specifications.
  • the glass type wearable device 10 may inform the user that the bombardment location information and the predicted landing information match. This allows the user to accurately fire the shell at the desired landing area.
  • the information processing method using the glass type wearable device according to the embodiments of the present invention described above is implemented as a program (or an application) to be executed in combination with the glass type wearable device 10 which is hardware, and stored in a medium. Can be.
  • the processor CPU of the glassy wearable device 10 may execute the glassy wearable device ( 10) may include code coded in a computer language such as C, C ++, JAVA, or machine language that can be read through the device interface.
  • code may include functional code associated with a function or the like that defines the necessary functions for executing the methods, the execution of which is required for the processor of the glassy wearable device 10 to execute according to a predetermined procedure. Procedure-related control codes.
  • such code requires that additional information or media required to execute the functions by the processor of the glassy wearable device 10 is referred to at any position (address address) of the internal or external memory of the glassy wearable device 10. It may further include a memory reference code for whether or not.
  • the code may be configured to communicate with the communication module of the glass type wearable device 10. It may further include communication-related code, such as how to communicate with any other computer or server in the remote, what information or media should be transmitted and received during communication.
  • the stored medium is not a medium for storing data for a short time such as a register, a cache, a memory, but semi-permanently, and means a medium that can be read by the device.
  • examples of the storage medium include, but are not limited to, a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
  • the program may be stored in various recording media on various servers accessible by the glass type wearable device 10 or various recording media on the glass type wearable device 10 of the user.
  • the media may also be distributed over network coupled computer systems so that the computer readable code is stored in a distributed fashion.

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Alarm Systems (AREA)
  • Navigation (AREA)

Abstract

The present invention concerns an information-processing system and method using a wearable device. According to one embodiment of the present invention, the information-processing method using the wearable device comprises the steps of: receiving in-building-incident location information; computing an escape route by determining safe emergency exit location information, based on the incident location information; generating guidance information in accordance with the computed escape route; and providing the guidance information to a user. Here, the incident location information comprises location information about where a specific incident has occurred, which is required for a user within a specific building to take refuge. According to the present invention, the glasses-type wearable device has the advantageous effect of providing people with an essentially safe route based on incident position information acquired via a wireless communication signal, so making it possible to reduce casualties from incidents.

Description

웨어러블 디바이스를 이용한 정보처리 시스템 및 방법Information processing system and method using wearable device
본 발명은 웨어러블 디바이스를 이용한 정보처리 시스템 및 방법에 관한 것으로, 보다 자세하게는 웨어러블 디바이스를 보유한 사용자의 활동 중에 획득된 데이터를 정보 처리하는 시스템 및 방법에 관한 것이다.The present invention relates to an information processing system and method using a wearable device, and more particularly, to a system and method for information processing data acquired during an activity of a user having a wearable device.
정보화시대가 급속히 발전함에 따라 현실감 있는 화면을 구현하는 디스플레이 장치의 중요성이 강조되고 있으며, 일 예로서 두부 장착형 디스플레이 장치(head mounted display, HMD)가 있다. 두부 장착형 디스플레이 장치는 주로 보안경이나 헬멧형 기기로 눈앞에 있는 스크린을 보도록 이루어지며, 가상 현실감을 실현하기 위해 개발되었다. As the information age evolves rapidly, the importance of a display device for realizing a realistic screen is emphasized. An example is a head mounted display (HMD). The head mounted display device is mainly made of a safety glasses or a helmet type device to see the screen in front of the eyes, and was developed to realize virtual reality.
글라스형 웨어러블 디바이스는 양안(양쪽 눈)에 근접한 위치에 액정 등의 소형 디스플레이가 설치되어 영상을 투영하는 방식이 일반적이며, 현재는 우주 개발, 원자로, 군사 기관 및 의료 기관에서 사용하기 위한 것과 업무용이나 게임용 등에 대한 각종 개발이 진행되고 있다. 종래 기술에 따른 글라스형 웨어러블 디바이스는 미국 특허 등록 번호 US8,427,396에도 개시되어 있다.Glass-type wearable devices generally have a small display such as a liquid crystal installed at a position close to both eyes (both eyes) to project an image. Currently, the wearable device is for use in space development, nuclear reactors, military institutions, and medical institutions. Various developments for games and the like are in progress. A glass type wearable device according to the prior art is also disclosed in US Pat. No. US8,427,396.
다양한 형태의 글라스형 웨어러블 디바이스가 연구되고 등장하고 있으나, 착용자들이 일상생활에서 편리하게 활용할 수 있는 방법 또는 서비스는 매우 한정되어 있으며, 글라스형 웨어러블 디바이스에 적합한 UI(User Interface) 또는 UX(User Experience)도 개발되고 있지 않은 실정이다. 또한, 최근에는 스마트폰과 같은 이동단말기와 연동하여 사용하는 글라스형 웨어러블 디바이스뿐만 아니라 단독으로 대부분의 기능을 수행할 수 있는 글라스형 웨어러블 디바이스도 개발되고 있으므로, 글라스형 웨어러블 디바이스 시장의 활성화를 위해 기존의 이동단말기와 차이나는 글라스형 웨어러블 디바이스의 특성에 적합한 서비스가 필요하다.Although various types of glass wearable devices have been researched and emerged, methods or services that wearers can conveniently utilize in daily life are very limited, and UI (User Interface) or UX (User Experience) suitable for glass wearable devices is very limited. Neither is it being developed. In addition, recently, glass-type wearable devices that can perform most functions by themselves, as well as glass-type wearable devices used in conjunction with mobile terminals such as smartphones, have been developed. Different from the mobile terminal of the need for a service suitable for the characteristics of the glass-type wearable device.
이러한 문제점을 해결하기 위하여, 본 발명의 일목적은 글라스형 웨어러블 디바이스의 사용자 또는 착용자가 활용할 수 있는 글라스형 웨어러블 디바이스에 적합한 서비스를 제공하고자 한다.In order to solve this problem, one object of the present invention is to provide a service suitable for a glass-type wearable device that can be utilized by the user or wearer of the glass-type wearable device.
특히, 글라스형 웨어러블 디바이스를 이용하여 탈출경로를 제공하거나 외부로 비상연락을 수행하는 시스템 및 방법을 제공하고자 한다.In particular, an object of the present invention is to provide a system and method for providing an escape route or performing emergency contact to the outside using a glass type wearable device.
또한, 글라스형 웨어러블 디바이스를 통해 제어할 전자기기를 선택하고 제어명령을 입력하여 무선통신으로 사용자가 원하는 전자기기를 제어하는 시스템 및 방법을 제공하고자 한다. In addition, the present invention provides a system and method for controlling an electronic device desired by a user through wireless communication by selecting an electronic device to be controlled through a glass type wearable device and inputting a control command.
또한, 포탄방향인식시스템에 의해 획득된 사격제원 및 포의 사거리를 바탕으로 예상탄착지를 계산하여 글라스형 웨어러블 디바이스의 사용자에게 표시하여 포탄을 원하는 위치로 정확하게 발사할 수 있도록 하는 시스템 및 방법을 제공하고자 한다.In addition, to provide a system and method for calculating the expected landing location based on the shooting range and the range of the gun obtained by the shell direction recognition system to display to the user of the glass-type wearable device to accurately launch the shell to the desired position. do.
본 발명의 일실시예에 따른 웨어러블 디바이스를 이용한 정보처리방법은, 건물 내 사건발생위치정보를 수신하는 단계; 상기 사건발생위치정보를 바탕으로, 안전한 비상구 위치정보를 판단하여 탈출경로를 계산하는 단계; 상기 계산된 탈출경로를 따라 안내정보를 생성하는 단계; 및 사용자에게 상기 안내정보를 제공하는 단계;를 포함하며, 상기 사건발생위치정보는, 특정 건물 내의 사용자의 대피를 요하는 특정한 사건이 발생한 위치정보이다.Information processing method using a wearable device according to an embodiment of the present invention, the step of receiving event occurrence location information in the building; Calculating an escape route by determining safe emergency exit location information based on the event occurrence location information; Generating guide information along the calculated escape route; And providing the guide information to the user, wherein the event occurrence location information is location information at which a specific event requiring evacuation of the user in a specific building has occurred.
본 발명의 다른 일실시예에 따른 웨어러블 디바이스를 이용한 정보처리방법은, 사용자의 음성입력 또는 동작입력을 획득하는 단계; 상기 음성입력 또는 동작입력을 인식하여 긴급상황을 판단하는 단계; 및 무선통신으로 비상연락 상대방에게 상기 긴급상황 알림정보를 전송하는 단계;를 포함한다.In accordance with another aspect of the present invention, an information processing method using a wearable device includes: obtaining a voice input or an operation input of a user; Determining an emergency situation by recognizing the voice input or operation input; And transmitting the emergency notification information to the emergency contact counterpart via wireless communication.
본 발명의 또 다른 일실시예에 따른 웨어러블 디바이스를 이용한 정보처리방법은, 상기 글라스형 웨어러블 디바이스가 현재 실내 위치를 측정하는 단계; 상기 글라스형 웨어러블 디바이스가 방위각 또는 고저각을 인식하여 사용자의 주시 방향을 측정하는 단계; 상기 측정된 현재 실내 위치에서 상기 주시 방향에 위치한 상기 전자기기를 제어대상으로 판단하는 단계; 사용자로부터 상기 전자기기의 제어명령을 수신하는 단계; 및 입력된 상기 제어명령을 상기 전자기기로 무선통신을 통해 전송하는 단계;를 포함한다.An information processing method using a wearable device according to another embodiment of the present invention may include: measuring, by the glass type wearable device, a current indoor location; Measuring a gaze direction of a user by recognizing an azimuth angle or a high and low angle by the glass type wearable device; Determining the electronic device located in the gaze direction from the measured current indoor location as a control object; Receiving a control command of the electronic device from a user; And transmitting the input control command to the electronic device through wireless communication.
본 발명의 또 다른 일실시예에 따른 웨어러블 디바이스를 이용한 정보처리방법은, 사용자의 주시방향에 상응하는 이미지를 획득하는 단계; 상기 이미지 분석을 통해 상기 이미지 내 상기 전자기기를 제어대상으로 판단하는 단계; 사용자로부터 상기 전자기기의 제어명령을 수신하는 단계; 및 입력된 상기 제어명령을 상기 전자기기로 무선통신을 통해 전송하는 단계;를 포함한다.According to another embodiment of the present invention, an information processing method using a wearable device includes: acquiring an image corresponding to a gaze direction of a user; Determining the electronic device in the image as a control object through the image analysis; Receiving a control command of the electronic device from a user; And transmitting the input control command to the electronic device through wireless communication.
본 발명의 또 다른 일실시예에 따른 웨어러블 디바이스를 이용한 정보처리방법은, 사용자로부터 디바이스에 제어를 원하는 전자기기 선택명령 및 제어명령에 상응하는 음성명령을 수신하는 단계; 상기 음성명령을 분석하여 제어대상인 전자기기와 제어명령을 판단하는 단계; 및 입력된 상기 제어명령을 선택된 상기 전자기기로 무선통신을 통해 전송하는 단계;를 포함한다.In accordance with another aspect of the present invention, an information processing method using a wearable device includes: receiving an electronic device selection command and a control command corresponding to a control command from a user; Analyzing the voice command to determine a control command with an electronic device to be controlled; And transmitting the inputted control command to the selected electronic device through wireless communication.
본 발명의 또 다른 일실시예에 따른 웨어러블 디바이스를 이용한 정보처리방법은, 현재 위치정보를 수신하는 단계; 실시간 사격제원정보를 포탄방향인식시스템으로부터 수신하는 단계; 상기 사격제원정보 및 사거리정보를 바탕으로 예상탄착지를 계산하는 단계; 및 상기 예상탄착지를 지도상에 표시하여 사용자에게 제공하는 단계;를 포함한다.According to another embodiment of the present invention, an information processing method using a wearable device includes: receiving current location information; Receiving real-time shooting specification information from the shell direction recognition system; Calculating a predicted landing location based on the shooting specification information and the range information; And displaying the expected impact point on a map and providing the same to the user.
상기와 같은 본 발명에 따르면, 아래와 같은 다양한 효과들을 가진다.According to the present invention as described above, has the following various effects.
첫째, 상기 글라스형 웨어러블 디바이스가 무선통신신호를 통해 획득된 사건발생장소정보를 피해가는 실질적으로 안전한 경로를 사람들에게 제공하여, 사고에 의한 사상자를 줄일 수 있는 효과가 있다.First, the glass-type wearable device provides people with a substantially safe path that avoids incident location information obtained through a wireless communication signal, thereby reducing casualties caused by accidents.
둘째, 연기에 의해 시인성이 저하되는 경우에도, 사용자의 눈 앞에 위치한 글라스형 웨어러블 디바이스의 디스플레이부 상에 경로 안내를 표시하여 사용자가 비상구 또는 대피장소를 용이하게 찾아갈 수 있도록 하는 효과가 있다.Second, even when the visibility is deteriorated by the smoke, the route guidance is displayed on the display unit of the glass wearable device located in front of the user's eyes so that the user can easily find the emergency exit or evacuation site.
셋째, 긴급상황에 처한 사용자가 별도의 조작없이 비상연락을 수행할 수 있다. 이에 따라 조기에 긴급상황에 대처하여 범죄 또는 사고를 줄일 수 있다.Third, a user in an emergency may perform emergency contact without additional manipulation. As a result, crimes or accidents can be reduced early by responding to emergency situations.
넷째, 상기 글라스형 웨어러블 디바이스의 특성상 사용자가 항상 입고 있으므로, 항상 사용자의 긴급상황에 대비할 수 있다.Fourth, since the user is always wearing due to the characteristics of the glass type wearable device, it is always possible to prepare for an emergency of the user.
다섯째, 긴급상황 별로 음성입력과 동작입력을 분류하고 비상연락을 할 상대방 및 연락처를 지정해두므로, 사용자가 처한 긴급상황에 적절하게 대응할 수 있는 상대방에게 비상연락이 수행될 수 있다.Fifth, since a voice input and an operation input are classified for each emergency situation and a contact person and a contact point for emergency contact are designated, an emergency contact can be performed to a counterpart who can appropriately respond to an emergency situation that a user encounters.
여섯째, 본 발명에 따르면 원거리에서 가정 내의 전자기기를 제어할 수 있으므로, 사용자가 직접 제어를 원하는 전자기기까지 이동하여 제어하는 불편함을 해소할 수 있다. 예를 들어, 침대에 누운 상태에서 전등을 끄기 위해서 전등 스위치가 있는 곳까지 갔다와야하는 불편함을 해소할 수 있다.Sixth, according to the present invention can control the electronic devices in the home at a long distance, it is possible to eliminate the inconvenience of the user to move to control the desired electronic devices. For example, the inconvenience of having to go to the place where the light switch is located in order to turn off the light while lying in the bed can be eliminated.
일곱째, 각 전자기기가 가정 내 무선통신에 연결될 수 있으면, 각 전자기기의 리모컨이 없이 글라스형 웨어러블 디바이스를 이용하여 제어할 수 있는 장점이 있다.Seventh, if each electronic device can be connected to the wireless communication in the home, there is an advantage that can be controlled using the glass-type wearable device without a remote control of each electronic device.
여덟째, 눈깜박임 패턴이나 고개움직임 패턴 등의 간단한 동작으로 상기 전자기기를 제어할 수 있고, 사용자가 제어를 원하는 전자기기를 주시하는 것만으로 제어대상 전자기기를 선택할 수 있어, 간편하게 가정 내 전자기기를 제어할 수 있는 효과가 있다. 예를 들어, 사용자가 침대에 누운 상태에서 노래가 재생 중인 오디오를 끄기를 원한다면, 사용자가 상기 글라스형 웨어러블 디바이스를 착용한 채로 상기 오디오를 응시하고 오디오 꺼짐에 해당하는 눈깜박임 패턴을 입력하면 된다.Eighth, the electronic device can be controlled by a simple operation such as a blinking pattern or a moving pattern, and the user can select a control target electronic device simply by looking at the electronic device that the user wants to control. There is a controllable effect. For example, if the user wants to turn off the audio being played while the user is lying in bed, the user may stare at the audio while wearing the glass wearable device and input a blinking pattern corresponding to the audio off.
아홉째, 포탄의 예상탄착지를 실시간으로 계산하여 글라스형 웨어러블 디바이스의 사용자에게 표시해주어, 전투 시 포탄을 정확하게 원하는 위치로 발사할 수 있다.Ninth, the projected landing of the shell is calculated in real time and displayed to the user of the glass type wearable device, so that the shell can be fired exactly at a desired position in battle.
열번째, 방위각 및 고각 변경에 따른 예상탄착지를 알려줌으로써 방위각 및 고각의 조절을 시각적으로 구현하여, 포의 사격제원 설정을 간편하게 하는 효과가 있다.Tenth, by visually realizing the adjustment of the azimuth and elevation by informing the predicted landing area according to the change in the azimuth and elevation, there is an effect of simplifying the shooting specifications of the artillery.
열한번째, 포탄을 정확하게 원하는 위치로 발사할 수 있어서, 포의 살상력을 높일 수 있는 효과가 있다.Eleventh, the shell can be fired exactly at the desired position, thereby increasing the killing power of the artillery.
도 1은 본 발명의 일 실시예들과 관련된 글라스형 웨어러블 디바이스 시스템의 블록 구성도(block diagram)이다.1 is a block diagram of a glass type wearable device system according to an exemplary embodiment of the present invention.
도 2는 본 발명의 일 실시예들과 관련된 글라스형 웨어러블 디바이스의 예시도면이다.2 is an exemplary view of a glass type wearable device related to one embodiment of the present invention.
도 3은 본 발명의 일 실시예들과 관련된 글라스형 웨어러블 디바이스, 외부서버 및 외부디바이스의 연결관계도이다.3 is a diagram illustrating a connection relationship between a glass type wearable device, an external server, and an external device according to an exemplary embodiment of the present invention.
도 4는 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법에 대한 순서도이다.4 is a flowchart illustrating a method of providing an escape route using a glass type wearable device according to an embodiment of the present invention.
도 5는 본 발명의 일실시예에 따라 글라스형 웨어러블 디바이스의 디스플레이부상에 안내정보를 표시한 예시도면이다.5 is an exemplary view showing guide information on a display unit of a glass type wearable device according to an embodiment of the present invention.
도 6은 본 발명의 일실시예에 따른 비콘과 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공시스템 의 내부구성도이다.6 is an internal configuration diagram of a system for providing an escape route using a beacon and a glass type wearable device according to an embodiment of the present invention.
도 7은 본 발명의 일실시예에 따른 통제서버, 비콘 및 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공시스템의 내부구성도이다.7 is an internal configuration diagram of a system for providing an escape route using a control server, a beacon, and a glass type wearable device according to an embodiment of the present invention.
도 8는 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락방법에 대한 순서도이다.8 is a flowchart illustrating an emergency emergency contact method using a glass type wearable device according to an embodiment of the present invention.
도 9는 본 발명의 일실시예에 따라 글라스형 웨어러블 디바이스와 비상연락 상대방 간의 연결관계를 도시한 도면이다.9 is a diagram illustrating a connection relationship between a glass type wearable device and an emergency contact counterpart according to an embodiment of the present invention.
도 10은 본 발명의 일실시예에 따른 실내측위 및 주시방향 측정에 의한 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법의 순서도이다.10 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by measuring indoor positioning and gaze direction according to an embodiment of the present invention.
도 11은 본 발명의 일실시예에 따른 전방영상 분석에 의한 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법의 순서도이다.FIG. 11 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by analyzing a front image according to an embodiment of the present disclosure.
도 12는 본 발명의 일실시예에 따른 음성명령 인식에 의한 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법의 순서도이다.12 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by voice command recognition according to an embodiment of the present invention.
도 13은 본 발명의 일실시예에 따라 글라스형 웨어러블 디바이스의 디스플레이부에 인식된 전자기기와 제어명령을 표시한 예시도면이다.FIG. 13 is an exemplary view showing an electronic device and a control command recognized in a display unit of a glass type wearable device according to an embodiment of the present invention.
도 14은 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어시스템의 내부구성도이다.14 is a block diagram of an indoor electronic device control system using a glass type wearable device according to an embodiment of the present invention.
도 15은 본 발명의 일 실시예에 따른 포탄방향인식시스템의 내부구성도이다.15 is an internal configuration diagram of a shell direction recognition system according to an embodiment of the present invention.
도 16는 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법의 순서도이다.16 is a flowchart illustrating a method for providing a projected impact target shell using a glass type wearable device according to an embodiment of the present invention.
도 17은 본 발명의 일실시예에 따른 포탄방향인식시스템과 글라스형 웨어러블 디바이스간의 연결관계를 도시한 도면이다.17 is a diagram illustrating a connection relationship between a shell direction recognition system and a glass type wearable device according to an exemplary embodiment of the present invention.
도 18는 본 발명의 일실시예에 따라 글라스형 웨어러블 디바이스의 디스플레이부 상에 예상탄착지를 표시한 예시도면이다.FIG. 18 is an exemplary view showing an expected impact point on a display unit of a glass type wearable device according to an embodiment of the present invention.
본 발명의 이점 및 특징, 그리고 그것들을 달성하는 방법은 첨부되는 도면과 함께 상세하게 후술되어 있는 실시예들을 참조하면 명확해질 것이다. 그러나 본 발명은 이하에서 개시되는 실시예들에 한정되는 것이 아니라 서로 다른 다양한 형태로 구현될 것이며, 단지 본 실시예들은 본 발명의 개시가 완전하도록 하며, 본 발명이 속하는 기술분야에서 통상의 지식을 가진 자에게 발명의 범주를 완전하게 알려주기 위해 제공되는 것이며, 본 발명은 청구항의 범주에 의해 정의될 뿐이다. 한편, 본 명세서에서 사용된 용어는 실시예들을 설명하기 위한 것이며 본 발명을 제한하고자 하는 것은 아니다. 본 명세서에서, 단수형은 문구에서 특별히 언급하지 않는 한 복수형도 포함한다. 명세서에서 사용되는 "포함한다(comprises)" 및/또는 "포함하는(comprising)"은 언급된 구성요소, 단계, 동작 및/또는 소자는 하나 이상의 다른 구성요소, 단계, 동작 및/또는 소자의 존재 또는 추가를 배제하지 않는다. 또한 본 발명의 요지를 불필요하게 흐릴 수 있는 공지 기능 및 구성에 대한 상세한 설명은 생략한다.Advantages and features of the present invention and methods for achieving them will be apparent with reference to the embodiments described below in detail with the accompanying drawings. However, the present invention is not limited to the embodiments disclosed below, but may be implemented in various forms. It is provided to fully convey the scope of the invention to those skilled in the art, the invention being defined only by the scope of the claims. Meanwhile, the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. In this specification, the singular also includes the plural unless specifically stated otherwise in the phrase. As used herein, “comprises” and / or “comprising” refers to the presence of one or more other components, steps, operations and / or elements. Or does not exclude additions. In addition, detailed descriptions of well-known functions and configurations that may unnecessarily obscure the subject matter of the present invention will be omitted.
도 1은 본 발명의 일 실시예들과 관련된 글라스형 웨어러블 디바이스 시스템(10)의 블록 구성도(block diagram)이다. 1 is a block diagram of a glass type wearable device system 10 according to an embodiment of the present invention.
도 1을 참조하면, 본 발명의 일실시예들과 관련된 글라스형 웨어러블 디바이스 시스템(10)은 입력부(100), 사용자입력부(110), 키보드(111), 터치패드(112), 카메라부(120), 제1카메라(121), 제2카메라(122), 제3카메라(123), 센싱부(130), 자이로센서(131), 가속도센서(132), 압력센서(133), 홍채인식센서(134), 심박검출센서(135), 근전도센서(136), 음성입력부(140), 제어부(210), 음성인식부(220), 인터페이스부(230), 음성-텍스트변환모듈(240), 무선통신부(250), 메모리(260), 출력부(300), 디스플레이부(310), 음향출력부(320), 알람부(330), 및 햅틱모듈(340)의 전부 또는 일부를 포함한다. 도 1에 도시된 구성요소들이 필수적인 것은 아니어서, 그보다 많은 구성요소들을 갖거나 그보다 적은 구성요소들을 갖는 글라스형 웨어러블 디바이스가 구현될 수도 있다.Referring to FIG. 1, the glass type wearable device system 10 according to an exemplary embodiment of the present invention may include an input unit 100, a user input unit 110, a keyboard 111, a touch pad 112, and a camera unit 120. ), The first camera 121, the second camera 122, the third camera 123, the sensing unit 130, the gyro sensor 131, the acceleration sensor 132, the pressure sensor 133, iris recognition sensor 134, heart rate detection sensor 135, EMG sensor 136, voice input unit 140, control unit 210, voice recognition unit 220, interface unit 230, voice-to-text conversion module 240, The wireless communication unit 250, a memory 260, an output unit 300, a display unit 310, an audio output unit 320, an alarm unit 330, and all or a part of the haptic module 340 are included. The components shown in FIG. 1 are not essential, so a glassy wearable device with more or less components may be implemented.
도 2는 본 발명의 일 실시예들과 관련된 글라스형 웨어러블 디바이스의 예시도면이다.2 is an exemplary view of a glass type wearable device related to one embodiment of the present invention.
상기 구성요소들은 도 2에서와 같은 글라스형 웨어러블 디바이스의 내부 또는 일측에 구비될 수 있다.The components may be provided inside or on one side of the glass type wearable device as shown in FIG. 2.
이하, 상기 구성요소들에 대해 차례로 살펴본다.Hereinafter, the components will be described in order.
입력부(100)는 오디오 신호, 비디오 신호, 사용자의 조작신호, 생체신호 등의 입력을 위한 것이다. 입력부(100)는 사용자입력부(110), 카메라부(120), 센싱부(130), 음성입력부(140)를 포함한다.The input unit 100 is for inputting an audio signal, a video signal, a user's manipulation signal, a biosignal, and the like. The input unit 100 includes a user input unit 110, a camera unit 120, a sensing unit 130, and a voice input unit 140.
사용자입력부(110)는 사용자가 디바이스의 동작 제어를 위하여 입력하는 키 입력 데이터를 발생시킨다. 사용자입력부(110)는 키패드(key pad), 키보드(111), 돔 스위치(dome switch), 터치패드(정압/정전)(112), 조그 휠, 조그 스위치, 핑거 마우스 등으로 구성될 수 있다. 특히, 터치 패드가 후술하는 디스플레이부(310)와 상호 레이어 구조를 이룰 경우, 이를 터치스크린(touch screen)이라 부를 수 있다.The user input unit 110 generates key input data input by the user for controlling the operation of the device. The user input unit 110 may include a keypad, a keyboard 111, a dome switch, a touch pad (static pressure / capacitance) 112, a jog wheel, a jog switch, a finger mouse, and the like. In particular, when the touch pad forms a mutual layer structure with the display unit 310 to be described later, this may be referred to as a touch screen.
카메라(120)는 비디오 신호 또는 이미지 신호 입력을 위한 것으로, 디바이스의 구성 태양에 따라 2개 이상이 구비될 수도 있다. 카메라(120)는 화상 통화모드 또는 촬영 모드에서 이미지 센서에 의해 얻어지는 정지영상 또는 동영상 등의 화상 프레임을 처리한다. 그리고 처리된 화상 프레임은 디스플레이부(310)에 표시될 수 있다. 또한, 카메라(120)에서 처리된 화상 프레임은 메모리(260)에 저장되거나 무선통신부(250)를 통하여 외부로 전송될 수 있다. 또한, 이미지 신호 또는 비디오 신호가 정보처리를 위한 입력으로 쓰이는 경우, 제어부(210)로 이미지 신호 및 비디오 신호를 전달한다.The camera 120 is for inputting a video signal or an image signal, and two or more cameras 120 may be provided according to a configuration aspect of the device. The camera 120 processes image frames such as still images or moving images obtained by an image sensor in a video call mode or a photographing mode. The processed image frame may be displayed on the display 310. In addition, the image frame processed by the camera 120 may be stored in the memory 260 or transmitted to the outside through the wireless communication unit 250. In addition, when an image signal or a video signal is used as an input for information processing, the control unit 210 transmits the image signal and the video signal.
카메라부(120)는 촬영하는 영상의 방향 또는 목적에 따라 하나 이상의 카메라를 포함할 수 있다. 제1카메라(121)는 글라스형 웨어러블 디바이스의 일측에 구비되어 전방의 영상을 촬영할 수 있다. 제2카메라(122)는 글라스형 웨어러블 디바이스의 일측에 구비되어 안구 방향의 영상 또는 이미지를 획득할 수 있다. 제3카메라(123)는 상기 글라스형 웨어러블 디바이스(10)의 후방 또는 측방에 구비되어, 후방 또는 측방의 영상 또는 이미지를 획득할 수 있다.The camera unit 120 may include one or more cameras according to the direction or purpose of the captured image. The first camera 121 may be provided at one side of the glass type wearable device to capture an image of the front side. The second camera 122 may be provided at one side of the glass type wearable device to acquire an image or an image in an eyeball direction. The third camera 123 may be provided at the rear or side of the glass type wearable device 10 to acquire an image or an image of the rear or side.
센싱부(130)는 사용자의 글라스형 웨어러블 디바이스(10)의 착용여부, 디바이스의 위치 등과 같이 디바이스의 현 상태를 감지하여 디바이스의 동작을 제어하기 위한 센싱 신호를 발생시킨다. 또한, 센싱부(130)는 디바이스의 정보처리를 위한 입력신호를 받는 입력부의 기능을 담당할 수 있으며, 외부 기기 연결 여부 인식 등의 다양한 센싱 기능을 담당할 수 있다.The sensing unit 130 generates a sensing signal for controlling the operation of the device by detecting the current state of the device, such as whether the user wears the glass-shaped wearable device 10 or the position of the device. In addition, the sensing unit 130 may perform a function of an input unit receiving an input signal for information processing of the device, and may perform various sensing functions such as whether an external device is connected or not.
센싱부(130)는 근접센서(Proximity Sensor), 압력센서(133), 모션 센서, 지문인식센서, 홍채인식센서(134), 심박검출센서(135), 피부온도센서, 피부저항센서 및 심전도센서 등의 다양한 센서를 포함할 수 있다. The sensing unit 130 includes a proximity sensor, a pressure sensor 133, a motion sensor, a fingerprint recognition sensor, an iris recognition sensor 134, a heart rate detection sensor 135, a skin temperature sensor, a skin resistance sensor and an electrocardiogram sensor. And various sensors.
근접센서는 접근하는 물체나, 근방에 존재하는 물체의 유무 등을 기계적 접촉이 없이 검출할 수 있도록 한다. 근접센서는 교류자계의 변화나 정자계의 변화를 이용하거나, 혹은 정전용량의 변화율 등을 이용하여 근접물체를 검출할 수 있다. 근접센서는 구성 태양에 따라 2개 이상이 구비될 수 있다.The proximity sensor can detect the presence or absence of an approaching object or an object present in the vicinity without mechanical contact. The proximity sensor can detect a proximity object by using a change in an alternating magnetic field or a change in a static magnetic field, or by using a change rate of capacitance. Two or more proximity sensors may be provided according to the configuration aspect.
압력센서(133)는 디바이스에 압력이 가해지는지 여부와, 그 압력의 크기 등을 검출할 수 있다. 압력센서(133)는 사용환경에 따라 디바이스에서 압력의 검출이 필요한 부위에 설치될 수 있다. 만일, 압력센서(133)가 디스플레이부(310)에 설치되는 경우, 압력센서(133)에서 출력되는 신호에 따라, 디스플레이부(310)를 통한 터치 입력과, 터치 입력보다 더 큰 압력이 가해지는 압력터치 입력을 식별할 수 있다. 또한, 압력센서(133)에서 출력되는 신호에 따라, 압력터치 입력 시에 디스플레이부(310)에 가해지는 압력의 크기도 알 수 있다.The pressure sensor 133 may detect whether pressure is applied to the device, the magnitude of the pressure, and the like. The pressure sensor 133 may be installed at a portion of the device requiring detection of pressure according to the use environment. If the pressure sensor 133 is installed in the display 310, a touch input through the display 310 and a greater pressure than the touch input are applied according to the signal output from the pressure sensor 133. The pressure touch input can be identified. In addition, according to the signal output from the pressure sensor 133, it is also possible to know the magnitude of the pressure applied to the display 310 when the pressure touch input.
모션 센서는 자이로센서(131), 가속도센서(132), 지자기센서 등의 센서 중에서 하나 이상을 포함하며, 이를 이용하여 디바이스의 위치나 움직임 등을 감지한다. 모션 센서에 사용될 수 있는 가속도센서(132)는 어느 한 방향의 가속도 변화에 대해서 이를 전기 신호로 바꾸어 주는 소자로서, MEMS(micro-electromechanical systems) 기술의 발달과 더불어 널리 사용되고 있다. 또한, 자이로센서(131)는 각속도를 측정하는 센서로서, 기준 방향에 대해 돌아간 방향을 감지할 수 있다.The motion sensor includes one or more of sensors such as a gyro sensor 131, an acceleration sensor 132, and a geomagnetic sensor, and detects the position or movement of the device using the same. The acceleration sensor 132 that can be used for a motion sensor is a device that converts an acceleration signal in one direction into an electrical signal, and is widely used with the development of micro-electromechanical systems (MEMS) technology. In addition, the gyro sensor 131 is a sensor for measuring the angular velocity, and may sense a direction returned to the reference direction.
심박검출센서(135)는 심장 박동에 의한 혈관 굵기의 변화에 따른 광혈류량의 변화를 측정한다. 피부 온도 센서는 온도 변화에 반응하여 저항값이 변화하는 것에 따라 피부 온도를 측정한다. 피부 저항 센서는 피부의 전기 저항을 측정한다.The heart rate detection sensor 135 measures the change in the light blood flow rate according to the change in blood vessel thickness due to the heartbeat. The skin temperature sensor measures the skin temperature as the resistance value changes in response to the temperature change. Skin resistance sensors measure the electrical resistance of the skin.
홍채인식센서(134)는 사람마다 고유한 특성을 가진 안구의 홍채 정보를 이용해 사람을 인식하는 기능을 수행한다. 사람의 홍채는 생후 18개월 이후 완성된 뒤, 홍채의 내측연 가까이에 융기되어 있는 원형의 홍채 패턴은 한번 정해지면 거의 변하지 않고, 또 사람마다 모양이 모두 다르다. 따라서 홍채인식은 사람마다 각기 다른 홍채의 특성을 정보화해 이를 보안용 인증기술로 응용한 것이다. 즉, 홍채의 모양과 색깔, 망막 모세혈관의 형태소 등을 분석해 사람을 식별하기 위한 수단으로 개발한 인증방식이다.The iris recognition sensor 134 performs a function of recognizing a person using iris information of an eye having unique characteristics for each person. The human iris is completed after 18 months of age, and then the circular iris pattern, which is raised near the inner edge of the iris, is almost unchanged once determined. Therefore, iris recognition is the application of security authentication technology by informatizing the characteristics of different iris for each person. In other words, it is an authentication method developed as a means of identifying people by analyzing the shape and color of the iris and the morphology of the retinal capillaries.
홍채인식센서(134)는 홍채의 패턴을 코드화해 이를 영상신호로 바꾸어 비교판단하는데, 일반적인 작동 원리는 다음과 같다. 먼저 일정한 거리에서 홍채인식기 중앙에 있는 거울에 사용자의 눈이 맞춰지면, 적외선을 이용한 카메라가 줌렌즈를 통해 초점을 조절한다. 이어 홍채 카메라가 사용자의 홍채를 사진으로 이미지화한 뒤, 홍채 인식 알고리즘이 홍채의 명암 패턴을 영역별로 분석해 개인 고유의 홍채 코드를 생성한다. 마지막으로 홍채 코드가 데이터베이스에 등록되는 것과 동시에 비교 검색이 이루어진다. 상기 홍채인식센서(134)는 안구방향으로 배치된 제2카메라(122)에 내부에 구비될 수 있으며, 이러한 경우 홍채인식센서의 기능을 상기 제2카메라(122)가 수행할 수 있다.The iris recognition sensor 134 codes and compares the iris pattern with an image signal, and determines the comparison. The general operation principle is as follows. First, when the user's eyes are focused on the mirror in the center of the iris recognizer at a certain distance, the infrared camera adjusts the focus through the zoom lens. Then, the iris camera images the user's iris into a photograph, and the iris recognition algorithm analyzes the iris contrast patterns by area to generate a unique iris code. Finally, a comparison search is performed as soon as the iris code is registered in the database. The iris recognition sensor 134 may be provided inside the second camera 122 disposed in the eye direction, and in this case, the second camera 122 may perform a function of the iris recognition sensor.
거리센서는 2점간 거리 측정방식, 3각 측량방식(적외선 이용식, 자연광 이용식), 초음파 방식 등이 있다. 종래의 3각측량의 원리와 같이, 2개의 경로에서 온 피측정물을 직각 프리즘으로 반사시켜 2개의 이미지 센서에 입사시켜 상대적 위치가 합치했을 때, 2점간의 거리가 표시된다. 이 경우, 자연광으로 하는 방법(수동식)과 적외선을 발사하여 행하는 방법이 있다. 초음파방식은 피측정물에 지향성이 날카로운 초음파를 송신하여 피측정물로부터의 반사파를 수신하기까지의 시간을 측정하여 거리를 아는 방식인데, 수신센서는 압전소자가 사용된다.The distance sensor includes a distance measurement method between two points, a triangulation method (infrared ray type, natural light type), and ultrasonic type. As in the conventional triangulation principle, the distance between two points is displayed when the measured objects from the two paths are reflected by a right-angle prism and incident on the two image sensors so that the relative positions match. In this case, there are a method of generating natural light (passive type) and a method of emitting infrared rays. The ultrasonic method is a method in which the distance is measured by transmitting an ultrasonic wave having a sharp directivity to the object to be measured and receiving a reflected wave from the object to be measured. The receiving sensor uses a piezoelectric element.
상기 도플러레이더는 레이더의 한 방식으로, 반사파의 위상변화 즉, 파의 도플러 효과를 이용한 레이더이다. 상기 도플러레이더는 펄스 변조되지 않는 사인파를 송ㆍ수신하는 연속파 레이더와 전자파 신호파형으로서 방형파에 펄스 변조된 전파를 사용하는 펄스 레이더가 있다.The Doppler radar is a radar that utilizes a phase change of the reflected wave, that is, a Doppler effect of the wave. The doppler radar includes a continuous wave radar that transmits and receives a sine wave which is not pulse modulated, and a pulse radar that uses pulse modulated radio waves as an electromagnetic wave signal waveform.
연속파 레이더에서는 도플러 주파수 필터의 성능을 얻기 쉽게 하기 위해서 변조주파수를 비교적 높게 취하기 때문에 원거리를 대상으로 한 레이더에는 부적당하나, 도플러 주파수를 가청주파수대로 택함으로써 인체나 차량 등의 움직임을 안정감 있는 소리로서 재생할 수 있는 특징이 있다. 펄스 레이더는 펄스 송신에서 반사 에코 수신까지의 시간에 의해 목표까지의 거리를 계측한다. 송신 펄스폭 내에서 주파수 변조나 위상 변조를 가하는 펄스 압축 레이더라 일컬어지는 방식이 있다. Continuous wave radar is unsuitable for long range radar because modulating frequency is relatively high in order to obtain the performance of Doppler frequency filter. There are features that can be. The pulse radar measures the distance to the target by the time from pulse transmission to reflection echo reception. There is a method called pulse compression radar that applies frequency modulation or phase modulation within the transmission pulse width.
음성입력부(140)는 음성신호의 입력을 위한 것으로 마이크 등이 포함될 수 있다. 마이크는 통화모드 또는 녹음모드, 음성인식 모드 등에서 마이크로폰(Microphone)에 의해 외부의 음향 신호를 입력받아 전기적인 음성 데이터로 처리한다. 그리고 처리된 음성 데이터는 통화 모드인 경우 무선통신부(250)를 통하여 이동통신 기지국으로 송신 가능한 형태로 변환되어 출력될 수 있다. 마이크는 외부의 음향 신호를 입력받는 과정에서 발생하는 잡음(noise)를 제거하기 위한 다양한 잡음 제거 알고리즘이 사용될 수 있다.The voice input unit 140 is for inputting a voice signal and may include a microphone. The microphone receives an external sound signal by a microphone in a call mode, a recording mode, a voice recognition mode, etc., and processes it into electrical voice data. The processed voice data may be converted into a form transmittable to the mobile communication base station through the wireless communication unit 250 and output in the call mode. As a microphone, various noise canceling algorithms may be used to remove noise generated while receiving an external sound signal.
출력부(300)는 오디오 신호, 이미지신호, 비디오 신호 또는 알람(alarm) 신호 등의 출력을 위한 것이다. 출력부(300)에는 디스플레이부(310), 음향출력부(320), 알람부(330), 및 햅틱 모듈(340) 등이 포함될 수 있다.The output unit 300 is for outputting an audio signal, an image signal, a video signal or an alarm signal. The output unit 300 may include a display unit 310, a sound output unit 320, an alarm unit 330, and a haptic module 340.
디스플레이부(310)는 디바이스에서 처리되는 정보를 표시 출력한다. 예를 들어 디바이스가 통화 모드인 경우 통화와 관련된 UI(User Interface) 또는 GUI(Graphic User Interface)를 표시한다. 그리고 디바이스가 화상 통화 모드 또는 촬영 모드인 경우, 촬영되거나 수신된 영상을 각각 혹은 동시에 표시할 수 있으며, UI, GUI를 표시한다. The display 310 displays and outputs information processed by the device. For example, when the device is in a call mode, the device displays a user interface (UI) or a graphic user interface (GUI) related to the call. When the device is in a video call mode or a photographing mode, the captured or received images may be displayed at the same time or simultaneously, and the UI and the GUI may be displayed.
한편, 전술한 바와 같이, 디스플레이부(310)와 터치패드가 상호 레이어 구조를 이루어 터치스크린으로 구성되는 경우, 디스플레이부(310)는 출력 장치 이외에 입력 장치로도 사용될 수 있다. 만일, 디스플레이부(310)가 터치스크린으로 구성되는 경우, 터치스크린 패널, 터치스크린 패널 제어기 등을 포함할 수 있다. On the other hand, as described above, when the display unit 310 and the touch pad form a mutual layer structure to form a touch screen, the display unit 310 may be used as an input device in addition to the output device. If the display unit 310 is configured as a touch screen, the display unit 310 may include a touch screen panel and a touch screen panel controller.
이외에도 디스플레이부(310)는 액정 디스플레이(liquid crystal display), 박막 트랜지스터 액정 디스플레이(thin film transistor-liquid crystal display), 유기 발광 다이오드(organic light-emitting diode), 플렉시블 디스플레이(flexible display), 3차원 디스플레이(3D display) 중에서 적어도 하나를 포함할 수도 있다. 그리고, 디바이스의 구현 형태에 따라 디스플레이부(310)가 2개 이상 존재할 수도 있다. 예를 들어, 디바이스에 외부 디스플레이부(310)와 내부 디스플레이부(310)가 동시에 구비될 수 있다.In addition, the display unit 310 may include a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, and a three-dimensional display. It may include at least one of (3D display). In addition, two or more display units 310 may exist according to the implementation form of the device. For example, the external display unit 310 and the internal display unit 310 may be simultaneously provided in the device.
디스플레이부(310)는 HUD(Head up Display), HMD(Head mounted Display) 등으로 구현될 수 있다. HMD(Head mounted Display)란 안경처럼 머리에 쓰고 대형 영상을 즐길 수 있는 영상표시장치다. HUD(Head up Display)는 사용자의 가시영역 내의 유리에 가상 화상(virtual image)을 투영시키는 영상표시장치이다.The display unit 310 may be implemented as a head up display (HUD), a head mounted display (HMD), or the like. HMD (Head mounted Display) is a video display device that you can wear on your head like glasses and enjoy large images. A head up display (HUD) is an image display device for projecting a virtual image onto glass in a user's visible area.
음향출력부(320)은 호신호 수신, 통화모드 또는 녹음 모드, 음성인식 모드, 방송수신 모드 등에서 무선 통신부로부터 수신되거나 메모리(260)에 저장된 오디오 데이터를 출력한다. 또한, 음향출력 모듈(320)은 디바이스에서 수행되는 기능, 예를 들어, 호신호 수신음, 메시지 수신음 등과 관련된 음향 신호를 출력한다. 이러한 음향출력 모듈(320)에는 스피커(speaker), 버저(Buzzer) 등이 포함될 수 있다.The sound output unit 320 outputs audio data received from the wireless communication unit or stored in the memory 260 in a call signal reception, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, and the like. In addition, the sound output module 320 outputs a sound signal related to a function performed in the device, for example, a call signal reception sound and a message reception sound. The sound output module 320 may include a speaker, a buzzer, and the like.
알람부(330)는 디바이스의 이벤트 발생을 알리기 위한 신호를 출력한다. 디바이스에서 발생하는 이벤트의 예로는 호신호 수신, 메시지 수신, 키 신호 입력 등이 있다. 알람부(330)는 오디오 신호나 비디오 신호 이외에 다른 형태로 이벤트 발생을 알리기 위한 신호를 출력한다. 예를 들면, 진동 형태로 신호를 출력할 수 있다. 알람부(330)는 호신호가 수신되거나 메시지가 수신된 경우, 이를 알리기 위해 신호를 출력할 수 있다. 또한. 알람부(330)는 키 신호가 입력된 경우, 키 신호 입력에 대한 피드백으로 신호를 출력할 수 있다. 이러한 알람부(330)가 출력하는 신호를 통해 사용자는 이벤트 발생을 인지할 수 있다. 디바이스에서 이벤트 발생 알림을 위한 신호는 디스플레이부(310)나 음향출력부(320)를 통해서도 출력될 수 있다.The alarm unit 330 outputs a signal for notifying occurrence of an event of the device. Examples of events occurring in the device include call signal reception, message reception, and key signal input. The alarm unit 330 outputs a signal for notifying occurrence of an event in a form other than an audio signal or a video signal. For example, the signal may be output in the form of vibration. The alarm unit 330 may output a signal to notify the call signal when the call signal is received or a message is received. Also. When the key signal is input, the alarm unit 330 may output the signal as a feedback to the key signal input. The user may recognize the occurrence of an event through the signal output from the alarm unit 330. The signal for notifying the event occurrence in the device may also be output through the display 310 or the sound output unit 320.
햅틱 모듈(haptic module)(340)은 사용자가 느낄 수 있는 다양한 촉각 효과를 발생시킨다. 햅틱 모듈(340)이 발생시키는 촉각 효과의 대표적인 예로는 진동 효과가 있다. 햅틱 모듈(340)이 촉각 효과로 진동을 발생시키는 경우, 햅택 모듈(340)이 발생하는 진동의 세기와 패턴 등은 변환가능하며, 서로 다른 진동을 합성하여 출력하거나 순차적으로 출력할 수도 있다. The haptic module 340 generates various haptic effects that a user can feel. A representative example of the haptic effect generated by the haptic module 340 is a vibration effect. When the haptic module 340 generates vibration by the tactile effect, the intensity and pattern of the vibration generated by the haptic module 340 may be converted, and may be output by combining different vibrations or sequentially.
무선통신부(250)는 방송수신 모듈, 이동통신 모듈, 무선 인터넷 모듈, 근거리 통신 모듈, 및 위치정보모듈 등을 포함할 수 있다.The wireless communication unit 250 may include a broadcast receiving module, a mobile communication module, a wireless internet module, a short range communication module, and a location information module.
방송수신 모듈은 방송 채널을 통하여 외부의 방송관리 서버로부터 방송 신호 및 방송관련 정보 중 적어도 하나를 수신한다. 이때, 방송 채널은 위성 채널, 지상파 채널 등을 포함할 수 있다. 방송관리 서버는, 방송신호 및 방송 관련 정보 중 적어도 하나를 생성하여 송신하는 서버나, 기 생성된 방송 신호 및 방송관련 정보 중 적어도 하나를 제공받아 단말기에 송신하는 서버를 의미할 수 있다.The broadcast receiving module receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. In this case, the broadcast channel may include a satellite channel, a terrestrial channel, and the like. The broadcast management server may mean a server that generates and transmits at least one of a broadcast signal and broadcast related information, or a server that receives at least one of the pre-generated broadcast signal and broadcast related information and transmits the same to a terminal.
방송관련 정보는, 방송 채널, 방송 프로그램 또는 방송 서비스 제공자에 관련한 정보를 의미할 수 있다. 방송관련 정보는, 이동통신망을 통하여도 제공될 수 있으며, 이 경우에는 이동통신 모듈에 의해 수신될 수 있다. 방송관련 정보는 다양한 형태로 존재할 수 있다. The broadcast related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast related information may also be provided through a mobile communication network, and in this case, may be received by the mobile communication module. Broadcast related information may exist in various forms.
방송수신 모듈은, 각종 방송 시스템을 이용하여 방송 신호를 수신하며, 디지털 방송 시스템을 이용하여 디지털 방송 신호를 수신할 수 있다. 또한, 방송수신 모듈은, 이와 같은 디지털 방송 시스템뿐만 아니라 방송 신호를 제공하는 모든 방송 시스템에 적합하도록 구성될 수 있다. 방송수신 모듈을 통해 수신된 방송신호 및/또는 방송 관련 정보는 메모리(260)에 저장될 수 있다.The broadcast receiving module may receive a broadcast signal using various broadcast systems, and receive a digital broadcast signal using a digital broadcast system. In addition, the broadcast receiving module may be configured to be suitable for all broadcast systems providing broadcast signals as well as such digital broadcast systems. The broadcast signal and / or broadcast related information received through the broadcast receiving module may be stored in the memory 260.
이동통신 모듈은, 이동 통신망 상에서 기지국, 외부의 단말, 서버 중 적어도 하나와 무선 신호를 송수신한다. 여기서, 무선 신호는, 음성 호 신호, 화상 통화 호 신호, 또는 문자/멀티미디어 메시지 송수신에 따른 다양한 형태의 데이터를 포함할 수 있다. The mobile communication module transmits and receives a radio signal with at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to voice call signal, video call signal, or text / multimedia message transmission and reception.
무선 인터넷 모듈은 무선 인터넷 접속을 위한 모듈을 말하는 것으로, 무선 인터넷 모듈은 디바이스에 내장되거나 외장될 수 있다. 무선 인터넷 기술로는 WLAN(Wireless LAN)(Wi-Fi), Wibro(Wireless broadband), Wimax(World Interoperability for Microwave Access), HSDPA(High Speed Downlink Packet Access), LTE(long term evolution), LTE-A(Long Term Evolution-Advanced) 등이 이용될 수 있다. The wireless internet module refers to a module for wireless internet access, and the wireless internet module may be embedded or external to the device. Wireless Internet technologies include Wireless LAN (Wi-Fi), Wireless Broadband (Wibro), World Interoperability for Microwave Access (Wimax), High Speed Downlink Packet Access (HSDPA), long term evolution (LTE), LTE-A Long Term Evolution-Advanced and the like can be used.
근거리 통신 모듈은 근거리 통신을 위한 모듈을 말한다. 근거리 통신 기술로 비콘(Beacon), 블루투스(Bluetooth), RFID(Radio Frequency Identification), 적외선 통신(IrDA, infrared Data Association), UWB(Ultra Wideband), 지그비(ZigBee) 등이 이용될 수 있다. The short range communication module refers to a module for short range communication. Beacon, Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, etc. may be used as a short range communication technology.
위치정보모듈은 측위신호를 수신하여 글라스형 웨어러블 디바이스의 위치를 측정하는 모듈을 말한다. 예를 들어, 상기 위치정보모듈은 GPS(Global Position System) 모듈이 해당될 수 있으며, 측위신호는 GPS 신호가 해당될 수 있다. GPS(Global Position System) 모듈은 복수 개의 GPS 인공위성으로부터 위치 정보를 수신한다.The location information module refers to a module that receives the positioning signal and measures the position of the glass type wearable device. For example, the position information module may correspond to a Global Position System (GPS) module, and the positioning signal may correspond to a GPS signal. The Global Position System (GPS) module receives position information from a plurality of GPS satellites.
메모리(260)는 제어부(210)의 처리 및 제어를 위한 프로그램이 저장될 수도 있고, 입력되거나 출력되는 데이터들(예를 들어, 메시지, 정지영상, 동영상 등)의 임시 저장을 위한 기능을 수행할 수도 있다. The memory 260 may store a program for processing and controlling the controller 210, and may perform a function for temporarily storing input or output data (eg, a message, a still image, a video, etc.). It may be.
메모리(260)는 플래시 메모리 타입(flash memory type), 하드디스크 타입(hard disk type), 멀티미디어 카드 마이크로 타입(multimedia card micro type), 카드 타입의 메모리(예를 들어 SD 또는 XD 메모리 등), 램, 롬 중 적어도 하나의 타입의 저장매체를 포함할 수 있다. 또한, 디바이스는 인터넷(internet) 상에서 메모리의 저장 기능을 수행하는 웹 스토리지(web storage)를 운영할 수도 있다. 메모리(260)는 이하 저장부(260)로 표현될 수 있다.The memory 260 may include a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, SD or XD memory), RAM The storage medium may include at least one type of storage medium. In addition, the device may operate a web storage that performs a storage function of a memory on the Internet. The memory 260 may be referred to as a storage 260 hereinafter.
인터페이스부(230)는 디바이스에 연결되는 모든 외부기기와의 인터페이스 역할을 수행한다. 디바이스에 연결되는 외부기기의 예로는, 유/무선 헤드셋, 외부 충전기, 유/무선 데이터 포트, 메모리 카드(Memory card), SIM(Subscriber Identification Module)이나 UIM(User Identity Module) 카드 등과 같은 카드 소켓, 오디오 I/O(Input/Output) 단자, 비디오 I/O(Input/Output) 단자, 이어폰 등이 있다. 인터페이스부(230)는 이러한 외부 기기로부터 데이터를 전송받거나 전원을 공급받아 디바이스 내부의 각 구성 요소에 전달할 수 있고, 디바이스 내부의 데이터가 외부 기기로 전송되도록 할 수 있다.The interface unit 230 serves as an interface with all external devices connected to the device. Examples of external devices that connect to the device include wired / wireless headsets, external chargers, wired / wireless data ports, memory cards, card sockets such as Subscriber Identification Module (SIM) or User Identity Module (UIM) cards, Audio I / O (Input / Output) terminals, video I / O (Input / Output) terminals, earphones, and the like. The interface unit 230 may receive data from such an external device or receive power and transfer the power to each component inside the device, and allow the data inside the device to be transmitted to the external device.
제어부(210)는 통상적으로 상기 각부의 동작을 제어하여 디바이스의 전반적인 동작을 제어하는 기능을 수행한다. 예를 들어 음성 통화, 데이터 통신 등을 위한 관련된 제어 및 처리를 수행한다. 또한, 제어부(210)는 멀티미디어 재생을 위해 데이터를 처리하는 기능을 수행한다. 또한, 입력부 또는 센싱부(130)로부터 입력받은 데이터를 처리하는 기능을 수행한다. The controller 210 typically controls the operation of each unit to perform a function of controlling the overall operation of the device. For example, perform related control and processing for voice calls, data communications, and the like. In addition, the controller 210 performs a function of processing data for multimedia reproduction. In addition, it performs a function of processing the data received from the input unit or the sensing unit 130.
또한, 제어부(210)는 얼굴인식을 위한 얼굴 검출 및 얼굴 인식을 수행한다. 즉, 제어부(210)는 얼굴인식을 위한 얼굴검출모듈 및 얼굴인식모듈을 포함할 수 있다. 얼굴검출모듈은 카메라부(120)에 의해 획득된 카메라 영상에서 얼굴영역만을 추출할 수 있다. 예를 들어, 얼굴검출모듈은 눈, 코, 입 등의 얼굴 내 특징요소를 인식하여 얼굴영역을 추출할 수 있다. 얼굴인식모듈은 추출된 얼굴영역에서 특징정보를 추출하여 템플릿을 생성하고, 얼굴데이터베이스 내의 얼굴정보 데이터와 탬플릿 비교를 수행하여 얼굴을 인식할 수 있다.In addition, the controller 210 performs face detection and face recognition for face recognition. That is, the controller 210 may include a face detection module and a face recognition module for face recognition. The face detection module may extract only the face area from the camera image acquired by the camera unit 120. For example, the face detection module extracts a face region by recognizing feature elements in the face such as eyes, nose, and mouth. The face recognition module may generate a template by extracting feature information from the extracted face region, and recognize a face by comparing a template with face information data in a face database.
또한, 제어부(210)는 카메라부(120)에 의해 획득된 영상 또는 이미지 내에서 문자를 추출하여 인식하는 기능을 수행할 수 있다. 즉, 제어부(210)는 문자인식을 위한 문자인식모듈을 포함할 수 있다. 상기 문자인식모듈의 문자 인식 방법으로 광학문자인식(Optical Character Recognition; OCR)방식을 적용할 수 있다. 상기 OCR방식은 이미지 스캔으로 얻을 수 있는 사람이 쓰거나 기계로 인쇄한 문서의 활자 영상을 컴퓨터가 편집 가능한 문자코드 등의 형식으로 변환하는 것으로, 소프트웨어로 구현이 가능한 방식이다. 예를 들어, 상기 OCR방식은 미리 준비된 몇 개의 표준 패턴문자와 입력문자를 비교하여 표준 패턴문자와 가장 유사한 것을 해당 문자로 선정할 수 있다. 상기 문자인식모듈이 다양한 언어의 표준 패턴문자를 구비하면, 다양한 언어의 인쇄된 활자를 판독할 수 있다. 이러한 방식을 OCR방식 중 Pattern matching기법이라고 하며, 상기 OCR방식은 이에 한정되지 아니하고 다양한 방법이 적용될 수 있다. 또한, 문자인식모듈의 문자인식 방식은 OCR방식에 한정되지 아니하고, 이미 인쇄되어 있는 오프라인상의 문자를 인식할 수 있는 다양한 방식이 적용될 수 있다.In addition, the controller 210 may perform a function of extracting and recognizing a character from an image or an image acquired by the camera unit 120. That is, the controller 210 may include a character recognition module for character recognition. Optical character recognition (OCR) may be applied as a character recognition method of the character recognition module. The OCR method converts a typeface image of a document written by a person who can be obtained by an image scan or printed by a machine into a format such as a computer code that can be edited by a computer, and can be implemented by software. For example, the OCR method may compare several standard pattern letters and input characters prepared in advance and select the most similar to the standard pattern letters as the corresponding letters. If the character recognition module includes standard pattern letters of various languages, it is possible to read printed characters of various languages. Such a method is called a pattern matching method among the OCR methods, and the OCR method is not limited thereto, and various methods may be applied. In addition, the character recognition method of the character recognition module is not limited to the OCR method, various methods for recognizing the character on the offline already printed may be applied.
또한, 제어부(210)는 제2카메라(122)가 획득한 안구방향 영상 또는 이미지를 바탕으로 시선방향을 인식하는 기능을 수행할 수 있다. 즉, 제어부(210)는 시선방향 인식을 수행하는 시선분석모듈을 포함할 수 있다. 사용자의 주시방향과 시선방향을 측정한 후 합성에 의해 계산하여 사용자가 바라보는 방향을 파악할 수 있다. 상기 주시방향은, 사용자의 얼굴 방향을 의미하여, 센싱부(130)의 자이로센서(131) 또는 가속도센서(132)에 의해 측정될 수 있다. 상기 시선방향은, 사용자의 동공이 바라보는 방향으로, 시선분석모듈에 의해 파악될 수 있다. 시선분석모듈은, 실시간 카메라 이미지의 분석을 통해 동공의 움직임을 검출하고, 각막에 반사된 고정 위치를 기준으로 시선의 방향을 계산하는 방식 등을 적용할 수 있다. 예를 들어, 영상처리 방법을 통해 동공의 중심 및 조명에 의한 각막 반사광의 위치를 추출하여 이들의 위치 관계를 통해 시선 위치를 계산할 수 있다.In addition, the controller 210 may perform a function of recognizing a gaze direction based on an eyeball image or an image acquired by the second camera 122. That is, the controller 210 may include a gaze analysis module that performs gaze direction recognition. After measuring the gaze direction and the gaze direction of the user, it is possible to determine the direction that the user looks by calculating by synthesis. The gaze direction refers to a direction of the face of the user and may be measured by the gyro sensor 131 or the acceleration sensor 132 of the sensing unit 130. The gaze direction may be grasped by the gaze analysis module in a direction viewed by the user's pupil. The gaze analysis module may detect a movement of a pupil through analysis of a real-time camera image, and apply a method of calculating a direction of a gaze based on a fixed position reflected by the cornea. For example, through the image processing method, the position of the corneal reflected light by the center of the pupil and the illumination may be extracted and the gaze position may be calculated through the positional relationship thereof.
전원 공급부는 제어부(210)의 제어에 의해 외부의 전원, 내부의 전원을 인가받아 각 구성요소들의 동작에 필요한 전원을 공급한다.The power supply unit receives an external power source and an internal power source under the control of the controller 210 to supply power for operation of each component.
음성인식부(220)는 자동적 수단에 의하여 음성으로부터 언어적 의미 내용을 식별하는 기능을 수행한다. 구체적으로 음성파형을 입력하여 단어나 단어열을 식별하고 의미를 추출하는 처리 과정이며, 크게 음성 분석, 음소 인식, 단어 인식, 문장 해석, 의미 추출의 5가지로 분류된다. 음성인식부(220)는 저장된 음성과 입력된 음성이 동일한지 비교하는 음성평가모듈을 더 포함할 수 있다. 또한, 음성인식부(220)는 입력된 음성을 텍스트로 변환하거나, 텍스트를 음성으로 변환하는 음성-텍스트변환모듈(240)도 더 포함할 수 있다.The speech recognition unit 220 performs a function of identifying linguistic meaning content from the speech by automatic means. Specifically, the process of identifying a word or word sequence and extracting meaning by inputting a speech waveform is classified into five categories: speech analysis, phoneme recognition, word recognition, sentence interpretation, and meaning extraction. The voice recognition unit 220 may further include a voice evaluation module for comparing whether the stored voice and the input voice are the same. In addition, the voice recognition unit 220 may further include a voice-to-text conversion module 240 for converting an input voice into text or converting a text into voice.
도 3은 본 발명의 일 실시예들과 관련된 글라스형 웨어러블 디바이스(10), 외부서버(20) 및 외부디바이스(30)의 연결관계도이다.3 is a diagram illustrating a connection relationship between the glass type wearable device 10, the external server 20, and the external device 30 according to an embodiment of the present invention.
글라스형 웨어러블 디바이스(10)는 정보처리를 위한 수행 과정을 내부에서 모두 수행할 수도 있으나, 외부서버(20)가 정보처리의 일부를 수행할 수 있다. 따라서, 글라스형 웨어러블 디바이스(10)는 입력부(100)를 통해 획득된 데이터 또는 일부 정보처리가 수행된 데이터를 정보처리요청데이터로 외부서버(20)에 전송할 수 있다. 글라스형 웨어러블 디바이스(10)는 외부서버(20)에서 정보처리 수행된 정보처리결과데이터를 무선통신을 통해 수신할 수 있다. 글라스형 웨어러블 디바이스(10)는 수신된 정보처리결과데이터를 출력부(300)를 통해 다양한 방식으로 사용자에게 제공할 수 있다. 상기 외부서버(20)는 글라스형 웨어러블 디바이스(10)에서 수행되는 서비스에 따라 상이할 수 있다. The glass type wearable device 10 may perform all the processing for information processing therein, but the external server 20 may perform some of the information processing. Accordingly, the glass type wearable device 10 may transmit the data acquired through the input unit 100 or data on which some information processing is performed to the external server 20 as information processing request data. The glass type wearable device 10 may receive information processing result data performed by the external server 20 through wireless communication. The glass type wearable device 10 may provide the received information processing result data to the user through the output unit 300 in various ways. The external server 20 may be different according to a service performed by the glass type wearable device 10.
글라스형 웨어러블 디바이스(10)는 자체의 출력부(300)를 통해 사용자에게 정보처리결과데이터를 제공할 수 있고, 외부디바이스(30)를 이용하여 정보처리결과데이터를 제공할 수도 있다. 즉, 글라스형 웨어러블 디바이스(10)가 정보처리 과정 전체를 수행하는 경우, 외부디바이스(30)는 글라스형 웨어러블 디바이스(10)로부터 수신된 정보처리결과데이터를 출력할 수 있다. 또한, 외부서버(20)가 글라스형 웨어러블 디바이스(10)로부터 정보처리요청데이터를 수신하여 일부 정보처리 과정을 수행하는 경우, 외부디바이스(30)는 외부서버(20)로부터 수신된 정보처리결과데이터를 출력할 수 있다. 외부디바이스(30)은, 스마트폰, 태블릿PC, 스마트 TV, 자동차 내부에 구비된 출력부(예를 들어, 차량 유리에 구비된 디스플레이부 또는 차량 내부 음향출력부) 등의 다양한 디바이스가 포함될 수 있다.The glass type wearable device 10 may provide the information processing result data to the user through its output unit 300, or may provide the information processing result data using the external device 30. That is, when the glass type wearable device 10 performs the entire information processing process, the external device 30 may output information processing result data received from the glass type wearable device 10. In addition, when the external server 20 receives the information processing request data from the glass type wearable device 10 to perform some information processing, the external device 30 receives the information processing result data received from the external server 20. You can output The external device 30 may include various devices such as a smartphone, a tablet PC, a smart TV, and an output unit (for example, a display unit provided in the vehicle glass or an internal vehicle sound output unit) provided in the vehicle. .
또한, 글라스형 웨어러블 디바이스(10)는 외부디바이스(30)로부터 송신되는 무선통신신호(예를 들어, 무선통신태그(30)인 비콘태그로부터 송신되는 비콘신호)를 수신할 수 있다. 글라스형 웨어러블 디바이스(10)는 수신한 무선통신신호를 이용하여 정보처리를 수행할 수 있다.In addition, the glass type wearable device 10 may receive a wireless communication signal (for example, a beacon signal transmitted from a beacon tag that is a wireless communication tag 30) transmitted from the external device 30. The glass type wearable device 10 may perform information processing using the received wireless communication signal.
이하, 도면을 참조하여 본 발명의 실시예들에 따른 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공시스템 및 방법에 대해 설명하기로 한다.Hereinafter, a system and method for providing an escape route using a glass type wearable device according to embodiments of the present invention will be described with reference to the accompanying drawings.
도 4는 본 발명의 바람직한 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법에 대한 순서도이다.4 is a flowchart illustrating a method of providing an escape route using a glass type wearable device according to an exemplary embodiment of the present invention.
도 4를 참조하면, 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법은, 건물 내 사건발생위치정보를 수신하는 단계(S100); 상기 사건발생위치정보를 바탕으로, 안전한 비상구 위치정보를 판단하여 탈출경로를 계산하는 단계(S110); 상기 계산된 탈출경로를 따라 안내정보를 생성하는 단계(S120); 및 사용자에게 상기 안내정보를 제공하는 단계(S130);를 포함한다. 본 발명의 일 실시예에 따른 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법을 순서대로 설명한다.Referring to FIG. 4, the method for providing an escape route using the glass type wearable device according to an embodiment of the present invention includes: receiving event occurrence location information in a building (S100); Calculating an escape route by determining safe emergency exit location information based on the event occurrence location information (S110); Generating guide information along the calculated escape route (S120); And providing the guide information to the user (S130). A method of providing an escape route using a glass type wearable device according to an embodiment of the present invention will be described in order.
글라스형 웨어러블 디바이스는 건물 내 사건발생위치정보를 수신한다(S100). 상기 사건발생위치정보는, 특정 건물 내의 사용자의 대피를 요하는 특정한 사건이 발생한 위치정보이다. 상기 사건발생위치정보는 각 구역의 스프링쿨러 작동, 화재 발생 경보장치 작동, 정전 감지 센서 작동, 열감지기 작동, 경비원의 인지 후 비상알람 작동, 감시카메라 또는 통신장비의 고장 등을 통해 파악될 수 있다. 예를 들어, 화재에 의해 연기가 발생하여 특정구역의 스프링클러가 작동하면 상기 특정구역이 사건발생장소로 판단된다. 글라스형 웨어러블 디바이스(10)는 무선통신을 통해 사건발생위치정보를 수신한다. 예를 들어, 상기 무선통신신호가 비콘(40)이 발신하는 비콘신호인 경우, 글라스형 웨어러블 디바이스(10)가 비콘신호를 엑티브 센싱할 수 있다. 상기 비콘(40)은 건물의 여러 위치에 부착되어 있을 수 있으며, 특히 비상구 알림등의 내부에 포함될 수 있다. 글라스형 웨어러블 디바이스(10)는 건물 내부의 여러 장소에 부착된 비콘을 통해 사건발생위치정보가 포함된 비콘신호를 어느 위치에서나 수신할 수 있다.The glass type wearable device receives location occurrence information of an event in a building (S100). The incident occurrence position information is position information on which a specific incident requiring evacuation of a user in a specific building occurs. The incident location information may be identified through the operation of the sprinkler in each zone, the operation of the fire alarm, the power outage sensor, the heat detector, the emergency alarm after the security guard's recognition, the failure of the surveillance camera or communication equipment. . For example, if a sprinkler in a specific zone operates due to the smoke generated by a fire, the specific zone is determined to be an event occurrence place. The glass type wearable device 10 receives event occurrence position information through wireless communication. For example, when the wireless communication signal is a beacon signal transmitted by the beacon 40, the glass type wearable device 10 may actively sense the beacon signal. The beacon 40 may be attached to various locations of the building, in particular may be included in the emergency exit notification light. The glass type wearable device 10 may receive a beacon signal including event occurrence location information at any position through beacons attached to various places inside a building.
글라스형 웨어러블 디바이스(10)는 사건발생위치정보를 바탕으로, 안전한 비상구 위치정보를 판단하여 탈출경로를 계산한다(S110). 상기 비상구위치정보는 건물 내에 위치하는 비상구의 위치에 관한 정보로, 상기 글라스형 웨어러블 디바이스(10)가 저장하고 있을 수 있고, 비콘신호에 사건발생위치정보와 함께 포함되어 글라스형 웨어러블 디바이스(10)로 전송될 수 있다. 상기 비상구는 외부로 대피하는 통로일 수도 있고, 외부로 대피할 수 없는 경우에는 옥상 등 안전하게 대피하여 구조를 기다릴 수 있는 장소로 가는 경로일 수 있다. 글라스형 웨어러블 디바이스(10)의 제어부(210)는 사고발생위치 및 사용자의 현재위치를 바탕으로, 안전 대피가 가능한 비상구를 파악한다. 그 후, 글라스형 웨어러블 디바이스(10)는 파악된 비상구로 안전하게 대피할 수 있는 탈출경로를 계산한다.The glass type wearable device 10 calculates an escape route by determining safe emergency exit location information on the basis of the event occurrence location information (S110). The emergency exit location information is information on the location of the emergency exit located in the building, the glass type wearable device 10 may be stored, the beacon signal is included with the event occurrence location information glass type wearable device 10 Can be sent to. The emergency exit may be a passage to evacuate to the outside, or, if it is not possible to evacuate to the outside, it may be a path to a place that can safely evacuate to a roof or the like and wait for rescue. The control unit 210 of the glass type wearable device 10 determines an emergency exit where safety can be evacuated based on the location of the accident and the current location of the user. Thereafter, the glass type wearable device 10 calculates an escape route that can be safely evacuated to the identified emergency exit.
글라스형 웨어러블 디바이스(10)는 상기 계산된 탈출경로를 따라 안내정보를 생성한다(S120). 상기 안내정보는, 이동경로 정보, 이동경로 전환 방향, 상기 이동경로 전환까지 남은 거리, 이동시 주의사항 등을 포함할 수 있다. 상기 탈출경로를 바탕으로, 상기 글라스형 웨어러블 디바이스(10)의 제어부(210)는 상기 비상구로 향하는 방향 및 분기점까지의 거리 등을 차례로 포함한 안내정보를 생성한다. The glass type wearable device 10 generates guide information along the calculated escape route (S120). The guide information may include moving route information, a moving route switching direction, a distance remaining until the moving route switching, and precautions when moving. Based on the escape route, the control unit 210 of the glass type wearable device 10 generates guide information including a direction to the emergency exit, a distance to a branch point, and the like.
사용자에게 상기 안내정보를 제공한다(S130). 상기 사용자에게 안내정보를 알리는 방식은, 상기 안내정보를 디스플레이부(310)에 표시하여 사용자에게 알리는 방식, 상기 안내정보를 음향출력으로 사용자에게 알리는 방식, 진동 발생의 세기와 진동 발생 방향을 통해 사용자에게 상기 안내정보를 알리는 방식 등이 적용될 수 있다. The guide information is provided to the user (S130). The method of informing the user of the guide information, a method of informing the user by displaying the guide information on the display unit 310, a method of informing the user of the guide information by the sound output, the user through the intensity and vibration generation direction of the vibration generation A method of notifying the guide information to the user may be applied.
예를 들어, 도 5에서와 같이, 글라스형 웨어러블 디바이스(10)의 디스플레이부(310) 상에 이동경로 전환 방향, 이동경로 전환까지 남은 거리를 표시할 수 있다. 또한, 상기 안내정보에는 탈출경로가 차례대로 포함되어 있으므로, 사용자는 상기 이동경로 전환 후 사용자입력부(110)를 통해 다음 안내정보 제공을 제어부(210)에 요청하여 안내정보를 순서대로 확인하면서 탈출경로를 이동할 수 있다. For example, as shown in FIG. 5, the moving path switching direction and the distance remaining until the moving path switching may be displayed on the display unit 310 of the glass type wearable device 10. In addition, since the escape information is sequentially included in the guide information, the user requests the control unit 210 to provide the next guide information through the user input unit 110 after the change of the movement path, and confirms the guide information in order, and then escapes the route. You can move it.
또한, 상기 안내정보를 디스플레이부(310)에 표시하여 사용자에게 알리는 방식은, 자이로센서(131) 등의 모션센서를 통해 현재 이동방향을 측정하고, 사용자의 이동방향이 탈출경로에 부합하는지 파악하여 사용자에게 알림을 제공할 수 있다.In addition, the method of notifying the user by displaying the guide information on the display unit 310, by measuring the current moving direction through a motion sensor such as a gyro sensor 131, by grasping whether the user's moving direction corresponds to the escape route It can provide a notification to the user.
또한, 무선통신을 통해 실시간 사건진행상황정보를 수신하는 단계;를 포함하고, 상기 안내정보 제공단계(S130)는 상기 사건발생위치정보 또는 상기 사건진행상황정보를 사용자에게 제공하는 것을 특징으로 할 수 있다. 상기 사건진행상황정보는, 사건이 처리되고 있는 상황 또는 사건이 진행되고 있는 상황에 대한 정보이다. 예를 들어, 화재사건인 경우, 사건진행상황정보는 화재의 진행상황(즉, 불이 확장된 건물 범위), 화재의 진압 정도, 또는 소방관 투입여부 등을 의미할 수 있다. 즉, 상기 사건이 화재사고인 경우, 사건진행상황정보는 외부서버에 의해 작동한 스프링쿨러 위치 또는 개수, 건물 내 카메라 또는 통신장비의 위치 또는 고장시점 중 적어도 하나 이상을 바탕으로 분석되어, 무선통신을 통해 상기 글라스형 웨어러블 디바이스가 수신할 수 있다. 글라스형 웨어러블 디바이스(10)는 디스플레이부(310)에 표시되는 지도상에 사건발생장소정보를 표시하고 사건이 처리되고 있는 상황을 함께 표시해주어서, 사건상황이 악화된 경우에는 사람들이 위급함을 느껴 신속하게 대피하게 하고, 사건이 안전하게 처리되고 있는 경우에는 사용자가 처리상황을 인지할 수 있게 하여 조급하게 대피하다가 발생하는 2차 사고를 방지할 수 있다.The method may further include receiving real-time event progress status information through wireless communication, wherein the guide information providing step (S130) may provide the user with the event occurrence location information or the event progress status information. have. The event progress status information is information about a situation in which the event is being processed or a situation in which the event is in progress. For example, in the case of a fire event, the event progress information may mean the progress of the fire (ie, the range of buildings in which the fire is expanded), the degree of extinguishing the fire, or whether a firefighter is inserted. That is, when the incident is a fire accident, the incident progress status information is analyzed based on at least one or more of the sprinkler position or number operated by an external server, the location of the camera or communication equipment in the building, or the failure time point, and wireless communication. Through the glass-type wearable device can be received. The glass type wearable device 10 displays event occurrence information on a map displayed on the display 310 and displays a situation in which an event is being processed. If the case is being handled safely, the user can be aware of the handling situation and can prevent secondary accidents caused by hasty evacuation.
또한, 상기 글라스형 웨어러블 디바이스(10)가 실내 측위를 통해 현재 위치를 측량하는 단계;를 포함하고, 상기 탈출경로 계산단계(S110)는 상기 실내위치정보 및 사건발생위치정보를 바탕으로 적절한 비상구 및 탈출경로를 계산할 수 있다. 사용자의 실내 위치를 인식하는 방법으로, 상기 비콘(40)과 같은 무선통신장치와 글라스형 웨어러블 디바이스(10)의 상호교신을 통해 사용자의 현재 위치를 측량하는 방식이 적용될 수 있다. 예를 들어, 3개의 구별이 되는 비콘신호를 상기 글라스형 웨어러블 디바이스(10)가 수신하여, 수신한 상기 비콘신호의 세기와 상기 비콘신호를 송신한 비콘(40)의 위치를 바탕으로 사용자의 현재 위치를 측량할 수 있다. 다만, 비콘(40)과 상호교신을 이용한 위치 측량방식은 이에 한정되지 않고, 상기 비콘신호의 방향 등을 이용하는 방식 등 다양한 방식이 적용될 수 있다.In addition, the glass-shaped wearable device 10 includes a step of measuring the current position through the indoor positioning; wherein, the escape route calculation step (S110) is based on the indoor location information and the event occurrence location information appropriate emergency exit and The escape route can be calculated. As a method of recognizing the indoor location of the user, a method of measuring the current location of the user through communication between the wireless communication device such as the beacon 40 and the glass type wearable device 10 may be applied. For example, the glass type wearable device 10 receives three distinguishing beacon signals, based on the strength of the received beacon signal and the position of the beacon 40 that transmitted the beacon signal. You can survey the location. However, the position measurement method using the beacon 40 and the mutual communication is not limited thereto, and various methods such as a method using the direction of the beacon signal may be applied.
또한, 사용자의 실내 위치를 인식하는 방법으로, 제1카메라(121)를 이용해 주변의 특징적인 지형 또는 지물과 같은 특징적 요소를 인식하고, 상기 글라스형 웨어러블 디바이스(10) 내에 저장되거나 외부서버로부터 무선통신으로 수신한 실내 지도 정보와 비교 수행하여, 사용자의 현재 위치를 인식할 수 있다. 제1카메라(121)는 상기 글라스형 웨어러블 디바이스(10)의 일측에 구비되어 전방의 영상 또는 이미지를 획득하는 카메라이다. 따라서 제1카메라(121)가 전방 영상 또는 이미지를 획득하고, 상기 영상 또는 이미지 내에 있는 상기 특징적 요소를 추출하고, 상기 특징적 요소를 상기 실내지도 정보에서 파악하여 현재 위치를 파악한다. 또한, 상기 전방 영상 또는 이미지 내의 특징적 요소 위치를 통해 현재 상기 사용자가 향하고 있는 방향도 인식할 수 있다.In addition, as a method of recognizing a user's indoor location, the first camera 121 may be used to recognize a characteristic element such as a characteristic terrain or a feature of the surrounding, and may be stored in the glass type wearable device 10 or wirelessly from an external server. The current location of the user may be recognized by performing comparison with indoor map information received through communication. The first camera 121 is a camera provided at one side of the glass type wearable device 10 to acquire an image or an image of the front side. Accordingly, the first camera 121 acquires the front image or the image, extracts the characteristic element in the image or the image, and grasps the characteristic element from the indoor map information to determine the current position. In addition, the direction that the user is currently facing may be recognized through the feature element position in the front image or the image.
사용자의 실내 위치를 파악하는 단계를 통해서, 상기 사용자는 이동하면서 현재 위치에 대응하는 적절한 안내정보를 글라스형 웨어러블 디바이스(10)로부터 제공받을 수 있다. 다만, 사용자의 실내 위치를 파악하는 방법은 이에 한정되지 아니하고, 실내 위치 인식을 구현하는 다양한 방법이 적용될 수 있다.By determining the indoor location of the user, the user may be provided with appropriate guide information corresponding to the current location while moving from the glass type wearable device 10. However, the method of determining the indoor location of the user is not limited thereto, and various methods of implementing indoor location recognition may be applied.
그 후, 글라스형 웨어러블 디바이스(10)는, 실내위치정보 및 사건발생위치정보를 바탕으로 적절한 비상구 및 탈출경로를 계산할 수 있다. 이를 통해, 글라스형 웨어러블 디바이스(10)는 사용자의 위치에 가장 부합하는 비상구 및 탈출경로를 안내할 수 있어서, 사용자가 안전하게 탈출하도록 보조할 수 있다.Thereafter, the glass type wearable device 10 may calculate an appropriate emergency exit and escape route based on the indoor location information and the event occurrence location information. Through this, the glass type wearable device 10 may guide an emergency exit and an escape route that best fits the location of the user, thereby assisting the user to safely escape.
또한, 상기 사용자의 실시간 위치정보를 무선통신을 통해 외부 디바이스로 전송하는 단계;를 포함할 수 있다. 글라스형 웨어러블 디바이스(10)가 인식된 상기 사용자의 위치를 무선통신을 통해 외부단말로 전송함으로써, 외부의 사람들이 건물 내의 상기 사용자의 위치를 파악할 수 있어, 용이하게 구조를 할 수 있다.The method may further include transmitting real-time location information of the user to an external device through wireless communication. By transmitting the position of the user recognized by the glass type wearable device 10 to an external terminal through wireless communication, it is possible for people outside to grasp the position of the user in the building, so that it can be easily structured.
또한, 글라스형 웨어러블 디바이스(10)가 건물 위치정보를 파악하는 단계; 및 상기 건물 위치정보를 바탕으로 상기 건물 내부지도를 외부서버에 요청하여 수신하는 단계;를 더 포함할 수 있다. 글라스형 웨어러블 디바이스(10)가 정확한 탈출경로 계산 및 탈출경로 안내를 수행하기 위해서는 건물 내부지도가 필요하다. 따라서, 글라스형 웨어러블 디바이스(10)는 내부지도를 제공받을 건물을 파악하기 위해 GPS신호 등을 통해 건물 위치정보를 파악할 수 있다. 그 후, 글라스형 웨어러블 디바이스(10)는 위치정보에 대응하는 건물의 내부지도를 외부서버에 요청하여 수신할 수 있다.In addition, the glass-type wearable device 10 to identify the building location information; And requesting and receiving an internal server map from the inside of the building based on the location information of the building. In order for the glass type wearable device 10 to perform accurate escape route calculation and escape route guidance, a building interior map is required. Accordingly, the glass type wearable device 10 may grasp building location information through a GPS signal to identify a building to receive an internal map. Thereafter, the glass type wearable device 10 may request and receive an internal map of a building corresponding to the location information from an external server.
도 6은 본 발명의 일실시예에 따른 비콘(40)과 글라스형 웨어러블 디바이스(10)를 이용한 비상구 알림시스템의 구성도이다. 6 is a block diagram of the emergency exit notification system using the beacon 40 and the glass-type wearable device 10 according to an embodiment of the present invention.
도 6을 참조하면, 본 발명의 다른 일실시예에 따른 비콘(40)과 글라스형 웨어러블 디바이스(10)를 이용한 비상구 알림시스템은, 무선통신부(250); 제어부(210); 및 출력부(300);를 포함한다. 다만, 도 6에 있어서, 기 설명한 구성에 대한 구체적인 설명은 생략하기로 한다.Referring to FIG. 6, the emergency exit notification system using the beacon 40 and the glass type wearable device 10 according to another embodiment of the present invention may include a wireless communication unit 250; Control unit 210; And an output unit 300. However, in FIG. 6, detailed description of the previously described configuration will be omitted.
상기 무선통신부(250)는 상기 비콘(40)으로부터 사건발생장소정보를 수신하는 기능을 수행한다. 또한, 상기 무선통신부(250)는, 상기 글라스형 웨어러블 디바이스(10)가 비상구 위치정보를 저장하지 않는 경우, 상기 비상구 위치정보를 수신할 수 있다. 또한, 상기 무선통신부(250)는 외부단말과 정보교환을 행하는 기능을 수행한다. 즉, 상기 무선통신부(250)는 상기 사용자의 측정된 실내 위치정보를 외부단말로 송신할 수 있다.The wireless communication unit 250 performs a function of receiving event occurrence place information from the beacon 40. In addition, when the glass type wearable device 10 does not store the emergency exit location information, the wireless communication unit 250 may receive the emergency exit location information. In addition, the wireless communication unit 250 performs a function of exchanging information with an external terminal. That is, the wireless communication unit 250 may transmit the measured indoor location information of the user to an external terminal.
상기 제어부(210)는 상기 사건발생 장소 및 상기 비상구 위치정보를 인식하여 탈출경로를 계산하고, 상기 탈출경로에 따라 안내정보를 생성하는 기능을 수행한다. 또한, 상기 제어부(210)는, 비콘(40)과의 상호교신 또는 상기 제1카메라(121)에 의한 전방의 특징적 요소 인식 등의 방식에 따라서, 상기 사용자의 실내 현재 위치를 인식하는 기능을 수행할 수 있다. 예를 들어, 상기 제어부(210)는, 상기 무선통신부(250)가 수신한 하나 이상의 상기 비콘신호를 이용하여 사용자의 현재 위치를 측정할 수 있다.The controller 210 recognizes the event occurrence place and the emergency exit location information, calculates an escape route, and generates guide information according to the escape route. In addition, the controller 210 performs a function of recognizing the current location of the user in accordance with a method such as mutual communication with the beacon 40 or recognition of characteristic elements in the front by the first camera 121. can do. For example, the controller 210 may measure the current location of the user by using one or more of the beacon signals received by the wireless communication unit 250.
출력부(300)는 사용자에게 안내정보를 알리는 기능을 수행한다. 출력부(300)는, 상기 안내정보를 시각적으로 표시하는 디스플레이부(310), 안내정보를 음향출력으로 사용자에게 알리는 음향출력부(320), 진동 발생의 세기와 진동 발생 방향을 통해 사용자에게 안내정보를 알리는 진동알람부 등을 포함할 수 있다.The output unit 300 performs a function of informing the user of the guide information. The output unit 300 includes a display unit 310 for visually displaying the guide information, an audio output unit 320 for notifying the user of the guide information as a sound output, and guides the user through the strength and direction of vibration generation. It may include a vibration alarm unit for informing information.
도 7은 본 발명의 일실시예에 따른 외부서버(20), 비콘(40) 및 글라스형 웨어러블 디바이스(10)를 이용한 비상구 알림시스템 의 내부구성도이다. 7 is an internal configuration of the emergency exit notification system using the external server 20, the beacon 40 and the glass-type wearable device 10 according to an embodiment of the present invention.
도 7을 참조하면, 본 발명의 또 다른 일실시예에 따른 비콘(40)과 글라스형 웨어러블 디바이스(10)를 이용한 비상구 알림시스템은, 외부서버(20); 비콘(40); 글라스형 웨어러블 디바이스(10);를 포함한다. 다만, 도 7에 있어서, 기 설명한 구성에 대한 구체적인 설명은 생략하기로 한다.Referring to FIG. 7, the emergency exit notification system using the beacon 40 and the glass type wearable device 10 according to another embodiment of the present invention may include an external server 20; Beacon 40; It includes a glass-shaped wearable device 10. However, in FIG. 7, detailed description of the previously described configuration will be omitted.
외부서버(20)는 사건발생 장소를 인식하여 탈출경로를 계산하고, 비콘(40) 별로 송신할 이동경로 안내정보를 설정하는 기능을 수행한다. 비콘(40)은 외부서버(20)가 설정한 상기 안내정보에 대응하는 비콘신호를 발생하는 기능을 수행한다.The external server 20 recognizes the occurrence place of the event, calculates the escape route, and performs the function of setting the movement route guide information to be transmitted for each beacon 40. The beacon 40 performs a function of generating a beacon signal corresponding to the guide information set by the external server 20.
상기 글라스형 웨어러블 디바이스(10)는 상기 비콘신호를 수신하여 사용자에게 안내정보를 알리는 기능을 수행한다. 상기 글라스형 웨어러블 디바이스(10)는, 무선통신부(250); 제어부(210); 및 출력부(300);를 포함할 수 있다. 상기 무선통신부(250)는 상기 상기 비콘(40)으로부터 비콘신호를 수신하고, 외부와 정보교환을 행하는 기능을 수행한다. 상기 제어부(210)는 상기 수신한 비콘신호를 바탕으로 출력 형태에 맞게 정보처리를 행하는 기능을 수행한다. 상기 출력부(300)는 사용자에게 안내정보를 알리는 기능을 수행한다.The glass type wearable device 10 receives the beacon signal and performs a function of informing the user of guide information. The glass type wearable device 10 includes a wireless communication unit 250; Control unit 210; And an output unit 300. The wireless communication unit 250 receives a beacon signal from the beacon 40 and performs a function of exchanging information with the outside. The controller 210 performs a function of performing information processing according to an output form based on the received beacon signal. The output unit 300 performs a function of informing the user of the guide information.
외부서버(20)는 사건발생 장소를 인식하여 탈출경로를 계산하고, 비콘(40) 별로 송신할 이동경로 안내정보를 설정한다. 상기 안내정보는, 이동경로 전환 방향, 상기 이동경로 전환까지 남은 거리, 이동시 주의사항 등을 포함할 수 있다. 상기 비콘(40)은 상기 외부서버(20)가 설정한 상기 안내정보에 대응하는 비콘신호를 발생시키고, 상기 글라스형 웨어러블 디바이스(10)의 무선통신부(250)가 상기 비콘신호를 수신한다. 그 후, 상기 제어부(210)가 상기 수신한 비콘신호를 출력 형태에 맞게 정보처리를 행하고, 상기 출력부(300)가 상기 정보처리된 데이터를 전달받아 상기 사용자에게 이동경로 안내정보를 알린다. 사용자가 이동하면서 상기 비콘신호를 수신하여 상기 사용자의 위치의 적절한 안내정보 제공받을 수 있다. The external server 20 recognizes the occurrence place of the event, calculates the escape route, and sets the movement route guide information to be transmitted for each beacon 40. The guide information may include a moving path switching direction, a distance remaining until the moving path switching, and precautions when moving. The beacon 40 generates a beacon signal corresponding to the guide information set by the external server 20, and the wireless communication unit 250 of the glass type wearable device 10 receives the beacon signal. Thereafter, the control unit 210 performs information processing on the received beacon signal according to the output form, and the output unit 300 receives the processed information and notifies the user of the route information. As the user moves, the beacon signal may be received to provide appropriate guide information of the user's location.
또한, 상기 제어부(210)는, 상기 무선통신부(250)가 수신한 하나 이상의 상기 비콘신호를 이용하여 사용자의 현재 위치를 측정 수 있다.In addition, the controller 210 may measure the current location of the user by using one or more of the beacon signals received by the wireless communication unit 250.
이하, 도면을 참조하여 본 발명의 실시예들에 따른 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 시스템, 방법 및 프로그램에 대해 설명하기로 한다.Hereinafter, an emergency emergency contact system, method, and program using a glass type wearable device according to embodiments of the present invention will be described with reference to the accompanying drawings.
도 8은 본 발명의 바람직한 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법에 대한 순서도이다.8 is a flowchart illustrating an emergency emergency contact method using a glass type wearable device according to an exemplary embodiment of the present invention.
도 8을 참조하면, 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법은, 사용자의 음성입력 또는 동작입력을 획득하는 단계(S200); 상기 음성입력 또는 동작입력을 인식하여 긴급상황을 판단하는 단계(S210); 및 무선통신으로 비상연락 상대방에게 상기 긴급상황 알림정보를 전송하는 단계(S220);를 포함한다. 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법을 순서대로 설명한다.Referring to FIG. 8, an emergency emergency contact method using a glass-type wearable device according to an embodiment of the present invention may include obtaining a voice input or an operation input of a user (S200); Determining an emergency situation by recognizing the voice input or operation input (S210); And transmitting the emergency notification information to the emergency contact counterpart via wireless communication (S220). An emergency emergency contact method using a glass type wearable device according to an embodiment of the present invention will be described in order.
글라스형 웨어러블 디바이스(10)는 사용자의 음성입력 또는 동작입력을 수신한다(S200). 글라스형 웨어러블 디바이스(10)는 음성입력부(140)를 통해 사용자의 음성을 수신할 수 있다. 상기 음성입력은, 비명소리 또는 지정된 긴급신호 문구가 될 수 있다. 예를 들어, 범죄 상황에 처했을 경우, 비명소리 또는 ‘도와주세요!’와 같은 도움을 요청하는 긴급신호 문구가 음성입력이 될 수 있다.The glass type wearable device 10 receives a user's voice input or an operation input (S200). The glass type wearable device 10 may receive a user voice through the voice input unit 140. The voice input may be a scream or a designated emergency signal phrase. For example, if you are in a criminal situation, voice prompts such as screaming or asking for help, such as 'Help Me!'
또한, 글라스형 웨어러블 디바이스(10)는 사용자의 설정에 따라서 주변의 음성을 입력받을 수 있다. 예를 들어, 사용자가 지병이 있어서 자주 쓰러지는 경우에는, 주변 사람들이 사용자의 상태를 확인하기 위해 묻는 음성 질문을 입력받도록 할 수 있다. In addition, the glass type wearable device 10 may receive a voice of a surrounding according to a user's setting. For example, if a user has a disease and falls frequently, the user may be asked to input a voice question asked to confirm the user's condition.
글라스형 웨어러블 디바이스(10)는 모션센서에 의해 사용자의 움직임데이터를 획득할 수 있다. 상기 동작입력은, 일반적이지 않은 사용자의 급작스러운 움직임이나, 저장된 고개움직임 패턴 등이 해당될 수 있다. 예를 들어, 사용자의 급작스러운 움직임은, 사용자가 납치되는 상황이면, 몸부림치는 행동에 의한 비정상적인 움직임 패턴이 해당될 수 있다.The glass type wearable device 10 may acquire motion data of a user by a motion sensor. The motion input may correspond to an unexpected movement of a user or a stored bow movement pattern. For example, a sudden movement of the user may correspond to an abnormal movement pattern due to a writhing action when the user is kidnapped.
글라스형 웨어러블 디바이스(10)는 상기 음성입력 또는 동작입력을 인식하여 긴급상황을 판단한다(S210). 상기 음성입력에 의한 상황판단의 경우, 상기 음성인식부(220)은 상기 음성입력에서 언어적 의미 또는 어조를 파악하고, 이를 바탕으로 제어부(210)가 긴급상황에 해당하는지 인식한다. 상기 동작입력에 의한 상황판단의 경우, 제어부(210)가 모션센서에 의해 인식된 사용자의 움직임 정보를 저장된 움직임 패턴정보와 비교하고, 상기 긴급상황에 해당하는 패턴정보라고 인식되면 긴급상황으로 판단할 수 있다. 또한, 정상적인 사용자의 움직임 패턴(예를 들어, 걷는 경우의 움직임 패턴 또는 달리는 경우의 움직임 패턴)에 해당하지 않는 움직임 패턴이 인식되면, 제어부(210)가 긴급상황으로 판단할 수 있다.The glass type wearable device 10 determines the emergency situation by recognizing the voice input or the operation input (S210). In the case of the situation determination by the voice input, the voice recognition unit 220 grasps a linguistic meaning or tone from the voice input, and recognizes whether the controller 210 corresponds to an emergency based on this. In the case of the situation determination by the operation input, the control unit 210 compares the user's motion information recognized by the motion sensor with the stored motion pattern information, and if it is recognized as the pattern information corresponding to the emergency situation, it is determined as an emergency situation. Can be. In addition, when a movement pattern that does not correspond to a normal user's movement pattern (eg, a walking pattern or a running pattern) is recognized, the controller 210 may determine an emergency situation.
글라스형 웨어러블 디바이스(10)는, 도 9에서와 같이, 무선통신으로 비상연락 상대방에게 상기 긴급상황 알림정보를 전송한다(S220). 제어부(210)가 긴급상황임을 인식하면, 무선통신부(250)로 비상연락 수행을 명령한다. 이에 따라 무선통신부(250)는 외부로 비상연락을 수행한다. 상기 비상연락은 미리 지정된 연락처로, 문자 보내기, 전화 걸기 또는 푸쉬 통보 서비스 등을 포함할 수 있다. 여기서, 상기 푸쉬 통보 서비스는, 예를 들면, 스마트폰에서 동일한 프로그램을 사용하는 사용자들에게 메시지를 통보하는 푸시 서비스일 수 있다. 또한, 상기 미리 지정된 연락처는 친구, 보호자 또는 경찰일 수 있으나, 이러한 종류로 한정되는 것은 아니다. 상기 연락처 정보는 글라스형 웨어러블 디바이스의 메모리로부터 획득할 수 있으며, 외부서버로부터 무선통신으로 수신할 수도 있다. 상기 비상연락의 방식은 기재된 방식에 한정되지 아니하고, 상기 글라스형 웨어러블 디바이스의 무선통신부(250)에 의해서 수행될 수 있는 다양한 방식이 적용될 수 있다. As shown in FIG. 9, the glass type wearable device 10 transmits the emergency notification information to the emergency contact counterpart via wireless communication (S220). When the controller 210 recognizes that the emergency situation, the wireless communication unit 250 commands the emergency contact. Accordingly, the wireless communication unit 250 performs an emergency contact to the outside. The emergency contact may include a text message, a phone call or a push notification service to a predetermined contact. Here, the push notification service may be, for example, a push service for notifying a message to users who use the same program in a smartphone. In addition, the predetermined contact may be a friend, guardian or police, but is not limited to this kind. The contact information may be obtained from a memory of the glass type wearable device, or may be received by an external server through wireless communication. The emergency contact method is not limited to the described method, and various methods that may be performed by the wireless communication unit 250 of the glass type wearable device may be applied.
또한, 사용자의 현재위치를 측정하는 단계;를 더 포함하고, 상기 비상연락 수행단계(S220)는 상기 측정된 현재위치정보를 상기 긴급상황 알림정보에 포함하여 전송할 수 있다. 사용자의 현재 위치정보는 글라스형 웨어러블 디바이스(10)의 GPS모듈에 의해서 파악될 수 있다. The method may further include measuring a current location of the user, wherein the emergency contact performing step S220 may include the measured current location information in the emergency notification information. The current location information of the user may be grasped by the GPS module of the glass type wearable device 10.
또한, 상기 음성입력 또는 동작입력에 대응하는 긴급상황분류를 판단하는 단계; 및 상기 긴급상황분류에 따라 특정한 지정기관 또는 특정한 지인을 상기 비상연락 상대방으로 선택하는 단계;를 더 포함할 수 있다.The method may further include determining an emergency classification corresponding to the voice input or the operation input; And selecting a specific designated agency or a specific acquaintance as the emergency contact counterpart according to the emergency classification.
글라스형 웨어러블 디바이스(10)는 저장부(260)에 비상연락을 신속하게 수행하기 위한 데이터를 분류하여 저장할 수 있다. 저장부(260)에 포함하여야 하는 데이터는 지정기관과 지정기관의 연락처 및 긴급상황에 해당하는 음성입력이나 동작입력 데이터를 포함할 수 있다. 따라서 상기 저장부(260)는 상기 긴급상황에 따라 사용자의 음성입력 또는 동작입력을 분류하고, 상기 긴급상황별로 지정기관 및 해당기관의 연락처를 대응시켜 저장할 수 있다.The glass type wearable device 10 may classify and store data for quickly performing emergency contact in the storage unit 260. The data to be included in the storage unit 260 may include voice input or operation input data corresponding to a contact point and an emergency of the designated authority and the designated authority. Accordingly, the storage unit 260 may classify the user's voice input or operation input according to the emergency situation, and store the designated agency and the contact information of the corresponding organization for each emergency situation.
이에 따라, 제어부(210)는 사용자의 음성입력 또는 동작입력이 포함되는 긴급상황분류를 파악하고, 긴급상황분류에 따라 비상연락을 수행할 지정기관 또는 지인을 선택할 수 있다. 그 후, 제어부(210)는 상기 무선통신부(250)에 지정기관 또는 지인의 연락처로 비상연락을 수행하도록 명령신호를 전송하고, 무선통신부(250)는 상기 연락처로 비상연락을 수행할 수 있다. 상기 연락처는 전화번호에 한정되지 아니하고, 다양한 무선통신방식에 의해 데이터를 전송할 수 있는 주소 등이 포함될 수 있다.Accordingly, the control unit 210 may grasp the emergency classification including the user's voice input or operation input, and select a designated authority or acquaintance to perform an emergency contact according to the emergency classification. Thereafter, the control unit 210 may transmit a command signal to the wireless communication unit 250 to perform an emergency contact to a contact point of a designated authority or an acquaintance, and the wireless communication unit 250 may perform an emergency contact to the contact point. The contact number is not limited to a phone number, and may include an address or the like for transmitting data by various wireless communication methods.
또한, 전방의 실시간 영상 촬영 또는 실시간 음성 녹음을 수행하는 단계;를 더 포함하고, 상기 비상연락 수행단계(S220)는 상기 촬영된 영상 또는 상기 녹음된 음성을 무선통신으로 전송할 수 있다. 즉, 글라스형 웨어러블 디바이스(10)는, 비상연락 수행 시에 제1카메라(121)에 의해 촬영되는 실시간 영상 또는 음성입력부에 의해 획득되는 실시간 음성을 무선통신으로 함께 전송할 수 있다. 실시간 영상 또는 음성을 함께 전송함으로써, 비상연락을 받은 상대방(예를 들어, 친구, 보호자, 소방관, 또는 경찰관)이 사용자가 처한 긴급상황을 인지할 수 있어서, 상대방이 긴급상황에 맞는 적절한 대처를 할 수 있다.The method may further include performing real-time video recording or real-time audio recording of the front side. The emergency contact performing step (S220) may transmit the captured video or the recorded voice through wireless communication. That is, the glass type wearable device 10 may transmit a real time image captured by the first camera 121 or a real time voice obtained by the voice input unit together with the wireless communication when performing an emergency contact. By sending real-time video or audio together, the other party (e.g., friend, guardian, firefighter, or police officer) who has received an emergency call can recognize the user's emergency, so that the other party can take appropriate action according to the emergency. Can be.
또한, 상기 비상연락 수행단계(S220)는, 상기 실시간 영상을 분석하여 상기 사용자에게 발생한 긴급상황분류를 판단하는 단계;를 포함할 수 있다. 예를 들어, 글라스형 웨어러블 디바이스(10)는, 갑자기 바닥으로 떨어지는 실시간 영상이 획득된 경우, 사용자가 쓰러진 것으로 인식하여 신체이상에 따른 응급상황으로 긴급상황을 파악할 수 있다. In addition, the emergency contact step (S220), may include the step of determining the emergency classification occurred to the user by analyzing the real-time image. For example, when a real-time image falling onto the floor is acquired, the glass type wearable device 10 may recognize that the user falls down and grasp an emergency as an emergency according to a physical abnormality.
또한, 긴급상황 알림정보 전송단계(S220)는, 일정 거리 내의 불특정 상대방이 상기 사용자의 긴급상황을 인식하도록 능동인식 신호를 발신할 수 있다. 긴급상황에서 도움을 빨리 받기 위해서는, 주변에 위치한 사람들에게 도움을 요청하는 것이 바람직하다. 따라서 글라스형 웨어러블 디바이스(10)는 비콘신호와 같은 엑티브센싱이 가능한 무선통신신호에 긴급상황알림정보를 포함하여 주변으로 발신할 수 있다. 이를 통해, 불특정 상대방이 본인의 외부디바이스(30)를 통해 사용자의 긴급상황 알림정보를 수신할 수 있어, 신속한 도움을 받을 수 있는 효과가 있다.In addition, the emergency notification information transmission step (S220), an unspecified counterpart within a certain distance may transmit an active recognition signal to recognize the emergency situation of the user. To get help quickly in an emergency, it is a good idea to ask for help from those around you. Therefore, the glass type wearable device 10 may transmit emergency notification information to a wireless communication signal capable of active sensing, such as a beacon signal, to transmit to the surroundings. Through this, the unspecified counterpart can receive the emergency notification information of the user through his external device 30, there is an effect that can be quickly received.
또한, 상기 사용자의 긴급상황에 상응하는 응급조치 방식을 외부로 음성 출력하는 단계;를 더 포함할 수 있다. 예를 들어, 사용자가 쓰러진 경우에 주변에 위치한 사람들의 신속한 응급조치가 필요할 수 있다. 따라서, 글라스형 웨어러블 디바이스(10)는 사용자의 긴급상황에 부합하는 응급조치 방식을 음성출력으로 생성하여 주변 사람들에게 알릴 수 있다. The method may further include outputting voice to an external emergency response method corresponding to the emergency of the user. For example, in case of a user fall, prompt first aid of people located nearby may be necessary. Therefore, the glass type wearable device 10 may generate a first aid method corresponding to an emergency of the user as a voice output and notify the surrounding people.
또한, 외부 웨어러블 디바이스로부터 상기 사용자의 생체신호를 수신하거나 상기 글라스형 웨어러블 디바이스가 생체신호를 획득하는 단계;를 포함하고, 상기 긴급상황 판단단계(S210)는 상기 생체신호를 반영하여 사용자의 긴급상황을 판단할 수 있다. 사용자의 정확한 상태를 파악하기 위해서는 심박수, 심전도 등의 사용자 생체정보가 필요하다. 따라서, 글라스형 웨어러블 디바이스(10)는 생체신호를 측정할 수 있는 다른 웨어러블 디바이스로부터 생체신호를 수신하거나 글라스형 웨어러블 디바이스(10)가 직접 생체신호를 측정할 수 있고, 생체신호를 반영하여 사용자의 상황을 파악할 수 있다. 예를 들어, 갑자기 바닥으로 떨어지는 실시간 영상이 획득된 경우, 생체신호를 반영하여 사용자의 긴급상황을 파악하여 사용자가 넘어진 것인지, 아니면 쓰러진 것인지를 파악할 수 있다. The method may further include receiving a bio signal of the user from an external wearable device or acquiring the bio signal by the glass type wearable device, wherein the emergency determination step (S210) may reflect the bio signal to the user's emergency situation. Can be judged. In order to determine the exact state of the user, user biometric information such as heart rate and electrocardiogram is required. Therefore, the glass type wearable device 10 may receive a biosignal from another wearable device capable of measuring the biosignal, or the glass type wearable device 10 may directly measure the biosignal, and may reflect the biosignal of the user. Understand the situation. For example, when a real-time image that suddenly falls to the floor is obtained, the emergency situation of the user may be reflected by reflecting a biosignal to determine whether the user has fallen or fallen.
이하, 도면을 참조하여 본 발명의 실시예들에 따른 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어시스템 및 방법에 대해 설명하기로 한다.Hereinafter, an indoor electronic device control system and method using a glass type wearable device according to embodiments of the present invention will be described with reference to the accompanying drawings.
도 10은 본 발명의 일실시예에 따른 실내측위 및 주시방향 측정에 의한 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법의 순서도이다.10 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by measuring indoor positioning and gaze direction according to an embodiment of the present invention.
도 10을 참조하면, 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스(10)를 이용한 실내 전자기기 제어방법은, 글라스형 웨어러블 디바이스가 현재 실내 위치를 측정하는 단계(S300); 글라스형 웨어러블 디바이스가 방위각 또는 고저각을 인식하여 사용자의 주시 방향을 측정하는 단계(S310); 상기 측정된 현재 실내 위치에서 주시 방향에 위치한 전자기기를 판단하는 단계(S320); 사용자로부터 전자기기의 제어명령을 수신하는 단계(S330); 및 입력된 상기 제어명령을 전자기기로 무선통신을 통해 전송하는 단계(S340);를 포함한다. 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법을 순서대로 설명한다.Referring to FIG. 10, the method for controlling an indoor electronic device using the glass type wearable device 10 according to an embodiment of the present disclosure may include: measuring, by the glass type wearable device, a current indoor position (S300); Measuring a gaze direction of the user by recognizing the azimuth angle or the high and low angle of the glass type wearable device (S310); Determining an electronic device located in a gaze direction from the measured current indoor location (S320); Receiving a control command of the electronic device from the user (S330); And transmitting the inputted control command to the electronic device through wireless communication (S340). The indoor electronic device control method using the glass type wearable device according to an embodiment of the present invention will be described in order.
글라스형 웨어러블 디바이스(10)가 현재 실내 위치를 측정한다(S300). 글라스형 웨어러블 디바이스(10)의 실내 위치 인식에는 다양한 실내 측위 방식이 적용될 수 있다. 다만, 실내 측위 방식은 이하 기재된 방식에 한정되지 아니하고, 다양한 방식이 적용될 수 있다.The glass type wearable device 10 measures a current indoor location (S300). Various indoor positioning methods may be applied to indoor location recognition of the glass type wearable device 10. However, the indoor positioning method is not limited to the method described below, and various methods may be applied.
실내 위치 측정 방식으로 와이파이, 비콘 등의 실내 무선통신망을 이용하여 측정하는 방식을 적용할 수 있다. 상기 글라스형 웨어러블 디바이스(10)가 상기 무선통신신호를 수신하여 상기 신호의 세기, 방향, 종류 등을 인식하고, 이를 통해 실내 위치를 측정할 수 있다. As an indoor location measuring method, a method of measuring using an indoor wireless communication network such as Wi-Fi or beacon may be applied. The glass type wearable device 10 may receive the wireless communication signal to recognize the strength, direction, type, and the like of the signal, and thereby measure an indoor location.
또한, 글라스형 웨어러블 디바이스(10)의 일측에 구비되어 전방의 영상 또는 이미지를 획득하는 제1카메라(121)에 의해 획득된 사용자의 전방 영상 또는 이미지에서 특징요소를 추출하고, 상기 특징요소가 포함된 실내지도 상의 글라스형 웨어러블 디바이스(10)의 위치를 파악하는 방식이 적용될 수 있다.In addition, a feature element is extracted from a front image or an image of the user acquired by the first camera 121 provided on one side of the glass type wearable device 10 to acquire an image or an image of the front, and the feature element is included. A method of determining the position of the glass type wearable device 10 on the indoor map may be applied.
글라스형 웨어러블 디바이스(10)가 방위각 또는 고저각을 인식하여 사용자의 주시 방향을 측정한다(S310). 상기 주시방향은 사용자의 얼굴이 특정한 전자기기를 바라보기 위해 향하고 있는 방향을 말하며, 상기 얼굴방향은 글라스형 웨어러블 디바이스가 바라보고 있는 방향과 일치한다. 글라스형 웨어러블 디바이스(10)은 자이로센서(131), 지자기센서, 가속도센서(132) 등을 이용하여 방위각 또는 고저각 측정을 수행할 수 있다. 상기 측정된 고저작 및 방위각에 상응하는 방향이 사용자의 주시방향에 해당한다.The glass type wearable device 10 recognizes an azimuth angle or a high and low angle to measure a direction of attention of the user (S310). The gaze direction refers to a direction that the face of the user faces to look at a specific electronic device, and the face direction corresponds to the direction that the glass wearable device is facing. The glass type wearable device 10 may perform azimuth or high and low angle measurements using the gyro sensor 131, the geomagnetic sensor, and the acceleration sensor 132. The direction corresponding to the measured high operation and azimuth angle corresponds to the user's gaze direction.
글라스형 웨어러블 디바이스(10)는 상기 측정된 현재 실내 위치에서 상기 주시 방향에 위치한 전자기기를 판단한다(S320). 즉, 글라스형 웨어러블 디바이스(10)는 인식된 현재 실내 위치에서 바라보는 방향에 위치한 전자기기를 인식한다. 글라스형 웨어러블 디바이스(10)는 인식된 현재 위치를 기준으로 측정된 방위각 및 고각을 적용하여 제어를 원하는 전자기기가 위치하는 방향을 파악한 후, 저장된 실내지도 상에서 파악된 방향에 위치하는 전자기기를 파악할 수 있다.The glass type wearable device 10 determines an electronic device located in the gaze direction at the measured current indoor location (S320). That is, the glass type wearable device 10 recognizes an electronic device located in a direction viewed from the recognized current indoor location. The glass type wearable device 10 may determine the direction in which the electronic device to be controlled is located by applying the azimuth and elevation measured based on the recognized current position, and then identify the electronic device located in the direction determined on the stored indoor map. Can be.
글라스형 웨어러블 디바이스(10)는 사용자로부터 전자기기의 제어명령을 수신한다(S330). 제어명령의 수신은, 제2카메라(122)에 의해 획득된 눈깜박임을 인식하여 상기 제어명령을 입력하는 방식, 상기 모션센서에 의해 고개움직임 패턴을 인식하여 상기 제어명령을 입력하는 방식, 터치부(112)의 터치조작에 의해 제어명령을 입력하는 방식, 사용자의 음성입력에 의해 상기 제어명령을 입력하는 방식, 제1카메라(121)가 사용자의 핸드제스처를 인식하여 상기 제어명령을 입력하는 방식 등에 의해 수행될 수 있다.The glass type wearable device 10 receives a control command of an electronic device from a user (S330). Receiving a control command, a method of inputting the control command by recognizing the blink of the eye obtained by the second camera 122, a method of inputting the control command by recognizing the motion pattern by the motion sensor, touch unit A method of inputting a control command by a touch operation of 112, a method of inputting the control command by a voice input of a user, and a method of inputting the control command by recognizing a user's hand gesture by the first camera 121. Or the like.
글라스형 웨어러블 디바이스(10)는 입력된 제어명령을 전자기기로 무선통신을 통해 전송한다(S340). 글라스형 웨어러블 디바이스(10)는 전자기기와 직접 연결하는 통신방식에 의해 제어명령을 전송할 수 있다. 예를 들어, 글라스형 웨어러블 디바이스(10)는 와이파이 다이렉트(WiFi-Direct) 방식 또는 블루투스 방식에 의해 파악된 전자기기로 제어명령을 전송할 수 있다. The glass type wearable device 10 transmits the input control command to the electronic device through wireless communication (S340). The glass type wearable device 10 may transmit a control command by a communication method directly connecting to the electronic device. For example, the glass type wearable device 10 may transmit a control command to an electronic device identified by a Wi-Fi Direct method or a Bluetooth method.
또한, 글라스형 웨어러블 디바이스(10)는 와이파이(Wi-Fi, WLAN)과 같은 가정 내 무선통신망에 연결되어 전자기기로 제어명령을 전송할 수 있다. 즉, 하나 이상의 상기 전자기기가 연결되어 있는 무선통신망에 상기 글라스형 웨어러블 디바이스(10)도 연결되고, 글라스형 웨어러블 디바이스(10)는 상기 무선통신망을 이용하여 무선 엑세스포인트(50)로 제어명령을 전송하고, 무선 엑세스포인트(50)가 제어대상인 상기 전자기기로 상기 제어명령을 전송할 수 있다.In addition, the glass type wearable device 10 may be connected to a home wireless communication network such as Wi-Fi (WLAN) and transmit a control command to an electronic device. That is, the glass type wearable device 10 is also connected to a wireless communication network to which at least one electronic device is connected, and the glass type wearable device 10 issues a control command to a wireless access point 50 using the wireless communication network. In addition, the wireless access point 50 may transmit the control command to the electronic device to be controlled.
또한, 사용자가 상기 글라스형 웨어러블 디바이스(10)를 착용하고 상기 가정 내에 진입하면 자동으로 상기 무선통신망에 연결되도록 구현할 수 있다. 글라스형 웨어러블 디바이스(10)가 가정 내 진입 시 상기 무선통신망을 엑티브센싱하여 자동으로 연결되도록 하여, 가정 내에 들어와서 사용자가 바로 글라스형 웨어러블 디바이스(10)를 이용하여 상기 전자기기를 제어하도록 할 수 있다.In addition, when the user wears the glass type wearable device 10 and enters the home, the glass type wearable device 10 may be automatically connected to the wireless communication network. When the glass type wearable device 10 enters a home, the wireless communication network is actively sensed so as to be automatically connected, so that the user can directly enter the home and control the electronic device using the glass type wearable device 10. have.
또한, 상기 글라스형 웨어러블 디바이스가 안구 시선방향을 획득하는 단계;를 더 포함하고, 상기 제어대상 판단단계(S320)는 상기 측정된 현재 실내 위치를 기준으로 상기 주시 방향에 상기 시선방향에 상응하는 각도를 반영하여 상기 전자기기를 제어대상으로 판단하는 것을 특징으로 할 수 있다. 사용자가 상기 글라스형 웨어러블 디바이스(10)를 착용하고 정면만을 바라보는 것이 아니므로, 사용자의 안구방향도 고려할 필요가 있다. 따라서 상기 제2카메라(122)에 의해 아이트래킹을 수행하여 사용자의 시선방향을 인식하고, 상기 측정된 방위각 및 고저각에 따른 주시방향에 상기 인식된 안구의 시선방향을 적용하여 상기 전자기기가 위치하는 정확한 위치를 인식할 수 있다.The method may further include obtaining, by the glass type wearable device, an eyeball gaze direction, and the determining of the control object (S320) may include an angle corresponding to the gaze direction with respect to the gaze direction based on the measured current indoor position. Reflecting this, it can be characterized in that the electronic device is determined as the control target. Since the user wears the glass type wearable device 10 and does not look only at the front, it is necessary to consider the eyeball direction of the user. Accordingly, the eye tracking is performed by the second camera 122 to recognize the user's gaze direction, and the electronic device is located by applying the recognized eye gaze direction to the gaze direction according to the measured azimuth and the high and low angles. Can recognize the exact location.
또한, 하나 이상의 전자기기가 제어대상으로 판단된 경우, 상기 글라스형 웨어러블 디바이스가 상기 제어대상으로 판단된 하나 이상의 전자기기 리스트를 화면상에 표시하여 제공하는 단계; 및 사용자로부터 상기 리스트 내 전자기기 중에서 특정한 전자기기를 선택받는 단계;를 더 포함할 수 있다. 글라스형 웨어러블 디바이스(10)를 착용한 사용자가 주시하는 방향에 복수의 전자기기가 위치할 수 있으며, 아이트래킹에 의한 시선방향을 반영하더라도 복수의 전자기기가 가까이에 위치할 수 있으므로, 글라스형 웨어러블 디바이스는 사용자가 인식된 복수의 전자기기 중에서 어느 전자기기를 제어하기를 원하는 지 파악할 수 없다. 이러한 경우, 제어대상으로 판단된 복수의 전자기기를 사용자에게 제공하여 사용자로부터 선택받을 필요가 있다. 따라서, 글라스형 웨어러블 디바이스(10)가 상기 판단된 하나 이상의 전자기기 리스트를 화면상에 표시하여 제공할 수 있다. 그 후, 글라스형 웨어러블 디바이스(10)는 사용자로부터 상기 리스트 내 전자기기 중에서 특정한 전자기기를 선택받아 제어대상인 전자기기를 결정할 수 있다. 예를 들어, 글라스형 웨어러블 디바이스는 복수의 전자기기 리스트를 화면상에 번호와 함께 표시할 수 있고, 사용자로부터 특정한 번호에 상응하는 터치입력, 음성입력, 눈깜박임 입력, 손 제스처 입력, 고개움직임 입력 등을 수신하여 제어대상인 특정한 전자기기를 선택할 수 있다.The method may further include: when the one or more electronic devices are determined to be controlled, displaying the list of one or more electronic devices determined to be controlled by the glass type wearable device; And selecting a specific electronic device from the electronic devices in the list by the user. Since a plurality of electronic devices may be located in a direction that the user wearing the glass type wearable device 10 watches, and the plurality of electronic devices may be located close to each other even when the eye tracking direction is reflected, the glass type wearable The device cannot determine which electronic device the user wants to control among the recognized plurality of electronic devices. In this case, it is necessary to provide the user with a plurality of electronic devices determined to be controlled to be selected by the user. Accordingly, the glass type wearable device 10 may display the determined one or more electronic device list on the screen and provide the same. Thereafter, the glass type wearable device 10 may select a specific electronic device from the electronic devices in the list and determine the electronic device to be controlled by the user. For example, the glass type wearable device may display a list of a plurality of electronic devices together with a number on the screen, and a touch input, a voice input, a blinking eye input, a hand gesture input, and a motion input corresponding to a specific number from a user. And the like to select a specific electronic device to be controlled.
또한, 상기 제어명령에 따른 상기 전자기기 제어 결과를 수신하여 사용자에게 알림을 제공하는 단계;를 더 포함할 수 있다. 즉, 글라스형 웨어러블 디바이스(10)는 제어명령을 수신한 전자기기로부터 제어결과를 무선통신을 수신할 수 있고, 사용자에게 알림을 수행할 수 있다. 사용자에게 알림을 제공하는 방식은, 디스플레이부(310)에 의해 화면상에 표시하는 방식, 음성출력에 의해 알리는 방식 등을 포함할 수 있다. The method may further include providing a notification to a user by receiving the electronic device control result according to the control command. That is, the glass type wearable device 10 may receive a wireless communication result of the control from the electronic device that receives the control command, and may notify the user. The manner of providing a notification to the user may include a method of displaying on the screen by the display unit 310, a method of notifying by a voice output, and the like.
또한, 상기 전자기기의 특정한 제어명령에 대응하는 입력방식 또는 입력패턴을 저장하는 단계;를 더 포함할 수 있다. 글라스형 웨어러블 디바이스(10)는 사용자로부터 핸드제스처 패턴, 눈깜박임패턴, 고개움직임 패턴과 같은 입력패턴을 입력받고 이에 대응하는 제어명령을 선택하여, 입력패턴과 제어명령의 대응관계를 설정할 수 있다. The method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device. The glass type wearable device 10 may receive an input pattern such as a hand gesture pattern, an eye blink pattern, or a moving pattern from a user, select a control command corresponding thereto, and set a corresponding relationship between the input pattern and the control command.
또한, 제어명령별로 입력방식을 사용자가 달리 설정하여 저장할 수도 있다. 예를 들어, 사용자가 조명을 제어하고자 할 때, 오디오를 끄는 명령은 왼쪽 눈을 감는 눈깜박임 패턴, 오디오를 켜는 명령은 오른쪽 눈을 감는 눈깜박임 패턴으로 설정하여 저장할 수 있다. 또한, 오디오의 재생 시에 다음 곡으로 넘기는 명령은 고개움직임 방향으로 설정하여 저장할 수 있다. 이를 통해 사용자가 원하는 명령 형태로 설정할 수 있어, 사용자의 특성에 맞게 명령입력 방식 및 패턴을 설정 가능하다. In addition, the user may set and store different input methods for each control command. For example, when the user wants to control the lighting, the command to turn off the audio may be set as a blinking pattern for closing the left eye, and the command for turning on the audio may be stored for the right blinking pattern. In addition, a command to be transferred to the next song at the time of audio reproduction can be set in the moving direction and stored. Through this, the user can set it in the form of the command he / she wants, and the command input method and pattern can be set according to the characteristics of the user.
도 11은 본 발명의 일실시예에 따른 전방영상 분석에 의한 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법의 순서도이다.FIG. 11 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by analyzing a front image according to an embodiment of the present disclosure.
도 11을 참조하면, 본 발명의 다른 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법은, 사용자의 주시방향에 상응하는 이미지를 획득하는 단계(S400); 상기 이미지 분석을 통해 상기 이미지 내 전자기기를 제어대상으로 하는 단계(S410); 사용자로부터 상기 전자기기의 제어명령을 수신하는 단계(S420); 및 입력된 상기 제어명령을 상기 전자기기로 무선통신을 통해 전송하는 단계(S430);를 포함한다. 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법을 순서대로 설명한다. 이하, 기설명한 단계에 대한 구체적인 설명을 생략한다.Referring to FIG. 11, a method of controlling an indoor electronic device using a glass type wearable device according to another exemplary embodiment of the present disclosure may include: obtaining an image corresponding to a gaze direction of a user (S400); (S410) controlling the electronic device in the image through the image analysis; Receiving a control command of the electronic device from a user (S420); And transmitting the inputted control command to the electronic device through wireless communication (S430). The indoor electronic device control method using the glass type wearable device according to an embodiment of the present invention will be described in order. Hereinafter, detailed description of the above-described steps will be omitted.
글라스형 웨어러블 디바이스(10)는 사용자의 주시방향에 상응하는 이미지를 획득한다(S400). 즉, 글라스형 웨어러블 디바이스(10)의 제1카메라(121)가 사용자의 얼굴이 향하는 방향(즉, 주시방향 또는 전방방향)의 이미지를 획득한다.The glass type wearable device 10 obtains an image corresponding to the direction of attention of the user (S400). In other words, the first camera 121 of the glass type wearable device 10 acquires an image in a direction (ie, a viewing direction or a forward direction) of the face of the user.
글라스형 웨어러블 디바이스(10)는 상기 이미지 분석을 통해 상기 이미지 내 전자기기를 제어대상으로 판단한다(S410). 예를 들어, 글라스형 웨어러블 디바이스는 상기 이미지 중앙에 위치한 전자기기를 제어대상으로 인식할 수 있다. 일반적으로 사용자가 제어를 하고자 하는 전자기기가 상기 제1카메라(121)로 획득한 전방 영상 또는 이미지의 가운데에 위치할 것이므로, 상기 획득된 이미지의 중앙에 위치한 전자기기를 제어대상으로 인식할 수 있다. 상기 제어부(210)는 상기 이미지의 중앙에 위치하는 전자기기가 어떠한 전자기기인지 상기 전자기기의 이미지를 분석하여 파악하여 제어대상으로 선택할 수 있다.The glass type wearable device 10 determines the electronic device in the image as a control object through the image analysis (S410). For example, the glass type wearable device may recognize the electronic device located in the center of the image as a control target. In general, since the electronic device to be controlled by the user will be located in the center of the front image or image acquired by the first camera 121, the electronic device located in the center of the acquired image can be recognized as a control object. . The controller 210 may analyze the image of the electronic device to determine which electronic device is located at the center of the image, and select the electronic device as a control target.
또한, 글라스형 웨어러블 디바이스(10)가 사용자의 핸드제스처를 포함하는 이미지를 촬영하는 경우, 글라스형 웨어러블 디바이스(10)가 상기 이미지에 포함된 사용자의 핸드제스처 영역을 파악하여, 상기 핸드제스처 영역에 대응하는 상기 전자기기를 추출하여 제어대상으로 판단할 수 있다. 예를 들어, 글라스형 웨어러블 디바이스가 사용자의 손가락이 가리키는 방향의 전자기기를 제어대상으로 판단할 수 있다. 또한, 예를 들어, 글라스형 웨어러블 디바이스(10)는 사용자의 특정한 제스처(예를 들어, 손가락을 모아서 만든 원형 제스처) 내에 포함한 전자기기를 제어대상으로 판단할 수 있다. 즉, 제어부(210)는 사용자의 특정한 핸드제스처를 인식하고, 상기 인식된 핸드제스처 내에 포함된 전자기기 이미지 부분을 추출할 수 있다. 상기 추출된 전자기기의 이미지를 분석을 통해 제어를 원하는 상기 전자기기를 인식할 수 있다.In addition, when the glass type wearable device 10 captures an image including the user's hand gesture, the glass type wearable device 10 identifies the user's hand gesture area included in the image and displays the image in the hand gesture area. The corresponding electronic device may be extracted and determined to be a control object. For example, the glass type wearable device may determine the electronic device in the direction indicated by the user's finger as the control target. In addition, for example, the glass type wearable device 10 may determine an electronic device included in a specific gesture of the user (for example, a circular gesture made by collecting a finger) as a control target. That is, the controller 210 may recognize a specific hand gesture of the user and extract an electronic device image part included in the recognized hand gesture. The image of the extracted electronic device may be analyzed to recognize the electronic device to be controlled.
글라스형 웨어러블 디바이스(10)는 사용자로부터 상기 전자기기의 제어명령을 수신한다(S420).The glass type wearable device 10 receives a control command of the electronic device from the user (S420).
글라스형 웨어러블 디바이스(10)는 입력된 상기 제어명령을 상기 전자기기로 무선통신을 통해 전송한다(S430).The glass type wearable device 10 transmits the input control command to the electronic device through wireless communication (S430).
또한, 상기 글라스형 웨어러블 디바이스가 안구 시선방향을 획득하는 단계;를 더 포함할 수 있다. The method may further include acquiring, by the glass type wearable device, an eyeball direction.
또한, 상기 제어대상 판단단계(S410)는, 상기 이미지 내에서 상기 시선방향에 상응하는 영역의 상기 전자기기를 추출하여 제어대상으로 판단할 수 있다. 즉, 글라스형 웨어러블 디바이스(10)는 사용자의 안구 시선방향을 인식하여, 상기 이미지 내 시선방향에 해당하는 시선지점을 계산할 수 있다. 그 후, 글라스형 웨어러블 디바이스(10)는 시선지점에 위치한 전자기기를 추출하여 제어대상으로 판단할 수 있다. 사용자가 상기 글라스형 웨어러블 디바이스(10)를 착용하고 정면만을 바라보는 것이 아니므로, 사용자의 안구방향도 고려할 필요가 있다. 따라서 상기 제2카메라(122)에 의해 아이트래킹을 수행하여 사용자의 안구방향을 인식하고, 상기 측정된 방위각 및 고저각에 상기 인식된 안구방향을 적용하여 상기 전자기기가 위치하는 정확한 위치를 인식할 수 있다.In addition, in the controlling object determination step (S410), the electronic device in the region corresponding to the gaze direction may be extracted from the image and determined as the controlling object. That is, the glass type wearable device 10 may recognize the eyeball gaze direction of the user and calculate a gaze point corresponding to the gaze direction in the image. Thereafter, the glass type wearable device 10 may extract an electronic device located at a gaze point and determine the control target. Since the user wears the glass type wearable device 10 and does not look only at the front, it is necessary to consider the eyeball direction of the user. Accordingly, eye tracking by the second camera 122 may be performed to recognize the eyeball direction of the user, and the correct eyeball position may be recognized by applying the recognized eyeball direction to the measured azimuth and elevation angles. have.
또한, 하나 이상의 전자기기가 제어대상으로 판단된 경우, 상기 글라스형 웨어러블 디바이스가 상기 제어대상으로 판단된 하나 이상의 전자기기 리스트를 화면상에 표시하여 제공하는 단계; 및 사용자로부터 상기 리스트 내 전자기기 중에서 특정한 전자기기를 선택받는 단계;를 더 포함할 수 있다.The method may further include: when the one or more electronic devices are determined to be controlled, displaying the list of one or more electronic devices determined to be controlled by the glass type wearable device; And selecting a specific electronic device from the electronic devices in the list by the user.
또한, 상기 제어명령에 따른 전자기기 제어 결과를 수신하여 사용자에게 알림하는 단계;를 더 포함할 수 있다.The method may further include receiving an electronic device control result according to the control command and notifying the user.
또한, 상기 전자기기의 특정한 제어명령에 대응하는 입력방식 또는 입력패턴을 저장하는 단계;를 더 포함할 수 있다.The method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.
도 12는 본 발명의 일실시예에 따른 음성명령 인식에 의한 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법의 순서도이다.12 is a flowchart illustrating a method for controlling an indoor electronic device using a glass type wearable device by voice command recognition according to an embodiment of the present invention.
도 12를 참조하면, 본 발명의 다른 일실시예에 또 따른 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법은, 사용자로부터 디바이스에 제어를 원하는 전자기기 선택명령 및 제어명령에 상응하는 음성명령을 수신하는 단계(S500); 상기 음성명령을 분석하여 제어대상인 전자기기와 제어명령을 판단하는 단계(S510); 및 입력된 상기 제어명령을 선택된 상기 전자기기로 무선통신을 통해 전송하는 단계(S520);를 포함할 수 있다. 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법을 순서대로 설명한다. 이하, 기설명한 단계에 대한 구체적인 설명을 생략한다.Referring to FIG. 12, in the method of controlling an indoor electronic device using a glass type wearable device according to another embodiment of the present invention, a user receives an electronic device selection command and a voice command corresponding to the control command from the user. Step (S500); Analyzing the voice command to determine a control command with an electronic device to be controlled (S510); And transmitting the inputted control command to the selected electronic device through wireless communication (S520). The indoor electronic device control method using the glass type wearable device according to an embodiment of the present invention will be described in order. Hereinafter, detailed description of the above-described steps will be omitted.
글라스형 웨어러블 디바이스(10)는 사용자로부터 디바이스에 제어를 원하는 전자기기 선택명령 및 제어명령에 상응하는 음성명령을 수신한다(S500). 글라스형 웨어러블 디바이스(10)의 음성입력부(140)가 상기 제어대상의 명칭과 상기 제어명령을 포함하는 사용자의 음성명령을 입력받는다.The glass type wearable device 10 receives an electronic device selection command and a voice command corresponding to the control command from the user (S500). The voice input unit 140 of the glass type wearable device 10 receives a voice command of a user including the name of the control target and the control command.
글라스형 웨어러블 디바이스(10)는 상기 음성명령을 분석하여 제어대상인 전자기기와 제어명령을 판단한다(S510). 즉, 글라스형 웨어러블 디바이스(10)의 음성인식부(220)가 상기 입력된 음성명령의 음성인식을 수행하여, 상기 음성명령에서 상기 제어대상에 해당하는 전자기기를 파악하고 상기 전자기기에 전송할 제어명령을 파악한다. 예를 들어, 사용자의 “거실 전등 꺼.”라는 음성명령을 상기 음성입력부(140)가 입력받는다. 상기 음성인식부(220)가 상기 음성명령을 해석하여 제어대상에 해당하는 전자기기가 거실 전등임을 인식하고, 사용자가 원하는 제어명령이 꺼짐(Off)임을 인식한다. 다만, 글라스형 웨어러블 디바이스(10)가 사용자의 음성명령을 입력받아 상기 제어대상과 상기 제어명령을 인식하는 방식은 이에 한정되지 아니하고, 다양한 방식이 적용될 수 있다.The glass type wearable device 10 analyzes the voice command to determine a control command with the electronic device to be controlled (S510). That is, the voice recognition unit 220 of the glass type wearable device 10 performs voice recognition of the input voice command, and controls the electronic device corresponding to the control object in the voice command and transmits the same to the electronic device. Figure out the order. For example, the voice input unit 140 receives a user's voice command “turn off the living room lamp.” The voice recognition unit 220 interprets the voice command to recognize that the electronic device corresponding to the control object is a living room lamp, and recognizes that the control command desired by the user is off. However, the manner in which the glass type wearable device 10 receives the user's voice command and recognizes the control target and the control command is not limited thereto, and various methods may be applied.
글라스형 웨어러블 디바이스(10)는 입력된 상기 제어명령을 선택된 상기 전자기기로 무선통신을 통해 전송한다(S520).The glass type wearable device 10 transmits the input control command to the selected electronic device through wireless communication (S520).
또한, 상기 제어명령에 따른 상기 전자기기 제어 결과를 수신하여 사용자에게 알림하는 단계;를 더 포함할 수 있다.The method may further include receiving a control result of the electronic device according to the control command and notifying the user.
또한, 상기 전자기기의 특정한 제어명령에 대응하는 입력방식 또는 입력패턴을 저장하는 단계;를 더 포함할 수 있다.The method may further include storing an input method or an input pattern corresponding to a specific control command of the electronic device.
또한, 도 13에서와 같이, 상기 디스플레이부(310)에 선택된 전자기기와 제어명령을 표시하여, 상기 전자기기의 선택명령 및 제어명령이 제대로 입력되었는지 여부를 사용자가 확인할 수 있도록 할 수 있다.In addition, as shown in FIG. 13, the display unit 310 may display the selected electronic device and a control command so that the user may check whether the selection command and the control command of the electronic device are properly input.
또한, 상기 제어명령에 따른 상기 전자기기 제어 결과를 수신하여 사용자에게 알림하는 단계;를 더 포함할 수 있다. 상기 제어명령을 상기 전자기기가 수신하여 제대로 원하는 동작을 수행하였는지 여부를 사용자에게 알리기 위해서, 상기 전자기기가 상기 제어명령에 따른 수행결과를 상기 글라스형 웨어러블 디바이스(10)로 전송하고, 상기 글라스형 웨어러블 디바이스(10)는 수신한 결과정보를 정보처리하여 사용자에게 알림을 수행한다. 상기 알림방식은, 상기 디스플레이부(310)에 상기 제어결과정보를 시각적으로 표시하는 방식, 상기 음향출력부(320)를 통해 음성으로 사용자에게 알리는 방식 등이 적용될 수 있다.The method may further include receiving a control result of the electronic device according to the control command and notifying the user. In order to inform the user whether the electronic device has received the control command and performed the desired operation properly, the electronic device transmits the execution result according to the control command to the glass type wearable device 10, and the glass type. The wearable device 10 notifies the user by processing the received result information. The notification method may include a method of visually displaying the control result information on the display 310, a method of notifying the user by voice through the sound output unit 320, and the like.
도 14는 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스(10)를 이용한 실내 전자기기 제어시스템의 내부구성도이다. 다만, 도 5에 있어서, 기 설명한 구성에 대한 구체적인 설명은 생략하기로 한다.FIG. 14 is a block diagram of an indoor electronic device control system using the glass type wearable device 10 according to an exemplary embodiment of the present invention. However, in FIG. 5, detailed description of the previously described configuration will be omitted.
도 14를 참조하면, 본 발명의 다른 일실시예에 따른 글라스형 웨어러블 디바이스(10)를 이용한 실내 전자기기 제어시스템은, 글라스형 웨어러블 디바이스(10); 및 무선 엑세스포인트(50);를 포함한다.Referring to FIG. 14, an indoor electronic device control system using the glass type wearable device 10 according to another exemplary embodiment of the present invention may include a glass type wearable device 10; And a wireless access point 50.
상기 글라스형 웨어러블 디바이스(10)는 제어를 원하는 전자기기 선택명령 및 제어명령을 입력받고, 무선 액세스포인트와 연결되어 상기 무선통신을 통해 입력된 상기 전자기기의 선택명령 및 제어명령을 전송하는 기능을 수행한다. 즉, 상기 글라스형 웨어러블 디바이스(10)는 무선통신부(250); 제어부(210); 및 사용자입력부(110);를 포함한다.The glass type wearable device 10 has a function of receiving an electronic device selection command and a control command for control and transmitting a selection command and a control command of the electronic device connected to a wireless access point through the wireless communication. To perform. That is, the glass type wearable device 10 includes a wireless communication unit 250; Control unit 210; And a user input unit 110.
사용자입력부(110)는 상기 글라스형 웨어러블 디바이스(10)는 제어를 원하는 전자기기 선택명령 및 제어명령을 입력받는 기능을 수행한다.The user input unit 110 performs the function of receiving the electronic device selection command and the control command that the glass type wearable device 10 wants to control.
제어부(210)는 상기 사용자입력부(110)에 의해 입력된 선택명령 및 제어명령을 바탕으로 사용자가 제어하고자 하는 전자기기를 판단하고, 원하는 제어명령을 파악한다. 또한, 상기 제어부(210)는 상기 무선통신부(250)를 통해 선택된 전자기기로 제어명령을 전송하기 위해 정보처리를 수행한다.The controller 210 determines an electronic device to be controlled by the user based on the selection command and the control command input by the user input unit 110 and grasps a desired control command. In addition, the controller 210 performs information processing to transmit a control command to the electronic device selected through the wireless communication unit 250.
무선통신부(250)는 글라스형 웨어러블 디바이스(10)를 무선 액세스포인트와 연결하여, 상기 무선통신을 통해 입력된 상기 전자기기의 선택명령 및 제어명령을 전송하는 기능을 수행한다. 또한, 글라스형 웨어러블 디바이스(10)의 실시간 실내위치를 인식하기 위해 실내 무선통신 신호를 수신하는 기능을 수행할 수 있다. The wireless communication unit 250 connects the glass type wearable device 10 with a wireless access point, and transmits a selection command and a control command of the electronic device input through the wireless communication. In addition, in order to recognize a real-time indoor location of the glass type wearable device 10, a function of receiving an indoor wireless communication signal may be performed.
또한, 제1카메라(121);를 더 포함할 수 있다. 제1카메라(121)는 글라스형 웨어러블 디바이스(10)의 일측에 구비되어 전방의 영상 또는 이미지를 획득하는 카메라로, 사용자가 주시하는 상기 전자기기를 인식하기 위해 전방의 영상 또는 이미지를 획득하는 기능을 수행한다. 제어부(210)는 제1카메라(121)에 의해 입력된 영상 또는 이미지의 중앙에 위치한 전자기기를 인식하거나 사용자의 특정 핸드제스처 내에 존재하는 전자기기를 인식할 수 있다. 또한, 제1카메라(121)는 글라스형 웨어러블 디바이스(10)의 실시간 실내 측위를 위해 전방의 영상 또는 이미지를 획득하는 기능을 수행한다. 제어부(210)는 제1카메라(121)에 의해 획득된 전방 영상 또는 이미지에서 특징요소를 추출하고 상기 특징요소의 위치, 크기 등을 바탕으로 실내 위치를 판단할 수 있다.In addition, the first camera 121 may further include a. The first camera 121 is a camera provided at one side of the glass type wearable device 10 to acquire an image or image in front of the user. The first camera 121 acquires an image or image in front of the user to recognize the electronic device that the user watches. Do this. The controller 210 may recognize an electronic device located in the center of the image or image input by the first camera 121 or may recognize the electronic device present in a specific hand gesture of the user. In addition, the first camera 121 performs a function of acquiring a front image or an image for real time indoor positioning of the glass type wearable device 10. The controller 210 may extract a feature element from the front image or the image acquired by the first camera 121, and determine the indoor location based on the location, size, etc. of the feature element.
또한, 모션센서; 및 제2카메라(122);를 더 포함할 수 있다. 상기 모션센서는 사용자의 고개움직임 패턴을 인식하여 상기 전자기기의 선택명령 또는 제어명령을 입력하는 기능을 수행한다. 또한, 상기 모션센서는 사용자가 바라보는 방향을 인식하는 기능을 수행한다. 상기 지자기센서와 상기 자이로센서(131) 등이 방위각을 측정하고 상기 가속도센서(132)가 고저각을 측정하여, 실내의 현재위치에서 사용자가 바라보는 방향을 인식할 수 있다.In addition, a motion sensor; And a second camera 122. The motion sensor recognizes a user's moving pattern and performs a function of inputting a selection command or a control command of the electronic device. In addition, the motion sensor performs a function of recognizing the direction the user looks. The geomagnetic sensor and the gyro sensor 131 and the like measure the azimuth angle and the acceleration sensor 132 measures the high and low angles, it is possible to recognize the direction the user looks at the current location of the room.
제2카메라(122)는 글라스형 웨어러블 디바이스(10)의 일측에 구비되어 안구방향의 영상 또는 이미지를 획득하는 카메라이다. 제2카메라(122)는 사용자의 눈깜박임 패턴을 인식하여 전자기기의 선택명령 또는 제어명령을 입력하는 기능을 수행한다. 또한, 제2카메라(122)가 안구방향 영상 또는 이미지를 획득하여 글라스형 웨어러블 디바이스(10)가 아이트래킹을 수행하여, 시선방향을 고려한 사용자의 주시방향을 정확하게 파악할 수 있다.The second camera 122 is a camera provided on one side of the glass type wearable device 10 to acquire an image or an image in an eyeball direction. The second camera 122 recognizes the user's blink pattern and performs a function of inputting a selection command or a control command of the electronic device. In addition, the second camera 122 acquires an eyeball direction image or an image, and the glass type wearable device 10 performs eye tracking, so that the gaze direction of the user considering the eyeline direction can be accurately determined.
무선 엑세스포인트(50)는 글라스형 웨어러블 디바이스(10)로부터 무선통신을 통해 전자기기의 선택명령 및 제어명령을 수신하고, 상기 제어명령을 선택된 상기 전자기기로 전송하는 기능을 수행한다.The wireless access point 50 receives a selection command and a control command of the electronic device from the glass type wearable device 10 through wireless communication, and transmits the control command to the selected electronic device.
이하, 도면을 참조하여 본 발명의 실시예들에 따른 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어시스템, 제어방법 및 제어프로그램에 대해 설명하기로 한다.Hereinafter, an indoor electronic device control system, a control method, and a control program using a glass type wearable device according to embodiments of the present invention will be described with reference to the accompanying drawings.
도 15는 본 발명의 일실시예에 따른 포탄방향인식시스템(60)의 내부구성도이다. 15 is an internal configuration diagram of the shell direction recognition system 60 according to an embodiment of the present invention.
도 15에 따르면, 본 발명의 일실시예에 따른 포탄방향인식시스템(60)은, 사격제원측정부(610); GPS모듈(620); 무선통신부(630); 및 제2제어부(640);를 포함한다. According to Figure 15, the shell direction recognition system 60 according to an embodiment of the present invention, the shooting specification measuring unit 610; GPS module 620; A wireless communication unit 630; And a second control unit 640.
사격제원측정부(610)는 포의 사격제원을 측정하는 기능을 수행한다. 사격제원측정부(610)는, 방위각센서(612); 및 고각측정센서(611);를 포함할 수 있다. 고각측정센서(611)는 상기 포의 고각을 인식하는 기능을 수행한다. 고각측정센서(611)는 중력측정방식을 활용하는데, 상기 중력측정은 미세전자기계시스템(Micro Electro Mechanical Systems; MEMS)소자 등을 통해 지구중력을 측정하여 중력 대비 기울어짐을 측정하는 방식과 도전 액체를 이용하여 도통단계에 따른 기울어짐을 측정하는 방식이 적용될 수 있다. 다만, 고각을 측정하는 기술은 이에 한정되지 아니하고, 고각을 측정하는 다양한 기술이 모두 적용될 수 있다. 상기 방위각센서(612)는 상기 포의 방위각을 인식하는 기능을 수행한다. The shooting specification measuring unit 610 performs a function of measuring the shooting specifications of the gun. Shooting specification measuring unit 610, the azimuth sensor 612; And an elevation measurement sensor 611. An elevation sensor 611 performs a function of recognizing the elevation of the gun. The elevation sensor 611 uses a gravity measurement method. The gravity measurement is a method of measuring tilt against gravity by measuring earth gravity through a micro electro mechanical system (MEMS) device and a conductive liquid. By using the method of measuring the inclination according to the conduction step can be applied. However, the technique of measuring the elevation is not limited thereto, and various techniques of measuring the elevation may be applied. The azimuth sensor 612 recognizes the azimuth of the gun.
방위각센서(612)는 자이로 컴파스(Gyrocompass)로 이루어질 수 있다. 상기 자이로 컴파스는 고속 회전하는 자이로스코프의 축에 추를 달아 지구 자전의 영향으로 지구의 자전축(진북방향)을 지시하여 방향을 획득한다. The azimuth sensor 612 may be made of a gyro compass. The gyro compass is attached to the axis of the high-speed rotating gyroscope to obtain the direction by indicating the rotation axis (true north direction) of the earth under the influence of the earth's rotation.
또한, 방위각센서(612)는 전자나침반으로 이루어질 수도 있다. 상기 전자나침반은 미세전자기계시스템(Micro Electro Mechanical Systems; MEMS)기술을 활용한 소형소자 등을 이용해서 지구 자기장을 측정하여 자북방향을 획득하는 장치이다.In addition, the azimuth sensor 612 may be made of an electronic compass. The electronic compass is a device that obtains the magnetic north direction by measuring the earth's magnetic field using a small device using a micro electro mechanical systems (MEMS) technology.
또한, 방위각센서(612)는 복수 개의 GPS모듈로 이루어질 수도 있다. 2개의 GPS값을 획득하여 각각의 위치정보를 서로 비교하여 도북방향(지향방향)을 측정하는 방식이다. 다만, 방위각을 측정하는 기술은 이에 한정되지 아니하고, 방위각을 측정하는 다양한 기술이 모두 적용될 수 있다.In addition, the azimuth sensor 612 may be formed of a plurality of GPS modules. It is a method of measuring the north direction (direction) by obtaining two GPS values and comparing each location information with each other. However, the technology for measuring the azimuth is not limited thereto, and various techniques for measuring the azimuth may be applied.
GPS모듈(620)은 상기 포의 위치를 인식하는 기능을 수행한다. 포탄이 발사되는 위치가 파악되어야 측정된 사격제원과 포의 사거리를 바탕으로 탄착지를 계산할 수 있으므로, GPS모듈(620)은 포의 위치를 인식하여 포탄이 발사되는 위치를 파악한다. The GPS module 620 recognizes the position of the gun. Since the location where the shell is fired can be calculated based on the measured shooting range and the firing range of the gun, the GPS module 620 recognizes the location of the gun to determine the location where the shell is fired.
제2제어부(640)는 사격제원측정부(610)로부터 측정된 사격제원을 상기 무선통신부(630)를 통해 전송하기 위해 데이터를 처리하는 기능을 수행한다. The second control unit 640 performs a function of processing data in order to transmit the shooting specifications measured from the shooting specification measuring unit 610 through the wireless communication unit 630.
무선통신부(630)는 글라스형 웨어러블 디바이스와 정보교환을 행하는 기능을 수행한다. 즉, 무선통신부(630)는 사격제원측정부(610)에서 측정한 사격제원정보를 무선통신을 통해 글라스형 웨어러블 디바이스(10) 등의 외부디바이스로 전송한다.The wireless communication unit 630 performs a function of exchanging information with the glass type wearable device. That is, the wireless communication unit 630 transmits the shooting specification information measured by the shooting specification measuring unit 610 to an external device such as the glass type wearable device 10 through wireless communication.
또한, 전원공급부;를 더 포함할 수 있다. 전원공급부는 시스템 구동을 위해 전원을 공급하는 기능을 수행한다. 전원공급부는 무선 또는 유선으로 충전이 이루어질 수도 있으며, 전원 교체식으로 이루어질 수도 있다.In addition, the power supply unit; may further include. The power supply unit supplies power to drive the system. The power supply unit may be charged wirelessly or wired, or may be made of a power replacement.
또한, 포격인식부;를 더 포함할 수 있다. 포격인식부는 포탄 발사 여부를 인식하는 기능을 수행한다. 포격인식부는 가속도센서, 진동센서, 음향센서, 스모크센서 등을 포함할 수 있다. 상기 가속도센서는 어느 한 방향의 가속도 변화에 대해서 이를 전기 신호로 바꾸어 주는 소자로, 포탄 격발 시 반동에 의한 가속도를 측정하여 격발여부를 인식한다. 상기 음향센서는 포탄 격발 시 발생하는 폭음을 인식하여 격발여부를 인식한다. 상기 스모크센서는 포탄 격발 시에 화약이 폭발하면서 발생하는 연기를 인식하여 격발여부를 파악한다. 포격을 인식하는 방법은 이에 한정되지 아니하고, 포탄 격발시의 특징을 파악할 수 있는 다양한 센서에 의해 구현될 수 있다.In addition, the shelling recognition unit; may further include. The shell recognition unit recognizes whether or not the shell is fired. The shell recognition unit may include an acceleration sensor, a vibration sensor, an acoustic sensor, a smoke sensor, and the like. The acceleration sensor is an element that converts an acceleration change in one direction into an electrical signal, and recognizes whether or not the trigger is triggered by measuring acceleration due to recoil when the shell is triggered. The acoustic sensor recognizes the explosion sound generated when the shell is triggered to recognize whether or not the trigger is triggered. The smoke sensor recognizes the smoke generated by the explosives when the shell is triggered to determine whether or not the trigger. The method of recognizing the shelling is not limited thereto, and may be implemented by various sensors capable of identifying characteristics of the shell firing.
상기 포탄방향인식시스템(60)은 사격제원측정부(610)가 사격제원인 고각과 방위각을 측정하고, GPS모듈(620)이 포의 현재 위치를 측정한다. 제2제어부(640)는 측정된 상기 정보를 무선통신으로 전송하기 위한 정보 처리를 하여 무선통신부(630)로 전달하고, 무선통신부(630)가 글라스형 웨어러블 디바이스로 전송한다. 제2제어부(640)는 무선통신부(630)를 통해 지형정보 및 사거리 정보를 전송받아서 예상탄착지 계산을 수행할 수도 있다.The shell direction recognition system 60 measures the elevation angle and the azimuth angle at which the shooting specification measuring unit 610 is the shooting specification, and the GPS module 620 measures the current position of the gun. The second control unit 640 performs the information processing for transmitting the measured information to the wireless communication to the wireless communication unit 630, the wireless communication unit 630 transmits to the glass type wearable device. The second controller 640 may receive the terrain information and the range information through the wireless communication unit 630 to perform the predicted landing location calculation.
이하, 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공시스템 및 방법에 대해 설명하기로 한다.Hereinafter, a system and method for providing an expected impact landing site using shell-shaped wearable devices according to an embodiment of the present invention will be described.
본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 예상포격위치 표시시스템은, 무선통신부(250); 메모리(260); 제1제어부(210); 및 디스플레이부(310)를 포함한다.Anticipated bombardment position display system using a glass-type wearable device according to an embodiment of the present invention, the wireless communication unit 250; Memory 260; First control unit 210; And a display unit 310.
무선통신부(250)는 포탄방향인식시스템(60)으로부터 사격제원정보를 수신하는 기능을 수행한다. 또한, 무선통신부(250)는 외부서버로부터 지형정보를 수신하는 기능을 수행할 수 있다. 또한, 무선통신부(250)는 상기 포탄방향인식시스템(60)으로부터 포의 현재위치 정보를 수신하는 기능을 수행할 수 있다. 글라스형 웨어러블 디바이스(10)의 포탄방향인식시스템(60)로부터 데이터 수신은 도 17에 도시되어 있다.The wireless communication unit 250 performs a function of receiving shooting specification information from the shell direction recognition system 60. In addition, the wireless communication unit 250 may perform a function of receiving terrain information from an external server. In addition, the wireless communication unit 250 may perform a function of receiving the current position information of the gun from the shell direction recognition system 60. Data reception from the shell direction recognition system 60 of the glass wearable device 10 is shown in FIG. 17.
제1제어부(210)는 무선통신부(250)가 수신한 사격제원정보 및 메모리(260)에 저장된 사거리정보를 바탕으로, 예상탄착지를 계산하는 기능을 수행한다. 상기 사거리는 포 발사 시에 포탄이 날아가는 거리이다. 즉, 상기 사거리정보는 고각에 따른 포탄의 도달거리 계산 시에 필요한 포탄의 발사속도 등의 여러 가지 정보를 말한다. 또한, 제1제어부(210)는 상기 포탄방향인식시스템(60)의 GPS모듈(620) 또는 상기 포격위치 표시시스템의 GPS모듈에 의해 현재위치를 파악하고, 상기 수신한 사격제원정보 중 고각을 바탕으로 상기 고각에서의 포탄이 도달하는 사거리를 계산한다. 또한, 제1제어부(210)는 수신한 사격제원정보 중 방위각을 적용하여 현재위치에서 어느 방향으로 상기 사거리만큼 포탄이 날아갈지 계산한다.The first control unit 210 performs a function of calculating the expected impact location based on the shooting specification information received by the wireless communication unit 250 and the range information stored in the memory 260. The range is the distance that the shell flies upon firing. That is, the range information refers to various information such as the firing speed of the shell required for calculating the reach of the shell according to the elevation. In addition, the first controller 210 detects the current position by the GPS module 620 of the shell direction recognition system 60 or the GPS module of the shelling position display system, and based on the elevation of the received shooting specification information. Calculate the range the shell reaches at the elevation. In addition, the first controller 210 calculates which direction the shell will fly by the firing range in the direction from the current position by applying the azimuth angle of the received shooting specification information.
또한, 제1제어부(210)는, 지형정보(예를 들어, 등고선 지도 또는 건물 높이 등의 정보가 포함된 지도)를 반영하여 예상탄착지를 계산할 수 있다. 즉, 산의 높이 등의 지형 특성에 따라서 포탄이 도달할 수 있는 위치가 달라질 수 있으므로, 상기 제1제어부(210)는 지형정보를 고려하여 예상탄착지를 계산할 수 있다.In addition, the first controller 210 may calculate the predicted impact location by reflecting the terrain information (for example, a map including information such as a contour map or a building height). That is, since the position where the shell can reach may vary according to the terrain characteristics such as the height of the mountain, the first controller 210 may calculate the predicted landing area in consideration of the terrain information.
상기 지형정보는, 외부서버로부터 상기 무선통신부(250)에 의해 수신하거나 상기 글라스형 웨어러블 디바이스(10)의 메모리(260)에 저장할 수 있다. 또한, 상기 지형정보는, 상기 무선통신부(250)에 의해 수신되는 실시간 획득 정보;를 더 포함할 수 있다. 예를 들어, 상기 실시간 획득 정보는 정찰기 등에 의해 획득되는 현재 작전지역의 이미지를 포함할 수 있다. 따라서, 상기 제1제어부(210)는 실시간으로 전송된 이미지를 통해 건물, 숲, 적의 진지상황 등을 파악하고, 이를 고려하여 탄착지를 계산할 수 있다. 다만, 상기 실시간 획득 정보는 이에 한정되지 아니하고, 실시간으로 전장상황과 관련하여 획득되는 다양한 정보를 포함할 수 있다.The terrain information may be received from an external server by the wireless communication unit 250 or stored in the memory 260 of the glass type wearable device 10. The terrain information may further include real-time acquisition information received by the wireless communication unit 250. For example, the real-time acquisition information may include an image of the current operation area obtained by the reconnaissance plane. Therefore, the first control unit 210 may determine the situation of the building, the forest, the enemy's position and the like through the image transmitted in real time, and calculate the impact point in consideration of this. However, the real-time acquisition information is not limited thereto, and may include various information obtained in real time with respect to the battlefield situation.
메모리(260)는 포의 종류별 사거리정보를 저장하는 기능을 수행한다. 또한, 메모리(260)는 지형정도(예를 들어, 등고선 지도 또는 건물 높이 등의 정보가 포함된 지도)를 저장하는 기능을 수행할 수 있다.The memory 260 stores a range information for each type of gun. In addition, the memory 260 may perform a function of storing a terrain degree (for example, a map including information such as a contour map or a building height).
디스플레이부(310)는, 도 18에서와 같이, 지도에 예상탄착지를 표시하여 사용자에게 디스플레이하는 기능을 수행한다.As shown in FIG. 18, the display 310 displays a predicted landing on a map and displays the expected landing location to the user.
또한, 본 발명의 일실시예는 사용자입력부(110);를 포함할 수 있다. 사용자입력부(110)는 사용자가 원하는 포격지위치정보를 입력하는 기능을 수행한다. 상기 포격지위치정보는 사용자가 원하는 포탄의 명중지점의 위치정보로, 예를 들어, 상기 위치의 위도 및 경도가 해당될 수 있다.In addition, an embodiment of the present invention may include a user input unit 110. The user input unit 110 performs a function of inputting the bombardment location information desired by the user. The bombardment location information is location information of the hit point of the shell desired by the user, for example, the latitude and longitude of the location may correspond.
또한, 음성입력부(140) 및 음성인식부(220)를 포함할 수 있다. 음성입력부(140)는 포격지위치정보를 사용자의 음성명령으로 입력받는 기능을 수행하고, 음성인식부(220)는 상기 입력된 음성명령에서 포격지위치정보를 파악하는 기능을 수행한다.In addition, the voice input unit 140 and the voice recognition unit 220 may be included. The voice input unit 140 performs a function of receiving the bombardment location information as a voice command of the user, and the voice recognition unit 220 performs a function of identifying the bombardment location information from the input voice command.
또한, 본 발명의 일실시예는 제1카메라(121);를 포함할 수 있다. 상기 제1카메라(121)는 상기 글라스형 웨어러블 디바이스의 전면부 일측에 구비되어 사용자의 정면의 영상 또는 이미지를 획득하는 카메라이다. 제1카메라(121)는, 포격지위치정보를 입력받기 위해 영상 또는 이미지를 획득하는 기능을 수행한다. 제1카메라(121)는 포격지위치정보가 포함된 이미지를 획득하여 제1제어부(210)로 전송하고, 제1제어부(210)는 상기 이미지 내의 포격지위치정보에 해당하는 문자를 추출하고, 추출한 문자의 위치정보(위도 및 경도)를 인식할 수 있다.In addition, an embodiment of the present invention may include a first camera 121. The first camera 121 is a camera provided at one side of the front part of the glass type wearable device to acquire an image or an image of the front of the user. The first camera 121 performs a function of acquiring an image or an image to receive the bombardment location information. The first camera 121 obtains an image including the bombardment location information and transmits the image to the first controller 210, and the first controller 210 extracts a character corresponding to the bombardment location information in the image, The location information (latitude and longitude) of the extracted text can be recognized.
또한, GPS모듈;을 더 포함할 수 있다. GPS모듈은 현재 사용자의 위치를 파악하는 기능을 수행한다. 포를 발사하는 사용자의 위치는 포의 위치와 오차범위 내에 있다. 그러므로 무선통신부(250)에 의해서 상기 포탄방향인식시스템(60)으로부터 포의 현재 위치정보를 수신하지 않고, GPS모듈이 사용자의 현재 위치를 측정하여 상기 예상탄착지 계산을 위해 제1제어부(210)로 전달할 수 있다.In addition, the GPS module may further include. The GPS module performs the function of identifying the current user's location. The position of the user firing the gun is within the position and error of the gun. Therefore, without receiving the current position information of the artillery from the shell direction recognition system 60 by the wireless communication unit 250, the GPS module measures the current position of the user and the first control unit 210 for calculating the predicted impact location. Can be delivered to.
또한, 알람부(330);를 더 포함할 수 있다. 알람부(330)는 사용자입력부(110) 또는 제1카메라(121)에 의해 입력받은 포격지위치정보와 제1제어부(210)에 의해 계산된 예상탄착지 위치정보가 일치하는 경우, 사용자에게 알리는 기능을 수행한다. 알람부(330)는 진동에 의해 사용자에게 알리는 진동알람부(330) 또는 음향 출력으로 사용자에게 알리는 음향출력부(320)를 포함할 수 있다.In addition, the alarm unit 330; may further include. The alarm unit 330 notifies the user when the bombardment location information input by the user input unit 110 or the first camera 121 and the predicted landing position information calculated by the first control unit 210 match. Perform the function. The alarm unit 330 may include a vibration alarm unit 330 for notifying the user by vibration or a sound output unit 320 for notifying the user by sound output.
도 16은 본 발명의 바람직한 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 포탄 탄착지 예상방법에 대한 순서도이다.16 is a flowchart illustrating a method for predicting shell impact using a glass type wearable device according to an exemplary embodiment of the present invention.
도 16을 참조하면, 본 발명의 또 다른 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법은, 현재 위치정보를 수신하는 단계(S600); 실시간 사격제원정보를 포탄방향인식시스템으로부터 수신하는 단계(S610); 상기 사격제원정보 및 사거리정보를 바탕으로 예상탄착지를 계산하는 단계(S620); 및 상기 예상탄착지를 지도상에 표시하여 사용자에게 제공하는 단계(S630);를 포함한다. 본 발명의 일실시예에 따른 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법을 순서대로 설명한다.Referring to FIG. 16, a method for providing an expected impact location for a shell using a glass type wearable device according to another embodiment of the present disclosure includes: receiving current location information (S600); Receiving real-time shooting specification information from the shell direction recognition system (S610); Calculating a predicted landing point based on the shooting specification information and the range information (S620); And displaying the expected impact point on a map and providing the same to the user (S630). A method of providing a projected impact target shell using a glass type wearable device according to an embodiment of the present invention will be described in order.
글라스형 웨어러블 디바이스(10)는 현재 위치정보를 수신한다(S600). 상기 현재 위치정보는, 상기 글라스형 웨어러블 디바이스가 상기 포탄방향 인식시스템으로부터 상기 현재 위치정보를 수신하거나, 상기 글라스형 웨어러블 디바이스가 상기 현재 위치정보를 측정 할 수 있다. 예를 들어, 상기 글라스형 웨어러블 디바이스(10)는, 상기 포탄방향 인식시스템의 GPS모듈(420)이 측정한 위치정보를 수신하거나 상기 글라스형 웨어러블 디바이스(10)의 GPS모듈이 현재 위치를 측정할 수 있다.The glass type wearable device 10 receives current location information (S600). The current position information may include the glass type wearable device receiving the current position information from the shell direction recognition system, or the glass type wearable device measuring the current position information. For example, the glass type wearable device 10 may receive location information measured by the GPS module 420 of the shell direction recognition system, or the GPS module of the glass type wearable device 10 may measure a current position. Can be.
실시간 사격제원정보를 포탄방향인식시스템으로부터 수신한다(S610). 글라스형 웨어러블 디바이스(10)가 상기 포탄방향인식시스템(60)로부터 사격제원측정부(410)에 의해 측정된 사격제원정보를 무선통신을 통해 수신한다. 상기 사격제원정보는 포의 고각정보 및 방위각정보가 포함될 수 있다.Real-time shooting specification information is received from the shell direction recognition system (S610). The glass type wearable device 10 receives the shooting specification information measured by the shooting specification measuring unit 410 from the shell direction recognition system 60 through wireless communication. The shooting specification information may include elevation information and azimuth information of the gun.
글라스형 웨어러블 디바이스(10)는 상기 사격제원정보 및 사거리정보를 바탕으로 예상탄착지를 계산한다(S620). 예를 들어, 글라스형 웨어러블 디바이스(10)는 상기 현재위치를 기준으로, 상기 수신한 사격제원정보 중 고각을 바탕으로 상기 고각에서의 포탄이 도달하는 사거리를 계산할 수 있다. 글라스형 웨어러블 디바이스(10)는 상기 수신한 사격제원정보 중 방위각을 적용하여 현재위치에서 어느 방향으로 상기 계산된 사거리만큼 포탄이 날아갈지 계산한다. 이를 통해서 상기 예상탄착지를 산출할 수 있다. The glass type wearable device 10 calculates an expected landing point based on the shooting specification information and the range information (S620). For example, the glass type wearable device 10 may calculate the firing range of the shell at the elevation based on the elevation of the received shooting specification information based on the current position. The glass type wearable device 10 calculates which direction the shell will fly by the calculated range in which direction from the current position by applying an azimuth angle from the received shooting specification information. Through this, the expected impact point can be calculated.
상기 예상탄착지 계산은, 상기 예상포격위치 표시시스템(10)의 제1제어부(210)에서뿐만 아니라, 포탄방향인식시스템(60)의 제2제어부(440)에 의해서 수행될 수 있고, 글라스형 웨어러블 디바이스(10)가 계산된 예상탄착지 정보를 포탄방향인식시스템(60)으로부터 수신할 수 있다. The predicted impact location calculation may be performed by the second control unit 440 of the shell direction recognition system 60 as well as the first control unit 210 of the predicted bombardment position display system 10, and wearable of the glass type wearable. The device 10 may receive the calculated predicted impact location information from the shell direction recognition system 60.
또한, 상기 예상탄착지 계산은, 외부 서버로 상기 사격제원정보 및 상기 현재 위치정보를 전송하여, 상기 외부서버가 수행할 수도 있다. 글라스형 웨어러블 디바이스는 외부서버에서 계산된 예상탄착지 정보를 무선통신부(250)를 통해 수신할 수 있다.In addition, the predicted impact location calculation may be performed by the external server by transmitting the shooting specification information and the current location information to an external server. The glass type wearable device may receive the predicted impact location information calculated by the external server through the wireless communication unit 250.
또한, 상기 예상탄착지 계산은, 글라스형 웨어러블 디바이스(10)가 수신한 지형정보를 반영하여 탄착지를 계산할 수 있다. 상기 지형정보는, 특정지역의 지형상의 특징 또는 특정지역에 위치한 지물에 대한 정보를 의미할 수 있다. 글라스형 웨어러블 디바이스(10)는 무선통신을 통해 외부서버로부터 등고선 지도 또는 건물 높이 등의 정보가 포함된 지도와 같은 지형정보 데이터를 수신할 수 있다. 또한, 글라스형 웨어러블 디바이스(10)는 정찰기(예를 들어, UAV) 등에 의해 실시간으로 획득된 지형정보를 무선통신을 통해 실시간으로 획득할 수 있다.In addition, the predicted impact location calculation may calculate the impact location by reflecting the terrain information received by the glass type wearable device 10. The terrain information may mean information on features of a terrain of a specific region or features located in a specific region. The glass type wearable device 10 may receive terrain information data such as a map including contour maps or building heights from an external server through wireless communication. In addition, the glass type wearable device 10 may acquire terrain information obtained in real time by a reconnaissance device (for example, UAV) in real time through wireless communication.
글라스형 웨어러블 디바이스(10)는 상기 예상탄착지를 지도상에 표시하여 사용자에게 제공한다(S630). 즉, 예상탄착지를 지도(예를 들어, 등고선이 표시된 지도)상에 표시하여 상기 디스플레이부(310)를 통해 사용자에게 시각적으로 제공한다.The glass type wearable device 10 displays the expected impact point on a map and provides the same to the user (S630). That is, the predicted impact location is displayed on a map (for example, a map on which contour lines are displayed) and visually provided to the user through the display unit 310.
또한, 상기 예상탄착지 계산단계(S620)는, 상기 사격제원정보 및 사거리정보를 바탕으로 제1예상탄착지를 계산하는 단계; 상기 현재위치와 상기 제1예상탄착지를 포함하는 지형정보를 수신하는 단계; 및 상기 지형정보를 반영하여 제2예상탄착지를 계산하는 단계;를 포함할 수 있다. 지형정보 데이터는 실시간으로 변경될 수 있으며 데이터의 크기가 매우 크므로, 필요한 지역에 대한 지형정보만을 제공받는 것이 효율적이다. 따라서, 글라스형 웨어러블 디바이스는 대략적인 예상탄착지를 계산하여 해당 영역의 지형정보를 요청하여 수신할 필요가 있다. In addition, the predicted impact land calculation step (S620), the step of calculating a first estimated impact land based on the shooting specification information and the range information; Receiving terrain information including the current location and the first predicted impact landing; And calculating a second predicted landing site by reflecting the terrain information. Since the topographical information data can be changed in real time and the size of the data is very large, it is efficient to receive only the topographical information for the required area. Therefore, the glass type wearable device needs to calculate the approximate predicted impact location and request and receive the topographic information of the corresponding area.
먼저, 글라스형 웨어러블 디바이스(10)는 상기 사격제원정보 및 사거리정보를 바탕으로 제1예상탄착지를 계산할 수 있다. 즉, 글라스형 웨어러블 디바이스(10)는 지형정보를 고려하지 않은 평지일 경우의 예상 탄착 지점인 제1예상탄착지를 산출할 수 있다. 그 후, 글라스형 웨어러블 디바이스(10)는 상기 현재위치와 상기 제1예상탄착지를 포함하는 지형정보를 요청하여 수신할 수 있다. 그 후, 글라스형 웨어러블 디바이스는 상기 지형정보를 반영하여 제2예상탄착지를 계산할 수 있다. 상기 예상탄착지 제공단계(S630)는, 상기 제2예상탄착지를 상기 지도상에 상기 예상탄착지로 표시할 수 있다.First, the glass type wearable device 10 may calculate a first predicted impact landing based on the shooting specification information and the range information. That is, the glass type wearable device 10 may calculate a first predicted impact point, which is an expected impact point in the case of flat land without considering the terrain information. Thereafter, the glass type wearable device 10 may request and receive terrain information including the current location and the first predicted impact landing. Thereafter, the glass type wearable device may calculate the second predicted impact landing by reflecting the terrain information. In the predicted impact land providing step (S630), the second predicted impact land may be displayed as the predicted impact land on the map.
또한, 외부서버로부터 상기 현재 위치 또는 상기 예상탄착지를 포함하는 상기 지도 데이터를 수신하는 단계;를 더 포함할 수 있다.The method may further include receiving the map data including the current location or the predicted impact location from an external server.
또한, 글라스형 웨어러블 디바이스(10)가 포격지위치정보를 획득하는 단계;를 더 포함할 수 있다. 글라스형 웨어러블 디바이스(10)는, 무선통신으로 외부서버로부터 상기 포격지위치정보를 수신하는 방식, 제1카메라에 의해 획득된 이미지에서 상기 포격지위치정보에 상응하는 문자를 인식하는 방식, 사용자의 음성데이터에서 상기 포격지위치정보를 인식하는 방식 등을 통해서 포격지위치정보를 획득할 수 있다.In addition, the glass-type wearable device 10 may further include obtaining bombardment location information. The glass type wearable device 10 may receive the bombardment location information from an external server through wireless communication, recognize a character corresponding to the bombardment location information in an image obtained by a first camera, and The bombardment location information may be obtained through a method of recognizing the bombardment location information in voice data.
또한, 상기 포격지위치정보를 지도상에 표시하여 디스플레이하는 단계;를 더 포함할 수 있다. 즉, 글라스형 웨어러블 디바이스(10)는 예상탄착지정보와 함께 포격지위치정보를 지도상에 시각적으로 표시할 수 있다.The method may further include displaying and displaying the bombardment location information on a map. That is, the glass type wearable device 10 may visually display the bombardment location information along with the predicted impact location information on the map.
또한, 상기 포격지위치정보와 상기 예상탄착지정보가 일치하는 경우, 사용자에게 알림을 수행하는 단계;를 더 포함할 수 있다. 글라스형 웨어러블 디바이스는 실시간 사격제원정보를 바탕으로 실시간 계산되는 예상탄착지정보와 사용자가 원하는 포격지위치정보가 일치하는지 여부를 실시간으로 판단할 수 있다. 글라스형 웨어러블 디바이스(10)는 포격지위치정보와 예상탄착지정보가 일치함을 사용자에게 알려줄 수 있다. 이를 통해, 사용자가 원하는 탄착지로 포탄을 정확하게 발사할 수 있다.In addition, if the bombardment location information and the predicted impact landing information, the step of notifying the user; may further include. The glass type wearable device may determine in real time whether the predicted landing location information calculated in real time and the location location of the desired bombardment target are matched based on real-time shooting specifications. The glass type wearable device 10 may inform the user that the bombardment location information and the predicted landing information match. This allows the user to accurately fire the shell at the desired landing area.
이상에서 전술한 본 발명의 실시예들에 따른 글라스형 웨어러블 디바이스를 이용한 정보처리 방법은, 하드웨어인 글라스형 웨어러블 디바이스(10)와 결합되어 실행되기 위해 프로그램(또는 어플리케이션)으로 구현되어 매체에 저장될 수 있다.The information processing method using the glass type wearable device according to the embodiments of the present invention described above is implemented as a program (or an application) to be executed in combination with the glass type wearable device 10 which is hardware, and stored in a medium. Can be.
상기 전술한 프로그램은, 상기 글라스형 웨어러블 디바이스(10)가 프로그램을 읽어 들여 프로그램으로 구현된 상기 방법들을 실행시키기 위하여, 상기 글라스형 웨어러블 디바이스(10)의 프로세서(CPU)가 상기 글라스형 웨어러블 디바이스(10)의 장치 인터페이스를 통해 읽힐 수 있는 C, C++, JAVA, 기계어 등의 컴퓨터 언어로 코드화된 코드(Code)를 포함할 수 있다. 이러한 코드는 상기 방법들을 실행하는 필요한 기능들을 정의한 함수 등과 관련된 기능적인 코드(Functional Code)를 포함할 수 있고, 상기 기능들을 상기 글라스형 웨어러블 디바이스(10)의 프로세서가 소정의 절차대로 실행시키는데 필요한 실행 절차 관련 제어 코드를 포함할 수 있다. 또한, 이러한 코드는 상기 기능들을 상기 글라스형 웨어러블 디바이스(10)의 프로세서가 실행시키는데 필요한 추가 정보나 미디어가 상기 글라스형 웨어러블 디바이스(10)의 내부 또는 외부 메모리의 어느 위치(주소 번지)에서 참조되어야 하는지에 대한 메모리 참조관련 코드를 더 포함할 수 있다. 또한, 상기 글라스형 웨어러블 디바이스(10)의 프로세서가 상기 기능들을 실행시키기 위하여 원격(Remote)에 있는 어떠한 다른 컴퓨터나 서버 등과 통신이 필요한 경우, 코드는 상기 글라스형 웨어러블 디바이스(10)의 통신 모듈을 이용하여 원격에 있는 어떠한 다른 컴퓨터나 서버 등과 어떻게 통신해야 하는지, 통신 시 어떠한 정보나 미디어를 송수신해야 하는지 등에 대한 통신 관련 코드를 더 포함할 수 있다. In the above-described program, in order for the glassy wearable device 10 to read the program and execute the methods implemented as a program, the processor CPU of the glassy wearable device 10 may execute the glassy wearable device ( 10) may include code coded in a computer language such as C, C ++, JAVA, or machine language that can be read through the device interface. Such code may include functional code associated with a function or the like that defines the necessary functions for executing the methods, the execution of which is required for the processor of the glassy wearable device 10 to execute according to a predetermined procedure. Procedure-related control codes. In addition, such code requires that additional information or media required to execute the functions by the processor of the glassy wearable device 10 is referred to at any position (address address) of the internal or external memory of the glassy wearable device 10. It may further include a memory reference code for whether or not. In addition, when the processor of the glass type wearable device 10 needs to communicate with any other computer or server that is remote in order to execute the functions, the code may be configured to communicate with the communication module of the glass type wearable device 10. It may further include communication-related code, such as how to communicate with any other computer or server in the remote, what information or media should be transmitted and received during communication.
상기 저장되는 매체는, 레지스터, 캐쉬, 메모리 등과 같이 짧은 순간 동안 데이터를 저장하는 매체가 아니라 반영구적으로 데이터를 저장하며, 기기에 의해 판독(reading)이 가능한 매체를 의미한다. 구체적으로는, 상기 저장되는 매체의 예로는 ROM, RAM, CD-ROM, 자기 테이프, 플로피디스크, 광 데이터 저장장치 등이 있지만, 이에 제한되지 않는다. 즉, 상기 프로그램은 상기 글라스형 웨어러블 디바이스(10)가 접속할 수 있는 다양한 서버 상의 다양한 기록매체 또는 사용자의 상기 글라스형 웨어러블 디바이스(10)상의 다양한 기록매체에 저장될 수 있다. 또한, 상기 매체는 네트워크로 연결된 컴퓨터 시스템에 분산되어, 분산방식으로 컴퓨터가 읽을 수 있는 코드가 저장될 수 있다.The stored medium is not a medium for storing data for a short time such as a register, a cache, a memory, but semi-permanently, and means a medium that can be read by the device. Specifically, examples of the storage medium include, but are not limited to, a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like. That is, the program may be stored in various recording media on various servers accessible by the glass type wearable device 10 or various recording media on the glass type wearable device 10 of the user. The media may also be distributed over network coupled computer systems so that the computer readable code is stored in a distributed fashion.
이상에서는 본 발명의 바람직한 실시예 및 응용예에 대하여 도시하고 설명하였지만, 본 발명은 상술한 특정의 실시예 및 응용예에 한정되지 아니하며, 청구범위에서 청구하는 본 발명의 요지를 벗어남이 없이 당해 발명이 속하는 기술분야에서 통상의 지식을 가진자에 의해 다양한 변형실시가 가능한 것은 물론이고, 이러한 변형실시들은 본 발명의 기술적 사상이나 전망으로부터 개별적으로 이해되어서는 안될 것이다.While the above has been shown and described with respect to preferred embodiments and applications of the present invention, the present invention is not limited to the specific embodiments and applications described above, the invention without departing from the gist of the invention claimed in the claims Various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or the prospect of the present invention.

Claims (33)

  1. 글라스형 웨어러블 디바이스를 이용하여 사용자에게 적절한 탈출경로를 제공하는 방법에 있어서,In the method of providing an appropriate escape route to the user by using the glass wearable device,
    건물 내 사건발생위치정보를 수신하는 단계; Receiving incident location information in the building;
    상기 사건발생위치정보를 바탕으로, 안전한 비상구 위치정보를 판단하여 탈출경로를 계산하는 단계;Calculating an escape route by determining safe emergency exit location information based on the event occurrence location information;
    상기 계산된 탈출경로를 따라 안내정보를 생성하는 단계; 및Generating guide information along the calculated escape route; And
    사용자에게 상기 안내정보를 제공하는 단계;를 포함하며,Providing the guide information to a user; includes,
    상기 사건발생위치정보는, 특정 건물 내의 사용자의 대피를 요하는 특정한 사건이 발생한 위치정보인, 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법.The event occurrence location information, the location information is a location where a specific event that requires evacuation of the user in a specific building, the escape path providing method using the glass-type wearable device.
  2. 제1항에 있어서,The method of claim 1,
    상기 안내정보 제공단계는, The guide information providing step,
    상기 안내정보를 화면에 표시하여 사용자에게 알리는 방식, 상기 안내정보를 음향출력으로 사용자에게 알리는 방식, 진동 발생의 세기와 진동 발생 방향을 통해 사용자에게 상기 안내정보를 알리는 방식 중 적어도 하나 이상을 포함하는 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법.And displaying at least one of a method of informing the user by displaying the guide information on a screen, a method of informing the user of the guide information by sound output, and a method of informing the user of the guide information through the strength and direction of vibration generation. A method of providing an escape route using a glass type wearable device.
  3. 제1항에 있어서,The method of claim 1,
    상기 글라스형 웨어러블 디바이스가 실내 측위를 통해 현재 위치를 측량하는 단계;를 포함하고,And measuring, by the glass-shaped wearable device, a current position through indoor positioning.
    상기 탈출경로 계산단계는,The escape route calculation step,
    상기 실내위치정보 및 사건발생위치정보를 바탕으로 적절한 비상구 및 탈출경로를 계산하는, 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법.A method for providing an escape route using a glass type wearable device for calculating an appropriate emergency exit and escape route based on the indoor location information and the occurrence location information.
  4. 제3항에 있어서,The method of claim 3,
    상기 사용자의 실시간 위치정보를 무선통신을 통해 외부 디바이스로 전송하는 단계;를 포함하는, 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법.And transmitting the real time location information of the user to an external device through wireless communication.
  5. 제3항에 있어서,The method of claim 3,
    상기 글라스형 웨어러블 디바이스가 건물 위치정보를 파악하는 단계; 및Identifying, by the glass type wearable device, building location information; And
    상기 건물 위치정보를 바탕으로 상기 건물 내부지도를 외부서버에 요청하여 수신하는 단계;를 더 포함하는, 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법.And requesting and receiving an internal server map from the inside of the building based on the building location information. The escape route providing method using the glass type wearable device.
  6. 제1항에 있어서The method of claim 1
    무선통신을 통해 실시간 사건진행상황정보를 수신하는 단계;를 포함하고,Receiving real-time event progress status information via wireless communication, including;
    상기 안내정보 제공단계는,The guide information providing step,
    상기 사건발생위치정보 또는 상기 사건진행상황정보를 사용자에게 제공하는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법.And providing the event occurrence location information or the event progress status information to a user.
  7. 제5항에 있어서,The method of claim 5,
    상기 사건진행상황정보는,The case progress information,
    상기 사건이 화재사고인 경우,If the incident is a fire accident,
    외부서버에 의해 작동한 스프링쿨러 위치 또는 개수, 건물 내 카메라 또는 통신장비의 위치 또는 고장시점 중 적어도 하나 이상을 바탕으로 분석되어, 무선통신을 통해 상기 글라스형 웨어러블 디바이스가 수신하는, 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법.A glass type wearable device that is analyzed based on at least one or more of a sprinkler position or number operated by an external server, a location or a failure point of a camera or communication equipment in a building, and received by the glass type wearable device through wireless communication. Providing an escape route using.
  8. 제1항 내지 제7항 중 어느 한 항에 있어서,The method according to any one of claims 1 to 7,
    상기 안내정보는,The guide information,
    이동경로 정보, 상기 이동경로 전환 방향, 상기 이동경로 전환까지 남은 거리, 이동시 주의사항 중 적어도 하나 이상을 포함하는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 탈출경로 제공방법.And at least one of moving path information, the moving path switching direction, the distance remaining until the moving path switching, and moving precautions.
  9. 글라스형 웨어러블 디바이스를 이용하여 사용자의 긴급상황을 인식하여 비상연락을 수행하는 방법에 있어서,In a method of performing an emergency contact by recognizing an emergency situation of a user using a glass type wearable device,
    사용자의 음성입력 또는 동작입력을 획득하는 단계;Obtaining a voice input or an operation input of a user;
    상기 음성입력 또는 동작입력을 인식하여 긴급상황을 판단하는 단계; 및Determining an emergency situation by recognizing the voice input or operation input; And
    무선통신으로 비상연락 상대방에게 상기 긴급상황 알림정보를 전송하는 단계;를 포함하는 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법.Transmitting the emergency notification information to the emergency contact counterpart via wireless communication; Emergency emergency contact method using a glass-type wearable device comprising a.
  10. 제9항에 있어서,The method of claim 9,
    상기 음성입력 또는 동작입력에 대응하는 긴급상황분류를 판단하는 단계; 및Determining an emergency classification corresponding to the voice input or operation input; And
    상기 긴급상황분류에 따라 특정한 지정기관 또는 특정한 지인을 상기 비상연락 상대방으로 선택하는 단계;를 더 포함하는 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법.Selecting a specific designated agency or a specific acquaintance as the emergency contact counterpart according to the emergency situation classification; Emergency emergency contact method using a glass-type wearable device further comprising.
  11. 제9항 또는 제10항에 있어서,The method of claim 9 or 10,
    상기 음성입력은,The voice input,
    비명소리 또는 지정된 긴급신호 문구인 것을 특징으로 하는 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법.Emergency call method using a glass-type wearable device, characterized in that the scream or emergency signal phrase.
  12. 제9항에 있어서,The method of claim 9,
    상기 사용자의 현재위치를 측정하는 단계;를 더 포함하고,Measuring the current location of the user; further comprising:
    상기 비상연락 수행단계는, The emergency contact performing step,
    상기 측정된 현재위치정보를 상기 긴급상황 알림정보에 포함하여 전송하는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법.Emergency measurement method using a glass-type wearable device, characterized in that for transmitting the measured current location information included in the emergency notification information.
  13. 제9항에 있어서,The method of claim 9,
    전방의 실시간 영상 촬영 또는 실시간 음성 녹음을 수행하는 단계;를 더 포함하고,Performing real-time video recording or real-time voice recording of the front;
    상기 비상연락 수행단계는,The emergency contact performing step,
    상기 촬영된 영상 또는 상기 녹음된 음성을 무선통신으로 전송하는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법.Emergency communication method using a glass-type wearable device, characterized in that for transmitting the photographed video or the recorded voice over wireless communication.
  14. 제13항에 있어서,The method of claim 13,
    상기 비상연락 수행단계는,The emergency contact performing step,
    상기 실시간 영상을 분석하여 상기 사용자에게 발생한 긴급상황분류를 판단하는 단계;를 더 포함하는, 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법.Analyzing the real-time image to determine the emergency classification occurred to the user; Emergency emergency contact method using a glass-type wearable device.
  15. 제9항에 있어서,The method of claim 9,
    긴급상황 알림정보 전송단계는,Emergency notification information transmission step,
    일정 거리 내의 불특정 상대방이 상기 사용자의 긴급상황을 인식하도록 능동인식 신호를 발신하는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법.Emergency identification method using a glass-type wearable device, characterized in that for transmitting an active recognition signal to recognize the emergency situation of the user within a certain distance.
  16. 제9항에 있어서,The method of claim 9,
    상기 사용자의 긴급상황에 상응하는 응급조치 방식을 외부로 음성 출력하는 단계;를 더 포함하는, 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법.Emergency output method using a glass-type wearable device further comprising the step of outputting a voice to the outside of the emergency measures corresponding to the emergency situation of the user.
  17. 제9항에 있어서,The method of claim 9,
    외부 웨어러블 디바이스로부터 상기 사용자의 생체신호를 수신하거나 상기 글라스형 웨어러블 디바이스가 생체신호를 획득하는 단계;를 포함하고,Receiving the biosignal of the user from an external wearable device or acquiring the biosignal by the glass type wearable device;
    상기 긴급상황 판단단계는,The emergency situation determination step,
    상기 생체신호를 반영하여 사용자의 긴급상황을 판단하는, 글라스형 웨어러블 디바이스를 이용한 긴급시 비상연락 방법.Emergency emergency contact method using a glass-type wearable device for determining the emergency situation of the user by reflecting the bio-signal.
  18. 글라스형 웨어러블 디바이스를 이용하여 실내 전자기기를 제어하는 방법에 있어서,In the method of controlling the indoor electronic device using the glass-type wearable device,
    상기 글라스형 웨어러블 디바이스가 현재 실내 위치를 측정하는 단계;Measuring, by the glass-shaped wearable device, a current indoor location;
    상기 글라스형 웨어러블 디바이스가 방위각 또는 고저각을 인식하여 사용자의 주시 방향을 측정하는 단계;Measuring a gaze direction of a user by recognizing an azimuth angle or a high and low angle by the glass type wearable device;
    상기 측정된 현재 실내 위치에서 상기 주시 방향에 위치한 상기 전자기기를 제어대상으로 판단하는 단계;Determining the electronic device located in the gaze direction from the measured current indoor location as a control object;
    사용자로부터 상기 전자기기의 제어명령을 수신하는 단계; 및Receiving a control command of the electronic device from a user; And
    입력된 상기 제어명령을 상기 전자기기로 무선통신을 통해 전송하는 단계;를 포함하는 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.And transmitting the inputted control command to the electronic device through wireless communication.
  19. 제18항에 있어서,The method of claim 18,
    상기 글라스형 웨어러블 디바이스가 안구 시선방향을 획득하는 단계;를 더 포함하고,Acquiring the eyeball gaze direction by the glass type wearable device;
    상기 제어대상 판단단계는,The control target determination step,
    상기 측정된 현재 실내 위치를 기준으로 상기 주시 방향에 상기 시선방향에 상응하는 각도를 반영하는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.And an angle corresponding to the gaze direction in the gaze direction based on the measured current indoor position. The indoor electronic device control method using the glass type wearable device.
  20. 글라스형 웨어러블 디바이스를 이용하여 실내 전자기기를 제어하는 방법에 있어서,In the method of controlling the indoor electronic device using the glass-type wearable device,
    사용자의 주시방향에 상응하는 이미지를 획득하는 단계;Obtaining an image corresponding to a direction of gaze of a user;
    상기 이미지 분석을 통해 상기 이미지 내 상기 전자기기를 제어대상으로 판단하는 단계;Determining the electronic device in the image as a control object through the image analysis;
    사용자로부터 상기 전자기기의 제어명령을 수신하는 단계; 및Receiving a control command of the electronic device from a user; And
    입력된 상기 제어명령을 상기 전자기기로 무선통신을 통해 전송하는 단계;를 포함하는 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.And transmitting the inputted control command to the electronic device through wireless communication.
  21. 제20항에 있어서,The method of claim 20,
    상기 글라스형 웨어러블 디바이스가 안구 시선방향을 획득하는 단계;를 더 포함하고,Acquiring the eyeball gaze direction by the glass type wearable device;
    상기 제어대상 판단단계는,The control target determination step,
    상기 이미지 내에서 상기 시선방향에 상응하는 영역의 상기 전자기기를 추출하여 상기 제어대상으로 판단하는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.And extracting the electronic device in the region corresponding to the gaze direction from the image to determine the control target, wherein the electronic device is controlled by the glass type wearable device.
  22. 제20항에 있어서,The method of claim 20,
    상기 제어대상 판단단계는,The control target determination step,
    상기 글라스형 웨어러블 디바이스가 상기 이미지에 포함된 사용자의 핸드제스처 영역을 파악하여, 상기 핸드제스처 영역에 대응하는 상기 전자기기를 추출하여 상기 제어대상으로 판단하는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.Using the glass type wearable device, the glass type wearable device detects a user's hand gesture area included in the image, and extracts the electronic device corresponding to the hand gesture area to determine the control target. Indoor electronics control method.
  23. 글라스형 웨어러블 디바이스를 이용하여 실내 전자기기를 제어하는 방법에 있어서,In the method of controlling the indoor electronic device using the glass-type wearable device,
    사용자로부터 디바이스에 제어를 원하는 전자기기 선택명령 및 제어명령에 상응하는 음성명령을 수신하는 단계;Receiving a voice command corresponding to an electronic device selection command and a control command for controlling the device from a user;
    상기 음성명령을 분석하여 제어대상인 전자기기와 제어명령을 판단하는 단계; 및Analyzing the voice command to determine a control command with an electronic device to be controlled; And
    입력된 상기 제어명령을 선택된 상기 전자기기로 무선통신을 통해 전송하는 단계;를 포함하는 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.Transmitting the inputted control command to the selected electronic device through wireless communication; and controlling the indoor electronic device using the glass type wearable device.
  24. 제18항 내지 제22항 중 어느 한 항에 있어서,The method according to any one of claims 18 to 22,
    하나 이상의 전자기기가 제어대상으로 판단된 경우,If one or more electronic devices are determined to be controlled,
    상기 글라스형 웨어러블 디바이스가 상기 제어대상으로 판단된 하나 이상의 전자기기 리스트를 화면상에 표시하여 제공하는 단계; 및Providing, by the glass type wearable device, a list of one or more electronic devices that are determined to be the control object on a screen; And
    사용자로부터 상기 리스트 내 전자기기 중에서 특정한 전자기기를 선택받는 단계;를 더 포함하는, 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.And selecting a specific electronic device from the electronic devices in the list by a user.
  25. 제18항 내지 제22항 중 어느 한 항에 있어서,The method according to any one of claims 18 to 22,
    상기 제어명령 수신단계는,The control command receiving step,
    상기 글라스형 웨어러블 디바이스가 눈깜박임 인식, 사용자의 고개움직임 패턴 인식, 터치조작 입력, 사용자의 음성입력 인식, 사용자의 핸드제스처 인식 중 적어도 하나 이상에 의해 상기 제어명령을 선택받는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.The glass-type wearable device is characterized in that the control command is selected by at least one of blink recognition, recognition of the user's moving pattern, touch manipulation input, recognition of the user's voice input, recognition of the user's hand gesture. Indoor electronic device control method using a wearable device.
  26. 제18항 내지 제23항 중 어느 한 항에 있어서,The method according to any one of claims 18 to 23,
    상기 제어명령에 따른 상기 전자기기 제어 결과를 수신하여 사용자에게 알림하는 단계;를 더 포함하는 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.And receiving a control result of the electronic device according to the control command and informing a user of the control result of the electronic device.
  27. 제18항 내지 제23항 중 어느 한 항에 있어서,The method according to any one of claims 18 to 23,
    상기 전자기기의 특정한 제어명령에 대응하는 입력방식 또는 입력패턴을 저장하는 단계;를 더 포함하는 글라스형 웨어러블 디바이스를 이용한 실내 전자기기 제어방법.And storing an input method or an input pattern corresponding to a specific control command of the electronic device.
  28. 글라스형 웨어러블 디바이스를 이용하여 실시간 사격제원정보에 따른 예상탄착지를 제공하는 방법에 있어서,In the method for providing an anticipated landing place according to the real-time shooting specification information using the glass-type wearable device,
    현재 위치정보를 수신하는 단계;Receiving current location information;
    실시간 사격제원정보를 포탄방향인식시스템으로부터 수신하는 단계;Receiving real-time shooting specification information from the shell direction recognition system;
    상기 사격제원정보 및 사거리정보를 바탕으로 예상탄착지를 계산하는 단계; 및Calculating a predicted landing location based on the shooting specification information and the range information; And
    상기 예상탄착지를 지도상에 표시하여 사용자에게 제공하는 단계;를 포함하는 글라스형 웨어러블 디바이스를 이용한 포탄 예상탄착지 제공방법.And providing the expected landing place on a map to a user. The method of claim 1, wherein the shell includes a wearable device.
  29. 제28항에 있어서, The method of claim 28,
    상기 현재 위치정보 수신단계는,The current location information receiving step,
    상기 글라스형 웨어러블 디바이스가 상기 포탄방향 인식시스템으로부터 상기 현재 위치정보를 수신하거나, 상기 글라스형 웨어러블 디바이스가 상기 현재 위치정보를 측정하는 것을 특징으로 하는, 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법.The glass type wearable device receives the current position information from the shell direction recognition system, the glass type wearable device, characterized in that for measuring the current position information, shell predictable landing using a glass type wearable device .
  30. 제28항에 있어서, The method of claim 28,
    상기 예상탄착지 계산단계는,The predicted impact location calculation step,
    상기 사격제원정보 및 사거리정보를 바탕으로 제1예상탄착지를 계산하는 단계;Calculating a first predicted landing area based on the shooting specification information and the range information;
    상기 현재위치와 상기 제1예상탄착지를 포함하는 지형정보를 수신하는 단계; 및Receiving terrain information including the current location and the first predicted impact landing; And
    상기 지형정보를 반영하여 제2예상탄착지를 계산하는 단계;를 포함하고,Calculating a second predicted impact landing by reflecting the terrain information;
    상기 예상탄착지 제공단계는,The expected landing site providing step,
    상기 제2예상탄착지를 상기 지도상에 상기 예상탄착지로 표시하는, 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법.A method for providing an expected impact landing shell using a glass-type wearable device, wherein the second estimated impact landing is displayed on the map as the expected impact landing.
  31. 제28항에 있어서, The method of claim 28,
    상기 글라스형 웨어러블 디바이스가 포격지위치정보를 획득하는 단계;를 더 포함하는 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법.And obtaining, by the glass type wearable device, bombardment location information.
  32. 제31항에 있어서, The method of claim 31, wherein
    상기 포격지위치정보 획득단계는,The bombardment location information obtaining step,
    무선통신으로 외부서버로부터 상기 포격지위치정보를 수신하는 방식, 제1카메라에 의해 획득된 이미지에서 상기 포격지위치정보에 상응하는 문자를 인식하는 방식, 사용자의 음성데이터에서 상기 포격지위치정보를 인식하는 방식 중 적어도 하나 이상에 의해 수행되는 것을 특징으로 하는 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법.A method of receiving the bombardment location information from an external server by wireless communication, a method of recognizing a character corresponding to the bombardment location information in an image obtained by a first camera, and the bombardment location information in a user's voice data Method for providing an expected impact landing shell using a glass-type wearable device, characterized in that performed by at least one or more of the way of recognizing.
  33. 제31항 또는 제32항에 있어서, 33. The method of claim 31 or 32,
    상기 포격지위치정보를 지도상에 표시하여 디스플레이하는 단계; 및Displaying and displaying the bombardment location information on a map; And
    상기 포격지위치정보와 상기 예상탄착지정보가 일치하는 경우, 사용자에게 알림을 수행하는 단계;를 더 포함하는 글라스형 웨어러블 디바이스를 이용한 포탄 예상 탄착지 제공방법.And providing a notification to a user when the bombardment location information and the predicted landing location information match.
PCT/KR2015/007914 2014-07-30 2015-07-29 Information-processing system and method using wearable device WO2016018063A2 (en)

Applications Claiming Priority (16)

Application Number Priority Date Filing Date Title
KR10-2014-0097132 2014-07-30
KR10-2014-0097084 2014-07-30
KR20140097132 2014-07-30
KR20140097084 2014-07-30
KR10-2014-0101025 2014-08-06
KR20140101025 2014-08-06
KR10-2014-0110608 2014-08-25
KR20140110608 2014-08-25
KR10-2015-0042550 2015-03-26
KR1020150042547A KR20160017593A (en) 2014-08-06 2015-03-26 Method and program for notifying emergency exit by beacon and wearable glass device
KR10-2015-0042547 2015-03-26
KR1020150042550A KR20160015142A (en) 2014-07-30 2015-03-26 Method and program for emergency reporting by wearable glass device
KR10-2015-0042941 2015-03-27
KR1020150042943A KR20160015143A (en) 2014-07-30 2015-03-27 System for the recognition of cannonball direction, method and program for the prediction of impacting point by wearable glass device
KR1020150042941A KR101728707B1 (en) 2014-08-25 2015-03-27 Method and program for controlling electronic device by wearable glass device
KR10-2015-0042943 2015-03-27

Publications (2)

Publication Number Publication Date
WO2016018063A2 true WO2016018063A2 (en) 2016-02-04
WO2016018063A3 WO2016018063A3 (en) 2016-03-24

Family

ID=55218426

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2015/007914 WO2016018063A2 (en) 2014-07-30 2015-07-29 Information-processing system and method using wearable device

Country Status (1)

Country Link
WO (1) WO2016018063A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107101633A (en) * 2017-04-13 2017-08-29 清华大学 A kind of Intelligent worn device that evacuation instruction is presented and evacuation instruction rendering method
CN108846992A (en) * 2018-05-22 2018-11-20 东北大学秦皇岛分校 A kind of method and device that safe early warning can be carried out to hearing-impaired people
CN113034843A (en) * 2021-02-21 2021-06-25 深圳市九象数字科技有限公司 High formwork wireless automatic monitoring system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100724967B1 (en) * 2005-09-28 2007-06-04 삼성전자주식회사 Accident broadcasting guide system for providing accident broadcasting guide service and method thereof
JP2009036621A (en) * 2007-08-01 2009-02-19 Denso Corp In-vehicle route guiding apparatus
KR101018583B1 (en) * 2010-07-14 2011-03-03 김현태 System for prevention of fires
KR101282669B1 (en) * 2012-11-12 2013-07-12 (주)티엘씨테크놀로지 Smart ware for preventing an accident on workshop
KR20140070940A (en) * 2012-11-30 2014-06-11 주식회사 하나아이엔씨 Smart Disaster Services Platform

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107101633A (en) * 2017-04-13 2017-08-29 清华大学 A kind of Intelligent worn device that evacuation instruction is presented and evacuation instruction rendering method
CN108846992A (en) * 2018-05-22 2018-11-20 东北大学秦皇岛分校 A kind of method and device that safe early warning can be carried out to hearing-impaired people
CN113034843A (en) * 2021-02-21 2021-06-25 深圳市九象数字科技有限公司 High formwork wireless automatic monitoring system

Also Published As

Publication number Publication date
WO2016018063A3 (en) 2016-03-24

Similar Documents

Publication Publication Date Title
WO2016133269A1 (en) Wearable device for generating image signal, and system for controlling same
WO2018182217A1 (en) Method for adaptive authentication and electronic device supporting the same
WO2018030799A1 (en) Method for providing parking location information of vehicle and electronic device thereof
WO2019013517A1 (en) Apparatus and method for voice command context
WO2019103212A1 (en) Monitoring system for iot smart terminal in ship using communication network
US10755222B2 (en) Work management apparatus, work defect prevention program, and work defect prevention method
CN107113354A (en) Communication system including headset equipment
KR20160017593A (en) Method and program for notifying emergency exit by beacon and wearable glass device
WO2018026142A1 (en) Method for controlling operation of iris sensor and electronic device therefor
WO2016018063A2 (en) Information-processing system and method using wearable device
KR101684264B1 (en) Method and program for the alarm of bus arriving by wearable glass device
WO2018143509A1 (en) Moving robot and control method therefor
WO2016021907A1 (en) Information processing system and method using wearable device
WO2018048130A1 (en) Content playback method and electronic device supporting same
WO2018021726A1 (en) Electronic device and method for controlling activation of camera module
WO2016006920A1 (en) System and method for processing information using wearable device
KR20180057839A (en) Intelligent robot capable of human recognition and operation method thereof
WO2020246639A1 (en) Method for controlling augmented reality electronic device
KR101728707B1 (en) Method and program for controlling electronic device by wearable glass device
KR101569880B1 (en) Apparatus for generating image signal and controlling system
WO2018097483A1 (en) Motion information generating method and electronic device supporting same
WO2016010328A1 (en) Information processing system and method using wearable device
KR20160053391A (en) System, method and application for confirmation of identity by wearable glass device
WO2018131903A1 (en) Method for detecting marker and electronic device thereof
KR20160015143A (en) System for the recognition of cannonball direction, method and program for the prediction of impacting point by wearable glass device

Legal Events

Date Code Title Description
NENP Non-entry into the national phase in:

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15828123

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 15828123

Country of ref document: EP

Kind code of ref document: A2