KR20160015142A - Method and program for emergency reporting by wearable glass device - Google Patents

Method and program for emergency reporting by wearable glass device Download PDF

Info

Publication number
KR20160015142A
KR20160015142A KR1020150042550A KR20150042550A KR20160015142A KR 20160015142 A KR20160015142 A KR 20160015142A KR 1020150042550 A KR1020150042550 A KR 1020150042550A KR 20150042550 A KR20150042550 A KR 20150042550A KR 20160015142 A KR20160015142 A KR 20160015142A
Authority
KR
South Korea
Prior art keywords
emergency
user
wearable device
voice
unit
Prior art date
Application number
KR1020150042550A
Other languages
Korean (ko)
Inventor
한성철
엄정한
김진영
이경현
김대중
김석기
유철현
김주천
김주원
Original Assignee
넥시스 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 넥시스 주식회사 filed Critical 넥시스 주식회사
Priority to PCT/KR2015/007914 priority Critical patent/WO2016018063A2/en
Publication of KR20160015142A publication Critical patent/KR20160015142A/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/10Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using wireless transmission systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0453Sensor means for detecting worn on the body to detect health condition by physiological monitoring, e.g. electrocardiogram, temperature, breathing
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0476Cameras to detect unsafe condition, e.g. video cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B23/00Alarms responsive to unspecified undesired or abnormal conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B3/00Audible signalling systems; Audible personal calling systems
    • G08B3/10Audible signalling systems; Audible personal calling systems using electric transmission; using electromagnetic transmission
    • G08B3/1008Personal calling arrangements or devices, i.e. paging systems
    • G08B3/1016Personal calling arrangements or devices, i.e. paging systems using wireless transmission
    • G08B3/1025Paging receivers with audible signalling details
    • G08B3/1033Paging receivers with audible signalling details with voice message alert

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Gerontology & Geriatric Medicine (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Physiology (AREA)
  • Pulmonology (AREA)
  • Telephone Function (AREA)

Abstract

The present invention relates to an emergency communication method and program for emergency using a glass-like wearable device.
An emergency emergency contact method using a glass-type wearable device according to an embodiment of the present invention includes: acquiring voice input or operation input of a user (S100); A step S200 of recognizing an emergency input by recognizing a voice input or an operation input; And transmitting the emergency notification information to the emergency communication partner through wireless communication (S300).
According to the present invention, a user who is in an emergency can perform emergency communication without any special operation. As a result, crimes or accidents can be reduced by coping with emergency situations early. Further, since the wearer always wears the wearable device due to the nature of the wearable wearable device, the user can always be prepared for an emergency situation. In addition, since the voice input and the action input are classified according to the emergency situation and the party and the contact for emergency communication are designated, the emergency communication can be performed to the other party capable of appropriately responding to the emergency situation of the user.

Description

METHOD AND PROGRAM FOR EMERGENCY REPORTING BY WEARABLE GLASS DEVICE USING GLASS TYPE Wearable Device BACKGROUND OF THE INVENTION [0001]

The present invention relates to an emergency emergency contact method and program using a glass-like wearable device, and more particularly to a method and program for recognizing an operation or a voice of a user in an emergency so that an emergency contact can be automatically performed in a police station or a fire department .

Recently wearable devices are emerging. It has appeared in the form of glasses that are linked to smart phones, and some forms that can operate independently without a smartphone are also emerging.

On the other hand, in general, a security-related program based on a smartphone automatically sends an urgent signal message when a user directly calls a smartphone emergency bell in an emergency or does not finish the program at the time when the scheduled arrival time is set .

The former has a problem that the user can not operate the smartphone, and the latter can not transmit the situation immediately after the emergency occurs, and has to wait until the program ends. Thus, Not suitable in an emergency.

In addition, the conventional route-based apps also require the user to directly display the route on the map, thereby causing the user to have a high inconvenience of input and inaccurate detection.

Also, a smart phone-based crime prevention and emergency notification system is known from the patent document 10-2014-0007118. According to the present invention, there is a technical problem to be solved by using a smart phone equipped with various sensors to perform a message or an emergency call including a current location to a designated place automatically in case of security or emergency situation in a user's peripheral area, Enables the emergency request in situations that can not be operated. However, a smartphone may not be able to be carried on the body, such as a smartphone being dropped from a pocket or put in a bag, and may be held up and missed in case of a hazard. At this time, various sensors have a problem that they can not recognize the dangerous situation of the user.

SUMMARY OF THE INVENTION It is an object of the present invention to provide an emergency emergency contact method and an emergency emergency contact method using a glass wearable device that enables an emergency emergency call to be automatically made in a police station or a fire department, Program.

According to an aspect of the present invention, there is provided an emergency communication method using a glass-type wearable device, the method comprising: obtaining a voice input or an operation input of a user; Recognizing the voice input or the operation input and determining an emergency situation; And transmitting the emergency notification information to the emergency communication partner through wireless communication.

Determining an emergency situation classification corresponding to the voice input or the operation input; And selecting a specific designated institution or a specific acquainte as the emergency communication partner according to the emergency classification.

Further, the voice input may be a screaming sound or a designated emergency signal phrase.

The method may further comprise measuring the current location of the user, wherein the emergency contact performing step includes transmitting the measured current position information in the emergency alert information.

The method may further include performing a real-time image capturing or a real-time voice recording in front of the user, and the emergency contact performing step transmits the captured image or the recorded voice through wireless communication.

The emergency contact performing step may further include analyzing the real-time image and determining an emergency situation classification generated by the user.

In addition, the emergency notification information step may be such that an unspecified counterpart within a predetermined distance transmits an active recognition signal so as to recognize the emergency situation of the user.

The method may further include voice outputting the emergency response method corresponding to the emergency situation of the user to the outside.

The method may further include receiving the user's biological signal from the external wearable device or acquiring the biological signal from the wearable wearable device, wherein the emergency situation determination step determines the emergency situation of the user by reflecting the bio- .

An emergency contact program for a glass-like wearable device according to another embodiment of the present invention is combined with hardware to execute the above-mentioned emergency contact method and stored in the medium.

According to the present invention as described above, the following various effects are obtained.

First, a user in an emergency can perform an emergency contact without any special operation. As a result, crimes or accidents can be reduced by coping with emergency situations early.

Second, since the wearer always wears the wearable device due to the nature of the wearable device, the user can always be prepared for an emergency situation.

Third, since the voice input and the action input are classified according to the emergency situation and the party and the contact for emergency communication are designated, the emergency communication can be performed to the other party capable of appropriately responding to the emergency situation of the user.

1 is an internal configuration diagram of an emergency communication system using a glass-like wearable device according to an embodiment of the present invention.
FIG. 2 is a flow chart of an emergency emergency contact method using a glass-type wearable device according to an embodiment of the present invention.
3 is a perspective view of a wearable wearable device according to an embodiment of the present invention.
4 is a diagram showing the connection relationship between the glass-like wearable device and the emergency contact party.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. BRIEF DESCRIPTION OF THE DRAWINGS The advantages and features of the present invention, and the manner of achieving them, will be apparent from and elucidated with reference to the embodiments described hereinafter in conjunction with the accompanying drawings. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. To fully disclose the scope of the invention to those skilled in the art, and the invention is only defined by the scope of the claims. Like reference numerals refer to like elements throughout the specification.

Unless defined otherwise, all terms (including technical and scientific terms) used herein may be used in a sense commonly understood by one of ordinary skill in the art to which this invention belongs. Also, commonly used predefined terms are not ideally or excessively interpreted unless explicitly defined otherwise.

The terminology used herein is for the purpose of illustrating embodiments and is not intended to be limiting of the present invention. In the present specification, the singular form includes plural forms unless otherwise specified in the specification. The terms " comprises "and / or" comprising "used in the specification do not exclude the presence or addition of one or more other elements in addition to the stated element.

1 is an internal configuration diagram of an emergency communication system using a glass-like wearable device according to an embodiment of the present invention. FIG. 2 is a flow chart of an emergency emergency contact method using a glass-type wearable device according to an embodiment of the present invention. 3 is a perspective view of a wearable wearable device according to an embodiment of the present invention. 4 is a diagram showing the connection relationship between the glass-like wearable device and the emergency contact party.

1 to 4 show a system 100, a user input unit 110, an application 111, a keyboard 112, a voice input unit 113, a touch pad 114, a GPS signal unit 115, A camera 120, a first camera 121, a second camera 122, a third camera 123, a sensing unit 130, a gyro sensor 131, an acceleration sensor 132, a pressure sensor 133, an iris recognition sensor 134, a heartbeat detection sensor 135, an electromyogram sensor 136, a control unit 210, a voice recognition unit 220, a situation evaluation unit 230, a voice- A wireless communication unit 250, a memory 260, an interface unit 270, an output unit 300, a display unit 310, an acoustic output unit 320, an alarm unit 330, a haptic module 340, The contact party's device 400 is shown.

1 is an internal configuration diagram of a glass-type wearable device system according to an embodiment of the present invention.

The wearable device system 100 includes a user input unit 110, an application 111, a keyboard 112, a voice input unit 113, a touch pad 114, a GPS signal unit 115 A short distance communication 116, a camera 120, a first camera 121, a second camera 122, a third camera 123, a sensing unit 130, a gyro sensor 131, an acceleration sensor A heart rate detection sensor 135, an electromyogram sensor 136, an information processing unit 210, a voice recognition unit 220, a situation evaluation module 230, A text conversion module 240, a wireless communication unit 250, a memory 260, an interface unit 270, an output unit 300, a display unit 310, an audio output unit 320, an alarm unit 330, And a haptic module 340, all of which are shown in FIG. The glass-like wearable device system 100 may further include other additional configurations.

The camera 120 is for inputting video signals or image signals, and may be provided in accordance with a configuration of the device. The camera 120 processes image frames such as still images or moving images obtained by the image sensor in the video communication mode or the photographing mode. The processed image frame can be displayed on the display unit 310. [ The image frame processed by the camera 120 may be stored in the memory 260 or transmitted to the outside through the wireless communication unit 250. [ When an image signal or a video signal is used as an input for information processing, the image signal and the video signal are transmitted to the control unit 210.

The camera unit 120 may include one or more cameras according to the direction or purpose of the image to be photographed. The first camera 121 is provided at one side of the glass-like wearable device so as to take an image of the front side. The second camera 122 may be provided on one side of the glass-like wearable device to obtain an image or an image in the eyeball direction. The third camera 123 is disposed behind or on the side of the glass-type wearable device, and can acquire a rearward or lateral image or an image.

The voice input unit 113 is for inputting voice signals and may include a microphone and the like. The microphone receives an external acoustic signal by a microphone in a communication mode, a recording mode, a voice recognition mode, and the like and processes it as electrical voice data. The processed voice data can be converted into a form that can be transmitted to the mobile communication base station through the mobile communication unit and output when the voice data is in the call mode. A variety of noise canceling algorithms may be used to remove the noise generated by the microphone in receiving an external acoustic signal.

The user input unit 110 generates key input data that the user inputs for controlling the operation of the device. The user input unit 110 may include a key pad, a keyboard, a dome switch, a touch pad (static / static), a jog wheel, a jog switch, and a finger mouse. Particularly, when the touch pad has a mutual layer structure with the display unit 310 described later, it can be called a touch screen.

The sensing unit 130 senses the current state of the device such as the open / close state of the device, the position of the device, the presence or absence of the user, and generates a sensing signal for controlling the operation of the device. In addition, the sensing unit 130 may function as an input unit for receiving an input signal for information processing of a device, and may perform various sensing functions such as recognition of connection to an external device.

The sensing unit 130 may include a proximity sensor, a pressure sensor 133, a motion sensor, a fingerprint recognition sensor, an iris recognition sensor 134, a heartbeat detection sensor 135, a skin temperature sensor, And the like.

The proximity sensor makes it possible to detect the presence of an object to be approached or nearby, without mechanical contact. The proximity sensor can detect a nearby object by using the change of the alternating magnetic field or the change of the static magnetic field, or by using the change rate of the capacitance. The proximity sensor may be equipped with two or more sensors according to the configuration.

The pressure sensor 133 can detect whether or not pressure is applied to the device, the magnitude of the pressure, and the like. The pressure sensor 133 may be installed in a part of the device where the pressure needs to be detected depending on the use environment. When the pressure sensor 133 is installed on the display unit 310, a touch input through the display unit 310 and a pressure applied by the touch input The pressure touch input can be identified. In addition, the magnitude of the pressure applied to the display unit 310 at the time of the pressure touch input can be determined according to the signal output from the pressure sensor 133. [

The motion sensor includes at least one of an acceleration sensor 132, a gyro sensor 131, and a geomagnetic sensor, and detects the position and movement of the device using the sensor. The acceleration sensor 132, which can be used for a motion sensor, is a device that converts an acceleration change in one direction into an electric signal and is widely used along with the development of MEMS (micro-electromechanical systems) technology. Further, the gyro sensor 131 is a sensor for measuring the angular velocity, and can sense the direction of rotation with respect to the reference direction.

The heartbeat detection sensor 135 measures the change in the optical blood flow according to the change in the thickness of the blood vessel caused by the heartbeat. The skin temperature sensor measures the skin temperature as the resistance value changes in response to the temperature change. The skin resistance sensor measures the skin's electrical resistance.

The iris recognition sensor 134 performs a function of recognizing a person using iris information of an eye having characteristics unique to each person. The human iris is completed after 18 months of age, and the circular iris pattern, which is raised near the medial side of the iris, is almost unchanged once determined, and the shape of each person is different. Therefore, iris recognition is the application of information technology to security for information of different iris characteristics. That is, it is an authentication method developed to identify people by analyzing the shape and color of iris and the morphology of retinal capillaries.

The iris recognition sensor 134 encodes a pattern of the iris and converts it into a video signal to compare and determine. The general operation principle is as follows. First, when the user's eye is aligned with the mirror located at the center of the iris recognizer at a certain distance, the infrared camera adjusts the focus through the zoom lens. After the iris camera images the user's iris as a photo, the iris recognition algorithm analyzes the iris pattern of the iris region to generate iris codes unique to the user. Finally, a comparison search is performed at the same time that the iris code is registered in the database.

Distance sensors include two-point distance measurement, triangulation (infrared, natural light) and ultrasonic. As in the conventional triangulation principle, when the object to be measured from two paths is reflected by a rectangular prism and incident on two image sensors, the distance between two points is displayed when the relative positions are matched. In this case, there is a method of making natural light (manual type) and a method of emitting infrared rays. The ultrasonic method is a method of transmitting ultrasonic waves having sharp direction to the object to be measured and measuring the time until the reflected wave from the object is received to find the distance. A piezoelectric element is used as the receiving sensor.

The Doppler radar is a radar that uses a Doppler effect of a wave, that is, a phase change of a reflected wave. The Doppler radar includes a continuous wave radar that transmits and receives a sinusoidal wave that is not pulse-modulated, and a pulse radar that uses a pulse-modulated wave to a square wave as an electromagnetic wave signal waveform.

In the continuous wave radar, the modulation frequency is relatively high in order to obtain the performance of the Doppler frequency filter. Therefore, it is inappropriate for the radar for the long distance, but the motion of the human body and the vehicle is reproduced as a stable sound by adopting the Doppler frequency as the audible frequency band. There is a feature that can be. The pulse radar measures the distance to the target by the time from the pulse transmission to the reflection echo reception. There is a method referred to as a pulse compression laser that performs frequency modulation or phase modulation within the transmission pulse width.

The output unit 300 is for outputting an audio signal, a video signal, or an alarm signal. The output unit 300 may include a display unit 310, an audio output module, an alarm unit 330, a haptic module 340, and the like.

The display unit 310 displays and outputs information processed in the device. For example, when the device is in the call mode, a UI (User Interface) or GUI (Graphic User Interface) associated with the call is displayed. When the device is in the video communication mode or the photographing mode, the captured or received image can be displayed individually or simultaneously, and the UI and the GUI are displayed.

Meanwhile, as described above, when the display unit 310 and the touch pad have a mutual layer structure to constitute a touch screen, the display unit 310 can be used as an input device in addition to the output device. If the display unit 310 is configured as a touch screen, it may include a touch screen panel, a touch screen panel controller, and the like.

In addition, the display unit 310 may be a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a three-dimensional display (3D display). There may be two or more display units 310 depending on the implementation type of the device. For example, the device may include an external display unit 310 and an internal display unit 310 at the same time.

The display unit 310 may be implemented as a head up display (HUD), a head mounted display (HMD), or the like. HMD (Head Mounted Display) is an image display device that allows you to enjoy large images on your head like glasses. A Head Up Display (HUD) is a video display device that projects a virtual image onto a glass in a visible region of a user.

The audio output unit 320 outputs audio data received from the wireless communication unit or stored in the memory 260 in a call signal reception mode, a call mode or a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the sound output module 320 outputs sound signals related to functions performed in the device, for example, call signal reception tones, message reception tones, and the like. The sound output module 320 may include a speaker, a buzzer, and the like.

The alarm unit 330 outputs a signal for notifying the occurrence of an event of the device. Examples of events that occur in a device include receiving a call signal, receiving a message, and inputting a key signal. The alarm unit 330 outputs a signal for notifying the occurrence of an event in a form other than an audio signal or a video signal. For example, it is possible to output a signal in a vibration mode. The alarm unit 330 may output a signal to notify when a call signal is received or a message is received. Also. When the key signal is input, the alarm unit 330 can output a signal as a feedback signal to the key signal input. The user can recognize the occurrence of an event through the signal output by the alarm unit 330. A signal for notifying the occurrence of an event in the device may also be output through the display unit 310 or the sound output unit.

The haptic module 340 generates various tactile effects that the user can feel. A typical example of the haptic effect generated by the haptic module 340 is a vibration effect. When the haptic module 340 generates vibration with a haptic effect, the intensity and pattern of the vibration generated by the haptic module 340 can be converted, and the different vibrations may be synthesized and output or sequentially output.

The wireless communication unit 250 may include a broadcast receiving module, a mobile communication module, a wireless Internet module, a short distance communication module, and a GPS module.

The broadcast receiving module receives at least one of a broadcast signal and broadcast related information from an external broadcast management server through a broadcast channel. At this time, the broadcast channel may include a satellite channel, a terrestrial channel, and the like. The broadcast management server may refer to a server for generating and transmitting at least one of a broadcast signal and broadcast related information and a server for receiving at least one of the generated broadcast signal and broadcast related information and transmitting the broadcast signal to the terminal.

The broadcast-related information may mean information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information can also be provided through a mobile communication network, in which case it can be received by the mobile communication module. Broadcast-related information can exist in various forms.

The broadcast receiving module receives a broadcast signal using various broadcast systems, and can receive a digital broadcast signal using a digital broadcast system. In addition, the broadcast receiving module may be configured to be suitable for all broadcasting systems that provide broadcast signals as well as the digital broadcasting system. The broadcast signal and / or broadcast related information received through the broadcast receiving module may be stored in the memory 260.

The mobile communication module transmits and receives radio signals to and from at least one of a base station, an external terminal, and a server on a mobile communication network. Here, the wireless signal may include various types of data according to a voice call signal, a video call signal, or a text / multimedia message transmission / reception.

The wireless Internet module refers to a module for wireless Internet access, and the wireless Internet module can be embedded in a device or externally. Wireless Internet technologies include WLAN (Wi-Fi), Wibro (Wireless broadband), Wimax (World Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), LTE (Long Term Evolution-Advanced) or the like can be used.

The short-range communication module 116 is a module for short-range communication. Beacon, Bluetooth, Radio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), ZigBee and the like can be used as a short distance communication technology.

The GPS (Global Position System) module 115 receives position information from a plurality of GPS satellites.

The memory 260 may store a program for processing and controlling the control unit 210 and may perform a function for temporarily storing input or output data (e.g., a message, a still image, a moving image, etc.) It is possible.

The memory 260 may be a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (e.g., SD or XD memory), a RAM , And a ROM. ≪ / RTI > The device may also operate a web storage that performs storage functions of the memory on the Internet.

The memory 260 may be represented by a storage unit 260 as follows.

The interface unit 270 serves as an interface with all external devices connected to the device. Examples of external devices connected to the device include card sockets such as a wired / wireless headset, an external charger, a wired / wireless data port, a memory card, a Subscriber Identification Module (SIM) or a User Identity Module (UIM) Audio I / O (input / output) jacks, video I / O (input / output) jacks, and earphones. The interface unit 270 may receive data from the external device or supply power to the respective components in the device, and may transmit data in the device to the external device.

The control unit 210 typically controls the operation of each unit to control the overall operation of the device. For example, voice communication, data communication, video communication, and the like. In addition, the control unit 210 performs a function of processing data for multimedia reproduction. In addition, it performs a function of processing data input from the input unit or the sensing unit 130.

In addition, the control unit 210 performs face detection and face recognition for face recognition. That is, the control unit 210 may include a face detection module and a face recognition module for face recognition. The face detection module can extract only the face region from the camera image acquired by the camera unit 120. [ For example, the face detection module can extract face regions by recognizing feature elements in the face such as eyes, nose, mouth, and the like. The face recognition module extracts feature information from the extracted face region to generate a template, and the face information can be recognized by performing template comparison with the face information data in the face database.

In addition, the control unit 210 may perform a function of extracting and recognizing a character within an image or an image acquired by the camera unit 120. [ That is, the control unit 210 may include a character recognition module for character recognition. An optical character recognition (OCR) method can be applied to the character recognition method of the character recognition module. The OCR method is a method that can be implemented by software by converting a type image of a document written or printed by a person, which can be obtained by image scanning, into a form such as a character code that can be edited by a computer. For example, in the OCR method, a plurality of standard pattern characters prepared in advance and an input character are compared with each other to select the character most similar to the standard pattern character. If the character recognition module includes standard pattern characters of various languages, printed characters of various languages can be read. Such a method is referred to as a pattern matching method among OCR methods, and various methods can be applied to the OCR method. In addition, the character recognition method of the character recognition module is not limited to the OCR method, and various methods capable of recognizing already-printed offline characters can be applied.

In addition, the control unit 210 may perform a function of recognizing the gaze direction based on the ocular direction image or the image acquired by the second camera 122. That is, the control unit 210 may include a line of sight analysis module that performs line-of-sight direction recognition. The direction of sight of the user and the direction of the line of sight are measured and then synthesized to determine the direction the user is looking at. The gaze direction refers to the direction of the user's face and can be measured by the gyro sensor 131 or the acceleration sensor 132 of the sensing unit 130. The gaze direction can be grasped by the gaze analysis module in the direction in which the user's pupil looks. The eye-gaze analysis module can detect motion of a pupil through analysis of a real-time camera image and calculate a gaze direction based on a fixed position reflected on the cornea. For example, the location of the cornea reflection light by the center of the pupil and illumination can be extracted through the image processing method and the eye position can be calculated through the positional relationship between them.

The control unit 210 may be expressed as an information processing unit 210 hereinafter.

The power supply unit receives external power and internal power under the control of the controller 210, and supplies power necessary for operation of the respective components.

The speech recognition unit 220 performs a function of recognizing verbally meaningful contents from the speech by automatic means. Specifically, a speech waveform is input to identify a word or a word sequence, and a meaning is extracted. The process is largely divided into voice analysis, phoneme recognition, word recognition, sentence analysis, and semantic extraction. The voice recognition unit 220 may further include a voice evaluation module that compares the stored voice with the input voice. The voice recognition unit 220 may further include a voice-to-text conversion module 240 that converts the input voice to text or converts the voice to voice.

The EEG signal generator generates an EEG synchronized signal having a frequency and a waveform for synchronizing human brain waves. That is, the EEG coherent signal generator performs the function of synchronizing the EEG by transmitting the vibration of the EEG frequency to the skull. Electroencephalogram (EEG) refers to the flow of electricity that occurs when a cranial nerve signal is transmitted. These brain waves are very slow when sleeping Delta wave EEG, when the action is a fast EEG betapa, meditation when the middle rate of the alpha waves are increased. Therefore, the EEG signal generation part can induce the alpha wave and the seta wave, so that the effect of learning assistance and mental concentration can be demonstrated.

Hereinafter, an emergency emergency communication system, method and program using a glass-like wearable device according to embodiments of the present invention will be described with reference to the drawings.

FIG. 2 is a flowchart of an emergency emergency contact method using a glass-like wearable device according to a preferred embodiment of the present invention.

Referring to FIG. 2, an emergency emergency contact method using a glass-like wearable device according to an embodiment of the present invention includes: acquiring a user's voice input or operation input (S100); Recognizing the voice input or the operation input and determining an emergency situation (S200); And transmitting the emergency notification information to the emergency communication partner through wireless communication (S300). An emergency emergency contact method using a glass wearable device according to an embodiment of the present invention will be described in order.

The user's voice input or operation input is obtained (S100). The acquisition of the voice input is performed by the voice input unit 113 of the glass-type wearable device, and receives voice of the user. The voice input may be screaming or a designated emergency signal phrase. For example, if you are in a criminal situation, you may be screaming or an urgent call-phrase that calls for help such as 'Help!'

Also, it is possible to receive the surrounding voice according to the setting of the user. For example, if a user has a chronic illness and suffers frequent collapse, he or she may be able to have nearby people enter a voice question that they ask to check their status.

The acquisition of the operation input is performed by the motion sensor of the glass-type wearable device, and receives the movement of the user. The operation input may correspond to an unusual user's sudden motion, a stored head movement pattern, or the like. For example, a sudden movement of a user may correspond to an abnormal movement pattern caused by a struggling action if the user is abducted.

The voice input or the operation input is recognized to determine an emergency situation (S200).

In the case of the situation determination by the voice input, the voice recognition unit 220 recognizes a linguistic meaning or a tone in the voice input, and recognizes whether the situation evaluation unit 230 corresponds to an emergency situation based on the recognition.

In the case of the situation judgment based on the operation input, the situation evaluating unit 230 performs a comparison judgment with the movement pattern information stored on the basis of the motion of the user recognized by the motion sensor, It is judged as an emergency. Further, when a motion pattern that does not correspond to a motion pattern of a normal user (for example, a motion pattern when walking or a motion pattern when running) is recognized, the situation evaluator 230 can determine have.

As shown in FIG. 4, the glass-like wearable device transmits the emergency notification information to the emergency communication partner through wireless communication (S300). If the situation evaluation unit 230 recognizes that the emergency situation is present, the wireless communication unit 250 is instructed to perform emergency communication. Accordingly, the wireless communication unit 250 performs an emergency communication to the outside.

The emergency contact is a pre-designated contact, which may include text messaging, dialing, or push notification services. Here, the push notification service may be, for example, a push service developed by a US Apple company that notifies a message to users who use the same program on a smartphone. Also, the pre-designated contact may be a friend, a guardian, or a police officer, but is not limited to this kind. The contact information can be acquired from the memory of the glass-like wearable device or received from an external server by wireless communication.

The emergency contact method is not limited to the described method, and various methods that can be performed by the wireless communication unit 250 of the glass-like wearable device can be applied.

The method may further comprise measuring the current location of the user, wherein the emergency contact performing step includes transmitting the measured current position information in the emergency alert information. The current position information of the user is detected by the GPS module 115 of the wearable wearable device 100. The method further includes the steps of determining an emergency classification corresponding to the voice input or the operation input; And selecting a specific designated institution or a specific acquainte as the emergency communication partner according to the emergency classification.

The wearable wearable device 100 may classify and store data for promptly performing an emergency contact in the storage unit 260. The data to be included in the storage unit 260 may include voice input or action input data corresponding to the contact of the designated agency and the designated agency and the emergency situation. Accordingly, the storage unit 260 can classify the voice input or the operation input of the user according to the emergency situation, and store and store the designated agency and the contact information of the agency according to the emergency.

Accordingly, the situation evaluation unit 230 can identify the emergency situation classification including the user's voice input or operation input, and can select a designation agency or acquaintance to perform the emergency contact according to the emergency state classification. Thereafter, the situation evaluation unit 230 transmits a command signal to the wireless communication unit 250 to perform an emergency communication with the contact of the designated authority or an acquaintance, and the wireless communication unit 250 performs an emergency communication with the contact can do. The contact is not limited to a telephone number but may include an address capable of transmitting data by various wireless communication methods.

The method may further include the step of performing a real-time image capturing or a real-time voice recording in the forward direction, and the emergency contact performing step (S300) may transmit the captured image or the recorded voice through wireless communication have. That is, the glass-like wearable device 100 transmits the real-time image photographed by the first camera 121 or the real-time voice acquired by the voice input unit together with wireless communication, as shown in Fig. 3, . By transmitting the real-time image or voice together, the other party (for example, a friend, a guardian, a firefighter, or a police officer) who received the emergency contact can recognize the emergency situation of the user, .

The emergency contact performing step may include analyzing the real-time image to determine an emergency situation classification generated by the user. For example, when the real-time image falling to the floor is suddenly obtained, the wearable device 100 recognizes that the user is falling and can recognize an emergency situation in an emergency situation due to a physical abnormality.

In addition, the emergency notification information transmission step (S300) may transmit an active recognition signal so that the unspecified counterpart within a certain distance recognizes the emergency situation of the user. To get help quickly in an emergency, it's a good idea to ask people around you for help. Accordingly, the wearable wearable device 100 can transmit emergency notification information to a wireless communication signal capable of active sensing, such as a beacon signal, to be transmitted to surrounding areas. Accordingly, the unspecified counter party can receive the urgent situation notification information of the user through the wireless communication device of the user, so that the urgent assistance can be obtained.

The method may further include voice outputting the emergency response method corresponding to the emergency situation of the user to the outside. For example, if a user is knocked down, a quick first aid action may be needed for people located nearby. Accordingly, the wearable wearable device 100 can generate a voice response corresponding to the emergency situation of the user and inform the surrounding people of the emergency action.

The method may further include receiving the user's biological signal from the external wearable device or obtaining the biological signal from the wearable wearable device, wherein the emergency condition determination step (S200) Can be determined. In order to grasp the precise state of the user, the user's biometric information such as heart rate, electrocardiogram, etc. is required. Accordingly, the wearable wearable device 100 can receive a biological signal from another wearable device capable of measuring a biological signal, or the wearable wearable device 100 can directly measure the biological signal, reflect the biological signal, The situation can be grasped. For example, when a real-time image suddenly falling to the floor is acquired, the user can grasp the emergency situation by reflecting the bio-signal to determine whether the user has fallen or has fallen.

The emergency communication system 100 using the glass wearable device according to another embodiment of the present invention includes an audio input unit 113, a voice recognition unit 220, a situation evaluation unit 230, a wireless communication unit 250, And a storage unit 260.

In addition, an embodiment of the present invention may include a motion sensor in place of the voice input unit 113 and the voice recognition unit 220. [

The voice input unit 113 performs a function of obtaining a voice input of the user.

The speech recognition unit 220 recognizes the linguistic meaning or the tone of the speech input.

The motion sensor performs a function of acquiring an operation input of the user.

The situation evaluator 230 performs a function of determining whether the input voice is a voice expressing a user's emergency situation. That is, it compares the voice information from the voice recognition unit 220 and the voice information stored in the memory to determine whether the voice information corresponds to an emergency situation, and transmits the emergency communication control signal to the wireless communication unit 250 according to the determination result. In addition, the mobile communication terminal 250 compares the user movement pattern information recognized and transmitted from the motion sensor and the movement pattern information stored in the memory to determine whether it corresponds to an emergency situation, and transmits the emergency communication control signal to the wireless communication unit 250 according to the determination result.

For example, if the period or size information of the voice signal detected by the voice recognition unit 220 is the same as the period or size information of the voice signal stored in the memory, the situation evaluation unit 230 may calculate the emergency contact control signal And outputs it to the wireless communication unit 250. If the situation evaluation unit 230 determines that the acceleration level of the impact detected by the acceleration sensor 132 is different from the acceleration level stored in the memory or the acceleration level ratio, (250).

The wireless communication unit 250 performs an emergency contact and data transmission / reception function to the outside. That is, the wireless communication unit 250 performs emergency communication to the other party (for example, a friend, a guardian, a firefighter, or a police officer) receiving an external emergency contact through wireless communication. The emergency contact may be a text message, a telephone call or a push notification service. The wireless communication unit 250 may transmit the real time image obtained by the first camera 121 or the user position information obtained by the GPS module 115 together.

The storage unit 260 stores voice reference information or operation reference pattern information indicating the emergency state of the user and stores contacts to perform the emergency contact.

In addition, the storage unit 260 may classify an emergency situation, store a user's voice input or operation input for each category, and store the designated agency and the contact information of the agency in correspondence with the emergency category.

Also, the storage unit 260 may be configured as a memory inside the wearable device, or may be implemented in an external server. That is, the information may be stored in the external server and received by the wireless communication in an emergency.

In addition, the first camera 121 may further include: The first camera 121 is provided at one side of the front portion of the glass-like wearable device, and acquires a real-time image to be viewed by the user.

Further, it may further include a GPS module (115). The GPS module 115 performs a function of grasping the location information of the user.

As described above, the emergency contact method for a wearable device of a glass type according to an embodiment of the present invention can be implemented as a program (or an application) to be executed in combination with a glass-like wearable device 100, have.

The above-described program is a program for causing a processor (CPU) of the glass-like wearable device 100 to read the program and execute the methods implemented by the program, C ++, JAVA, machine language, etc., which can be read through the device interface of the device 100. [ Such code may include a function code related to a function or the like that defines necessary functions for executing the above methods, and may execute the functions necessary to execute the functions of the processor of the glass-like wearable device 100 according to a predetermined procedure Procedure-related control codes. This code may also be used when the additional information or media necessary for the processor of the glass-like wearable device 100 to perform the above functions must be referred to at any position (address) of the internal or external memory of the glass-like wearable device 100 You can also include more memory reference related code. When the processor of the glass-like wearable device 100 needs to communicate with any other computer or server that is remote to execute the functions, the code is transmitted to the communication module of the glass-like wearable device 100 And may further include communication related codes such as how to communicate with any other remote computer or server, what information or media should be transmitted or received during communication, and the like.

The medium to be stored is not a medium for storing data for a short time such as a register, a cache, a memory, etc., but means a medium which semi-permanently stores data and is capable of being read by a device. Specifically, examples of the medium to be stored include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, but are not limited thereto. That is, the program may be stored in various recording media on various servers that the glass-like wearable device 100 can access, or on various recording media on the glass-wearable device 100 of the user. In addition, the medium may be distributed to a network-connected computer system so that computer-readable codes may be stored in a distributed manner.

According to the present invention as described above, the following various effects are obtained.

First, a user in an emergency can perform an emergency contact without any special operation. As a result, crimes or accidents can be reduced by coping with emergency situations early.

Second, since the wearer always wears the wearable device due to the nature of the wearable device, the user can always be prepared for an emergency situation.

Third, since the voice input and the action input are classified according to the emergency situation and the party and the contact for emergency communication are designated, the emergency communication can be performed to the other party capable of appropriately responding to the emergency situation of the user.

While the present invention has been described in connection with what is presently considered to be practical exemplary embodiments, it is to be understood that the invention is not limited to the disclosed embodiments, but, on the contrary, You will understand. It is therefore to be understood that the above-described embodiments are illustrative in all aspects and not restrictive.

100: system 110: user input
111: Application 112: Keyboard
113: voice input unit 114: touch pad
115: GPS signal unit 116: Local area communication
120: camera unit 121: first camera
122: second camera 123: third camera
130: sensing unit 131: gyro sensor
132: acceleration sensor 133: pressure sensor
134: iris recognition sensor 135: heart rate detection sensor
136: EMG sensor
210: control unit 220: voice recognition unit
230: situation evaluation unit 240: voice-to-text conversion module
250: wireless communication unit 260: memory
270:
300: output unit 310: display unit
320: Acoustic output unit 330:
340: Haptic module
400: Device of the emergency communication partner

Claims (10)

CLAIMS 1. A method of performing emergency contact by recognizing a voice or an action according to an emergency situation of a user wearing a wearable wearable device using a glass wearable device,
Obtaining a voice input or an operation input of the user;
Recognizing the voice input or the operation input and determining the emergency situation; And
And transmitting the emergency notification information to the emergency communication partner through wireless communication.
The method according to claim 1,
Determining an emergency classification corresponding to the voice input or the operation input; And
And selecting a specific designated institution or a specific person as the emergency communication partner according to the emergency classifying method.
3. The method according to claim 1 or 2,
Wherein the voice input comprises:
And the emergency call signal is a screaming sound or a designated emergency signal text.
The method according to claim 1,
Further comprising: measuring a current position of the user;
In the emergency contact step,
And transmitting the measured current position information in the emergency notification information.
The method according to claim 1,
Performing real-time video recording or real-time voice recording in front of the user,
In the emergency contact step,
And transmitting the photographed image or the recorded voice through a wireless communication.
6. The method of claim 5,
In the emergency contact step,
And analyzing the real-time image to determine an emergency situation classification generated by the user.
The method according to claim 1,
The emergency notification information transmission step includes:
Wherein the non-specific counterpart within a certain distance transmits an active recognition signal to recognize the emergency situation of the user.
The method according to claim 1,
The method of claim 1, further comprising the step of: outputting an emergency response method corresponding to the emergency situation of the user to the outside.
The method according to claim 1,
And receiving the biological signal of the user from an external wearable device or the biological wearable device of the user obtaining the biological signal of the user,
In the emergency situation determination step,
Wherein the emergency state of the user is determined by reflecting the biological signal.
An emergency contact program for a glass-like wearable device, in combination with a glass-type wearable device which is hardware, and which is stored in the medium for carrying out the method of any one of claims 1 to 9.
KR1020150042550A 2014-07-30 2015-03-26 Method and program for emergency reporting by wearable glass device KR20160015142A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2015/007914 WO2016018063A2 (en) 2014-07-30 2015-07-29 Information-processing system and method using wearable device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140097084 2014-07-30
KR20140097084 2014-07-30

Publications (1)

Publication Number Publication Date
KR20160015142A true KR20160015142A (en) 2016-02-12

Family

ID=55355081

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150042550A KR20160015142A (en) 2014-07-30 2015-03-26 Method and program for emergency reporting by wearable glass device

Country Status (1)

Country Link
KR (1) KR20160015142A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017175997A1 (en) * 2016-04-04 2017-10-12 삼성전자 주식회사 Electronic apparatus and control method therefor
KR20190056023A (en) 2017-11-16 2019-05-24 (주)바인아이티 Emergency reporting mothod by Smart mobile and based on bluetooth for wearable
KR102518190B1 (en) * 2022-08-12 2023-04-06 주식회사 포딕스시스템 System for preventing unexpected situations of complainant
WO2023075029A1 (en) * 2021-11-01 2023-05-04 주식회사 케이레이즈 System and method for infant eyesight restoration and presbyopia relief through smart glasses, smartphones, and smartwatches
KR102573067B1 (en) * 2022-12-14 2023-09-01 주식회사 피앤씨솔루션 An augmented reality glasses device that can analyze the wearer’s condition using an eye-tracking camera and an illuminance sensor

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017175997A1 (en) * 2016-04-04 2017-10-12 삼성전자 주식회사 Electronic apparatus and control method therefor
US11350823B2 (en) 2016-04-04 2022-06-07 Samsung Electronics Co., Ltd. Electronic apparatus and control method therefor
KR20190056023A (en) 2017-11-16 2019-05-24 (주)바인아이티 Emergency reporting mothod by Smart mobile and based on bluetooth for wearable
WO2023075029A1 (en) * 2021-11-01 2023-05-04 주식회사 케이레이즈 System and method for infant eyesight restoration and presbyopia relief through smart glasses, smartphones, and smartwatches
KR102518190B1 (en) * 2022-08-12 2023-04-06 주식회사 포딕스시스템 System for preventing unexpected situations of complainant
KR102573067B1 (en) * 2022-12-14 2023-09-01 주식회사 피앤씨솔루션 An augmented reality glasses device that can analyze the wearer’s condition using an eye-tracking camera and an illuminance sensor

Similar Documents

Publication Publication Date Title
US11024152B2 (en) Systems and methods for managing an emergency situation
KR101684264B1 (en) Method and program for the alarm of bus arriving by wearable glass device
KR101661555B1 (en) Method and program for restricting photography of built-in camera of wearable glass device
KR20160017593A (en) Method and program for notifying emergency exit by beacon and wearable glass device
KR20160015142A (en) Method and program for emergency reporting by wearable glass device
KR102047988B1 (en) Vision aids apparatus for the vulnerable group of sight, remote managing apparatus and method for vision aids
KR101728707B1 (en) Method and program for controlling electronic device by wearable glass device
KR20160026310A (en) System and method for curling coaching by wearable glass device
KR102251710B1 (en) System, method and computer readable medium for managing content of external device using wearable glass device
KR20160011302A (en) System and method for digital image processing by wearable glass device
KR20160024140A (en) System and method for identifying shop information by wearable glass device
KR101629758B1 (en) Method and program with the unlock system of wearable glass device
KR101853324B1 (en) Method for Providing Safety Zone Information, Apparutus and System Therefor
KR20160016216A (en) System and method for real-time forward-looking by wearable glass device
KR20160025150A (en) System and method for social dating service by wearable glass device
KR20160023226A (en) System and method for exploring external terminal linked with wearable glass device by wearable glass device
KR20160015704A (en) System and method for recognition acquaintance by wearable glass device
KR101832327B1 (en) Method for Providing Safety Zone Information, Apparutus and System Therefor
KR20160016149A (en) System and method for preventing drowsiness by wearable glass device
KR101661556B1 (en) Method and program for confirmation of identity by wearable glass device
KR20160025203A (en) System and method for billiard coaching by wearable glass device
KR20160053472A (en) System, method and application for confirmation of identity by wearable glass device
KR102308970B1 (en) System and method for inputting touch signal by wearable glass device
KR20160015143A (en) System for the recognition of cannonball direction, method and program for the prediction of impacting point by wearable glass device
KR101572807B1 (en) Method, apparatus and system for transmitting image signal by wearable device

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application