KR101906551B1 - A system for real time relaying and transmitting - Google Patents

A system for real time relaying and transmitting Download PDF

Info

Publication number
KR101906551B1
KR101906551B1 KR1020150159913A KR20150159913A KR101906551B1 KR 101906551 B1 KR101906551 B1 KR 101906551B1 KR 1020150159913 A KR1020150159913 A KR 1020150159913A KR 20150159913 A KR20150159913 A KR 20150159913A KR 101906551 B1 KR101906551 B1 KR 101906551B1
Authority
KR
South Korea
Prior art keywords
unit
image
wearable terminal
voice
receiving
Prior art date
Application number
KR1020150159913A
Other languages
Korean (ko)
Other versions
KR20160104537A (en
Inventor
이현상
Original Assignee
유퍼스트(주)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 유퍼스트(주) filed Critical 유퍼스트(주)
Publication of KR20160104537A publication Critical patent/KR20160104537A/en
Application granted granted Critical
Publication of KR101906551B1 publication Critical patent/KR101906551B1/en

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/0202Child monitoring systems using a transmitter-receiver system carried by the parent and the child
    • G08B21/0205Specific application combined with child monitoring using a transmitter-receiver system
    • G08B21/0208Combination with audio or video communication, e.g. combination with "baby phone" function
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/0202Child monitoring systems using a transmitter-receiver system carried by the parent and the child
    • G08B21/0205Specific application combined with child monitoring using a transmitter-receiver system
    • G08B21/0211Combination with medical sensor, e.g. for measuring heart rate, temperature
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Child & Adolescent Psychology (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Emergency Management (AREA)
  • Medical Informatics (AREA)
  • Educational Administration (AREA)
  • User Interface Of Digital Computer (AREA)
  • Theoretical Computer Science (AREA)
  • Educational Technology (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Traffic Control Systems (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Cardiology (AREA)
  • Multimedia (AREA)
  • Studio Devices (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Rehabilitation Tools (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The present invention relates to a real-time relay and transmission system, and a technical problem to be solved is to provide a system capable of relaying and transmitting the situation of a site more precisely and in detail and real-time sharing with external experts.
A first wearable terminal having a ring structure for photographing a first video, directly inputting a first audio, and transmitting the first video and the first audio to the outside in real time, respectively; A second wearing-type terminal which is made up of a spectacles structure, receives the first image in real time from the first wearing-type terminal and displays it through a lens of the spectacles, and receives and outputs a second voice from the outside in real time; And receiving and displaying the first image from the first wearable terminal in real time and receiving and outputting the first voice in real time from the first wearable terminal and directly receiving the second voice, A real-time relay and transmission system including an external management unit for real-time transmission to a terminal is disclosed.

Description

[0001] A SYSTEM FOR REAL TIME RELAYING AND TRANSMITTING [0002]

Embodiments of the present invention relate to a real-time relay and dispatch system.

In order to carry out urgent transportation of patients through an ambulance or to make a more accurate diagnosis and treatment at the operation site, equipment capable of accurately grasping the situation or the field is needed.

In addition, in case of the recent Pangyo festival accident, sunken lake sinking accident, forest fire, etc., it is necessary to cope with the situation in real time by sharing with experts in real time.

However, in most cases, there is no equipment for real-time sharing of an accident site or an operation site with outside experts in real time. However, it is possible to grasp the situation of the scene through a photographing device such as a CCTV installed in the field, but in particular, since the lesion must be observed accurately at the operation site, .

Open Patent Publication No. 10-2014-0066258 (May 2015.05) "Modification of video display based on sensor input for perspective near sight display" Registered Patent Publication No. 10-0782103 (November 28, 2007) [Patent Document 1] Published Japanese Patent Application No. 10-2015-0055262 (May 2015.05.1) 'Visualization of sound using a mobile device' [Patent Document 1] Japanese Laid-Open Patent Publication No. 10-2009-0105531 (2009.10.07) 'Method and apparatus for informing a visually impaired person of a document image taken with a camera sensor' Patent Document No. 10-2014-0116517 (Apr. 20, 2014) 'Wearable device with input and output structures'

The embodiment of the present invention provides a system capable of collaborating by relaying and transmitting the situation of the field more accurately and in detail and real-time sharing with external experts.

A real-time relay and transmission system according to an embodiment of the present invention is a real-time relay and transmission system that has a ring structure, captures a first video, directly receives a first audio, and transmits the first video and the first audio to the outside A first wearing-type terminal for carrying out the invention; A second wearing-type terminal which is made up of a spectacles structure, receives the first image in real time from the first wearing-type terminal and displays it through a lens of the spectacles, and receives and outputs a second voice from the outside in real time; And receiving and displaying the first image from the first wearable terminal in real time and receiving and outputting the first voice in real time from the first wearable terminal and directly receiving the second voice, And an external management unit for real-time transmission to the terminal.

The first wearable terminal may further include: a first photographing unit for photographing the first image; A first microphone for directly receiving the first voice; A first image data transmission unit for transmitting the first image captured through the first image sensing unit to the second wearable terminal and the external management unit in real time; And a first voice data transmission unit for transmitting the first voice input through the first microphone unit to the external management unit in real time.

The first wearable terminal may further include: a first memory unit for storing the first image; A first battery unit for supplying power to the first wearable terminal; And a port unit for transmitting the first image stored in the first memory unit to the outside and for transmitting external power to the first battery unit.

Also, the first wearable terminal may be in the form of a ring, a watch, or a bracelet.

The second wearable terminal may further include: a first image data receiver for receiving the first image from the first wearable terminal; A projector unit for projecting the beam so that the first image received through the first image data receiving unit is displayed through the lens of the eyeglasses; A first audio data receiving unit for receiving the second audio from the external management unit; And a first speaker unit for outputting the first voice received through the first voice data receiving unit.

The second wearable terminal may further include: a second photographing unit for photographing a second image at the time of wearing the eyeglasses; And a second image data transmission unit for transmitting the second image photographed through the second image sensing unit to the external management unit in real time.

The second wearable terminal may further include: a second memory unit for storing the second image; A second battery unit for supplying power to the second wearable terminal; And a power connection unit for transmitting the first image and the second image stored in the second memory unit to the outside and transmitting external power to the second battery unit.

The external management unit may include a second image data receiving unit for receiving the second image from the second wearable terminal.

The external management unit may further include: a second microphone unit for directly receiving the second voice; A second voice data transmission unit for transmitting the second voice input through the second microphone unit to the second wearable terminal in real time; A third image data receiving unit for receiving the first image from the first wearable terminal; A display unit for outputting the first image received from the second image data receiving unit; A second voice data receiving unit for receiving the first voice from the first wearable terminal; And a second speaker unit for outputting the first voice received through the second voice data receiving unit.

According to the embodiment of the present invention, when a surgeon or a patient is consulted, all the relevant situations that can be confirmed by a medical professional can be shared with a third related specialist through video and voice in real time, A more accurate diagnosis can be made or better surgical results can be obtained.

Also, in the case of a fire suppression or a large accident scene, even if a third expert is not in the field, it is possible to collaborate with a third related expert by real-time sharing of the situation through video and voice so that more accurate and appropriate action can be taken can do.

Also, in the case of public execution, the situation of the site can be recorded by video and audio, and it can be used as legal proof of the aftermath.

1 is a view schematically showing a configuration of a real-time relay and dispatch system according to an embodiment of the present invention.
2 is a block diagram illustrating a detailed configuration of a real-time relay and transmission system according to an embodiment of the present invention.
3A is a perspective view of a first wearable terminal according to an embodiment of the present invention as viewed from the upper right.
FIG. 3B is a perspective view of the first wearable terminal according to the embodiment of the present invention when viewed from the lower right.
3C is a perspective view of the first wearable terminal according to the embodiment of the present invention as viewed from the upper left corner.
4A is a perspective view of a second wearable terminal according to an embodiment of the present invention.
4B is an exploded perspective view of a second wearable terminal according to an embodiment of the present invention.

The terms used in this specification will be briefly described and the present invention will be described in detail.

While the present invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not limited to the disclosed embodiments. Also, in certain cases, there may be a term selected arbitrarily by the applicant, in which case the meaning thereof will be described in detail in the description of the corresponding invention. Therefore, the term used in the present invention should be defined based on the meaning of the term, not on the name of a simple term, but on the entire contents of the present invention.

When an element is referred to as "including" an element throughout the specification, it is to be understood that the element may include other elements as well, without departing from the spirit or scope of the present invention. Also, the terms "part," " module, "and the like described in the specification mean units for processing at least one function or operation, which may be implemented in hardware or software or a combination of hardware and software .

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings so that those skilled in the art can easily carry out the present invention. The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. In order to clearly illustrate the present invention, parts not related to the description are omitted, and similar parts are denoted by like reference characters throughout the specification.

2 is a block diagram illustrating a detailed configuration of a real-time relay and dispatch system according to an embodiment of the present invention. FIG. 3A is a block diagram of a real-time relay and dispatch system according to an embodiment of the present invention. 3B is a perspective view of the first wearable terminal according to the embodiment of the present invention viewed from the lower right side thereof, FIG. 3C is a perspective view of the first wearable terminal according to the present invention, FIG. 4A is a perspective view of a second wearable terminal according to an embodiment of the present invention, FIG. 4B is a perspective view of a second wearable terminal according to an embodiment of the present invention, FIG. Fig.

1 to 4B, a real-time relay and transmission system 100 according to an embodiment of the present invention includes a first wearable terminal 110, a second wearable terminal 120, and an external management unit 130 do.

The first wearable terminal 110 is a wearable portable apparatus having a ring structure. The first wearable terminal 110 captures a first image, directly receives a first voice, and transmits a first image and a first voice to the second wearable terminal 120 And the external management unit 130, respectively.

To this end, the first wearable terminal 110 includes a first photographing unit 111, a first microphone unit 112, a first image data transmitting unit 112, a first voice data transmitting unit 113, A memory unit 114, a first battery unit 115, a port unit 116, and a power switch unit 117.

The first photographing unit 111 may be a photographing unit for photographing the first image, for example, a photographing unit such as a portable endoscope or a portable ultrasonic photographing unit. The first photographing unit 111 can photograph the location of a person, the affected part of the patient, or the inside of the body, which are difficult for the person to approach at the accident scene, to generate the first image data.

The first microphone unit 112 is a means for directly receiving a first voice and converting the received first voice into an electrical signal, that is, voice data. The first microphone unit 112 receives a voice of a wearer wearing the first wearable terminal 110, It can be converted into voice data.

The first image data transmission unit 112 may transmit the first image data generated through the first image pickup unit 111 to the second wearable terminal 120 and the external management unit 130 in real time . At this time, the external management unit 130 may include a separate management server to store the received second image data.

The first image data transmission unit 112 may transmit the first image data generated through the first image pickup unit 111 using a wireless or wired communication method. Here, in the case of the wireless communication method, short-range wireless communication such as Wi-Fi, Bluetooth, Zigbee, or Beacon can be used. In the case of the wired communication method, a data cable or the like may be connected to the port unit 116 to transmit the first image data generated through the first image pickup unit 111 to the second wearable terminal 120 . However, when the wired communication system is used, the wearer wearing the first and second wearable terminals 110 and 120 may inconvenience in activity, so it is advantageous to transmit data using a wireless system whenever possible .

The first voice data transmission unit 113 may transmit the first voice data converted through the first microphone unit 112 to the external management unit 130 in real time. The first voice data transmission unit 113 can transmit the first voice data to the external management unit 130 in real time using the wireless Internet communication method.

The first memory unit 114 may store first image data generated through the first photographing unit 111. [ The first memory unit 114 may be connected to the port unit 116 to provide or delete first image data stored in advance according to a control signal input from the outside. The first memory unit 114 may function as a black box when an environment or a situation where communication between the first wearable terminal 110 and the external management unit 130 is not established occurs, And can provide the corresponding stored image data after the necessity.

The first battery unit 115 is a means for supplying power to the first wearable terminal 110 and may be built in or detached from the first wearable terminal 110. The first battery unit 115 preferably includes a battery module including not only an element for storing electrical energy but also elements for protecting the charge, discharge, overcharge, and the like.

The port unit 116 is formed to detach the data cable and can provide the first video data stored in the first memory unit 114 to the outside through the data cable. The port portion 116 may be connected to the first battery portion 115 and electrically connects the first battery portion 115 and the external power source to charge the first battery portion 115. [ Or connection means for directly connecting an external power source to the second wearable terminal 120.

The power switch unit 117 may turn on / off the first wearable terminal 110. When the power switch unit 117 turns on the first wearable terminal 110, the lamp is turned on, and when the first wearer-type terminal 110 is turned off, the lamp is turned off, / Off state of the wearer.

In the above description, the first wearer-type terminal 110 has a ring structure that can be worn by the user. However, the first wearer-type terminal 110 may be configured to have a shape And can be carried out in various forms. For example, it can be performed in the form of a portable ultrasound machine or an endoscope in a surgical or medical setting environment, and can be implemented in various forms such as a bracelet, a clock, and a button in the case of an accident or a public service execution.

The second wearable terminal 120 has a spectacles structure and receives the first image data from the first wearable terminal 110 in real time and displays the received first image data through the lens of the eyeglasses, In real time, and output the second voice. The eyeglass can have a general eyeglass structure including a rim 120a, a lens 120b, a bridge 120c, a nose pad 120d, a temple 120e, and a tip 120f.

The second wearable terminal 120 includes a second photographing unit 121, a first image data receiving unit 122, a projector unit 123, a first voice data receiving unit 124, A second image data transfer unit 126, a second memory unit 127, a second battery unit 128, and a power connection unit 129. The first memory unit 125, the second image data transfer unit 126, the second memory unit 127,

In the second wearing-type terminal 120, the second photographing unit 121 and the second image data transmitting unit 126 may be omitted, but the second photographing unit 121 and the second image data transmitting unit 126 are included in the present embodiment.

The second photographing unit 121 may be installed over the endpiece of the upper rim 120a and the temple 120e and the camera lens of the second photographing unit 121 may be installed . The second photographing unit 121 may generate the second image data by photographing the second image. The second image may refer to the image of the wearer of the glasses, i.e., the image of the first person of the wearer.

The first image data receiving unit 122 may receive the first image data from the first wearable terminal 110 and transmit the first image data to the projector unit 123. The first image data receiving unit 122 may receive the first image data from the first wearable terminal 110 using a wireless or wired communication method. Here, in the case of the wireless communication method, short-range wireless communication such as Wi-Fi, Bluetooth, Zigbee, or Beacon can be used. In the case of the wired communication method, the first video data can be received by connecting a data cable or the like to the port portion 116 of the first wearable terminal 110. However, when the wired communication system is used, the wearer wearing the first and second wearable terminals 110 and 120 may inconvenience in activity, so it is advantageous to transmit data using a wireless system whenever possible .

The projector unit 123 performs a predetermined conversion process on the first image data connected to the first image data receiving unit 122 and received through the first image data receiving unit 122, The prism may project light onto the lens 120b so that the light is displayed on the prism 120b. Here, the lens 120b may perform a display function for displaying a first image.

The projector unit 123 may project light so that the first image data is displayed on at least one of the two lenses 120b provided in the nose pad 120d. At this time, the projector unit 123 may make the corresponding image on the lens 120b appear semi-transparent or project only on a certain portion of the lens 120b so that the projected first image does not obscure the wearer's field of view have. In addition, it is also possible to use a method of selecting the light so that the first image data is displayed on at least one lens of the two lenses 120b.

The first audio data receiving unit 124 may receive the second audio data from the external management unit 130 and transmit the second audio data to the first speaker unit 125. The first voice data receiving unit 124 may receive the second voice data from the external management unit 130 using the wireless Internet communication method.

The first speaker unit 125 may output the second audio data received through the first audio data receiver 124 after being subjected to a predetermined conversion process so that the user can listen to the second audio data. The first speaker unit 125 may be installed on the tip 120f of the eyeglasses so that the wearer can listen to the second voice transmitted from the external management unit 130 more easily.

The second image data transmission unit 126 may transmit the second image photographed through the second image pickup unit 121 to the external management unit 130 in real time. The second image data transmission unit 126 may transmit the second image data to the external management unit 130 using the wireless Internet communication method. At this time, the external management unit 130 may include a separate management server to store the received second image data. The management server may be a management server for storing first image data received from the first wearable terminal 110.

The second memory unit 127 may store a second image photographed through the second photographing unit 121. The second memory unit 127 may function as a black box when an environment or a situation in which communication between the second wearable terminal 120 and the external management unit 130 is not established occurs, And can provide the corresponding stored image data after the necessity.

The second battery unit 128 is a means for supplying power to the second wearable terminal 120 and may be built in or detached from the second wearable terminal 120. For example, the second battery part 128 may be mounted or embedded in the temple 120e of the eyeglasses, and may be configured to be detachable to the eyeglass structure for replacement. The second battery unit 128 preferably includes a battery module including not only an element for storing electrical energy but also elements for protecting charge, discharge, overcharge, and the like.

The power connection unit 129 may be connected to the second battery unit 128 and electrically connect the second battery unit 128 to the external power source to charge the second battery unit 128 Or connection means for directly connecting an external power source to the second wearable terminal 120.

The first video data receiving unit 122, the first audio data receiving unit 124 and the second video data transmitting unit 126 may be composed of one communication module or IC (A) 120e.

The external management unit 130 receives and displays the first image in real time from the first wearable terminal 110 and receives and outputs the first voice in real time from the first wearable terminal 110, The second voice may be directly received and transmitted to the second wearable terminal 120 in real time.

If the first and second wearable terminals 110 and 120 are devices worn by physicians, fire fighters, police, etc., put into the actual field, the external management unit 130 may be installed in the surgery, It may be equipment to be managed by another external expert who can provide technical advice or instructions.

The external management unit 130 includes a second microphone unit 131, a second audio data transmission unit 132, a third video data reception unit 134, a display unit 135, a second audio data reception unit 136, 2 speaker unit 137 as shown in FIG. In addition, as described above, in the present embodiment, the second image pickup unit 121 and the second image data transfer unit 126 are included in the second wearable terminal 120, May further include a second image data receiving unit 133 for receiving second image data from the second wearable terminal 120. However, the second image data receiving unit 133 may be omitted because it is an optional item such as the second image pickup unit 121 and the second image data transmitting unit 126.

The second microphone unit 131 is a means for directly receiving a second voice and converting it into an electrical signal, that is, voice data. The second microphone unit 131 can directly input voice of an external expert and convert the voice into second voice data.

The second audio data transmitting unit 132 may transmit the second audio data converted by the second microphone unit 131 to the first wearer-type terminal 110 in real time. The second voice data transmitting unit 132 may transmit the second voice data to the first wearer-type terminal 110 in real time using a wireless Internet communication method.

The second image data receiving unit 133 may receive the second image data from the second wearable terminal 120. The second image data receiving unit 133 receives the second image data from the second wearable terminal 120 using the wireless Internet communication and has a separate management server to store the received second image data have.

The third video data receiving unit 134 may receive the first video data from the first wearable terminal 110. The third image data receiving unit 134 receives the first image data from the first wearer-type terminal 110 using the wireless Internet communication and has a separate management server to store the received first image data have.

The display unit 135 may play back the first and second video data received through the second and third video data receiving units 133 and 134 in real time or may be loaded from the management server and played back in a streaming manner. Accordingly, the external manager can share the situation of the scene in real time through the corresponding image.

The second audio data receiving unit 136 may receive the first audio data from the first wearable terminal 110 and transmit the first audio data to the second speaker unit 137. The second audio data receiver 136 may receive the second audio data from the first wearer terminal 110 using a wireless Internet communication method.

The second speaker unit 137 performs a predetermined conversion process on the first voice data received through the second voice data receiving unit 136 and then transmits the first voice data to the wearer of the first and second wearable terminals 110 and 120 Can be output to the outside so that the voice of the external expert can be heard.

If the first and second wearable terminals 110 and 120 are devices worn by physicians, fire fighters, police, etc., put into the actual field, the external management unit 130 may be installed in the surgery, It is the equipment to be managed by another external specialist who can give technical advice or instructions to the field specialists and other external specialists to share the situation of the field so that they can solve problems more efficiently through collaboration.

Accordingly, the field expert can select the first microphone unit 112, the first voice data transmission unit 114, the first voice data reception unit 124, and the second microphone unit 112 of the first and second wearable terminals 110 and 120, The second microphone unit 131, the second sound data transfer unit 132, and the second sound data reception unit 132 of the external management unit 130, The second speaker unit 137, and the second speaker unit 137, it is possible to communicate not only in real-time video sharing but also through voice chatting, Thereby enabling more efficient processing.

The first memory unit 115 of the first wearable terminal 110, the second memory unit 127 of the second wearable terminal 120 and the management server of the external management unit 130 The stored image data may be used as evidence for an operation, accident or event.

According to the embodiment of the present invention, it can be usefully used in various situations. For example, in the case of surgery or patient consultation, medical practitioners can share all the relevant situations that can be identified with third-party experts via video and voice in real-time, enabling collaboration with third-party experts, The examination can be performed or better surgical results can be obtained.

Also, in case of fire suppression or a large accident scene, even if a third expert is not on the spot, real-time situation can be shared through video and voice in real time, and cooperation with a third related expert becomes possible. Can be done.

Also, in the case of public execution, the situation of the site can be recorded by video and audio, and it can be used as legal proof of the aftermath.

As described above, the present invention is not limited to the above-described embodiments, but can be applied to a system and method for real- It will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

100: Relay and transmission system 110: First wearable terminal
111: first photographing unit 112: first microphone unit
113: first image data transmission unit 114: first audio data transmission unit
115: first memory unit 116: first battery unit
117: Port section 118: Power switch section
120: second wearable terminal 120a: rim
120b: lens 120c: bridge
120d: nose pad 120e: temple
120f: Tip 121: Second shooting section
122: first image data reception unit 123: projector unit (prism)
124: first audio data receiving unit 125: first speaker unit
126: second image data transfer unit 127: second memory unit
128: second battery part 129: power connection part
130: external management unit 131: second microphone unit
132: second audio data transmission unit 133: second video data reception unit
134: Third image data reception unit 135:
136: second audio data receiver 137: second speaker

Claims (9)

And a first photographing unit including a portable endoscope or a portable ultrasonic camera for photographing a first image, wherein the first photographing unit receives the first voice directly and outputs the first image and the first voice to the outside A first wearable terminal for real-time transmission in real time;
Wherein the first wearable terminal comprises a spectacle structure and receives the first image in real time from the first wearable terminal and displays the first image through a lens of the spectacle lens, A second wearable terminal for receiving and outputting voice in real time; And
Receiving and displaying the first image from the first wearable terminal in real time and receiving and outputting the first voice in real time from the first wearable terminal and directly receiving the second voice, And an external management unit for real-
The first wearable terminal comprises:
A first microphone for directly receiving the first voice; A first image data transmission unit for transmitting the first image captured through the first image sensing unit to the second wearable terminal and the external management unit in real time; A first voice data transmission unit for transmitting the first voice input through the first microphone unit to the external management unit in real time; A first memory unit for storing the first image; A first battery unit for supplying power to the first wearable terminal; And a port unit for transmitting the first image stored in the first memory unit to the outside and transmitting external power to the first battery unit, wherein the port unit is in the form of a ring, a watch, and a bracelet,
The second wearable terminal comprises:
A first image data receiving unit for receiving the first image from the first wearable terminal; A projector unit for projecting the beam so that the first image received through the first image data receiving unit is displayed through the lens of the eyeglasses; A first audio data receiving unit for receiving the second audio from the external management unit; A first speaker unit for outputting the first voice received through the first voice data receiver; A second photographing unit for photographing a second image at the time of wearing of the glasses; And a second image data transmission unit for transmitting, in real time, the second image captured through the second image sensing unit to the external management unit,
The external management unit,
A second microphone for receiving the second voice directly; A second voice data transmission unit for transmitting the second voice input through the second microphone unit to the second wearable terminal in real time; A third image data receiving unit for receiving the first image from the first wearable terminal; A display unit for outputting the first image received from the second image data receiving unit; A second voice data receiving unit for receiving the first voice from the first wearable terminal; A second speaker unit for outputting the first audio signal received through the second audio data receiver; And a second image data receiving unit for receiving the second image from the second wearable terminal.
delete delete delete delete delete The method according to claim 1,
The second wearable terminal comprises:
A second memory unit for storing the second image;
A second battery unit for supplying power to the second wearable terminal; And
Further comprising a power connection unit for transmitting the first image and the second image stored in the second memory unit to the outside and transmitting external power to the second battery unit.
delete delete
KR1020150159913A 2015-01-29 2015-11-13 A system for real time relaying and transmitting KR101906551B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020150014313 2015-01-29
KR20150014313 2015-01-29

Publications (2)

Publication Number Publication Date
KR20160104537A KR20160104537A (en) 2016-09-05
KR101906551B1 true KR101906551B1 (en) 2018-10-12

Family

ID=56711960

Family Applications (3)

Application Number Title Priority Date Filing Date
KR1020150159913A KR101906551B1 (en) 2015-01-29 2015-11-13 A system for real time relaying and transmitting
KR1020150159921A KR20160093529A (en) 2015-01-29 2015-11-13 A wearable device for hearing impairment person
KR1020150159933A KR101765838B1 (en) 2015-01-29 2015-11-13 Wearable device for visual handicap person

Family Applications After (2)

Application Number Title Priority Date Filing Date
KR1020150159921A KR20160093529A (en) 2015-01-29 2015-11-13 A wearable device for hearing impairment person
KR1020150159933A KR101765838B1 (en) 2015-01-29 2015-11-13 Wearable device for visual handicap person

Country Status (1)

Country Link
KR (3) KR101906551B1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102009448B1 (en) * 2018-06-15 2019-08-09 신한대학교 산학협력단 Apparatus for Providing Information of Things and Driving Method Thereof
KR102173634B1 (en) * 2019-08-21 2020-11-04 가톨릭대학교 산학협력단 System and method for navigation for blind
KR102284744B1 (en) * 2019-08-28 2021-07-30 구본준 Wearable device using stereo camera and infrared sensor for the visually impaired
KR102351584B1 (en) * 2019-12-02 2022-01-17 이우준 System for providing navigation service for visually impaired person
KR102294152B1 (en) * 2019-12-26 2021-08-25 인제대학교 산학협력단 A Eyeglasses for Rehabilitation of Patients with Visual Field Defects
KR102457910B1 (en) * 2020-11-19 2022-10-24 김동건 sunglasses for blind
US20240008112A1 (en) * 2020-11-19 2024-01-04 Shoshi Liliya Kaganovsky Wearable electronic device with switch-enabled communications
JP2022081342A (en) * 2020-11-19 2022-05-31 キヤノン株式会社 Glasses-type information appliance, method for the same, and program
KR102459095B1 (en) * 2021-04-02 2022-10-27 (주)케이아이오티 Gudie appratus and method for blind person having enhanced safety
KR102471586B1 (en) * 2021-04-02 2022-11-28 (주)케이아이오티 Gudie appratus and method for blind person using ultraviolet camera
KR102633725B1 (en) 2022-01-12 2024-02-05 동의대학교 산학협력단 Smart Glass System for the Hearing Impaired and Method for controlling the same

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003109160A (en) * 2001-09-29 2003-04-11 Toshiba Corp Emergency rescue supporting system, portable terminal with emergency rescue function, wireless terminal for receiving emergency rescue information and emergency rescue supporting method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100782103B1 (en) 2006-04-27 2007-12-04 (주)디오컴 Glass type monitor
KR20090105531A (en) 2008-04-03 2009-10-07 슬림디스크 주식회사 The method and divice which tell the recognized document image by camera sensor
WO2013049248A2 (en) 2011-09-26 2013-04-04 Osterhout Group, Inc. Video display modification based on sensor input for a see-through near-to-eye display
US8976085B2 (en) 2012-01-19 2015-03-10 Google Inc. Wearable device with input and output structures
KR20150055262A (en) 2013-11-13 2015-05-21 서원영 Sound visualization display method using mobile device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003109160A (en) * 2001-09-29 2003-04-11 Toshiba Corp Emergency rescue supporting system, portable terminal with emergency rescue function, wireless terminal for receiving emergency rescue information and emergency rescue supporting method

Also Published As

Publication number Publication date
KR20160093530A (en) 2016-08-08
KR101765838B1 (en) 2017-08-10
KR20160093529A (en) 2016-08-08
KR20160104537A (en) 2016-09-05

Similar Documents

Publication Publication Date Title
KR101906551B1 (en) A system for real time relaying and transmitting
US10342428B2 (en) Monitoring pulse transmissions using radar
US9655517B2 (en) Portable eye imaging apparatus
US10838203B2 (en) Adjustable electronic device system with facial mapping
JP2016505889A (en) Communication glasses with safety function zone
US20050275714A1 (en) Eyeglass interface device and security system
KR101321423B1 (en) Medical monitoring system by the smart phone
KR101580559B1 (en) Medical image and information real time interaction transfer and remote assist system
TW581668B (en) Endoscopic device
KR20090105531A (en) The method and divice which tell the recognized document image by camera sensor
CN103619232A (en) Apparatus for capturing image of anterior part of iris and medical monitoring system using smart phone
WO2022007720A1 (en) Wearing detection method for wearable device, apparatus, and electronic device
KR20090036183A (en) The method and divice which tell the recognized document image by camera sensor
KR200480257Y1 (en) Camera apparatus for confirming teeth image based on smartphone
KR20150086477A (en) Magnification loupe with display system
KR20190077639A (en) Vision aids apparatus for the vulnerable group of sight, remote managing apparatus and method for vision aids
US20180345501A1 (en) Systems and methods for establishing telepresence of a remote user
KR20150018973A (en) Medical monitoring system by the smart phone
CN104706422A (en) Head-worn type medical device, medical system and operation method of medical system
US11037519B2 (en) Display device having display based on detection value, program, and method of controlling device
KR102236358B1 (en) Systems and methods for protecting social vulnerable groups
KR20130131511A (en) Guide apparatus for blind person
CN107848125A (en) Robot and robot system
US20110124974A1 (en) Mobile medical communications system
WO2019119022A1 (en) Augmented visual assistance system for assisting a person working at a remote workplace, method and headwear for use therewith

Legal Events

Date Code Title Description
A201 Request for examination
A302 Request for accelerated examination
E902 Notification of reason for refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
J201 Request for trial against refusal decision
J301 Trial decision

Free format text: TRIAL NUMBER: 2016101005139; TRIAL DECISION FOR APPEAL AGAINST DECISION TO DECLINE REFUSAL REQUESTED 20160831

Effective date: 20180809

S901 Examination by remand of revocation
GRNO Decision to grant (after opposition)