WO2017188665A1 - Appareil portable de travailleur sur le terrain, appareil de réalité virtuelle et système de commande à distance au moyen d'une caméra à 360 degrés de l'appareil portatif du travailleur sur le terrain et de l'appareil de réalité virtuelle - Google Patents

Appareil portable de travailleur sur le terrain, appareil de réalité virtuelle et système de commande à distance au moyen d'une caméra à 360 degrés de l'appareil portatif du travailleur sur le terrain et de l'appareil de réalité virtuelle Download PDF

Info

Publication number
WO2017188665A1
WO2017188665A1 PCT/KR2017/004279 KR2017004279W WO2017188665A1 WO 2017188665 A1 WO2017188665 A1 WO 2017188665A1 KR 2017004279 W KR2017004279 W KR 2017004279W WO 2017188665 A1 WO2017188665 A1 WO 2017188665A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearer
virtual reality
field
site
reality device
Prior art date
Application number
PCT/KR2017/004279
Other languages
English (en)
Korean (ko)
Inventor
엄정한
Original Assignee
주식회사 지에스아이엘
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020160050755A external-priority patent/KR101885355B1/ko
Priority claimed from KR1020160050750A external-priority patent/KR101898893B1/ko
Application filed by 주식회사 지에스아이엘 filed Critical 주식회사 지에스아이엘
Publication of WO2017188665A1 publication Critical patent/WO2017188665A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers

Definitions

  • Various embodiments of the present invention relate to a remote control system for controlling a work or task performing site, and a site wearer device and a virtual reality device constituting the same, more specifically, a server or virtual reality for specific information related to the site to which the site wearer belongs.
  • a system and apparatus are provided for communicating with a device, and for enabling remote analysis and communication with the device.
  • a plurality of people collaborate or share a predetermined work, and a manager assigned to each person manages the work site through instructions of the work.
  • the personnel of the site is wearing equipment or attire, such as a hard hat, firefighting, military uniforms, police uniforms, etc., these equipment or attire serves to protect or identify the personnel from the hazards of the site.
  • equipment or attire such as a hard hat, firefighting, military uniforms, police uniforms, etc.
  • a manager away from the fire site must accurately determine the situation at the fire site in order to direct the workers (eg firefighters) at the fire site. do.
  • the manager must accurately determine the current state of the person at the fire site to determine the exact time of rescue.
  • a remote manager can generally use a mobile phone or a radio to receive information about the surrounding conditions of the site, the physical condition of the patient, and the like.
  • these methods are difficult to use in case of emergency, such as the above-mentioned fire situation or distress situation, there is a problem that it is difficult for the administrator to accurately convey the instruction information related to the work to the remote work personnel.
  • the worker who belongs to a specific environment is required a means to accurately transmit his or her situation or current status to a remote manager as an image.
  • the remote manager can accurately find out the site information to which the worker belongs. There is a need for means.
  • the present invention has been made to solve the above problems, the scene wearer in the field by taking a panoramic image, such as a 360-degree image in the field in real time by providing a remote virtual reality device, the user of the virtual reality device (
  • the purpose of the present invention is to provide a remote control system that enables an administrator to analyze a site where a site wearer is more accurately, a field wearer device and a virtual reality device constituting the remote control system.
  • the object of the present invention is not limited to the above, it can be variously derived through the following contents.
  • Field wearer device is a wearable housing on the head of the field wearer;
  • An LTE communication module disposed in the housing to relay communication with the virtual reality device capable of remotely observing a situation of the field wearer;
  • a 360 degree camera module disposed in the housing and configured to capture a 360 degree image related to the site;
  • a PTT module disposed in the housing to collect a voice of the on-site wearer or a voice related to the site and transmit a voice of the wearer of the virtual reality device;
  • a controller configured to transmit the captured 360 degree image and the collected voice to a server through the LTE communication module.
  • the field wearer photographs a 360 degree image related to the field, collects the sound associated with the field wearer or the field through a PTT module, and virtualizes the 360 degree image and the voice using LTE communication.
  • the user of the virtual reality device can accurately recognize the information on the site, there is an effect that can provide accurate and specific site instruction information to the site wearer.
  • the on-site wearer device is configured as a wearable device in the form of a helmet or a band that can be worn on the wearer's head so that the user in the field can take a 360-degree image of the site without using a hand. There is an effect.
  • server or virtual reality device by assigning a variable or weight to the site information and biometric information automatically set the safety status grade, to detect the emergency situation of the field wearer in real time The effect is that you can.
  • a translation module in a field wearer device, a server, or a virtual reality device according to various embodiments of the present disclosure, by translating voices collected from a PTT module and providing the translated voice to a server or a virtual reality device, Even when the wearer and the user of the virtual reality device 300 use different languages, the wearer and the user of the virtual reality device 300 can effectively communicate.
  • the user of the virtual reality device receives a 360-degree image associated with the site from the field wearer device, the user of the virtual reality device can accurately recognize the information about the site, PTT module and The LTE communication module has an effect that can provide in-depth site information to the user in the field.
  • the emergency situation of the on-site wearer can be detected in real time.
  • the voice signal collected through the PTT module is translated using the translation module, and the translated voice signal is provided to the field wearer, thereby enabling the user of the field wearer and the virtual reality device. Even if they use different languages, they can communicate effectively.
  • FIG. 1 is a block diagram of a remote control system according to various embodiments of the present invention.
  • FIG 2 is an exemplary view of a field wearer device according to various embodiments of the present disclosure.
  • FIG. 3 is a configuration diagram of a field wearer device according to various embodiments of the present disclosure.
  • FIG. 4 is a flowchart illustrating an operation of capturing a 360-degree image in the field wearer device of the remote control system according to various embodiments of the present disclosure and receiving field indication information generated based on the same.
  • FIG. 5 is a flowchart illustrating an operation of setting a safety status level of a user in a field wearer device and generating an emergency signal based on the same according to various embodiments of the present disclosure.
  • FIG. 6 is a configuration diagram of a server according to various embodiments of the present disclosure.
  • FIG. 7 is a flowchart illustrating an operation of mediating communication between a field wearer device and a virtual reality device by processing information related to a site in a server according to various embodiments of the present disclosure.
  • FIG. 8 is a flowchart illustrating an operation of setting a safety level of a user in a server and generating an emergency signal based on the same according to various embodiments of the present disclosure.
  • FIG. 9 is an exemplary view illustrating a virtual reality device according to various embodiments of the present disclosure.
  • FIG. 10 is a configuration diagram of a virtual reality device according to various embodiments of the present disclosure.
  • FIG. 11 is a flowchart illustrating an operation of generating field indication information in a virtual reality device according to various embodiments of the present disclosure.
  • FIG. 12 is a flowchart illustrating an operation of setting a safety level of a user in a virtual reality device and generating an emergency signal based on the same according to various embodiments of the present disclosure.
  • Field wearer device is a wearable housing on the head of the field wearer;
  • An LTE communication module disposed in the housing to relay communication with the virtual reality device capable of remotely observing a situation of the field wearer;
  • a 360 degree camera module disposed in the housing and configured to capture a 360 degree image related to the site;
  • a PTT module disposed in the housing to collect a voice of the on-site wearer or a voice related to the site and transmit a voice of the wearer of the virtual reality device;
  • a controller configured to transmit the captured 360 degree image and the collected voice to a server through the LTE communication module.
  • control unit receives, from the server, site indication information generated in the virtual reality device through the LTE communication module, and receives the received field indication information through the PTT module. Output to the field wearer.
  • the on-site wearer device may further include a sensor module for collecting at least one sensor information related to the site, and the controller may transmit the sensor information to the server through the LTE communication module.
  • the scene indication information may be generated in the virtual reality device based on at least one of the 360 degree image, the voice, and the sensor information.
  • the field wearer device may be configured as a wearable device in a helmet or band form that may be worn on the head of the field wearer.
  • an on-site wearer device includes a push to tank (PTT) button in at least a portion of an exterior of the housing, and the controller activates the PTT module when the PTT button is input.
  • PTT push to tank
  • the collected voice may be transmitted to the server or the virtual reality device through the LTE communication module.
  • the sensor information includes site information collected at a site where the site wearer is located and biometric information indicating a state of the site wearer. Calculating a first variable according to the first variable and a second variable according to the biometric information, and setting a safe state level of the field wearer based on at least one of the first variable and the second variable, When the first threshold value is exceeded, an emergency signal may be transmitted to the server through the LTE communication module.
  • the on-site wearer device may further include a notification unit, and the controller may further include the notification unit when the safety state grade exceeds a second threshold having a higher value than the first threshold. Through the on-site wearer can output the emergency signal.
  • the controller calculates the safety state grade by assigning a weight of the first variable and a weight of the second variable, wherein the weight of the second variable is the weight of the second variable. It may have a higher value than the weight of the first variable.
  • control unit may further include: when the safety state grade exceeds the first threshold value or the second threshold value, the safety state grade is the first threshold value or the first threshold value.
  • An identifier or tag information may be assigned to distinguish the 360 degree image for a time exceeding 2 thresholds from the photographed 360 degree image, and the 360 degree image to which the identifier or tag information is assigned may be transmitted to the server.
  • the on-site wearer device may further include a translation module, and the translation module may translate the voice collected by the PTT module into a language preset according to the virtual reality device to the server. .
  • the remote control system is a remote control system using a 360 degree camera and a virtual reality device
  • the remote control system is a field wearer device, a virtual reality device for remotely instructing the field wearer and the field wearer
  • a server for relaying the device and the virtual reality device
  • the field wearer device comprises: a housing wearable on the head of the field wearer; An LTE communication module disposed in the housing to relay communication with the virtual reality device capable of remotely observing a situation of the field wearer;
  • a 360 degree camera module disposed in the housing and configured to capture a 360 degree image related to the site;
  • a PTT module disposed in the housing to collect a voice of the on-site wearer or a voice related to the site and transmit a voice of the wearer of the virtual reality device;
  • a controller configured to transmit the captured 360 degree image and the collected voice to a server through the LTE communication module, wherein the server receives the 360 degree image and the voice from the field wearer device and receives the received.
  • a virtual reality device is a virtual reality device for remotely controlling a field wearer device used in the field, the wearable housing of the user of the virtual reality device;
  • An LTE communication module disposed in the housing to relay communication with the field wearer device in the field;
  • a display module disposed in the housing and outputting a virtual space supported by the virtual reality device;
  • a push to tank (PTT) module disposed in the housing to collect a voice signal of the user of the virtual reality device;
  • PTT push to tank
  • a control unit which collects the voice signal of the user of the virtual reality device based on the 360 degree image to generate on-site instruction information and transmits the generated on-site instruction information to the on-site wearer device or the server through the LTE communication module; It may include.
  • the controller may include the voice signal collected by the site wearer device and sensor information detected by the site wearer device in relation to the site through the LTE communication module.
  • the apparatus may further receive from the wearer device and output the voice signal or the sensor information through the display module or the PTT module.
  • the virtual reality device includes a PTT button on at least a portion of the exterior of the housing, and the controller activates the PTT module when the PTT button is input.
  • the voice signal may be exchanged between the field wearer device and the virtual reality device.
  • the sensor information includes site information collected at a site where the site wearer device is located and biometric information indicating a state of the site wearer
  • the control unit may be configured to the site information. Calculating a first variable according to the first variable and a second variable based on the biometric information, setting a safe state level of the user based on at least one of the first variable and the second variable, and setting the preset safe state level When the first threshold value is exceeded, an emergency signal may be transmitted to the server through the LTE communication module.
  • the controller may store the pre-stored data according to the safe state class when the safe state level exceeds a second threshold having a higher value than the first threshold value.
  • Site indication information may be transmitted to the field wearer device or the server.
  • the controller sets the safety state level by assigning a weight of the first variable and a weight of the second variable, and the weight of the second variable is the weight of the second variable. It may have a higher value than the weight of the first variable.
  • the controller is further configured to, when the safety state grade exceeds the first threshold value or the second threshold value, the safety state grade is the first threshold value or the Identifier or tag information is assigned to distinguish the 360 degree image for a time exceeding a second threshold from the photographed 360 degree image, and the 360 degree image to which the identifier or tag information is assigned is output through the display module. can do.
  • the controller may be configured to translate a voice signal received from the field wearer device into a preset language according to the virtual reality device and output the same, and to generate the generated field instruction information. According to the field wearer device can be translated into a preset language and transmitted to the server.
  • the remote control system is a remote control system using a 360 degree camera and a virtual reality device
  • the remote control system is a field wearer device, a virtual reality device that can remotely observe the situation of the field wearer And a server for mediating the field wearer device and the virtual reality device, wherein the field wearer device includes the 360 degree camera and the LTE communication module, and the virtual reality device is wearable by a user of the virtual reality device.
  • An LTE communication module disposed in the housing to relay communication with the field wearer device in the field;
  • a display module disposed in the housing and outputting a virtual space supported by the virtual reality device;
  • a push to tank (PTT) module disposed in the housing to collect a voice signal of the user of the virtual reality device;
  • a control unit which collects the voice signal of the user of the virtual reality device based on the 360 degree image to generate on-site instruction information and transmits the generated on-site instruction information to the on-site wearer device or the server through the LTE communication module; It may include.
  • Site referred to in this document may include various sites such as fire sites, construction sites, nuclear treatment sites, chemical treatment sites, tunnel construction sites, military training sites, police, firefighters, and military service sites. Can be.
  • “user” referred to in this document may mean a person (eg, an operator) of the site or a person (eg, an administrator) who remotely monitors or analyzes the site.
  • the worker may be referred to hereinafter as “site wearer” and the administrator may hereinafter be referred to as “user of a virtual reality device”.
  • the "site-related information" referred to in this document is at least one of the 360-degree image, audio signal and sensor information collected by the field wearer device 100 in the field, and the field instruction information generated by the virtual reality device 300. It may mean.
  • FIG. 1 is a block diagram of a remote control system 10 according to various embodiments of the present invention.
  • the remote control system 10 may include a field wearer device 100, a server 200, a virtual reality device 300, and a network 400.
  • the remote control system 10 is a field wearer to take a 360-degree image through the field wearer device 100 to transmit to the server 200 or the virtual reality device 300, the virtual reality device 300 is a 360-degree image It can perform a function of providing the field instruction information generated based on the field wearer device (100).
  • the field wearer device 100, the server 200, and the virtual reality device 300 will be described in detail with reference to the following drawings.
  • the network 400 may be a telecommunications network.
  • the communication network may include at least one of a computer network, the Internet, the Internet of things, a mobile network, and a telephone network. It is not.
  • FIG 2 is an exemplary view of a field wearer device 100 according to various embodiments of the present disclosure.
  • the wearer device 100 may be configured as a wearable device that may be worn on the wearer's head.
  • the field wearer device 100 may be connected to various types of helmets 100_S and configured in a form suitable for site characteristics.
  • the field wearer device 100 of FIG. 2 is an exemplary form, and may be modified and used in various forms according to a user or purpose in the field.
  • the field wearer device 100 may be implemented integrally with the helmet 100_S or the hair band, but is not limited thereto.
  • the field wearer device 100 may include a housing 100_1.
  • the housing 100_1 is formed to be worn on the head of the field wearer, and may include respective components of the field wearer device 100.
  • the field wearer device 100 may include a voice button 101, a power button 102, a camera module 103, a light emitting unit 104, and a microphone 105 in at least a partial region. It may include. In this case, the field wearer device 100 may further include at least some components of FIG. 3 in an area not shown in FIG. 2, such as the inside or the back.
  • the field wearer may wear the field wearer device 100 as shown in FIG. 2 on the head and perform a field work. For example, in response to the field wearer pressing the power button 102 of the field wearer device 100, the field wearer device 100 may activate each configuration of the field wearer device 100.
  • the microphone 105 of the field wearer device 100 is a voice signal collected from the field wearer (eg, the voice of the field wearer) or the voice signal of the field ( For example, ambient sounds).
  • the camera module 103 may be a 360 degree camera module capable of photographing a 360 degree scene.
  • the camera module 130 may include at least one camera, and each camera may be configured to photograph each of at least one direction spaced apart by a predetermined angle.
  • the camera module 103 may combine (eg, stitch) each image photographed by each camera of the camera module 103 and post-process the combined image to generate one 360 degree image.
  • the camera module 103 may include at least one processor capable of performing the combining process and the post-processing process, but is not limited thereto.
  • the camera module 103 may control the collected image by a controller (eg, The 360 degree image may be generated by the control unit by transmitting to the processor 110.
  • the camera module 103 may be located in the center of the upper end of the field wearer device 100 without being limited to the form shown in FIG. 2.
  • the camera module 103 may be configured to include a first camera located in the front portion of the field wearer device 100 and a second camera located in the rear portion of the field wearer device 100 to capture a 360 degree image.
  • the field wearer device 100 may distinguish and display at least one of a communication state, a power state, and an emergency state through the light emitting unit 104.
  • the light emitter 104 may include at least one LED device, and LEDs of a predetermined color or number may be emitted according to each state.
  • FIG. 3 is a detailed configuration diagram of a field wearer device 100 according to various embodiments of the present disclosure.
  • the field wearer device 100 may include one or more processors 110 (eg, a CPU, an AP, an MCU, or a DSP), a communication module 120, a memory 130, a sensor module 140, an input device 150, and a display 160. ), An interface 170, an audio module 180, a camera module 191, a power management module 195, a battery 196, an indicator 197, and a motor 198.
  • processors 110 eg, a CPU, an AP, an MCU, or a DSP
  • the processor 110 may control, for example, a plurality of hardware or software components connected to the processor 110 by driving an operating system or an application program, and may perform various data processing and operations.
  • the processor 110 may be implemented with, for example, a system on chip (SoC).
  • SoC system on chip
  • the communication module 120 may be, for example, a cellular module 121, a WIFI module 123, a BT module 125, a GPS module 127, an NFC module 128, and an RF (radio frequency) module 129. It may include.
  • the communication module 120 may be an LTE communication module.
  • the communication module 120 or the cellular module 121 may support at least one communication scheme of LTE, LTE-A, CDMA, WCDMA, UMTS, WiBro, and GSM.
  • the server 200 or the remote control system 10 may include a base station server that can support LTE communication.
  • the LTE communication module may provide a voice signal collected by the audio module 180 to the server 200 or the virtual reality device 300.
  • the field wearer may be connected to the call in real time with the manager through the cellular module 121 to receive the field instruction information.
  • the field wearer is a field wearer device of an adjacent field wearer (eg, another worker) through at least one of the WIFI module 123, the BT module 125, the GPS module 127, or the NFC module 128. Can communicate in real time or periodically.
  • the memory 130 may include an internal memory 132 or an external memory 134.
  • the internal memory 132 may include, for example, at least one of a volatile memory, a nonvolatile memory, a hard drive, or a solid state drive (SSD).
  • the external memory 134 may include, for example, compact flash (CF), secure digital (SD), micro secure digital (Micro-SD), mini secure digital (Mini-SD), extreme digital (xD), or memory. Sticks (memory sticks) and the like.
  • the external memory 134 may be functionally and / or physically connected to the field wearer device 100 through various interfaces.
  • the memory 130 may store a 360 degree image, an audio signal, and sensor information collected through each configuration of the field wearer device 100.
  • the memory 130 may store the site-related information and the field wearer's information under the control of the processor 110 in relation to the voice signal and the sensor information.
  • the processor 110 may distinguish between an on-site wearer's voice signal (eg, the on-site wearer's voice) and an on-site voice signal (eg, ambient sound of the site) collected through the audio module 180.
  • the voice signal may be distinguished and stored in the memory 130. This embodiment may be performed in the server 200 or the virtual reality device 300 to be described later.
  • the processor 110 may distinguish between on-site information (eg, on-site temperature information or wind speed information) collected through the sensor module 140 and on-site wearer's biometric information (eg, heart rate information of the on-site wearer).
  • on-site information eg, on-site temperature information or wind speed information
  • on-site wearer's biometric information eg, heart rate information of the on-site wearer.
  • the divided biometric information may be distinguished and stored in the memory 130.
  • the sensor module 140 may measure a physical quantity or detect an operation state of the field wearer device 100 to convert the measured or detected information into an electrical signal.
  • the sensor module 140 may include, for example, a gesture sensor 140A, a gyro sensor 140B, an air pressure sensor 140C, a magnetic sensor 140D, an acceleration sensor 140E, a grip sensor 140F, and a proximity sensor.
  • 140G color sensor 140H (e.g. RGB (red, green, blue) sensor), biometric sensor 140I, temperature / humidity sensor 140J, illuminance sensor (140K), 9-axis sensor or UV (ultra) violet) may include at least one of the sensors 140M.
  • the sensor module 140 may include, for example, an olfactory sensor, an electromyography sensor, an electroencephalogram sensor, an electrocardiogram sensor, and an infrared (IR) sensor.
  • Sensors iris sensors and / or fingerprint sensors, motion sensors, motion sensors, nitrogen measurement sensors, carbon monoxide measurement sensors and pressure sensors.
  • the biometric sensor 140I includes a heart rate measuring sensor for measuring a heart rate of the on-site wearer, a body temperature measuring sensor for measuring a body temperature of the on-site wearer, a skin condition measuring sensor for measuring a skin condition of the on-site wearer, It may include an oxygen amount measuring sensor for measuring the amount of oxygen in the body wearer.
  • the motion sensor includes one or more of sensors such as a gyro sensor 140B, an acceleration sensor 140E, a geomagnetic sensor, and detects the position or movement of the field wearer device 100 using the sensor.
  • sensors such as a gyro sensor 140B, an acceleration sensor 140E, a geomagnetic sensor, and detects the position or movement of the field wearer device 100 using the sensor.
  • the heart rate sensor measures the change of optical blood flow according to the change in blood vessel thickness due to the heartbeat in order to collect heart rate information of the field wearer.
  • it is possible to measure the change in the number and the number of heartbeats per unit time, and when measuring the number of heartbeats per unit time, the heart rate detection sensor is measured in consideration of the state of the surrounding environment and the state of the measurement target, etc. You may.
  • the skin condition sensor may include a skin condition sensor and may not only measure body temperature but also measure skin temperature as a resistance value changes in response to a change in ambient temperature. Based on a predetermined value (for example, 36.5 degrees Celsius to 37.5 degrees Celsius, which is the normal temperature of the skin), abnormal skin temperature rise and fall can also be measured.
  • the skin resistance sensor can be used to measure the electrical resistance of the skin, so that the skin temperature sensor can measure the change in resistance value in response to changes in ambient temperature.
  • the UV sensor 140M is a sensor that detects ultraviolet rays, and for example, may be referred to as an ultraviolet detection sensor capable of detecting this when occurring in a fire scene. That is, when a smoke is generated by the fire and the smoke blocks the light flowing into the UV sensor 140M, a change occurs in the amount of ultraviolet light that can be detected by the UV sensor 140M. The quantity can be measured to detect the fire.
  • the UV sensor 140M may determine whether or not a fire occurs according to the degree of light obscured.
  • the size of the generated fire can be detected or determined, and smoke saturation per unit area can be calculated based on the size of the fire determined by analyzing or treating the ultraviolet rays as described above.
  • the body oxygen measuring sensor is a sensor that measures the amount of oxygen existing in the human body in order to collect the oxygen content of the body wearer of the field wearer.
  • the sensor refers to a sensor for determining a body temperature, an ambient temperature, a heart rate, etc.) in advance, and then determining the amount of oxygen existing in the human body in consideration of changes in various conditions.
  • the carbon monoxide measuring sensor is a sensor for measuring the concentration of carbon dioxide gas present in adjacent air. There is a way to measure the concentration.
  • the nitrogen measuring sensor is a sensor for measuring nitrogen oxides present in adjacent air such as nitrogen monoxide (NO), nitrogen dioxide (NO2), dinitrogen trioxide (N2O3) and nitrous oxide (N2O).
  • NO nitrogen monoxide
  • NO2 nitrogen dioxide
  • N2O3 dinitrogen trioxide
  • N2O nitrous oxide
  • the input device 150 may include, for example, a touch panel 152 or a key 156.
  • the touch panel 152 may use, for example, at least one of capacitive, resistive, infrared, or ultrasonic methods.
  • the key 156 may include, for example, a physical button, an optical key, or a keypad.
  • a physical button for example, a physical button, an optical key, or a keypad.
  • the voice button 101 or the power button 102 of FIG. 2 described above may be implemented.
  • Display 160 may include panel 162.
  • the panel 162 may be implemented to be, for example, flexible, transparent, or wearable.
  • the panel 162 may be configured as one module together with the above-described touch panel 152.
  • the field wearer device 100 may further include a display 160, and image information (eg, a 360 degree image) collected by the field wearer device 100, an audio signal, and the like.
  • image information eg, a 360 degree image
  • the sensor information or emergency signal can be output visually.
  • the interface 170 may be, for example, a high-definition multimedia interface (HDMI) 172, a universal serial bus 174 (USB), an optical interface 176, or a D-sub (D-subminiature, 178). It may include. According to various embodiments of the present disclosure, the field wearer device 100 may export information collected or received field indication information related to a site to an external device through the interface 170.
  • HDMI high-definition multimedia interface
  • USB universal serial bus
  • optical interface 176 optical interface
  • D-sub D-subminiature
  • the audio module 180 may bidirectionally convert, for example, a sound and an electrical signal. According to various embodiments, the audio module 180 may process voice information input or output through the speaker 182, the receiver 184, the earphone 186, the microphone 188, or the like.
  • the audio module 180 may be physically or electrically connected to the voice button 101 (for example, a PTT button) of FIG. 2 described above, and the microphone 188 in response to an input of the voice button.
  • the voice signal can be collected through.
  • This microphone 188 may be of the same or similar configuration as the microphone 105 of FIG. 2 described above.
  • the camera module 191 is, for example, a device capable of capturing still images and moving images.
  • the camera module 191 may include one or more image sensors (eg, a front sensor or a rear sensor), a lens, and an image signal processor (ISP). Or flash (eg, LED or xenon lamp).
  • the camera module 191 may include the same function and configuration as the camera module 103 of FIG. 2 described above.
  • the camera module 191 may be the 360 degree camera module described above.
  • the power management module 195 may manage power of the field wearer device 100.
  • the power management module 195 may include a power management integrated circuit (PMIC), a charger integrated circuit (IC), or a battery 196 or a fuel or fuel gauge.
  • PMIC power management integrated circuit
  • IC charger integrated circuit
  • battery 196 battery 196 or a fuel or fuel gauge.
  • the power management module 195 may be physically or electrically connected to the power button 102 of FIG. 2 described above, and may turn on power applied from a battery according to the input of the power button 102. Can be turned off.
  • the indicator 197 may display a specific state of the field wearer device 100 or a part thereof (for example, the processor 110, for example, a boot (power) state, a communication state, a charge state, an emergency state, and the like). According to various embodiments, the indicator 197 may be the same as or similar to the light emitting unit 104 of FIG. 2, and may include at least some components of the light emitting unit 104.
  • the motor 198 may convert an electrical signal into mechanical vibration, and may generate a vibration or haptic effect. According to various embodiments, the motor 198 may be functionally classified into some components of the alarm unit. In this case, the motor 198 may notify the field wearer of the emergency signal described later by vibration.
  • the field wearer device 100 may include a PTT module.
  • the PTT module may be disposed in the housing 100_1 to collect a voice signal related to the field wearer or the field.
  • the PTT module may include the aforementioned PTT button and the audio module 180.
  • the PTT module may be configured as one module including at least one of the PTT button, the audio module 180, the electric circuit, and a processor for controlling each component of the PTT module.
  • the wearer device 100 may further include a translation module (not shown).
  • the translation module may translate, change, or convert a voice signal collected from the PTT module of the field wearer device 100 into a preset language format according to the user of the virtual reality device 300 or the virtual reality device 300.
  • the translation module may translate, change, or convert the field instruction information received from the virtual reality device 300 into a preset language according to the field wearer or the field wearer device 100.
  • the translation module may be configured in the server 200 or the virtual reality device 300 in addition to the field wearer device 100.
  • FIG. 4 is a flowchart illustrating an operation of capturing a 360-degree image in the field wearer device 100 of the remote control system 10 according to various embodiments of the present disclosure and receiving field indication information generated based on the same.
  • the field wearer device 100 can take a 360-degree image associated with the site.
  • the field wearer may press the power button 102 of the field wearer device 100 to activate the camera modules 103 and 191, and photograph the field image in a 360 degree panorama.
  • the field wearer device 100 may transmit the captured 360 degree image to the server 200.
  • the field wearer device 100 may transmit a 360 degree image taken 360 degrees of the surrounding site in real time or periodically to the server 200.
  • the field wearer device 100 may collect at least one voice signal. Specifically, the field wearer device 100 may collect a voice signal collected from the field wearer or a voice signal of the field through the PTT module. For example, when the PTT button provided in at least a portion of the field wearer device 100 is pressed by the wearer, the wearer device 100 may transmit the received voice signal to the server 200.
  • the field wearer device 100 may collect at least one sensor information. Specifically, the field wearer device 100 may collect sensor information such as site information collected at the site where the site wearer is located and biometric information indicating the state of the site wearer.
  • the site information may include at least one of site temperature information, site wind speed information, site carbon monoxide measurement information, and site nitrogen measurement information.
  • site temperature information, site carbon monoxide measurement information, and site nitrogen amount measurement information may be information measured within a predetermined range preset from the field wearer device 100.
  • the biometric information may include at least one of heart rate information of the on-site wearer, temperature information of the on-site wearer, and oxygen content information of the on-site wearer.
  • the field wearer device 100 may transmit the collected voice signal and sensor information to the server 200.
  • the field wearer device 100 may transmit a 360 degree image, an audio signal, and sensor information to the server 200 simultaneously or sequentially.
  • the field wearer device 100 may receive field instruction information generated based on a 360 degree image from the server 200 or the virtual reality device 300.
  • the field indication information may be information based on the 360 degree image.
  • the administrator may analyze the 360-degree image received from the field wearer device 100 or the server 200 and input the appropriate field instruction information to the virtual reality device 100, the virtual reality device 300 is
  • the input field instruction information may be provided to the field wearer device 100 through the server 200.
  • the site indication information may be a voice signal or a text signal of an administrator, but is not limited thereto.
  • the field wearer device 100 may output field instruction information received from the server 200 or the virtual reality device 300.
  • the field wearer device 100 may output field indication information through the PTT module, the notification unit, the audio module 180, and the display 160. Then, the field wearer can perform the task given to him through the output of the field instruction information.
  • the site wearer in the field by sharing the 360-degree image, audio signal and sensor information of the site in real time to the manager away from the site, the administrator can analyze the content of the site more accurately, based on the site As the field wearer is remotely instructed through the LTE module, the field information can be efficiently and quickly performed.
  • the field wearer device 100 may set a safety state grade of the field wearer based on information collected from the field wearer device 100 to detect a stable or emergency state of the field wearer. Details thereof will be described later with reference to FIG. 5.
  • FIG. 5 is a flowchart illustrating an operation of setting a safety level according to site information and biometric information in the field wearer device 100 according to various embodiments of the present disclosure, and generating an emergency signal based on the same.
  • the contents of FIG. 5 may be performed at any point in the performance of each step of FIG. 4 described above.
  • various embodiments of the safe state class setting and emergency signal generation operation disclosed in FIG. 5 may be performed in the server 200 or the virtual reality device 300 to be described later.
  • the field wearer device 100 may calculate a first variable according to the site information and a second variable according to the biometric information among sensor information.
  • the field wearer device 100 may set the first variable value by counting the exceeded number of times or the exceeded temperature.
  • the on-site wearer device 100 may perform the above-mentioned number of times, excess temperature or insufficient oxygen amount.
  • the second variable value can be set by counting.
  • the wearer device 100 may assign a weight of the first variable and a weight of the second variable.
  • the biometric information of the on-site wearer is an indicator that is more directly related to the safety status of the on-site wearer than the on-site information
  • the on-site wearer device 100 may have a weight of the second variable higher than that of the first variable. Can be set.
  • the field wearer device 100 may set a safety state level of the field wearer based on at least one of the calculated first and second variables.
  • the field wearer device 100 may determine whether the safety state grade exceeds a preset first threshold value. If the safety level does not exceed a preset threshold (eg, the first threshold), the operation of FIG. 5 may end.
  • a preset threshold eg, the first threshold
  • the field wearer device 100 may determine a threshold value (eg, the safety level level is higher than the preset threshold value in step S525).
  • the second threshold value e.g. the safety level level is higher than the preset threshold value in step S525.
  • the wearer device 100 may transmit an emergency signal to the server 200.
  • the server 200 may transmit the received emergency signal to the virtual reality device 300 or return the preset site indication information to the field wearer device 100 according to the emergency signal.
  • the site wearer device 100 may output an emergency signal to the site wearer in operation S535.
  • the field wearer device 100 may output an emergency signal to the field wearer using at least one of the light emitter 104, the motor 198, the display 160, the audio module 180, or the PTT module. have.
  • the field wearer device 100 may assign an identifier for a corresponding section in a 360 degree image.
  • the field wearer device 100 may identify the 360-degree image during a section (eg, time) during which the safety state grade exceeds the first threshold value or the second threshold value, such as an identifier or the like. Tag information can be given.
  • the field wearer device 100 may process the 360 degree image so that the warning item is exposed to at least a portion of the 360 degree image captured from the exceeded time point. have.
  • the field wearer device 100 may transmit the processed 360 degree image to the server 200.
  • the present disclosure is not limited thereto, and the field wearer device 100 may distinguish metadata corresponding to a section exceeding a threshold value from the captured image by assigning metadata such as identifier or tag information to the 360 degree image.
  • the first variable, the second variable, the first weight, the second weight, the first threshold value, the second threshold value, the identifier or the tag information as described above are set according to a policy set in advance by the on-site wearer or the on-site wearer device 100. And may be stored in the memory 130 and may be changed or updated at any time.
  • the administrator who receives the site image checks the safety state of the site wearer intuitively and quickly when analyzing the image. There is an effect that more appropriate field instruction information can be delivered to the field wearer.
  • the server 200 may perform communication with the field wearer device 100 or the virtual reality device 300 to mediate the communication of the site-related information and guide the appropriate site instruction information in case of an emergency.
  • the server 200 includes a communication module 210, a controller 220, an image information processor 221, a sensor information processor 223, a voice information processor 223, an authentication processor 227, a translation processor 229, and a storage unit. 230 may be included. According to various embodiments of the present disclosure, the server 200 may omit at least some of the components of FIG. 6 or may further include at least some components, and may include at least some components of the on-site wearer device 100 of FIG. 3. .
  • the communication module 210 may connect communication with the field wearer device 100 or the virtual reality device 300.
  • the communication module 210 may include at least some components of the communication module 120 of the field wearer device 100.
  • the communication module 210 may include at least one of a wired communication module and a wireless communication module (for example, an RF transmission / reception module, a transceiver, etc.).
  • the server 200 may include an LTE communication module.
  • the controller 220 may perform a data processing function of controlling a signal flow and processing data between an overall operation such as power supply control of the server 200 and an internal configuration of the server 200.
  • the controller 220 may include at least one processor.
  • the image information processor 221 may receive image information (eg, a 360 degree image) received from the field wearer device 100, and may perform processing and processing operations on the corresponding image information. According to various embodiments of the present disclosure, the image information processing unit 221 may perform a 360-degree image during the exceeded section when the safety status level of the field wearer exceeds a preset threshold (eg, the first threshold value or the second threshold value). May be given metadata such as an identifier or tag information to be distinguished from the 360-degree image received from the field wearer device 100.
  • a preset threshold eg, the first threshold value or the second threshold value
  • the voice information processor 223 may receive voice information received from the field wearer device 100 and provide the received voice information to the virtual reality device 300. According to various embodiments of the present disclosure, the voice information processor 223 distinguishes between a voice signal (for example, ambient sound of the scene) and a voice signal (for example, voice of the wearer) of the scene from the voice information received from the site wearer. It may be stored in the storage 230 of the server 200.
  • a voice signal for example, ambient sound of the scene
  • a voice signal for example, voice of the wearer
  • the sensor information processor 225 may receive sensor information from the field wearer device 100 and provide the received sensor information to the virtual reality device 300. According to various embodiments, the sensor information processing unit 225 may set the safety status level of the site wearer based on the site information and the biometric information of the site wearer included in the sensor information, and generate an emergency signal according to the set safety status level. can do.
  • the authentication processor 227 may perform an authentication process for a field wearer or an administrator to use the remote management service of the present invention.
  • the authentication processing unit 227 may include information related to the field wearer or the manager (for example, ID, password, address, access IP, log information, nationality, language information, etc.) from the field wearer device 100 or the virtual reality device 300. ) Can be received in advance in the personnel information DB (231).
  • the person information includes authentication information received from the on-site wearer device 100 or the virtual reality device 300.
  • the authentication process may be performed by comparing the information stored in the DB 231.
  • the authentication processing unit 227 may connect the on-site wearer device 100 or the virtual reality device 300 and the server 200 from when the authentication is completed. If the authentication is not completed, the authentication processing unit 227 may transmit a message indicating that the authentication is not completed to the field wearer device 100 or the virtual reality device 300, or may request reauthentication. In this case, the authentication processing unit 227 may suspend communication between the field wearer device 100 or the virtual reality device 300 and the server until the authentication is completed.
  • the translation processor 229 may translate the received voice signal into a preset language and provide the translated voice signal to the virtual reality device 300. According to various embodiments, the translation processor 229 may check the preset language information according to an administrator, and may translate the voice signal received from the field wearer device 100 based on the confirmed language information. This translation process may be performed using known speech recognition and speech conversion algorithms. In addition, the translation processing unit 229 may refer to the personnel information DB 231 of the storage unit 230 to confirm language information used by an administrator or a field wearer.
  • the translation processing unit 229 translates the field instruction information received from the virtual reality device 300 into a language used by the field wearer, and then transmits the translated field instruction information to the field wearer device 100. Can be. Thereby, even if the site wearer and the manager who manages the site wearer who perform the site use different languages, there is an effect that can effectively communicate the site-related information.
  • the storage unit 230 may store data received or generated from the controller 220, the server 200, or other components of the remote control system 10.
  • the storage unit 230 may include, for example, a memory, a cash, a buffer, or the like, and may be configured of software, firmware, hardware, or a combination of two or more thereof.
  • the storage unit 230 may include a personnel information DB 231 and an information DB 233.
  • Personnel information DB 231 may include personal information or log information related to the site wearer or administrator
  • the information DB 233 is information required to perform the information or services that occur during the on-site remote management service on the server 200 It may include.
  • the information DB 233 may store a first variable, a second variable, a first weight, a second weight, a first threshold value, a second threshold value, an identifier, tag information, or a translated voice signal. have.
  • the image information processor 221, the sensor information processor 223, the voice information processor 223, the authentication processor 227, and the translation processor 229 are illustrated in a separate configuration from the controller 220. It may be configured as a module with the control unit 220.
  • the functions of the image information processing unit 221, the sensor information processing unit 223, the audio information processing unit 223, the authentication processing unit 227, the translation processing unit 229, and the control unit 220 may be stored in a storage unit 230, for example, a memory. It may be implemented in the form of a routine, an instruction (instruction) or a program stored in the). In addition, routines, instructions, or programs configured to perform the above-described operations may be stored in a computer-readable storage medium.
  • the server 200 may further include a PTT processor and an LTE communication processor.
  • the PTT processing unit may perform a function of providing a voice signal received from the field wearer device 100 to the virtual reality device 300.
  • the LTE communication processor may support LTE communication between the field wearer device 100 and the virtual reality device 300.
  • FIG. 7 is a flowchart illustrating an operation of communicating information associated with a site with the field wearer device 100 or the virtual reality device 300 in the server 200 according to various embodiments of the present disclosure.
  • the server 200 may receive site-related information from the field wearer device (100).
  • the server 200 transmits a 360 degree image photographed by the field wearer device in relation to the site and a voice signal collected by the field wearer device in relation to the site or the site wearer through the communication module 210. Can be received.
  • the server 200 may translate the received voice signal. For example, the server 200 checks the language information corresponding to the virtual reality device 300, and if the language of the voice signal received from the field wearer device 100 is different from the checked language information, the confirmed language The received voice signal may be translated based on the information.
  • the server 200 may transmit field-related information such as a 360-degree video and audio signal (or translated voice signal) to the virtual reality device 300, and in operation S750, the virtual reality device 300 may be used.
  • Site indication information based on the site-related information eg, 360-degree image and audio signal
  • the server 200 may transmit the received field indication information to the field wearer device 100.
  • the server 200 may transmit the received field indication information to the field wearer device 100.
  • the server 200 may receive sensor information detected by the on-site wearer device 100 in relation to the site, and transmit the received sensor information to the virtual reality device 300. .
  • the server 200 may also set the safety state grade based on the sensor information. Details thereof will be described later with reference to FIG. 8.
  • the server 200 may perform an authentication process for the field wearer device 100 or the virtual reality device 300 to provide a service of the remote field management system 10.
  • FIG. 8 is a flowchart illustrating an operation of setting a safety state level of a field wearer in the server 200 and generating an emergency signal based on the same according to various embodiments of the present disclosure. Such content of FIG. 8 may be performed at any point in time during each step of FIG. 7.
  • the server 200 may calculate a first variable according to field information and a second variable according to biometric information among sensor information.
  • the process of calculating the variable according to the site information and the biometric information and the setting of the weights for the first variable and the second variable may be the same as the contents performed in the field wearer device 100 of FIG. 5.
  • the server 200 may set a safety status level of the field wearer based on at least one of the first variable and the second variable.
  • the server 200 may determine whether the safety state grade exceeds the first threshold value. If the safety status level does not exceed a preset threshold (eg, the first threshold value), the operation of FIG. 8 ends, otherwise, the operation proceeds to step S825 and the safety status level determines the second threshold value. You can check if it is exceeded.
  • a preset threshold eg, the first threshold value
  • the field wearer device 100 may transmit an emergency signal to the virtual reality device 300.
  • the server 200 may transmit an emergency signal stored in advance according to the safety state grade to the field wearer device 100.
  • the emergency signal may be automatically transmitted from the server 200 itself to the field wearer device 100 without providing the site instruction information separately from the virtual reality device 300.
  • the server 200 may store an emergency signal table for each safety state class to be automatically transmitted to the field wearer device 100 when the storage unit 230 meets a predetermined condition.
  • the server 200 may assign an identifier for the corresponding section in the 360 degree image.
  • the server 200 is a 360-degree image in the 360-degree image received from the on-site wearer device 100 for a section (eg, time) when the safety status level exceeds the first threshold value or the second threshold value (eg, time).
  • Identifier or tag information can be assigned to distinguish from.
  • the server wearer device 100 may also provide an image corresponding to a section exceeding a threshold by applying meta data such as identifier or tag information to a 360 degree image. Can be distinguished within the image received from
  • the virtual reality device 300 may be configured in the form of a virtual reality (VR) device.
  • the virtual reality device 300 may be implemented as a wearable device such as a head-mounted device that can be worn on the head of the administrator.
  • the virtual reality device 300 operates as a standalone device without a combination of the electronic device 1 or a VR device (eg, a VR dispenser) that uses an electronic device 1 such as a smartphone. It may be configured as a device, but is not limited to a specific form.
  • the virtual reality device 100 may include a housing wearable by a user of the virtual reality device 100.
  • the housing may further include a configuration such as a band or a frame that guides the user to wear a portion of the head or face of the user of the virtual reality device 100.
  • Such a housing may be formed to include respective components of the virtual reality device 100.
  • the virtual reality device 300 may include a voice button 301 (for example, a PTT button) that is the same as or similar to the voice button 104 (for example, a PTT button) of the field wearer device 100 in at least a partial region. .
  • a voice button 301 for example, a PTT button
  • the voice button 104 for example, a PTT button
  • the virtual reality device 300 may include a PTT module.
  • the PTT module may be formed the same as or similar to the PTT module of the field wearer device 100 described above.
  • the PTT module may be disposed in a housing to generate a voice signal (eg, a voice) related to a user of the virtual reality device 300. Etc.) can be collected.
  • the manager wears the virtual reality device 300 shown in FIG. 9 on the head or face, and receives a 360-degree 360-degree image captured by the field wearer device 100 from the server 200 to identify a site to which the field wearer belongs. You can watch in real time or periodically.
  • the administrator may input the field instruction information by voice by touching or pressing the voice button 301, and the virtual reality device 300 may generate the field instruction information corresponding to the input voice.
  • the virtual reality device 300 may transmit the generated field indication information to the server 200 or the field wearer device 100 through the communication module 310 supporting the LTE communication scheme.
  • FIG. 10 is a configuration diagram of a virtual reality device 300 according to various embodiments of the present disclosure.
  • the virtual reality device 300 may include an input module 310, a display module 320, a storage 330, an audio module 340, a communication module 350, and a controller 360.
  • the virtual reality device 300 may include a power module (eg, a circuit and a battery) that supplies power to the virtual reality device 300, and a manager may wear the virtual reality device 300. It may further include a connection frame or band to. Additionally or alternatively, the virtual reality device 300 may include at least some components of the field wearer device 100 of FIG. 3 described above.
  • the input module 310 may receive numeric or text information and may include a plurality of input keys and function keys for setting various functions. According to various embodiments, the input module 310 may include a voice button 301 (eg, a PTT button). In addition, the input module 310 may be implemented as a touch screen.
  • a voice button 301 eg, a PTT button.
  • the input module 310 may be implemented as a touch screen.
  • the display module 320 may output various screens (eg, a 360 degree image screen output at 360 degrees in a virtual space) generated according to the function operation of the virtual reality device 300.
  • the display module 120 may be implemented as a display module or a touch screen method.
  • the display module 320 may provide a virtual output screen supported by the VR device.
  • the virtual space and the image in the virtual space are displayed on the display module 320 of the virtual reality device 300 exposed to the manager's eyes. This can be output in the form of a panorama.
  • the 360-degree image may be continuously output in the virtual space according to the rotated position.
  • the storage unit 330 may store data received or generated from the controller 360, the virtual reality device 300, or other components of the remote control system 10.
  • the storage unit 130 may include, for example, a memory, a cash, a buffer, and the like.
  • the storage unit 330 may include a remote management application 331.
  • the remote management application 331 may be an application that can be downloaded through an open market such as an app store, but is not limited thereto.
  • the remote management application 331 may be implemented in the form of a home application embedded in the virtual reality device 300 or may be an electronic device such as a smartphone.
  • the virtual reality device 300 may be loaded from the device 301 and used.
  • the remote management application 331 may be loaded according to a call of the controller 360 or an application processor (AP, not shown) to perform a function related to on-site remote management through each configuration of the virtual reality device 300.
  • the remote management application 331 may be implemented as a routine, an instruction or a program, such a program may be stored in a computer-readable storage medium.
  • the storage unit 330 may include an information DB 335.
  • the information DB 335 may store, for example, 360-degree image, audio signal, biometric information, variable, weight, threshold value, or field indication information according to a safety status level, but is not limited thereto.
  • the audio module 340 may be configured to process various audio signals generated in a field remote management operation process of the virtual reality device 300.
  • the PTT module may include the aforementioned PTT button and the audio module 340.
  • the PTT module may be configured as one module including at least one of the PTT button, the audio module 340, an electric circuit, and a processor for controlling each component of the PTT module.
  • the communication module 350 may connect communication between the field wearer device 100 and an external device (eg, the server 200).
  • the communication module 150 may be connected to the network 300 through wireless or wired communication to communicate with the external device.
  • the communication module 350 of the virtual reality device 300 may include at least some components of the communication module 120 of the field wearer device 100.
  • the communication module 350 of the virtual reality device 300 may be the same as the LTE communication module of the field wearer device 100 or the server 200, and the LTE communication module may be LTE, LTE-A, or CDMA. It may support at least one communication scheme of WCDMA, UMTS, WiBro and GSM.
  • the controller 360 may perform a data processing function of controlling a signal flow and processing data between an overall operation such as power supply control of the virtual reality device 300 and an internal configuration of the virtual reality device 300.
  • the controller 360 may include at least one processor.
  • the virtual reality device 300 may further include a translation module (not shown).
  • the translation module may translate, change, or convert a voice signal collected from the PTT module of the virtual reality device 300 into a preset language format according to the user of the virtual reality device 300 or the virtual reality device 300.
  • the translation module translates, changes, or converts the voice signal received from the field wearer device 100 or the server 200 in a language preset according to the user of the virtual reality device 300 or the virtual reality device 300. can do.
  • FIG. 11 is a flowchart illustrating an operation of generating scene indication information in the virtual reality apparatus 300 according to various embodiments of the present disclosure.
  • the virtual reality device 300 may receive a 360 degree image from the server 200.
  • the virtual reality device 300 may receive a 360-degree image captured by the field wearer device from the server 200 or the wearer device 100 in relation to the site.
  • the virtual reality device 300 may output a 360 degree image on the virtual space through the display module 320.
  • the virtual reality device 300 may generate site indication information based on a manager's input based on the output 360 degree image.
  • the manager may wear the virtual reality device 300 and check a 360 degree image of the scene photographed by the field wearer device 100 in a virtual space.
  • the manager may analyze the site and input voice, text, or video to the virtual reality device 300, and the virtual reality device 300 may generate site indication information based on the manager's input.
  • the field instruction information may be processed by the virtual reality device 300 based on the administrator's input information itself or the administrator's input information (eg, information processed by voice noise removal, image preprocessing, or the like). May be).
  • the virtual reality device 300 may transmit the generated field instruction information to the field wearer device 100 or the server 200.
  • the virtual reality device 300 may receive a voice signal input from the manager. It may be transmitted to the server 200 or the field wearer device 100 through the communication module 320 (eg, an LTE communication module).
  • a voice button eg, a PTT button
  • the communication module 320 eg, an LTE communication module
  • the virtual reality device 300 may receive a voice signal or sensor information collected from the field wearer device 100 from the server 200 in relation to the site.
  • the virtual reality device 300 may further output at least one of the received voice signal and the sensor information, and the manager may input information for generating the field indication information based on the output information.
  • the virtual reality device 300 is also based on the information collected by the virtual reality device 300 to detect a stable or emergency state of the field wearer.
  • the safety status level of the field wearer can be set. Details thereof will be described later with reference to FIG. 12.
  • FIG. 12 is a flowchart illustrating an operation of setting a safety status level of a field wearer in the virtual reality device 300 according to various embodiments of the present disclosure, and generating an emergency signal based on the safety status level.
  • the contents of FIG. 12 may be performed at any point in time of performing the above-described steps of FIG. 11, and the contents overlapping with those of FIGS. 5 and 8 may be omitted.
  • the virtual reality device 300 may calculate a first variable according to the site information and a second variable according to the biometric information among sensor information.
  • the virtual reality device 300 may set a safety state level of the field wearer based on at least one of the first variable and the second variable.
  • the virtual reality device 300 may assign a weight of the first variable and a weight of the second variable. In addition, the virtual reality device 300 may set the weight of the second variable to have a higher value than the weight of the first variable.
  • the virtual reality device 300 may check whether the safety state grade exceeds the first threshold value. If the safety level does not exceed a preset threshold (eg, the first threshold), the operation of FIG. 12 may end.
  • a preset threshold eg, the first threshold
  • the virtual reality device 300 in step S1225 threshold value eg, the safety status level is higher than the preset threshold value.
  • the second threshold value e.g. the first threshold value
  • the virtual reality device 300 may transmit an emergency signal to the server 200 in step S1230.
  • the server 200 may transmit the received emergency signal to the on-site wearer device 100 or transmit preset site indication information to the on-site wearer device 100 according to the emergency signal.
  • the virtual reality device 300 transmits the emergency signal stored in advance according to the safety state level to the server 200 or the field wearer device 100 in operation S1235. Can be.
  • the virtual reality device 300 may assign an identifier for the corresponding section in the 360 degree image.
  • the field wearer device 100 may identify the 360-degree image during a section (eg, time) during which the safety state grade exceeds the first threshold value or the second threshold value, such as an identifier or the like.
  • Tag information can be given.
  • the virtual reality device 300 may output a 360 degree image to which the above identifier or tag information is assigned.
  • the virtual reality device 300 may process a 360 degree image so that a warning item is exposed to at least a portion of a 360 degree image captured from the time point when the safety state grade exceeds the first threshold value. have.
  • the virtual reality device 300 may output the processed 360 degree image in a virtual space.
  • the present disclosure is not limited thereto, and the field wearer device 100 may distinguish metadata corresponding to a section exceeding a threshold value from the captured image by assigning metadata such as identifier or tag information to the 360 degree image.
  • the first variable, the second variable, the first weight, the second weight, the first threshold, the second threshold, the identifier or the tag information as described above may be set according to a policy set in advance by the manager or the virtual reality device 300. It may be stored in the memory 330 of the virtual reality device 300, and can be changed or updated at any time.
  • module or “unit” may refer to a unit including one or a combination of two or more of hardware, software, or firmware. . “Module” or “unit” is interchangeable with, for example, terms such as unit, logic, logical block, component, or circuit. Can be. “Module” or “part” may be a minimum unit or part of an integrally formed part, or may be a minimum unit or part of one or more functions. The “module” or “ ⁇ part” can be implemented mechanically or electronically.
  • Modules or programming modules may include at least one or more of the aforementioned components, omit some of them, or further include additional components.
  • Operations performed by modules, programming modules, or other components in accordance with various embodiments of the present invention may be executed in a sequential, parallel, repetitive, or heuristic manner. In addition, some operations may be executed in a different order, may be omitted, or other operations may be added.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • Health & Medical Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système de commande à distance au moyen d'une caméra à 360 degrés et un appareil de réalité virtuelle, le système de commande à distance comprenant : un appareil portable de travailleur sur le terrain ; un appareil de réalité virtuelle permettant de fournir à distance des instructions à un travailleur sur le terrain portant l'appareil portable ; et un serveur permettant l'intermédiation entre l'appareil portable du travailleur et l'appareil de réalité virtuelle, aussi bien l'appareil portable du travailleur sur le terrain que l'appareil de réalité virtuelle comprenant un module de communication LTE, un module de bouton de microphone (PTT), et une unité de commande, et l'appareil portable du travailleur sur le terrain comprend en outre un module de caméra à 360 degrés.
PCT/KR2017/004279 2016-04-26 2017-04-21 Appareil portable de travailleur sur le terrain, appareil de réalité virtuelle et système de commande à distance au moyen d'une caméra à 360 degrés de l'appareil portatif du travailleur sur le terrain et de l'appareil de réalité virtuelle WO2017188665A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2016-0050755 2016-04-26
KR1020160050755A KR101885355B1 (ko) 2016-04-26 2016-04-26 가상현실 장치 및 360도 카메라와 가상현실 장치를 이용한 원격 관제 시스템
KR1020160050750A KR101898893B1 (ko) 2016-04-26 2016-04-26 현장착용자 장치 및 상기 현장착용자 장치의 360도 카메라와 가상현실 장치를 이용한 원격 관제 시스템
KR10-2016-0050750 2016-04-26

Publications (1)

Publication Number Publication Date
WO2017188665A1 true WO2017188665A1 (fr) 2017-11-02

Family

ID=60160900

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/004279 WO2017188665A1 (fr) 2016-04-26 2017-04-21 Appareil portable de travailleur sur le terrain, appareil de réalité virtuelle et système de commande à distance au moyen d'une caméra à 360 degrés de l'appareil portatif du travailleur sur le terrain et de l'appareil de réalité virtuelle

Country Status (1)

Country Link
WO (1) WO2017188665A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020081377A (ko) * 2000-12-28 2002-10-26 아쓰시 다카하시 시술자의 시각을 이용한 원격지 인터넷기술지도교육배송시스템, 및 통신망을 이용한 지도 시스템
KR20080075571A (ko) * 2007-02-13 2008-08-19 (주)월드이엔지 무선 센서네트워크를 이용한 작업자 안전 관리 장치
KR20110081540A (ko) * 2010-01-08 2011-07-14 (주)넥스챌 전력 설비 작업 현장 상황 모니터링 시스템 및 전력 설비 작업 현장 상황 모니터링 방법
KR101422352B1 (ko) * 2012-11-05 2014-07-30 중앙대학교 산학협력단 건설현장의 하자관리시스템 및 방법
KR20150124241A (ko) * 2014-04-28 2015-11-05 대우조선해양 주식회사 가상 현실을 이용한 해양 플랜트 통합 운영 시스템 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20020081377A (ko) * 2000-12-28 2002-10-26 아쓰시 다카하시 시술자의 시각을 이용한 원격지 인터넷기술지도교육배송시스템, 및 통신망을 이용한 지도 시스템
KR20080075571A (ko) * 2007-02-13 2008-08-19 (주)월드이엔지 무선 센서네트워크를 이용한 작업자 안전 관리 장치
KR20110081540A (ko) * 2010-01-08 2011-07-14 (주)넥스챌 전력 설비 작업 현장 상황 모니터링 시스템 및 전력 설비 작업 현장 상황 모니터링 방법
KR101422352B1 (ko) * 2012-11-05 2014-07-30 중앙대학교 산학협력단 건설현장의 하자관리시스템 및 방법
KR20150124241A (ko) * 2014-04-28 2015-11-05 대우조선해양 주식회사 가상 현실을 이용한 해양 플랜트 통합 운영 시스템 및 방법

Similar Documents

Publication Publication Date Title
KR101898893B1 (ko) 현장착용자 장치 및 상기 현장착용자 장치의 360도 카메라와 가상현실 장치를 이용한 원격 관제 시스템
WO2019103212A1 (fr) Système de surveillance pour terminal intelligent ido dans un navire utilisant un réseau de communication
WO2018194243A1 (fr) Dispositif de communication vidéo, procédé de communication vidéo, et procédé de médiation de communication vidéo
EP3520434A1 (fr) Procédé de détection de mauvais positionnement d'écouteur, et dispositif électronique et support d'enregistrement associés
WO2017131384A1 (fr) Dispositif électronique et son procédé de commande
WO2017095145A1 (fr) Procédé et appareil de fourniture d'informations de recherche
WO2015167236A1 (fr) Dispositif électronique et procédé de fourniture de service d'appel vidéo d'urgence
WO2018143509A1 (fr) Robot mobile et son procédé de commande
WO2018072567A1 (fr) Procédé et système d'appel à l'aide d'urgence basés sur une identification par empreinte digitale pour un terminal mobile, et terminal mobile
WO2018236058A1 (fr) Dispositif électronique pour fournir des informations de propriété d'une source de lumière externe pour un objet d'intérêt
EP3482341A1 (fr) Dispositif électronique et son procédé de fonctionnement
WO2016006734A1 (fr) Procédé et dispositif de reconnaissance d'informations biométriques
WO2015137673A1 (fr) Procédé de détermination de source de données
WO2017135522A1 (fr) Dispositif d'affichage de type miroir et son procédé de commande
WO2015133788A1 (fr) Procédé et dispositif électronique pour afficher un contenu
WO2015199505A1 (fr) Appareil et procédé de prévention de dysfonctionnement dans un dispositif électronique
WO2018026142A1 (fr) Procédé de commande du fonctionnement d'un capteur d'iris et dispositif électronique associé
WO2018066859A1 (fr) Procédé de fourniture de service d'urgence, dispositif électronique associé et support d'enregistrement lisible par ordinateur
WO2018097683A1 (fr) Dispositif électronique, dispositif électronique externe et procédé de connexion de dispositif électronique et de dispositif électronique externe
WO2016182090A1 (fr) Terminal de type lunettes et son procédé de commande
WO2015133868A1 (fr) Dispositif électronique imperméable à l'eau
KR101885355B1 (ko) 가상현실 장치 및 360도 카메라와 가상현실 장치를 이용한 원격 관제 시스템
WO2020141727A1 (fr) Robot de soins de santé et son procédé de commande
WO2017188665A1 (fr) Appareil portable de travailleur sur le terrain, appareil de réalité virtuelle et système de commande à distance au moyen d'une caméra à 360 degrés de l'appareil portatif du travailleur sur le terrain et de l'appareil de réalité virtuelle
WO2019208923A1 (fr) Dispositif électronique pour réaliser une communication avec un dispositif portable pour recevoir des informations biométriques

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17789850

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17789850

Country of ref document: EP

Kind code of ref document: A1