WO2023234456A1 - Vr image and content providing system and method for underwater monitoring - Google Patents

Vr image and content providing system and method for underwater monitoring Download PDF

Info

Publication number
WO2023234456A1
WO2023234456A1 PCT/KR2022/008243 KR2022008243W WO2023234456A1 WO 2023234456 A1 WO2023234456 A1 WO 2023234456A1 KR 2022008243 W KR2022008243 W KR 2022008243W WO 2023234456 A1 WO2023234456 A1 WO 2023234456A1
Authority
WO
WIPO (PCT)
Prior art keywords
underwater
content
user
virtual
experience
Prior art date
Application number
PCT/KR2022/008243
Other languages
French (fr)
Korean (ko)
Inventor
김기주
Original Assignee
동명대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 동명대학교산학협력단 filed Critical 동명대학교산학협력단
Publication of WO2023234456A1 publication Critical patent/WO2023234456A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]

Definitions

  • the present invention is a result of support from the Busan University Innovation Research Complex Development Project (IURP2201) of the Busan Institute for Industrial Science and Innovation (BISTEP), and relates to a system and method for providing VR images and content for underwater monitoring.
  • the present invention relates to a method using a VR image and content provision system for underwater monitoring that allows individuals to realistically observe the underwater environment using VR (Virtual Reality).
  • the conventional monitoring method for aquatic life, etc. has the problem that underwater environment analysis experts have to analyze the underwater environment based on two-dimensional images that lack realism, which inevitably reduces the accuracy of the analysis results.
  • VR virtual reality
  • the present invention was created to solve this problem, by providing users or parties who want to attempt underwater monitoring with VR (virtual reality) images based on underwater image information using an image capture device such as Camel,
  • the purpose is to provide a VR video and content provision system and method for underwater monitoring that can further increase the sense of reality by introducing virtual reality.
  • the present invention provides a VR video and content provision system and method for underwater monitoring that can maximize educational effects by ensuring the user's stability and inducing a realistic sense of presence and immersion in order to provide the user with a realistic experience of the underwater environment.
  • a VR video and content provision system and method for underwater monitoring can maximize educational effects by ensuring the user's stability and inducing a realistic sense of presence and immersion in order to provide the user with a realistic experience of the underwater environment.
  • the purpose is to provide a VR video and content provision system and method for underwater monitoring that allows underwater experience educators to have an intuitive experience by simulating virtual underwater images based on information selected through underwater environment experience. .
  • the present invention is a VR video and content provision system for underwater monitoring, in which a detection sensor is installed, a water mat for allowing the user to feel the physical sensation of water while lying down, and a water mat for detecting the user's motion.
  • a motion detection device consisting of an image capture device and an image recognition module, a virtual experience output device that is worn on the user's head and provides a virtual reality image, and data received from the detection sensor of the water mat and the motion detection device.
  • a terminal capable of controlling the virtual reality content output to the virtual experience output device and transmitting content information about the underwater virtual experience to the virtual experience output device, and transmitting content information about the underwater virtual experience to the virtual experience output device and the terminal according to the content information. It is characterized by including a management server that runs and monitors the progress of the marine virtual experience.
  • the detection sensor is characterized in that the underwater environment is recognized by the user's vision, hearing, and tactile senses.
  • the virtual reality content is characterized by including a user avatar that experiences the underwater environment.
  • the bottom of the water mat further includes a driving means for driving the water mat, a wind volume controller that provides wind for a realistic experience, and a spray controller that provides water droplets.
  • the present invention generates underwater images as VR images and provides them directly to the user, while also monitoring the underwater, thereby increasing the sense of reality for the underwater images. Accordingly, if you want to experience the underwater environment based on VR images, It has an excellent effect of increasing the accuracy and reliability of the underwater experience.
  • FIG. 1 is a schematic configuration diagram of a VR image providing system for underwater monitoring according to an embodiment of the present invention.
  • Figure 2 is a diagram showing an award mat.
  • Figure 3 is a detailed block diagram of an output device for virtual experience.
  • Figure 4 is an example screen of virtual reality (VR) content.
  • VR virtual reality
  • Figure 5 is a relationship diagram between a terminal and a content device.
  • FIG. 6 is a detailed block diagram of the management server.
  • Figure 7 is a flowchart of a method for providing VR images for underwater monitoring according to an embodiment of the present invention.
  • Figure 8 is a block diagram for explaining the configuration of the VR education device.
  • Figure 9 is a block diagram for explaining the configuration of a display device.
  • Figure 10 is a block diagram for explaining the configuration of a server.
  • Figure 11 is a flowchart illustrating a method of providing VR-based marine education using a VR education device.
  • FIG 1 is a schematic configuration diagram of a VR image providing system for underwater monitoring according to an embodiment of the present invention
  • Figure 2 is a diagram showing an aquatic mat
  • Figure 3 is a detailed block diagram of an output device for virtual experience
  • Figure 4 is an example screen of virtual reality (VR) content
  • Figure 5 is a diagram of the relationship between the terminal and the content device
  • Figure 6 is a detailed block diagram of the management server
  • Figure 7 is underwater monitoring according to an embodiment of the present invention.
  • It is a flowchart of a method for providing VR images
  • FIG. 8 is a block diagram for explaining the configuration of a VR education device
  • FIG. 9 is a block diagram for explaining the configuration of a display device
  • FIG. 10 is a block diagram for explaining the configuration of a server.
  • Figure 11 is a flow chart to explain a method of providing VR-based marine education using a VR education device.
  • the user can experience a virtual underwater environment by lying on the water mat 100 floating on the water provided in any space containing water.
  • a detection sensor (S) is installed on the water mat 100, allowing the user to lie down and directly feel the physical sensation of the water, thereby promoting more realism in the virtual (VR) underwater experience.
  • a driving means for driving the water mat 100 and an air volume controller (not shown) that provides wind for a more realistic experience of a virtual underwater environment, and A mist regulator (not shown) that provides water droplets is included.
  • the water mat 100 has a predetermined driving means formed at the bottom, it is possible to actively simulate the physical properties of water (speed, amount, frequency, wavelength, etc. of water movement). desirable.
  • the detection sensor (S) includes a tilt sensor, an acceleration sensor, an inertial sensor, etc., so that it can output data that can express the physical properties of water, such as the speed, amount, frequency, and wavelength of water movement. .
  • the aquatic mat 100 is formed with a sensory sensor (S) to enable the user lying on the aquatic mat 100 to perceive various underwater environments through the visual, auditory, and tactile senses.
  • S a sensory sensor
  • the sensory sensor (S) is mounted so that the user can feel the physical sensation of water while lying down. Furthermore, the sensory sensor (S) includes a tilt sensor, an acceleration sensor, an inertial sensor, etc. It is desirable to be able to output data that can express the physical properties of water, such as the speed, amount, frequency, and wavelength of movement.
  • the motion detection device 200 is composed of an image capture device 210 and an image recognition module (not shown) to detect the body motion of a user lying on the front of the water mat 100.
  • the motion detection device 200 be equipped with a predetermined motion recognition marker (not shown) on the user's body so that the motion detection device 200 can more effectively recognize the user's motion.
  • the output device 300 for virtual experience is worn on the user's head and provides a virtual reality (VR) image.
  • VR virtual reality
  • the virtual reality content output through the virtual experience output device 300 includes virtual reality content from the user's first-person perspective (not shown) or virtual reality content from the user's third-person perspective (not shown).
  • the first-person and third-person virtual reality content preferably includes a user avatar who experiences survival swimming, and the user avatar moves in conjunction with each other by matching the body motion data detected from the motion detection device 200 described above. It would be desirable to do so.
  • the output device 300 for a virtual experience when describing the output device 300 for a virtual experience, includes a header unit 310, a speaker 320, a breathing sensor 330, and a control unit 340. It is composed including.
  • the header unit 310 visually provides underwater virtual reality content according to the user's gaze.
  • video signals are input through the video signal input unit 350. This image corresponds to an image when the user is located at a certain location underwater.
  • a voice signal is input through the voice signal input unit 360, and this signal is output through the speaker 320.
  • the sound output through the speaker 320 corresponds to the sound when the user is located at a predetermined position underwater.
  • Underwater virtual reality content is provided through the video signal input unit 350 and the audio signal input unit 360.
  • the breathing sensor 330 detects whether and to what extent the user is inhaling or exhaling.
  • the control unit 340 controls each component of the virtual experience output device 300 and reflects the user's breathing state measured by the breathing sensor 330 in the content.
  • the virtual experience output device 300 is provided as an interface by manufacturing a header part 310, a breathing sensor 330, and a speaker 320 to be worn on the user's face.
  • the virtual experience output device 300 is configured in the form of a mask and can be worn on the user's face. In this case, it can be attached to the face using a fixing band (not shown).
  • control unit 340 obtains user information in real time from the virtual experience output device 300 and controls the virtual experience output device 300 using the obtained data.
  • * Figure 4 is an example screen of the virtual reality (VR) content.
  • the actual underwater state obtained through video and audio is shown to the user through virtual reality content of the virtual experience output device 300.
  • the virtual content screen naturally changes depending on the water depth shown on the left side of the screen, and the amount of oxygen, salinity, density, water temperature, and pressure change depending on the water depth.
  • the terminal 400 receives data from the detection sensor (S) of the water mat 100 and the motion detection device 200 and turns on/off the virtual reality content output to the virtual experience output device 300. It can be done, and it can be controlled.
  • the terminal 400 generates or receives marine content reflecting information on the surrounding marine environment based on Points of Interest (POI) when creating a point of interest from the content device 500, which will be described later.
  • POI Points of Interest
  • the content device 500 will be described with reference to FIG. 5 .
  • the content device 500 consists of an underwater environment information providing server 510 that provides underwater environment information, and an underwater content providing server 520 that provides content reflecting the underwater environment information.
  • the terminal 400 provides peripheral information. Create marine content that reflects underwater environment information. Accordingly, the application is downloaded and executed in the terminal 400.
  • location information and date information are transmitted to the underwater environment information providing server 510, and the underwater environment information providing server 510 receives the received information.
  • underwater environment information that can check the current area's tide, water depth, weather, water temperature, wind direction, wind speed, wave height, and wave direction is generated, and this is transmitted to the terminal 400, and the terminal 400 ) generates underwater content based on the received underwater environment information, transmits it to the underwater content providing server 520 and requests it to be stored for each content, and the underwater content providing server 520 creates each content reflecting the underwater environment information.
  • the terminal 400 connects to the underwater content providing server 520, receives content reflecting marine environment information based on the content ID, and displays it on the screen, so that the content registered in the terminal 400
  • the terminal 400 By organizing the underwater environment information and the current underwater environment information, each user can obtain safe and excellent underwater activity results by viewing content and obtaining underwater environment information in accordance with the dynamically changing underwater environment. will be.
  • the terminal 400 can be configured to compare past underwater environment information and current underwater environment information based on location information and date information, and confirm the difference or receive guidance.
  • the terminal 400 when the terminal 400 takes pictures of underwater activities or the results of underwater activities and registers them, it can be configured to secure and automatically store underwater environment information based on the current location and date to create underwater content. .
  • the terminal 400 executes a related application to generate content and explains the display of content reflecting underwater environment information, for example, when the application of the terminal 400 is activated, a tiled image appears on the screen.
  • the point of interest registration button (not shown) to set and register the current location displayed through a map (not shown) that allows you to receive an image and check the current location as an area of interest
  • the terminal ( The location information and date information of the current location where 400) is located are transmitted from the underwater environment information providing server 510, and the corresponding underwater environment information is transmitted to generate content.
  • the management server 600 transmits content information about the underwater virtual experience to the virtual experience output device 300 described above, and configures the virtual experience output device 300 and the terminal 400 according to the content information. It can be operated and the progress of the user's underwater virtual experience can be monitored in real time.
  • management server 600 will be described with reference to the drawings.
  • FIG. 6 is a detailed block diagram of the management server 600, which includes a communication unit 410, a server control unit 620, and a storage unit 630.
  • the communication unit 610 performs communication with the virtual experience output device 300.
  • the communication unit 610 transmits content information about the ocean virtual experience to the virtual experience output device 300, and reports the progress of the underwater virtual experience and the user's biological condition from the virtual experience output device 300. Receive information about
  • the server control unit 620 controls the delivery of content information so that the virtual experience output device 300 is operated according to the content information about the marine virtual experience.
  • content information may be selected by user input received from the external terminal 400, or may be directly input by a server input unit (not shown).
  • the server control unit 620 monitors the progress of the user's marine virtual experience.
  • the server control unit 620 may analyze the progress of each user's marine virtual experience and perform evaluation. That is, the server control unit 620 can evaluate the user's experience difficulty level, progress speed, experience completion status, etc.
  • the server control unit 620 can use the evaluation results to calculate the parts that the user is lacking or supplementary parts and provide them to the user.
  • the server control unit 620 monitors the user's biological condition.
  • the server control unit 620 controls the virtual experience output device 300 so that the marine virtual experience ends when the user's biological condition remains unstable for a certain period of time compared to a preset standard.
  • the preset standard refers to a normal biological state. Through this, users can prevent underwater safety accidents in advance.
  • the storage unit 630 stores content information about the marine virtual experience.
  • the storage unit 630 stores information on evaluation results and biological conditions for each user.
  • the storage unit 630 may be a flash memory type, hard disk type, media card micro type, card type memory (e.g. SD or XD memory, etc.), RAM, SRAM, ROM, EEPROM, PROM, magnetic memory, magnetic disk. , and may include at least one storage medium among optical disks.
  • the method of using a virtual reality (VR) imaging system is a water mat 100 equipped with a detection sensor (S) and manufactured so that the user can feel the physical sensation of water while lying down. , a motion detection device 200 that detects the user's body movements, an experiential output device 300 that is worn on the user's head and outputs virtual reality content, and a detection sensor (S) of the water mat 100 and a terminal 400 capable of receiving data values from the motion detection device 200 and controlling virtual reality content output to the experiential output device 300, and the marine environment through the user's sight, hearing, and touch. It relates to a method of using a virtual reality (VR) imaging system including a content device 500 that creates a perceptible underwater environment.
  • VR virtual reality
  • Step (S10) of operating the motion detection device 200 to detect the user's body motion there is a step (S10) of operating the motion detection device 200 to detect the user's body motion.
  • the S10 collects the user's body motion data through detection of the user's body motion.
  • step (S20) in which the user wears the experiential output device 300 that outputs virtual reality content on his head. It is fixed to the head through the header portion 310, and detailed descriptions of this will be omitted as they are well-known. Also, I would like to make it clear that it is not limited to the method of fixing it to the head.
  • the virtual reality content is linked to the user's first-person or third-person perspective by matching body motion data detected from the motion detection device 200 through the user avatar. It works.
  • the user lies down on the water mat 100 (S30), and the user equipped with the experience output device 300 lies on the water mat 100 and is ready to execute a virtual underwater experience.
  • Data values output from the detection sensor (S) and the motion detection device 200 of the water mat 100 are output to the experiential output device 300 and are reflected in virtual reality content (S40).
  • the underwater environment can be recognized through the user's visual, auditory, and tactile senses through the detection sensor (S), and the motion of the body is sensed from the motion detection device (200), allowing the user to freely move and move around underwater. It can give you a feeling.
  • the management server 600 goes through a step of controlling the operation of the experiential output device 300 and the content device 500 (S50).
  • the experiential output device 300 is waterproofed to enable acquisition and provides augmented reality or virtual reality images
  • the content device 500 displays the underwater environment so that the marine environment is perceived by the user's visual, auditory, and tactile senses. Create.
  • the management server 600 transmits content information about the marine virtual experience to the experiential output device 300 and the content device 500, respectively, so that the experiential output device 300 and the content device 500 transmit the content information. Operate accordingly.
  • the experiential output device 300 and the content device 500 may be driven in conjunction with each other.
  • step (S60) in which the user experiences a virtual underwater environment through the reflected virtual reality content.
  • the motion detection device 200 which detects the user's body motion, includes a camera 210 and an image recognition module (not shown), and a detection sensor (S) of the water mat 100 and a motion detection device. It is preferable that the data values output from 200 be reflected in the virtual reality content output to the experiential output device 300 so that they can be directly projected from the experiential output device 300 or the terminal 400. do.
  • the management server 00 monitors the progress of the maritime virtual experience performed through step S60 (S70).
  • the management server 600 can perform evaluation by analyzing the progress of each user's virtual experience, such as the underwater environment. That is, the management server 600 can evaluate the user's experience difficulty level, progress speed, experience completion status, etc. More preferably, the management server 600 can use the evaluation results to calculate the parts that the user lacks or complement and provide them to the user.
  • the present invention can be used as an experiential education platform for an underwater environment based on virtual reality or augmented reality (Advanced Reality) using the terminal 400.
  • the VR education device 700 may include an HMD 710, a VR motion unit 720, and a display device 730.
  • the HMD 710 is worn on the user's head and has a glasses-shaped display unit, so that the 3D image received from the display device 730 can be output to the glasses-shaped display unit. Additionally, the HMD 710 can transmit the movement of the user's head (head tracking information) to the display device 730 and receive a 3D image according to the movement.
  • the motion recognition unit 720 is mounted on both hands of the user and can recognize the user's motion. At this time, the motion recognition unit 720 is mounted on the user's hand, has a motion sensor mounted on the joint area to recognize the user's joint movement, and is equipped with a vibration sensor to receive notifications according to the movement.
  • the motion recognition unit 720 is equipped with a laser and transmits an input signal by transmitting a laser to the display unit 733 of the display device 730, or serves as an interface such as touch, grab, and UI manipulation according to preset operations. You can also perform .
  • the motion recognition unit 720 generates reference position information when driven, and at this time, the reference position information can be displayed on the display screen of the display unit 733 by displaying the motion recognition unit 720 as a virtual hand such as a cursor. It becomes information that exists.
  • the virtual hand will be correspondingly displayed on the display screen of the display unit 733 based on the reference position information of the motion recognition unit 720. You can.
  • the display device 730 is a terminal installed with an app for providing educational services such as an underwater environment, and can provide educational content through communication with the server 800 and provide a UI (User Interface) for controlling the app.
  • UI User Interface
  • the display device 730 performs user authentication when running the app, logs in, receives information from the user through the UI or selects customized content provided by the terminal 400, and transmits it to the server 800. You can receive simulation VR images that match user requests.
  • the display device 730 displays a VR image applied to the simulation received from the server 800 of the virtual hand according to the user motion recognized by the motion recognition unit 720 through the display unit 733, and displays the same VR image through the display unit 733. Images can also be displayed on the HMD (710).
  • the VR image is a 360° image provided according to the movement of the HMD 710, and the display unit 733 can mirror and output the display image provided on the HMD 710.
  • FIG. 9 is a block diagram for explaining the configuration of the display device 730 of FIG. 8.
  • the display device 730 may include a communication unit 731, a memory 732, a display unit 733, a control unit 734, and a remote provision unit 735.
  • the display device 730 serves to output marine images suitable for educational content information.
  • the display device 730 can transmit and receive wireless data with the server 800 through the communication unit 731, and can also transmit and receive data with the HMD 710 and the motion recognition unit 720 by being wired or wirelessly connected.
  • Data for driving, etc. may be stored in the memory 732, and VR marine image content to be simulated and played back in the display unit 733 may be stored.
  • the display unit 733 mirrors and displays the VR image output on the HMD 710 under the control of the control unit 734, and displays a cursor corresponding to the movement of the motion recognition unit 720 on the simulation content received from the server 800. It can be displayed by applying (not shown).
  • the control unit 734 controls the overall operation of the display device 730 and controls content received from the server 800 according to control commands (movement information, etc.) received from the HMD 710 and the motion recognition unit 720. This can be output to the HMD 710 and the display unit 733.
  • the remote provider 735 supports remote control of the terminal 400 through the server 800, and if you receive support for underwater video education remotely through the remote provider 735 or need help with training, Guides can be received through the terminal 400, and user-customized content executed by the terminal 400 can also be played through the display unit 733.
  • FIG. 10 is a configuration block diagram for explaining the server 800 of FIG. 8.
  • the server 800 may include a DB 710, a content provider 720, a calculation unit 730, and a remote support unit 740.
  • the DB 710 may include an underwater environment DB, an educational content DB, a management DB, etc.
  • the underwater environment DB can store information on various facilities and devices for building underwater environment facilities. For example, this may include fish/coral reef experience education, deep sea experience education, high pressure experience education, low water temperature experience education, low light experience education, and strong current experience education.
  • the underwater environment DB stores 3D images for each object (various images) of the education, the location of underwater creatures, etc., and contains content information about marine virtual education.
  • the educational content DB can store educational content such as preliminary training for virtual underwater experience, and step-by-step courses for learning induction and indirect experience can be matched and stored for each content.
  • the course may be a preset action guided according to the learning purpose of each content.
  • educational content can be a variety of content about marine virtual experience, underwater tidal current experience, and the shape of structures (corals, caves, etc.) that model structures that actually exist in the ocean.
  • the management DB can store and manage management information of individual training participants, such as training participant information, customized content management, current participant training evaluation level, progress, etc.
  • the content provider 820 provides the content requested from the VR device 700 and the content remotely played by the terminal 400 to the corresponding VR device 700, and provides an underwater environment DB for marine experience or underwater life observation.
  • an object matching the information selected by the user among objects stored in the underwater environment DB may be provided to the VR device 700. At this time, the selected object may be simulated and provided.
  • the object may be transmitted as is and simulated in the VR device 700.
  • the VR image displayed on the display unit 733 and the HMD 710 is placed in a pre-stored location whenever an object is selected through the UI and can be displayed as a simulation.
  • objects in the constructed virtual underwater environment may be moved to a location according to user settings by the motion recognition unit 720.
  • the content provider 820 can extract a plurality of necessary video contents and missions based on the corresponding information and provide a content list so that step-by-step learning can be performed.
  • the remote support unit 830 provides an environment for remote control of the terminal 400. At this time, remote control of the terminal 400 may be performed at the user's request or when the terminal 400 provides user-customized content.
  • the remote support unit 830 transmits the environmental information of the underwater image constructed through the terminal 400 and requests remote consultation to enable remote consultation between the VR device 100 and the terminal 400. make it happen.
  • consultation can take place in various forms such as real-time video chat, voice chat, and text chat.
  • a UI User Interface
  • underwater education can be performed by playing customized content according to the educator's learning stage.
  • the display device 730 transmits information about the selected education to the server 800 (S110).
  • the server 800 simulates and provides objects corresponding to the selected educational information, and the VR education device 700 can output a simulated VR image to the HMD 710 (S120). Additionally, the VR image of the HMD 710 may be mirrored and output on the display unit 733.
  • the educator wishes to change the course, he or she may change the existing selected course to the desired course through the motion recognition unit 720.
  • the server 800 can provide the corresponding underwater education based on the selected education (S130).
  • the corresponding information may be transmitted through the server 800 to the terminal of the administrator in charge of managing education (not shown). Through the administrator terminal, the administrator can check the content of the educator's consultation and respond to it through the server 800.
  • the VR education device 100 can transmit the completion of education preparation to the server 800, and at the same time, the server 800 can provide underwater-related video content selected by the educator (S140).
  • Educators can perform education and learning in a virtual underwater environment through VR underwater images output to the HMD 710 and perform each movement using the motion recognition unit 720 (S150).
  • the educator's execution of underwater education and learning is transmitted to the administrator terminal through the server 800, and the administrator monitors it in real time through the administrator terminal. If the educator needs help, remote control can be performed from the outside. You can also (S160).
  • the administrator terminal may simultaneously monitor the educational activities of at least one educator.
  • the administrator selects relevant content for the educator's underwater education performance and content judged to be learning that requires reinforcement from the administrator's perspective as user-customized content to provide customized training. It can also be performed (S170).
  • the administrator can easily select content tailored to the educator through a collection of content related to the educator and a search service.
  • user-customized content selected by the administrator can be played on the VR education device 700 through remote control, and the server 800 provides customized training for the education matched to the content or the education selected in the terminal 400. This can be provided through education.
  • the administrator While the educator is performing underwater education and learning, the administrator performs monitoring and remote control in real time, and when the underwater education (learning) is completed, the VR education device (700) sends a signal to the server (800) that the underwater education has been completed. Transmit to (S180).
  • Control unit 350 Video signal input unit
  • Content device 510 Underwater environment information provision server
  • Underwater content provision server 600 Management server
  • Storage unit S Detection sensor
  • VR education device 710 HMD

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Primary Health Care (AREA)
  • General Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention provides a VR image providing system for underwater monitoring, the system comprising a water mat, a motion detection device, an output device for virtual experience, a terminal, and a management server. More specifically, provided is a VR image and content providing system that enables individuals to observe underwater environments realistically by using VR.

Description

수중 모니터링을 위한 VR 영상 및 컨텐츠 제공 시스템 및 방법System and method for providing VR video and content for underwater monitoring
본 발명은 부산산업과학혁신원(BISTEP)의 부산광역시 대학혁신연구단지조성사업(IURP2201)의 지원에 의해 발생한 성과물로서, 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공 시스템 및 방법에 관한 것이다. 본 발명은 더욱 자세하세는, VR(Virtual Reality: 가상현실)을 활용하여 개인이 수중 환경을 현장감 있게 관측 가능하도록 하는 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공 시스템을 이용한 그 방법에 관한 것이다.The present invention is a result of support from the Busan University Innovation Research Complex Development Project (IURP2201) of the Busan Institute for Industrial Science and Innovation (BISTEP), and relates to a system and method for providing VR images and content for underwater monitoring. In more detail, the present invention relates to a method using a VR image and content provision system for underwater monitoring that allows individuals to realistically observe the underwater environment using VR (Virtual Reality).
최근 들어 육상 자원의 고갈, 인구의 밀집화 등을 해결하기 위하여 해양에 대한 개발이 가속화되면서 해양 플랜트, 해상 교량, 해저 터널 등의 수중 건축물들이 건설되고 있다. 이에 따라, 수중에 서식하고 있는 수중생물을 환경을 모니터링(monitoring)하기 위한 기술의 수요도 증가하고 있다.Recently, in order to solve the depletion of land resources and population density, development of the ocean has accelerated, and underwater structures such as marine plants, marine bridges, and underwater tunnels are being built. Accordingly, the demand for technology for monitoring the environment of aquatic organisms living in the water is also increasing.
이러한 수중 모니터링을 위하여, 종래에는 잠수부가 잠수하여 수중생물의 영상을 직접 촬영 및 직접 검사하는 방식이 활용되었다.For such underwater monitoring, a method in which divers dive under the water and directly capture and inspect images of underwater life has been used.
하지만, 이러한 종래의 수중영상을 촬영한 모니터링 방식은 인명 사고의 위험이 따를 뿐 아니라, 잠수부, 기상, 바다 등의 상태에 영향을 받으므로, 영상 취득 시간 및 범위에 한계가 있었다. 또한, 수중 환경 분석 전문가가 현장감이 떨어지는 2차원 영상을 토대로 수중 환경을 분석해야 하므로, 그 분석 결과의 정확성이 떨어질 수 밖에 없는 문제점이 있었다. However, this conventional monitoring method of capturing underwater images not only carries the risk of human accidents, but is also affected by conditions such as divers, weather, and the sea, so there are limitations in image acquisition time and range. In addition, since underwater environment analysis experts had to analyze the underwater environment based on two-dimensional images that lacked realism, there was a problem that the accuracy of the analysis results inevitably decreased.
더 나아가, 종래의 수중생물 등의 모니터링 방식은 수중 환경 분석 전문가가 현장감이 떨어지는 2차원 영상을 토대로 수중 환경을 분석해야 하므로, 그 분석 결과의 정확성이 떨어질 수 밖에 없는 문제점이 있었다.Furthermore, the conventional monitoring method for aquatic life, etc. has the problem that underwater environment analysis experts have to analyze the underwater environment based on two-dimensional images that lack realism, which inevitably reduces the accuracy of the analysis results.
따라서, 수중 환경에서의 위치 인식 등에 필요한 정보를 얻기 위하여, 수중의 영상 정보를 활용한 수중 물체의 탐색을 비롯하여, 가상현실(Virtual Reality, VR)을 이용한 수중생물의 모니터링에 관한 다양한 연구가 수행되어 왔다. Therefore, in order to obtain information necessary for location recognition in the underwater environment, various studies have been conducted on monitoring underwater life using virtual reality (VR), including the search for underwater objects using underwater image information. come.
일반적으로, 가상 현실(VR)은 어떤 특정한 환경이나 상황을 컴퓨터 등과 같은 정보처리장치로 만들어, 그것을 사용하는 사람이 마치 실제 주변상황과 상호작용을 하고 있는 것처럼 만들어 주는 인간과 정보처리장치 사이의 인터페이스를 말한다. 이러한 가상현실 기술을 통해 현실 세계에서 경험하기 어렵거나 제약이 많은 환경을 사용자에게 제공할 수 있다. In general, virtual reality (VR) is an interface between humans and information processing devices that creates a specific environment or situation with an information processing device such as a computer, making it as if the person using it is interacting with the actual surrounding situation. says These virtual reality technologies can provide users with environments that are difficult or restrictive to experience in the real world.
더 나아가, 사용자가 가상현실을 통해 실재감을 느끼기 위해서는 시각, 청각, 촉각 등 사용자의 오감을 통해 획득되는 정보를 제공해야 한다. 따라서, 상기 가상현실(VR)을 적용한 시스템과 방법도 최근 들어 크게 부각되고 있다.Furthermore, in order for users to feel a sense of reality through virtual reality, information acquired through the user's five senses, such as sight, hearing, and touch, must be provided. Accordingly, systems and methods applying the virtual reality (VR) have also recently been greatly highlighted.
이러한 특징과 기능을 갖고 있는 가상현실을 이용하여, 수중 환경으로 인한 직, 간접적인 영향을 최대한 줄이기 위하여 종래 기술보다 현재의 수중생물을 모니터링할수 있는 수중환경 상태를 더욱 정확히 파악하는 것이 가능하고, 분석 데이터를 쉽게 이해하고 결과를 효율적으로 활용할수 있는 시스템 및 그 방법의 개발이 요구되어 왔다.Using virtual reality, which has these features and functions, it is possible to more accurately identify and analyze the state of the underwater environment to monitor current aquatic life than conventional technologies in order to minimize direct and indirect impacts from the underwater environment. There has been a demand for the development of systems and methods that can easily understand data and utilize results efficiently.
따라서, 본 발명은 이러한 문제를 해결하기 위해서 창안된 것으로, 카멜 등과 같은 영상 촬영 장치를 이용하여 수중 영상 정보에 의거하여 VR(가상현실) 영상으로 수중 모니터링을 시도하려는 사용자 또는 당사자에게 제공하여 줌으로써, 가상현실을 도입하여 현실감을 더욱 증대시킬 수 있는 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공시스템과 방법을 제공하는데 그 목적이 있다. Therefore, the present invention was created to solve this problem, by providing users or parties who want to attempt underwater monitoring with VR (virtual reality) images based on underwater image information using an image capture device such as Camel, The purpose is to provide a VR video and content provision system and method for underwater monitoring that can further increase the sense of reality by introducing virtual reality.
또한, 본 발명은 수중환경을 사용자에게 현실적으로 체험시키기 위하여 상기 사용자의 안정보장 및 실질적인 현장감과 그 몰입을 유도함으로서 교육적 효과를 극대화시킬수 있는 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공시스템과 방법을 제공하는데 그 목적이 있다. In addition, the present invention provides a VR video and content provision system and method for underwater monitoring that can maximize educational effects by ensuring the user's stability and inducing a realistic sense of presence and immersion in order to provide the user with a realistic experience of the underwater environment. There is a purpose.
또한, 수중환경 체험을 의해 선택된 정보를 기반으로 가상 수중 영상을 시뮬레이션하여, 수중체험 교육자로 하여금 직관적인 체감을 할 수 있는 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공시스템과 방법을 제공하는데 그 목적이 있다. In addition, the purpose is to provide a VR video and content provision system and method for underwater monitoring that allows underwater experience educators to have an intuitive experience by simulating virtual underwater images based on information selected through underwater environment experience. .
따라서, 이러한 목적을 달성하기 위하여 본 발명은 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공시스템에 있어서, 감지센서가 설치되며, 사용자가 누워서 물의 물리감을 느낄수 있도록 하기 위한 수상 매트 와 상기 사용자의 동작을 감지하기 위해 영상 촬영 장치와 영상 인식 모듈로 구성되는 동작 감지 장치와 상기 사용자의 머리에 착용되어 가상현실 영상을 제공하는 가상체험용 출력 장치와 상기 수상매트의 상기 감지센서와 상기 동작 감지 장치의 데이터를 수신하여 상기 가상체험용 출력장치에 출력되는 가상현실 콘텐츠를 제어할수 있는 단말기와 상기 가상체험용 출력장치에 수중 가상체험에 대한 콘텐츠 정보를 전송하여 상기 콘텐츠 정보에 맞게 상기 가상체험용 출력장치 및 상기 단말기를 구동시키고, 상기 해양 가상체험에 대한 진행상황을 모니터링하는 관리서버를 포함하는 것을 특징으로 한다. Therefore, in order to achieve this purpose, the present invention is a VR video and content provision system for underwater monitoring, in which a detection sensor is installed, a water mat for allowing the user to feel the physical sensation of water while lying down, and a water mat for detecting the user's motion. A motion detection device consisting of an image capture device and an image recognition module, a virtual experience output device that is worn on the user's head and provides a virtual reality image, and data received from the detection sensor of the water mat and the motion detection device. A terminal capable of controlling the virtual reality content output to the virtual experience output device and transmitting content information about the underwater virtual experience to the virtual experience output device, and transmitting content information about the underwater virtual experience to the virtual experience output device and the terminal according to the content information. It is characterized by including a management server that runs and monitors the progress of the marine virtual experience.
또한, 상기 감지센서는 상기 사용자의 시각, 청각 및 촉각으로 수중 환경이 인지되도록 하는 것을 특징으로 하는 것이다.In addition, the detection sensor is characterized in that the underwater environment is recognized by the user's vision, hearing, and tactile senses.
또한, 상기 가상현실 콘텐츠는 수중환경을 체험하는 사용자 아바타를 포함하는 것을 특징으로 하는 것이다.Additionally, the virtual reality content is characterized by including a user avatar that experiences the underwater environment.
그리고, 상기 수상 매트의 하단에는 상기 수상 매트를 구동하기 위한 구동수단과 사실감이 있는 체험을 위하여 바람을 제공하는 풍량조절기 및 물방울을 제공하는 분무조절기를 더 포함하는 것을 특징으로 한다. In addition, the bottom of the water mat further includes a driving means for driving the water mat, a wind volume controller that provides wind for a realistic experience, and a spray controller that provides water droplets.
따라서, 본 발명은 수중 영상을 VR 영상으로 생성시켜 직접 사용자에게 제공함과 동시에, 수중을 모니터링하도록 하여 수중영상에 대한 현장감을 증대시킬 수 있으며 이에 따라 VR 영상을 토대로 하여 수중 환경을 체험하고자 할 경우, 수중 체험의 정확성과 그 신뢰성을 높일 수 있는 우수한 효과가 있는 것이다. Therefore, the present invention generates underwater images as VR images and provides them directly to the user, while also monitoring the underwater, thereby increasing the sense of reality for the underwater images. Accordingly, if you want to experience the underwater environment based on VR images, It has an excellent effect of increasing the accuracy and reliability of the underwater experience.
또한, 수중환경 등을 교육받으려고 하는 교육자에 대하여 안전성을 보장하면서도 사실적인 실질적인 수중 교육 환경을 제공할수 있는 효과도 있는 것이다. In addition, it has the effect of providing a realistic and practical underwater education environment while ensuring safety for educators who want to receive education about the underwater environment.
도 1은 본 발명의 실시예에 따른 수중 모니터링을 위한 VR 영상 제공시스템의 개략적인 구성도.1 is a schematic configuration diagram of a VR image providing system for underwater monitoring according to an embodiment of the present invention.
도 2는 수상 매트를 나타낸 도면. Figure 2 is a diagram showing an award mat.
도 3은 가상체험용 출력장치의 상세 블록도. Figure 3 is a detailed block diagram of an output device for virtual experience.
도 4는 가상현실(VR) 콘텐츠의 화면 예시도.Figure 4 is an example screen of virtual reality (VR) content.
도 5는 단말기와 컨텐츠 장치와의 관계도. Figure 5 is a relationship diagram between a terminal and a content device.
도 6은 관리서버의 상세 블록도. Figure 6 is a detailed block diagram of the management server.
도 7은 본 발명의 실시예에 따른 수중 모니터링을 위한 VR 영상 제공방법의 흐름도.Figure 7 is a flowchart of a method for providing VR images for underwater monitoring according to an embodiment of the present invention.
도 8은 VR 교육장치의 구성을 설명하기 위한 블록도. Figure 8 is a block diagram for explaining the configuration of the VR education device.
도 9는 디스플레이장치의 구성을 설명하기 위한 블록도.Figure 9 is a block diagram for explaining the configuration of a display device.
도 10은 서버의 구성을 설명하기 위한 블록도.Figure 10 is a block diagram for explaining the configuration of a server.
도 11은 VR 교육장치를 이용하여 VR기반 해양 교육의 제공방법을 설명하기 위한 흐름도.Figure 11 is a flowchart illustrating a method of providing VR-based marine education using a VR education device.
이하에서는 본 발명의 양호한 실시예를 첨부된 도면을 참조하여 상세히 설명하기로 한다. 본 발명을 설명하기에 앞서, 우선 각 도면의 구성요소들에 참조부호를 부가함에 있어서, 동일한 구성요소들에 대해서는 비록 다른 도면상에 표시가 되더라도 가능한 한 동일 부호를 가지도록 하고 있음에 유의하여야 한다. Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the attached drawings. Before explaining the present invention, it should be noted that in adding reference numerals to components in each drawing, the same components are given the same reference numerals as much as possible even if they are shown in different drawings. .
또한, 하기에서 본 발명을 설명함에 있어 관련된 공지기능 또는 구성에 대한 구체적인 설명이 본 발명의 요지를 불필요하게 흐릴수 있다고 판단되는 경우에는 그 상세한 설명을 생략한다. Additionally, in the following description of the present invention, if a detailed description of a related known function or configuration is judged to unnecessarily obscure the gist of the present invention, the detailed description will be omitted.
또한, 본 출원에서 사용한 용어는 단지 특정한 실시예를 설명하기 위하여 사용된 것에 불과하므로, 본 발명을 한정하려는 의도가 아니며 단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한 복수의 표현도 의미하는 것임을 미리 밝혀두고자 한다. In addition, it should be noted in advance that the terms used in this application are only used to describe specific embodiments, and are not intended to limit the invention, and that singular expressions also mean plural expressions unless the context clearly indicates otherwise. I want to leave it.
도 1은 본 발명의 실시예에 따른 수중 모니터링을 위한 VR 영상 제공시스템의 개략적인 구성도이고, 도 2는 수상 매트를 나타낸 도면이고, 도 3은 가상체험용 출력장치의 상세 블록도이고, 도 4는 가상현실(VR) 콘텐츠의 화면 예시도이고, 도 5는 단말기와 컨텐츠 장치와의 관계도이고, 도 6은 관리서버의 상세 블록도이고, 도 7은 본 발명의 실시예에 따른 수중 모니터링을 위한 VR 영상 제공방법의 흐름도이고, 도 8은 VR 교육장치의 구성을 설명하기 위한 블록도이고, 도 9는 디스플레이장치의 구성을 설명하기 위한 블록도이고, 도 10은 서버의 구성을 설명하기 위한 블록도이고, 도 11은 VR 교육장치를 이용하여 VR기반 해양 교육의 제공방법을 설명하기 위한 흐름도이다.Figure 1 is a schematic configuration diagram of a VR image providing system for underwater monitoring according to an embodiment of the present invention, Figure 2 is a diagram showing an aquatic mat, Figure 3 is a detailed block diagram of an output device for virtual experience, and Figure 4 is an example screen of virtual reality (VR) content, Figure 5 is a diagram of the relationship between the terminal and the content device, Figure 6 is a detailed block diagram of the management server, and Figure 7 is underwater monitoring according to an embodiment of the present invention. It is a flowchart of a method for providing VR images, FIG. 8 is a block diagram for explaining the configuration of a VR education device, FIG. 9 is a block diagram for explaining the configuration of a display device, and FIG. 10 is a block diagram for explaining the configuration of a server. It is a block diagram for, and Figure 11 is a flow chart to explain a method of providing VR-based marine education using a VR education device.
이하, 도 1 및 도 2를 참조하여 본 발명에 의한 수중 모니터링을 위한 VR 영상 제공 시스템의 전체적인 구성에 대하여 설명하기로 한다. Hereinafter, the overall configuration of the VR image providing system for underwater monitoring according to the present invention will be described with reference to FIGS. 1 and 2.
먼저, 도 2를 보면 사용자가 물이 수용된 임의의 공간에 비치되어 있는 수상(水上)에 떠 있는 수상 매트(100)에 누워서 가상 수중 환경을 체험할 수 있도록 하는 것이다. First, looking at Figure 2, the user can experience a virtual underwater environment by lying on the water mat 100 floating on the water provided in any space containing water.
상기 수상 매트(100)에는 감지 센서(S)가 설치되며, 사용자가 누워서 물의 물리감을 직접적으로 느낄수 있도록 하여, 가상(VR) 수중의 체험에 있어 더욱 현실감을 도모할 수 있도록 하는 것이다. A detection sensor (S) is installed on the water mat 100, allowing the user to lie down and directly feel the physical sensation of the water, thereby promoting more realism in the virtual (VR) underwater experience.
더 나아가, 상기 수상 매트(100)의 하단에는 상기 수상 매트(100)를 구동하기 위한 구동수단(미도시)과 더욱 가상 수중환경을 사실감 있는 체험을 위하여 바람을 제공하는 풍량조절기(미도시) 및 물방울을 제공하는 분무 조절기(미도시)가 포함된다. Furthermore, at the bottom of the water mat 100, a driving means (not shown) for driving the water mat 100 and an air volume controller (not shown) that provides wind for a more realistic experience of a virtual underwater environment, and A mist regulator (not shown) that provides water droplets is included.
따라서, 상기 수상 매트(100)는 하단에 소정의 구동수단이 형성되어 있으므로, 능동적으로 물의 물리적 성질들(물 움직임의 속도, 양, 주파수, 파장 등)을 모사(模寫)할 수 있도록 하는 것이 바람직하다. Therefore, since the water mat 100 has a predetermined driving means formed at the bottom, it is possible to actively simulate the physical properties of water (speed, amount, frequency, wavelength, etc. of water movement). desirable.
상기 수상 매트(100)를 상기 구동수단을 이용한 구동에 대한 설명과 상기 풍량 조절기와 상기 분무조절기에 대한 자세한 구성과 작동관계에 대한 설명은 널리 공지된 기술이므로 생략하기로 한다. The description of driving the water mat 100 using the driving means and the detailed configuration and operational relationship of the air volume controller and the spray controller will be omitted since they are well-known techniques.
한편, 상기 감지 센서(S)는 틸트센서, 가속도센서, 관성센서 등을 포함하여, 물움직임의 속도, 양, 주파수, 파장 등 물의 물리적 성질을 표현할 수 있는 데이터를 출력할 수 있도록 하는 것이 바람직하다.Meanwhile, it is preferable that the detection sensor (S) includes a tilt sensor, an acceleration sensor, an inertial sensor, etc., so that it can output data that can express the physical properties of water, such as the speed, amount, frequency, and wavelength of water movement. .
그리고, 상기 수상 매트(100)는 도시된 대로, 상기 수상 매트(100)에 누워있는 사용자의 시각, 청각 및 촉각으로 다양한 수중 환경을 인지할수 있도록 하기 위한 감각 센서(S)가 형성되어 있다. And, as shown, the aquatic mat 100 is formed with a sensory sensor (S) to enable the user lying on the aquatic mat 100 to perceive various underwater environments through the visual, auditory, and tactile senses.
따라서, 상기 감각 센서(S)가 장착되어, 사용자가 누워 물의 물리감을 느낄 수 있도록 형성되는데, 더 나아가 상기 감각센서(S)는 틸트(tilt) 센서, 가속도 센서, 관성 센서 등을 포함하여, 물의 움직임의 속도, 양, 주파수, 파장 등 물의 물리적 성질을 표현할 수 있는 데이터를 출력할 수 있도록 하는 것이 바람직하다.Therefore, the sensory sensor (S) is mounted so that the user can feel the physical sensation of water while lying down. Furthermore, the sensory sensor (S) includes a tilt sensor, an acceleration sensor, an inertial sensor, etc. It is desirable to be able to output data that can express the physical properties of water, such as the speed, amount, frequency, and wavelength of movement.
동작 감지 장치(200)는 수상 매트(100)의 전면인 위에 누워 있는 사용자의 신체 동작을 감지하기 위해 영상 촬영 장치(210)와 영상 인식 모듈(미도시)로 구성되어 있는 것이다. The motion detection device 200 is composed of an image capture device 210 and an image recognition module (not shown) to detect the body motion of a user lying on the front of the water mat 100.
상기 동작 감지 장치(200)는 사용자의 신체에 소정의 동작 인식 마커(미도시)를 장착하여 상기 동작 감지 장치(200)가 사용자의 동작을 보다 효과적으로 인식할 수 있도록 하는 것이 바람직하다. It is desirable that the motion detection device 200 be equipped with a predetermined motion recognition marker (not shown) on the user's body so that the motion detection device 200 can more effectively recognize the user's motion.
가상체험용 출력 장치(300)는 상기 사용자의 머리에 착용되어 가상현실(VR) 영상을 제공하는 것이다. The output device 300 for virtual experience is worn on the user's head and provides a virtual reality (VR) image.
상기 가상체험용 출력 장치(300)를 통하여 출력되는 가상현실 콘텐츠는 사용자 1인칭 시점에서의 가상현실 콘텐츠(미도시) 또는 사용자 3인칭 시점에서의 가상현실 콘텐츠(미도시)가 있다. The virtual reality content output through the virtual experience output device 300 includes virtual reality content from the user's first-person perspective (not shown) or virtual reality content from the user's third-person perspective (not shown).
상기 1인칭 및 3인칭 가상현실 콘텐츠는 생존 수영을 체험하는 사용자 아바타를 포함시키는 것이 바람직한 것인데, 상기 사용자 아바타는 앞서 설명한 동작감지장치(200)로부터 감지되는 신체 동작 데이터에 매칭되어 서로 연동하도록 하여 움직이도록 하는 것이 바람직할 것이다.The first-person and third-person virtual reality content preferably includes a user avatar who experiences survival swimming, and the user avatar moves in conjunction with each other by matching the body motion data detected from the motion detection device 200 described above. It would be desirable to do so.
이하에서는, 도면을 참조하여 가상체험용 출력 장치(300)를 통하여 출력되는 가상현실 콘텐츠의 기능에 대한 설명을 하기로 한다. Hereinafter, the functions of virtual reality content output through the virtual experience output device 300 will be described with reference to the drawings.
도 3을 참조하여, 가상체험용 출력장치(300)에 대하여 설명하면, 상기 가상체험용 출력장치(300)는 헤더부(310), 스피커(320), 호흡센서(330) 및 제어부(340)를 포함하여 구성된다.Referring to FIG. 3, when describing the output device 300 for a virtual experience, the output device 300 for a virtual experience includes a header unit 310, a speaker 320, a breathing sensor 330, and a control unit 340. It is composed including.
헤더부(310)는 사용자의 시선에 따른 수중 가상 현실 콘텐츠를 제공하는 시각적으로 제공한다. 수중 가상 현실 콘텐츠 중 영상신호는 영상신호 입력부(350)를 통해 입력된다. 이 영상은 사용자가 수중의 소정 위치에 위치하였을 때의 영상에 해당한다.The header unit 310 visually provides underwater virtual reality content according to the user's gaze. Among the underwater virtual reality content, video signals are input through the video signal input unit 350. This image corresponds to an image when the user is located at a certain location underwater.
상기 수중 가상 현실 콘텐츠 중 음성신호는 음성신호 입력부(360)를 통해 입력되고, 이 신호는 스피커(320)를 통해 출력된다. 상기 스피커(320)를 통해 출력되는 사운드는 사용자가 수중의 소정 위치에 위치했을 때의 사운드에 해당한다.Among the underwater virtual reality content, a voice signal is input through the voice signal input unit 360, and this signal is output through the speaker 320. The sound output through the speaker 320 corresponds to the sound when the user is located at a predetermined position underwater.
상기 영상신호 입력부(350) 및 음성신호 입력부(360)를 통해 수중 가상 현실 콘텐츠가 제공되는 것이다.Underwater virtual reality content is provided through the video signal input unit 350 and the audio signal input unit 360.
호흡센서(330)는 사용자의 들숨, 날숨 여부 및 그 정도를 감지한다.The breathing sensor 330 detects whether and to what extent the user is inhaling or exhaling.
제어부(340)는 가상체험용 출력장치(300)의 각 구성요소를 제어하고, 상기 호흡센서(330)에 의해 측정된 사용자의 호흡상태를 상기 콘텐츠에 반영한다.The control unit 340 controls each component of the virtual experience output device 300 and reflects the user's breathing state measured by the breathing sensor 330 in the content.
상기 가상체험용 출력장치(300)는 헤더부(310)와 호흡센서(330), 스피커(320)를 사용자의 얼굴의 안면부에 착용하는 형태로 제작하여 인터페이스로 제공된다. The virtual experience output device 300 is provided as an interface by manufacturing a header part 310, a breathing sensor 330, and a speaker 320 to be worn on the user's face.
또 다른 방법으로 상기 가상체험용 출력장치(300)는 마스크 형태로 구성되어 사용자의 얼굴에 착용할 수 있는데, 이때, 고정밴드(미도시)를 이용하여 얼굴에 밀착할 수 있다.In another method, the virtual experience output device 300 is configured in the form of a mask and can be worn on the user's face. In this case, it can be attached to the face using a fixing band (not shown).
또한, 제어부(340)는 상기 가상체험용 출력장치(300)로부터 사용자 정보를 실시간으로 획득하고, 그 획득된 데이터를 이용하여, 상기 가상체험용 출력장치(300)를 제어하는 것이다. In addition, the control unit 340 obtains user information in real time from the virtual experience output device 300 and controls the virtual experience output device 300 using the obtained data.
*도 4는 상기 가상현실(VR) 콘텐츠의 화면 예시도이다.*Figure 4 is an example screen of the virtual reality (VR) content.
도면에서 나타난 바와 같이, 실제 수중의 상태를 영상 및 음성으로 획득한 것을 사용자에게 가상체험용 출력장치(300)의 가상현실 콘텐츠를 통해 보여주는 것이다. 특히 화면의 좌측에 도시된 수심(水深)에 따라 상기 가상 컨텐츠 화면이 자연스럽게 변경되며, 수심에 따라 산소량, 염분, 밀도, 수온 및 압력 등이 변경되는 것이다. As shown in the drawing, the actual underwater state obtained through video and audio is shown to the user through virtual reality content of the virtual experience output device 300. In particular, the virtual content screen naturally changes depending on the water depth shown on the left side of the screen, and the amount of oxygen, salinity, density, water temperature, and pressure change depending on the water depth.
단말기(400)는 상기 수상 매트(100)의 감지센서(S)와 상기 동작 감지 장치(200)의 데이터를 수신하여 상기 가상체험용 출력장치(300)에 출력되는 가상현실 콘텐츠를 온오프 등을 할수 있으며, 제어할 수 있는 것이다. The terminal 400 receives data from the detection sensor (S) of the water mat 100 and the motion detection device 200 and turns on/off the virtual reality content output to the virtual experience output device 300. It can be done, and it can be controlled.
상기 단말기(400)는 후술할 컨텐츠 장치(500)로부터 관심지점 생성시 POI(Points of Interest)를 기반으로 주변의 해양 환경 정보가 반영된 해양 콘텐츠를 생성하거나 이를 서비스 받는다. The terminal 400 generates or receives marine content reflecting information on the surrounding marine environment based on Points of Interest (POI) when creating a point of interest from the content device 500, which will be described later.
이하에서는, 도 5를 참조하여 컨텐츠 장치(500)에 대한 설명을 하기로 한다. Hereinafter, the content device 500 will be described with reference to FIG. 5 .
상기 컨텐츠 장치(500)는 수중환경정보를 서비스하는 수중환경정보 제공서버(510)와, 수중환경정보가 반영된 콘텐츠를 제공하는 수중콘텐츠 제공서버(520)로 이루어지며, 상기 단말기(400)에서는 주변의 수중환경정보가 반영된 해양콘텐츠를 생성한다. 따라서, 단말기(400)에는 해당 어플리케이션이 다운로드되어 실행되다. The content device 500 consists of an underwater environment information providing server 510 that provides underwater environment information, and an underwater content providing server 520 that provides content reflecting the underwater environment information. The terminal 400 provides peripheral information. Create marine content that reflects underwater environment information. Accordingly, the application is downloaded and executed in the terminal 400.
즉, 상기 단말기(400)의 해당 어플리케이션을 구동시켜, 관심지점을 생성할 때 위치정보와 일시정보를 수중환경정보 제공서버(510)로 전송하고, 상기 수중환경정보 제공서버(510)는 전송받은 위치정보와 일시정보를 토대로 현 지역의 물때, 수심, 날씨, 수온, 풍향, 풍속, 파고, 파향 등을 확인할 수 있는 수중환경정보를 생성하고, 이를 단말기(400)로 전송하고, 상기 단말기(400)는 전송받은 수중환경정보를 토대로 수중콘텐츠를 생성하고, 이를 수중콘텐츠 제공서버(520)로 전송하여 각 콘텐츠별로 저장하도록 요청하고, 상기 수중콘텐츠 제공서버(520)는 수중환경정보가 반영된 각각의 콘텐츠를 데이터베이스화함으로써, 콘텐츠를 등록하는 과정을 완료하는 것이다.That is, when the corresponding application of the terminal 400 is run and a point of interest is created, location information and date information are transmitted to the underwater environment information providing server 510, and the underwater environment information providing server 510 receives the received information. Based on the location information and time information, underwater environment information that can check the current area's tide, water depth, weather, water temperature, wind direction, wind speed, wave height, and wave direction is generated, and this is transmitted to the terminal 400, and the terminal 400 ) generates underwater content based on the received underwater environment information, transmits it to the underwater content providing server 520 and requests it to be stored for each content, and the underwater content providing server 520 creates each content reflecting the underwater environment information. By converting the content into a database, the process of registering the content is completed.
그리고, 단말기(400)가 수중콘텐츠 제공서버(520)에 접속하여 콘텐츠 ID를 토대로 해양 환경 정보가 반영된 콘텐츠를 전송받아 화면 상에 표출하는 과정을 포함하여 이루어져, 상기 단말기(400)에 등록된 콘텐츠의 수중 환경 정보와 현재의 수중 환경 정보를 확인할 수 있게 구성함으로써, 유동적으로 변화하는 수중환경에 맞춰 각각의 사용자들이 콘텐츠를 보고 수중 환경 정보를 획득하면서 안전하고 우수한 수중활동 결과를 얻을 수 있게 서비스하는 것이다.And, the terminal 400 connects to the underwater content providing server 520, receives content reflecting marine environment information based on the content ID, and displays it on the screen, so that the content registered in the terminal 400 By organizing the underwater environment information and the current underwater environment information, each user can obtain safe and excellent underwater activity results by viewing content and obtaining underwater environment information in accordance with the dynamically changing underwater environment. will be.
이때, 단말기(400)는 위치정보 및 일시정보를 기반으로 과거의 수중환경정보와 현재의 수중환경정보를 비교하여, 그 차이를 확인하거나 안내 받을 수 있게 구성할 수 있는 것이다.At this time, the terminal 400 can be configured to compare past underwater environment information and current underwater environment information based on location information and date information, and confirm the difference or receive guidance.
더불어, 단말기(400)가 수중활동 또는 수중활동 결과를 사진 촬영하고 이를 등록할 때에 현재의 위치와 일시를 기반으로 하여 수중환경정보를 확보하여 자동으로 저장하면서 수중 콘텐츠를 생성할수 있도록 구성할수 있는 것이다.In addition, when the terminal 400 takes pictures of underwater activities or the results of underwater activities and registers them, it can be configured to secure and automatically store underwater environment information based on the current location and date to create underwater content. .
상기 단말기(400)가 관련 어플리케이션을 실행하여, 콘텐츠를 생성할 때와, 수중환경정보가 반영된 콘텐츠가 표출에 대한 설명을 하면 가령, 상기 단말기(400)의 어플리케이션이 활성화되었을 때, 화면상에는 타일링된 이미지를 전송받아 현재의 위치를 확인할 수 있는 맵(미도시)을 통해 표출되는 현재의 위치를 관심지역으로 설정 등록하기 위한 관심지점 등록버튼(미도시)을 통하여, 이를 작동시켰을 경우, 상기 단말기(400)가 위치한 현 위치의 위치정보와 일시정보를 수중 환경정보 제공서버(510)로부터 전송하고, 이에 대응하는 수중환경정보를 전송받아 콘텐츠를 생성하는 것이다. When the terminal 400 executes a related application to generate content and explains the display of content reflecting underwater environment information, for example, when the application of the terminal 400 is activated, a tiled image appears on the screen. When activated through the point of interest registration button (not shown) to set and register the current location displayed through a map (not shown) that allows you to receive an image and check the current location as an area of interest, the terminal ( The location information and date information of the current location where 400) is located are transmitted from the underwater environment information providing server 510, and the corresponding underwater environment information is transmitted to generate content.
더 나아가, 단말기(400)를 통하여 수중콘텐츠 제공서버(520)에 접속하여 자신이 생성한 콘텐츠를 공유할 수 있는 공유 URL을 전송받았을 때, 이를 카카오 토크 또는 트위터 혹은 인스타그램 등과 같은 SNS나 메시지를 통해 피공유인에게 전송하여, 각각의 피공유인들이 상기 공유URL을 통해 수중콘텐츠 제공서버(520)에 접속하여 수중환경정보가 반영된 콘텐츠를 확인할수 있는 것이다. Furthermore, when you access the underwater content providing server 520 through the terminal 400 and receive a sharing URL that allows you to share the content you created, you can share it through SNS or messages such as Kakao Talk, Twitter, or Instagram. It is transmitted to the shareee through , so that each shareee can access the underwater content providing server 520 through the shared URL and check the content reflecting the underwater environment information.
관리 서버(600)는 앞서 설명한 가상체험용 출력장치(300)으로 하여금 수중 가상체험에 대한 콘텐츠 정보를 전송하여, 상기 콘텐츠 정보에 맞게 상기 가상체험용 출력장치(300) 및 상기 단말기(400)를 구동시키고, 사용자의 상기 수중 가상 체험에 대한 진행 상황을 실시간으로 모니터링할 수 있는 것이다. The management server 600 transmits content information about the underwater virtual experience to the virtual experience output device 300 described above, and configures the virtual experience output device 300 and the terminal 400 according to the content information. It can be operated and the progress of the user's underwater virtual experience can be monitored in real time.
이하, 도면을 참조하여 상기 관리 서버(600)에 대한 설명을 하기로 한다.Hereinafter, the management server 600 will be described with reference to the drawings.
도 6은 관리 서버(600)의 상세 블록도로서, 상기 관리 서버(600)는 통신부(410), 서버 제어부(620) 및 저장부(630)를 포함한다.Figure 6 is a detailed block diagram of the management server 600, which includes a communication unit 410, a server control unit 620, and a storage unit 630.
통신부(610)는 가상체험용 출력장치(300)와의 통신을 수행한다. 상기 통신부(610)는 가상체험용 출력장치(300)에 해양 가상체험에 대한 콘텐츠 정보를 전송하고, 상기 가상체험용 출력장치(300)로부터 수중의 가상체험에 대한 진행상황 및 사용자의 생체상태에 대한 정보를 수신한다.The communication unit 610 performs communication with the virtual experience output device 300. The communication unit 610 transmits content information about the ocean virtual experience to the virtual experience output device 300, and reports the progress of the underwater virtual experience and the user's biological condition from the virtual experience output device 300. Receive information about
서버 제어부(620)는 해양 가상체험에 대한 콘텐츠 정보에 맞게 가상체험용 출력장치(300)가 구동되도록 콘텐츠 정보의 전달을 제어한다. 이 때, 콘텐츠 정보는 외부 단말기(400)로부터 수신된 사용자 입력에 의해 선택되거나, 서버 입력부(미도시)에 의해 직접 입력될 수 있다.The server control unit 620 controls the delivery of content information so that the virtual experience output device 300 is operated according to the content information about the marine virtual experience. At this time, content information may be selected by user input received from the external terminal 400, or may be directly input by a server input unit (not shown).
서버 제어부(620)는 사용자의 해양 가상체험에 대한 진행 상황을 모니터링한다. 상기 서버 제어부(620)는 사용자마다 해양 가상체험에 대한 진행 상황을 분석하여 평가를 수행할 수 있다. 즉, 상기 서버 제어부(620)는 사용자의 체험 난이도, 진행속도, 체험 완료 상태 등을 평가할 수 있다. The server control unit 620 monitors the progress of the user's marine virtual experience. The server control unit 620 may analyze the progress of each user's marine virtual experience and perform evaluation. That is, the server control unit 620 can evaluate the user's experience difficulty level, progress speed, experience completion status, etc.
바람직하게는, 서버 제어부(620)는 평가된 결과를 이용하여 사용자가 부족한 부분이나 보완 부분을 산출하여 사용자에게 제공할 수 있다.Preferably, the server control unit 620 can use the evaluation results to calculate the parts that the user is lacking or supplementary parts and provide them to the user.
서버 제어부(620)는 사용자의 생체 상태를 모니터링한다. 서버 제어부(620)는 사용자의 생체상태가 기 설정된 기준보다 불안정 상태가 일정시간 지속되면 해양 가상체험이 종료되도록 가상체험용 출력장치(300)를 제어한다. The server control unit 620 monitors the user's biological condition. The server control unit 620 controls the virtual experience output device 300 so that the marine virtual experience ends when the user's biological condition remains unstable for a certain period of time compared to a preset standard.
여기서, 기 설정된 기준은 정상상태의 생체상태를 의미한다. 이를 통해서, 사용자는 수중 안전 사고를 미연에 방지할 수 있다.Here, the preset standard refers to a normal biological state. Through this, users can prevent underwater safety accidents in advance.
저장부(630)는 해양 가상체험에 대한 콘텐츠 정보가 저장된다. 상기 저장부(630)는 각 사용자마다 평가 결과 및 생체상태에 대한 정보가 저장된다. 상기 저장부(630)는 플래시 메모리 타입, 하드디스크 타입, 미디어 카드 마이크로 타입, 카드 타입의 메모리(예를 들어 SD 또는 XD 메모리 등), 램, SRAM, 롬, EEPROM, PROM, 자기메모리, 자기 디스크, 광디스크 중 적어도 하나의 저장매체를 포함할 수 있다.The storage unit 630 stores content information about the marine virtual experience. The storage unit 630 stores information on evaluation results and biological conditions for each user. The storage unit 630 may be a flash memory type, hard disk type, media card micro type, card type memory (e.g. SD or XD memory, etc.), RAM, SRAM, ROM, EEPROM, PROM, magnetic memory, magnetic disk. , and may include at least one storage medium among optical disks.
이하, 도 7을 참조하여 본 발명에 의한 수중 모니터링을 위한 VR 영상 제공방법에 대하여 설명하기로 한다. 앞서 설명한 실시예와 중복되는 설명은 어느정도 생략하기로 한다. Hereinafter, a method of providing VR images for underwater monitoring according to the present invention will be described with reference to FIG. 7. Descriptions that overlap with the previously described embodiments will be omitted to some extent.
도시한 바와 같이, 본 발명에 의한 일실시예에 따른 가상현실(VR) 영상시스템을 이용하는 방법은, 감지센서(S)가 장착되며 사용자가 누워 물의 물리감을 느낄 수 있도록 제작된 수상 매트(100), 상기 사용자의 신체 동작을 감지하는 동작 감지장치(200), 상기 사용자의 머리에 착용되며, 가상현실 콘텐츠를 출력하는 체험용 출력장치(300), 상기 수상 매트(100)의 감지센서(S)와 동작 감지 장치(200)의 데이터 값을 수신하고, 상기 체험용 출력장치(300)에 출력되는 가상현실 콘텐츠를 제어할 수 있는 단말기(400) 및 상기 사용자의 시각, 청각 및 촉각으로 해양 환경이 인지되도록 수중 환경상태를 조성하는 콘텐츠 장치(500)를 포함하는 가상현실(VR) 영상 시스템을 이용하는 방법에 관한 것이다. As shown, the method of using a virtual reality (VR) imaging system according to an embodiment of the present invention is a water mat 100 equipped with a detection sensor (S) and manufactured so that the user can feel the physical sensation of water while lying down. , a motion detection device 200 that detects the user's body movements, an experiential output device 300 that is worn on the user's head and outputs virtual reality content, and a detection sensor (S) of the water mat 100 and a terminal 400 capable of receiving data values from the motion detection device 200 and controlling virtual reality content output to the experiential output device 300, and the marine environment through the user's sight, hearing, and touch. It relates to a method of using a virtual reality (VR) imaging system including a content device 500 that creates a perceptible underwater environment.
먼저, 상기 동작 감지 장치(200)를 작동시켜 사용자의 신체 동작을 감지하는 단계(S10)가 있다. First, there is a step (S10) of operating the motion detection device 200 to detect the user's body motion.
상기 S10은 사용자의 신체 동작의 감지를 통하여 사용자의 신체동작 데이터를 수집하는 것이다. The S10 collects the user's body motion data through detection of the user's body motion.
상기 사용자가 가상현실 콘텐츠를 출력시키는 체험용 출력장치(300)를 머리에 착용하는 단계(S20)가 있다. 헤더부(310)를 통하여 머리에 고정시키는 것인데, 이에 관한 자세한 설명은 공지된 것이므로 생략하기로 한다. 그리고, 머리에 고정시키는 방식에 한정되는 것은 아니라는 점을 밝혀두고자 한다. There is a step (S20) in which the user wears the experiential output device 300 that outputs virtual reality content on his head. It is fixed to the head through the header portion 310, and detailed descriptions of this will be omitted as they are well-known. Also, I would like to make it clear that it is not limited to the method of fixing it to the head.
상기 가상현실 콘텐츠는 사용자의 1인칭 또는 3인칭 시점에서의 상기 1인칭 또는 3인칭 시점에서의 가상현실 콘텐츠는 사용자 아바타를 통하여 동작감지장치(200)로부터 감지된 신체의 동작 데이터에 매칭되어 함께 연동하여 동작한다. The virtual reality content is linked to the user's first-person or third-person perspective by matching body motion data detected from the motion detection device 200 through the user avatar. It works.
디음으로, 사용자가 수상 매트(100) 위에 눕는 단계(S30)인데, 상기 체험용 출력장치(300)를 갖춘 사용자가 상기 수상 매트(100)에 누워 가상의 수중 체험을 실행할 준비를 마치는 것이다. Next, the user lies down on the water mat 100 (S30), and the user equipped with the experience output device 300 lies on the water mat 100 and is ready to execute a virtual underwater experience.
상기 수상 매트(100)의 감지센서(S)와 동작감지장치(200)로부터 출력되는 데이터 값들을 상기 체험용 출력장치(300)에 출력되어 가상현실 콘텐츠에 반영되는 단계(S40)를 거치게 된다. Data values output from the detection sensor (S) and the motion detection device 200 of the water mat 100 are output to the experiential output device 300 and are reflected in virtual reality content (S40).
즉, 상기 감지센서(S)를 통하여 사용자의 시각, 청각 및 촉각으로 수중 환경을 인지할수 있도록 하고, 상기 동작감지장치(200)로부터 신체의 동작을 감지, 마치 수중을 자유자재로 움직이며 돌아다닌다는 느낌을 줄수 있는 것이다. In other words, the underwater environment can be recognized through the user's visual, auditory, and tactile senses through the detection sensor (S), and the motion of the body is sensed from the motion detection device (200), allowing the user to freely move and move around underwater. It can give you a feeling.
다음 단계로, 관리서버(600)는 상기 체험용 출력장치(300) 및 컨텐츠장치(500)의 구동을 제어하는 단계를 거치게 된다(S50). In the next step, the management server 600 goes through a step of controlling the operation of the experiential output device 300 and the content device 500 (S50).
여기서, 체험용 출력장치(300)는 입수가 가능하도록 방수 처리되고, 증강현실 또는 가상현실 영상을 제공하고, 컨텐츠 장치(500)는 사용자의 시각, 청각 및 촉각으로 해양 환경이 인지되도록 수중 환경상태를 조성한다. Here, the experiential output device 300 is waterproofed to enable acquisition and provides augmented reality or virtual reality images, and the content device 500 displays the underwater environment so that the marine environment is perceived by the user's visual, auditory, and tactile senses. Create.
상기 관리서버(600)는 해양 가상체험에 대한 콘텐츠 정보를 체험용 출력장치(300) 및 컨텐츠 장치(500)에 각각 전송하여 상기 체험용 출력장치(300) 및 상기 컨텐츠 장치(500)가 콘텐츠 정보에 맞게 구동시킨다. The management server 600 transmits content information about the marine virtual experience to the experiential output device 300 and the content device 500, respectively, so that the experiential output device 300 and the content device 500 transmit the content information. Operate accordingly.
이때, 체험용 출력장치(300) 및 컨텐츠장치(500)는 서로 연동되며 구동될 수 있다.At this time, the experiential output device 300 and the content device 500 may be driven in conjunction with each other.
상기 반영된 가상현실 콘텐츠를 통하여 사용자가 가상의 수중환경의 체험을 하는 단계(S60)를 포함하여 이루어지는 것을 특징으로 한다.It is characterized by including a step (S60) in which the user experiences a virtual underwater environment through the reflected virtual reality content.
상기 사용자의 신체 동작을 감지하는 상기 동작 감지 장치(200)는 카메라(210)와 영상인식 모듈(미도시)을 포함하여 이루어지며, 상기 수상 매트(100)의 감지센서(S)와 동작감지장치(200)로부터 출력되는 데이터 값들을 상기 체험용 출력장치(300)에 출력되는 가상현실 콘텐츠에 반영하는 것은, 상기 체험용 출력장치(300) 또는 단말기(400)에서 그대로 투영될 수 있도록 하는 것이 바람직하다.The motion detection device 200, which detects the user's body motion, includes a camera 210 and an image recognition module (not shown), and a detection sensor (S) of the water mat 100 and a motion detection device. It is preferable that the data values output from 200 be reflected in the virtual reality content output to the experiential output device 300 so that they can be directly projected from the experiential output device 300 or the terminal 400. do.
마지막 단계로, 관리서버(00)는 S60단계를 통해 수행된 해상 가상체험에 대한 진행상황을 모니터링한다(S70). In the final step, the management server 00 monitors the progress of the maritime virtual experience performed through step S60 (S70).
관리서버(600)는 사용자마다 수중환경 등의 가상체험에 대한 진행상황을 분석하여 평가를 수행할 수 있다. 즉, 관리서버(600)는 사용자의 체험 난이도, 진행속도, 체험 완료상태 등을 평가할 수 있다. 더욱 바람직하게는, 상기 관리서버(600)는 평가된 결과를 이용하여 사용자가 부족한 부분이나 보완 부분을 산출하여 사용자에게 제공할 수 있다.The management server 600 can perform evaluation by analyzing the progress of each user's virtual experience, such as the underwater environment. That is, the management server 600 can evaluate the user's experience difficulty level, progress speed, experience completion status, etc. More preferably, the management server 600 can use the evaluation results to calculate the parts that the user lacks or complement and provide them to the user.
부가적으로, 본 발명은 단말기(400)를 활용하여 가상현실 또는 증강현실(Advanced Reality) 기반 수중환경의 체험 교육 플랫폼(Platform)으로서 활용할 수 있다. Additionally, the present invention can be used as an experiential education platform for an underwater environment based on virtual reality or augmented reality (Advanced Reality) using the terminal 400.
이하, 도면을 첨부하여 가상현실(VR) 기반 수중환경의 체험 교육 플랫폼과 가상현실(VR)기반 수중환경 체험 교육 제공 방법에 대한 설명을 하기로 한다. Hereinafter, a drawing will be attached to explain the virtual reality (VR)-based underwater environment experience education platform and the method of providing virtual reality (VR)-based underwater environment experience education.
도 8을 참고하면, VR 교육장치(700)는 HMD(710), VR 모션부(720) 및 디스플레이장치(730)를 포함할 수 있다.Referring to FIG. 8, the VR education device 700 may include an HMD 710, a VR motion unit 720, and a display device 730.
HMD(710)는 사용자의 머리에 착용되며, 안경형상의 디스플레이부를 구비하여 디스플레이장치(730)로부터 수신되는 3차원 영상을 안경형상의 디스플레이부로 출력할 수 있다. 또한, HMD(710)는 사용자의 머리의 움직임(헤드 트레킹 정보)을 디스플레이장치(730)로 전송하여 움직임에 따른 3차원 영상을 수신할 수 있다.The HMD 710 is worn on the user's head and has a glasses-shaped display unit, so that the 3D image received from the display device 730 can be output to the glasses-shaped display unit. Additionally, the HMD 710 can transmit the movement of the user's head (head tracking information) to the display device 730 and receive a 3D image according to the movement.
모션 인식부(720)는 사용자의 양손 등에 각각 장착되어, 사용자 모션을 인식할 수 있다. 이때, 상기 모션 인식부(720)는 사용자 손에 장착되어 사용자의 관절 움직임을 인식할 수 있도록 관절부위에 모션센서가 장착되고, 진동센서를 구비하여 동작에 따른 알림을 수신할 수 있다. The motion recognition unit 720 is mounted on both hands of the user and can recognize the user's motion. At this time, the motion recognition unit 720 is mounted on the user's hand, has a motion sensor mounted on the joint area to recognize the user's joint movement, and is equipped with a vibration sensor to receive notifications according to the movement.
또한, 모션 인식부(720)는 레이저를 구비하여 디스플레이장치(730)의 디스플레이부(733)에 레이저를 송신하여 입력신호를 전송하거나, 기설정 동작에 따른 터치, 그랩, UI 조작 등의 인터페이스 역할을 수행할 수도 있다.In addition, the motion recognition unit 720 is equipped with a laser and transmits an input signal by transmitting a laser to the display unit 733 of the display device 730, or serves as an interface such as touch, grab, and UI manipulation according to preset operations. You can also perform .
이를 위해, 모션 인식부(720)는 구동시 기준 위치정보를 생성하며, 이때 기준 위치정보는 디스플레이부(733)의 표시화면에 모션 인식부(720)를 커서 등과 같은 가상의 손으로 표시할 수 있는 정보가 된다.To this end, the motion recognition unit 720 generates reference position information when driven, and at this time, the reference position information can be displayed on the display screen of the display unit 733 by displaying the motion recognition unit 720 as a virtual hand such as a cursor. It becomes information that exists.
이때, 모션 인식부(720)의 이동이나 동작에 따른 움직임 발생시, 상기 모션 인식부(720)의 기준 위치정보를 기준으로, 상기 가상의 손이 대응되게 디스플레이부(733)의 표시화면에 표시될 수 있다.At this time, when movement occurs due to the movement or operation of the motion recognition unit 720, the virtual hand will be correspondingly displayed on the display screen of the display unit 733 based on the reference position information of the motion recognition unit 720. You can.
디스플레이장치(730)는 수중환경 등의 교육 서비스 제공을 위한 앱이 설치된 단말기로 서버(800)와의 통신을 통해 교육 콘텐츠를 제공하며, 앱 제어를 위한 UI(User Interface)를 제공할 수 있다.The display device 730 is a terminal installed with an app for providing educational services such as an underwater environment, and can provide educational content through communication with the server 800 and provide a UI (User Interface) for controlling the app.
디스플레이장치(730)는 앱 실행시 사용자 인증을 수행하여 로그인하고, 상기 UI를 통해 사용자로부터 정보나 단말기(400)에 의해 사용자 맞춤 제공된 사용자 맞춤형 콘텐츠 등을 선택 입력받아 서버(800)로 전송하며, 사용자 요청에 매칭되는 시뮬레이션 VR 영상을 수신할 수 있다.The display device 730 performs user authentication when running the app, logs in, receives information from the user through the UI or selects customized content provided by the terminal 400, and transmits it to the server 800. You can receive simulation VR images that match user requests.
또한, 디스플레이장치(730)는 모션 인식부(720)로부터 인식되는 사용자 모션에 따른 가상의 손을 서버(800)로부터 수신한 시뮬레이션에 적용한 VR 영상을 디스플레이부(733)를 통해 표시하며, 동일한 VR영상을 HMD(710)에도 표시할 수 있다. 이때, VR영상은 HMD(710)의 움직임에 따라 제공되는 360°영상이 되며, 디스플레이부(733)는 HMD(710)에 구비된 디스플레이 영상을 미러링 출력할 수 있다.In addition, the display device 730 displays a VR image applied to the simulation received from the server 800 of the virtual hand according to the user motion recognized by the motion recognition unit 720 through the display unit 733, and displays the same VR image through the display unit 733. Images can also be displayed on the HMD (710). At this time, the VR image is a 360° image provided according to the movement of the HMD 710, and the display unit 733 can mirror and output the display image provided on the HMD 710.
도 9는 도 8의 디스플레이장치(730)의 구성을 설명하기 위한 구성 블록도이다. 디스플레이장치(730)는 통신부(731), 메모리(732), 디스플레이부(733), 제어부(734) 및 원격제공부(735)를 포함할 수 있다. FIG. 9 is a block diagram for explaining the configuration of the display device 730 of FIG. 8. The display device 730 may include a communication unit 731, a memory 732, a display unit 733, a control unit 734, and a remote provision unit 735.
디스플레이장치(730)는 콘텐츠 교육정보에 알맞는 해양 영상을 출력하는 역할을 한다. The display device 730 serves to output marine images suitable for educational content information.
상기 디스플레이장치(730)는 통신부(731)를 통해 서버(800)와 무선 데이터를 송수신할 수 있으며, HMD(710) 및 모션 인식부(720)와도 유선 또는 무선 연결되어 데이터를 송수신할 수 있다.The display device 730 can transmit and receive wireless data with the server 800 through the communication unit 731, and can also transmit and receive data with the HMD 710 and the motion recognition unit 720 by being wired or wirelessly connected.
메모리(732)에는 구동을 위한 데이터 등이 저장되고, 디스플레이부(733)에서 시뮬레이션 및 재생될 VR 해양영상 콘텐츠들이 저장될 수 있다.Data for driving, etc. may be stored in the memory 732, and VR marine image content to be simulated and played back in the display unit 733 may be stored.
디스플레이부(733)는 제어부(734)의 제어에 따라 HMD(710)에 출력되는 VR영상을 미러링 표시하며, 서버(800)로부터 수신되는 시뮬레이션 콘텐츠에 모션 인식부(720)의 움직임에 대응되는 커서(미도시)를 적용하여 표시할 수 있다.The display unit 733 mirrors and displays the VR image output on the HMD 710 under the control of the control unit 734, and displays a cursor corresponding to the movement of the motion recognition unit 720 on the simulation content received from the server 800. It can be displayed by applying (not shown).
제어부(734)는 디스플레이장치(730)의 전반적인 구동을 제어하며, HMD(710) 및 모션 인식부(720)로부터 수신되는 제어 명령(움직임 정보 등)에 따라 서버(800)로부터 수신된 콘텐츠를 제어하여 HMD(710) 및 디스플레이부(733)로 출력할 수 있다.The control unit 734 controls the overall operation of the display device 730 and controls content received from the server 800 according to control commands (movement information, etc.) received from the HMD 710 and the motion recognition unit 720. This can be output to the HMD 710 and the display unit 733.
원격 제공부(735)는 서버(800)를 통해 단말기(400)의 원격 제어를 지원하며, 상기 원격 제공부(735)를 통해 원격으로 수중 영상 교육을 지원받거나, 교육에 있어서 도움이 필요한 경우, 단말기(400)를 통하여 가이드를 받을 수 있고, 단말기(400)에 의해 실행되는 사용자 맞춤형 콘텐츠를 디스플레이부(733)를 통해 재생시킬 수도 있는 것이다.The remote provider 735 supports remote control of the terminal 400 through the server 800, and if you receive support for underwater video education remotely through the remote provider 735 or need help with training, Guides can be received through the terminal 400, and user-customized content executed by the terminal 400 can also be played through the display unit 733.
도 10은 도 8의 서버(800)를 설명하기 위한 구성 블록도이다. 도시된 바와 같이, 상기 서버(800)는 DB(710), 콘텐츠 제공부(720), 산출부(730) 및 원격지원부(740)를 포함할 수 있다.FIG. 10 is a configuration block diagram for explaining the server 800 of FIG. 8. As shown, the server 800 may include a DB 710, a content provider 720, a calculation unit 730, and a remote support unit 740.
DB(710)는 수중환경 DB, 교육 콘텐츠 DB, 관리 DB 등을 포함할 수 있다.The DB 710 may include an underwater environment DB, an educational content DB, a management DB, etc.
*수중환경 DB는 수중환경 시설 구축을 위한 각종 설비 및 장치에 대한 정보를 저장할 수 있다. 일 예로, 물고기/산호초 체험 교육, 심해 체험 교육, 고수압 체험 교육, 저수온 체험 교육, 저조도 체험 교육, 강한 조류 체험 교육 등이 해당될 수 있다.*The underwater environment DB can store information on various facilities and devices for building underwater environment facilities. For example, this may include fish/coral reef experience education, deep sea experience education, high pressure experience education, low water temperature experience education, low light experience education, and strong current experience education.
또한, 상기 수중환경 DB는 상기 교육들의 각 객체(각종 이미지 영상)에 대한 3차원 이미지, 수중생물의 위치 등을 저장하며, 해양 가상교육에 대한 컨텐츠 정보가 저장되는 것이다.In addition, the underwater environment DB stores 3D images for each object (various images) of the education, the location of underwater creatures, etc., and contains content information about marine virtual education.
교육 콘텐츠 DB는 가상 수중 체험을 위한 사전 교육등과 같은 교육 콘텐츠들을 저장할 수 있으며, 각 콘텐츠에는 학습유도 및 간접체험을 위한 단계별 코스가 매칭 저장될 수 있다.The educational content DB can store educational content such as preliminary training for virtual underwater experience, and step-by-step courses for learning induction and indirect experience can be matched and stored for each content.
여기서 코스는 각 콘텐츠의 학습 목적에 따라 유도되는 기설정된 동작이 될 수 있다. 일 예로, 교육콘텐츠는 해양 가상체험, 수중에서 물의 조류체험, 실제 해양에 존재하는 구조를 모형화한 구조물(산호, 동굴 등) 형상 등에 관한 다양한 콘텐츠들이 될 수 있다.Here, the course may be a preset action guided according to the learning purpose of each content. For example, educational content can be a variety of content about marine virtual experience, underwater tidal current experience, and the shape of structures (corals, caves, etc.) that model structures that actually exist in the ocean.
관리 DB는 교육 참가자의 정보, 맞춤형 콘텐츠관리, 현재의 참가자 교육평가수준, 진도 등과 같은 교육 참가자 개개인의 관리정보를 저장관리할 수 있다.The management DB can store and manage management information of individual training participants, such as training participant information, customized content management, current participant training evaluation level, progress, etc.
콘텐츠 제공부(820)는 VR 장치(700)로부터 요청된 콘텐츠 및 단말기(400)에 의해 원격 재생된 콘텐츠를 해당 VR 장치(700)로 제공하며, 해양 체험이나 수중생물 관찰을 위한 수중환경 DB의 선택에 따른 콘텐츠 요청시에는 상기 수중환경 DB에 저장된 객체 중 사용자에 의해 선택된 정보에 매칭되는 객체를 VR 장치(700)로 제공할 수 있다. 이때, 선택된 객체는 시뮬레이션되어 제공될 수 있다.The content provider 820 provides the content requested from the VR device 700 and the content remotely played by the terminal 400 to the corresponding VR device 700, and provides an underwater environment DB for marine experience or underwater life observation. When requesting content according to selection, an object matching the information selected by the user among objects stored in the underwater environment DB may be provided to the VR device 700. At this time, the selected object may be simulated and provided.
또는, 객체 그대로 전송되어 VR 장치(700)에서 시뮬레이션될 수도 있다.Alternatively, the object may be transmitted as is and simulated in the VR device 700.
이때, 디스플레이부(733) 및 HMD(710)에 디스플레이되는 VR 영상은 UI를 통해 객체가 선택될 때마다 기저장된 위치에 배치되어 시뮬레이션으로 나타낼수 있는 것이다. At this time, the VR image displayed on the display unit 733 and the HMD 710 is placed in a pre-stored location whenever an object is selected through the UI and can be displayed as a simulation.
또한, 구축된 가상 수중환경의 객체는 모션 인식부(720)에 의해 사용자 설정에 따른 위치로 이동될 수도 있다.Additionally, objects in the constructed virtual underwater environment may be moved to a location according to user settings by the motion recognition unit 720.
이와 같이, 임의의 수중영역 선택시, 사용자에 의해 선택된 각 객체(수중환경에 어울리는 장치 등)를 적합 위치에 자동 배치한 시뮬레이션을 제공함으로써 사용자가 수중 환경을 직관적으로 인지할 수 있다.In this way, when selecting an arbitrary underwater area, a simulation is provided in which each object (devices suitable for the underwater environment, etc.) selected by the user is automatically placed in an appropriate position, allowing the user to intuitively perceive the underwater environment.
콘텐츠 제공부(820)는 수중환경이 구축되면, 해당 정보를 기반으로, 필요한 복수 개의 영상 콘텐츠 및 미션을 추출하여 단계별 학습을 수행할 수 있도록 콘텐츠 목록을 제공할 수 있다. Once the underwater environment is established, the content provider 820 can extract a plurality of necessary video contents and missions based on the corresponding information and provide a content list so that step-by-step learning can be performed.
원격지원부(830)는 단말기(400)의 원격 제어를 위한 환경을 제공한다. 이때, 상기 단말기(400)의 원격 제어는 사용자의 요청시 또는 단말기(400)의 사용자 맞춤형 콘텐츠 제공시 실행될 수 있다.The remote support unit 830 provides an environment for remote control of the terminal 400. At this time, remote control of the terminal 400 may be performed at the user's request or when the terminal 400 provides user-customized content.
원격지원부(830)는 수중영상의 환경 구축이 완료되면 단말기(400)를 통하여 구축된 수중영상의 환경 정보를 전송하고, 원격상담을 요청하여 VR 장치(100) 및 단말기(400)간의 원격 상담이 이뤄질 수 있게 한다. When the construction of the underwater image environment is completed, the remote support unit 830 transmits the environmental information of the underwater image constructed through the terminal 400 and requests remote consultation to enable remote consultation between the VR device 100 and the terminal 400. make it happen.
이때, 상담은 실시간 화상채팅, 음성채팅, 문자채팅 등의 다양한 형태로 이뤄질 수 있다.At this time, consultation can take place in various forms such as real-time video chat, voice chat, and text chat.
이하, 도면을 참조하여 VR 교육장치(700)를 이용하여 VR기반 해양 교육의 제공방법에 대한 설명을 한다. 상기 설명한 플랫폼과 중보고다는 설명은 어느정도 생략하기로 한다. 이하에서 언급되는, '교육자'는 수중환경이나 해양교육을 받는 당사자라는 것은 자명한 것이다. Hereinafter, a method of providing VR-based marine education using the VR education device 700 will be described with reference to the drawings. We will omit the explanation of the platform and mediation described above to some extent. It is self-evident that the 'educator' mentioned below is the person who receives education on the underwater environment or the ocean.
도 11을 참고하면, VR 교육장치(700)에서 실행을 통해 교육자가 로그인하면, 해당 교육자의 선택에 따라 수중환경 구축을 위한 UI(User Interface)를 제공할 수 있다. 이와 같이, 구축된 경우에는 교육자의 학습 단계에 맞추어, 맞춤형 콘텐츠를 재생하여 수중교육 학습을 수행할 수 있다.Referring to FIG. 11, when an educator logs in through execution on the VR education device 700, a UI (User Interface) for building an underwater environment can be provided according to the educator's selection. When constructed in this way, underwater education can be performed by playing customized content according to the educator's learning stage.
상기 교육자가 모션 인식부(720)를 이용하여 UI에서 컨텐츠 교육의 종류를 선택하면, 디스플레이장치(730)는 서버(800)로 선택된 교육의 정보를 전송하는 것이다(S110). When the educator selects the type of content education in the UI using the motion recognition unit 720, the display device 730 transmits information about the selected education to the server 800 (S110).
다음으로, 상기 서버(800)는 선택되는 교육정보에 대응되는 객체를 시뮬레이션하여 제공하고, VR 교육장치(700)는 HMD(710)로 시뮬레이션 VR영상을 출력할 수 있다(S120). 또한, 디스플레이부(733)에는 상기 HMD(710)의 VR 영상이 미러링 출력될 수 있다.Next, the server 800 simulates and provides objects corresponding to the selected educational information, and the VR education device 700 can output a simulated VR image to the HMD 710 (S120). Additionally, the VR image of the HMD 710 may be mirrored and output on the display unit 733.
만일, 교육자가 수강을 변경하고자 할 경우, 모션 인식부(720)를 통해 기존의 수강 선택한 교육을 변경하고자 하는 교육으로 변경을 할 수도 있다. If the educator wishes to change the course, he or she may change the existing selected course to the desired course through the motion recognition unit 720.
그리고, 교육의 선택을 통한 해당 컨텐츠가 구축되면, 서버(800)는 선택된 교육을 기반으로 하여 해당 수중교육을 제공할 수 있다(S130).Then, when the corresponding content is constructed through the selection of education, the server 800 can provide the corresponding underwater education based on the selected education (S130).
만일, 교육자에 의해 상담 요청이 있는 경우, 해당 정보는 서버(800)를 통해 교육을 관리 담당하는 관리자의 단말기(미도시)로 전송될 수 있다. 상기 관리자 단말기를 통해 관리자는 교육자의 상담 내용을 확인하고, 상기 서버(800)를 통하여 이에 대한 답변 등을 할 수 있다.If there is a request for consultation by an educator, the corresponding information may be transmitted through the server 800 to the terminal of the administrator in charge of managing education (not shown). Through the administrator terminal, the administrator can check the content of the educator's consultation and respond to it through the server 800.
다음으로, VR 교육장치(100)은 교육준비의 완료를 서버(800)로 전송할 수 있으며, 동시에 상기 서버(800)는 교육자가 선택한 수중 관련 영상콘텐츠를 제공할 수 있다(S140).Next, the VR education device 100 can transmit the completion of education preparation to the server 800, and at the same time, the server 800 can provide underwater-related video content selected by the educator (S140).
교육자는 HMD(710)에 출력되는 VR 수중영상을 통해 가상 수중환경 등의 교육과 학습을 수행하고, 모션 인식부(720)를 이용, 각 동작을 수행할 수 있다(S150). Educators can perform education and learning in a virtual underwater environment through VR underwater images output to the HMD 710 and perform each movement using the motion recognition unit 720 (S150).
이때, 교육자의 수중교육과 학습의 실행은 상기 서버(800)를 통해 관리자 단말기로 전송되며, 관리자는 상기 관리자 단말기를 통하여 실시간 모니터링을 하며, 교육자에게 도움이 필요한 경우, 외부에서 원격 제어를 수행할 수도 있다(S160).At this time, the educator's execution of underwater education and learning is transmitted to the administrator terminal through the server 800, and the administrator monitors it in real time through the administrator terminal. If the educator needs help, remote control can be performed from the outside. You can also (S160).
상기 관리자 단말기는 적어도 하나 이상의 교육자들의 교육 활동을 동시에 모니터링할 수도 있다. The administrator terminal may simultaneously monitor the educational activities of at least one educator.
다음으로, 관리자는 교육자의 수중교육의 학습단계 및 수행 모니터링을 통해, 상기 교육자의 수중 교육 수행을 위한 관련 콘텐츠, 관리자 측면에서 보강이 필요한 학습으로 판단한 콘텐츠 등을 사용자 맞춤형 콘텐츠로 선택하여 맞춤형 트레이닝을 수행할 수도 있다(S170). Next, through monitoring the learning stages and performance of the educator's underwater education, the administrator selects relevant content for the educator's underwater education performance and content judged to be learning that requires reinforcement from the administrator's perspective as user-customized content to provide customized training. It can also be performed (S170).
이때, 관리자는 교육자에 관련하는 콘텐츠 모음 및 검색서비스를 통해 교육자 맞춤형 콘텐츠를 용이하게 선택할 수도 있다.At this time, the administrator can easily select content tailored to the educator through a collection of content related to the educator and a search service.
이와 같이, 관리자에 의해 선택된 사용자 맞춤형 콘텐츠는 원격제어를 통해 VR 교육장치(700)에 재생될 수 있으며, 서버(800)는 해당 콘텐츠에 매칭된 교육 또는 상기 단말기(400)에서 선택한 교육을 맞춤형 트레이닝 교육으로 제공할 수 있ㄴ는 것이다.In this way, user-customized content selected by the administrator can be played on the VR education device 700 through remote control, and the server 800 provides customized training for the education matched to the content or the education selected in the terminal 400. This can be provided through education.
교육자가 수중 교육 학습을 수행하는 동안에, 관리자는 모니터링 및 원격제어를 실시간 수행하며, 해당 수중 교육(학습)이 완료되면, VR 교육 장치(700)는 수중 교육이 완료가 되었다는 신호를 서버(800)로 전송한다(S180).While the educator is performing underwater education and learning, the administrator performs monitoring and remote control in real time, and when the underwater education (learning) is completed, the VR education device (700) sends a signal to the server (800) that the underwater education has been completed. Transmit to (S180).
이상에서와 같은 내용의 본 발명이 속하는 기술분야의 당업자는 본 발명의 기술적 사상이나 필수적 특징을 변경하지 않고서 다른 구체적인 형태로 실시될 수 있다는 것을 이해할 수 있을 것이다. 그러므로, 상기 기술한 실시 예는 예시된 것이며 한정적인 것이 아닌 것으로서 이해해야만 한다.Those skilled in the art to which the present invention pertains as described above will understand that the present invention can be implemented in other specific forms without changing its technical idea or essential features. Therefore, the above-described embodiments should be understood as illustrative and not restrictive.
본 발명의 범위는 상기 상세한 설명보다는 첨부된 특허청구범위에 의하여 나타내어지며, 특허청구범위의 의미 및 범위 그리고 그 등가 개념으로부터 도출되는 모든 변경 또는 변형된 형태가 본 발명의 범위에 포함되는 것으로 해석되어야 한다.The scope of the present invention is indicated by the appended claims rather than the detailed description above, and all changes or modified forms derived from the meaning and scope of the claims and their equivalent concepts should be construed as being included in the scope of the present invention. do.
100 : 수상 매트 200 : 동작감지장치 100: water mat 200: motion detection device
210 : 영상촬영장치 210: Video recording device
300 : 출력장치 310 : 헤드부 300: output device 310: head part
320 : 스피커 330 : 호흡센서320: Speaker 330: Respiration sensor
340 : 제어부 350 : 영상신호입력부 340: Control unit 350: Video signal input unit
360 : 음성신호 입력부 400 : 단말기 360: Voice signal input unit 400: Terminal
500 : 컨텐츠장치 510 : 수중환경정보 제공서버500: Content device 510: Underwater environment information provision server
520 ; 수중콘텐츠 제공서버 600 : 관리서버520 ; Underwater content provision server 600: Management server
610 : 통신부 620 : 서버 제어부 610: Communication unit 620: Server control unit
630 : 저장부 S : 감지센서 630: Storage unit S: Detection sensor
700 : VR 교육장치 710 : HMD 700: VR education device 710: HMD
720 : 모션인식부 730 : 디스플레이장치720: Motion recognition unit 730: Display device
800 : 서버 810 : DB 800: Server 810: DB
820 : 콘텐츠 제공부 830 : 원격지원부 820: Content provision department 830: Remote support department

Claims (4)

  1. 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공시스템에 있어서, In the VR video and content provision system for underwater monitoring,
    감지센서가 설치되며, 사용자가 누워서 물의 물리감을 느낄수 있도록 하기 위한 수상 매트 ;A water mat equipped with a detection sensor to allow users to lie down and feel the physical sensation of water;
    상기 사용자의 동작을 감지하기 위해 영상 촬영 장치와 영상 인식 모듈로 구성되는 동작 감지 장치; A motion detection device consisting of an image capture device and an image recognition module to detect the user's motion;
    상기 사용자의 머리에 착용되어 가상현실 영상을 제공하는 가상체험용 출력 장치;A virtual experience output device worn on the user's head to provide a virtual reality image;
    상기 수상매트의 상기 감지센서와 상기 동작 감지 장치의 데이터를 수신하여 상기 가상체험용 출력장치에 출력되는 가상현실 콘텐츠를 제어할수 있는 단말기; 상기 가상체험용 출력장치에 수중 가상체험에 대한 콘텐츠 정보를 전송하여 상기 콘텐츠 정보에 맞게 상기 가상체험용 출력장치 및 상기 단말기를 구동시키고, 상기 해양 가상체험에 대한 진행상황을 모니터링하는 관리서버를 포함하는 것을 특징으로 하는 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공시스템.A terminal capable of controlling virtual reality content output to the virtual experience output device by receiving data from the detection sensor and the motion detection device of the water mat; It includes a management server that transmits content information about the underwater virtual experience to the virtual experience output device, drives the virtual experience output device and the terminal according to the content information, and monitors the progress of the marine virtual experience. A VR video and content provision system for underwater monitoring, characterized in that:
  2. 제1항에 있어서, According to paragraph 1,
    상기 감지센서는 상기 사용자의 시각, 청각 및 촉각으로 수중 환경이 인지되도록 하는 것을 특징으로 하는 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공시스템.The detection sensor is a VR video and content provision system for underwater monitoring, characterized in that the underwater environment is recognized by the user's vision, hearing, and tactile senses.
  3. 제2항에 있어서, According to paragraph 2,
    상기 감지센서는 틸트센서, 가속도센서, 관성 센서를 포함하며, 물의 속도, 물의 양, 파장의 물리적 성질을 표현하는 데이터를 출력하는 것을 특징으로 하는 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공시스템.The detection sensor includes a tilt sensor, an acceleration sensor, and an inertial sensor, and outputs data expressing the physical properties of water speed, water amount, and wavelength. A VR video and content provision system for underwater monitoring.
  4. 제1항에 있어서, According to paragraph 1,
    상기 수상 매트의 하단에는 상기 수상 매트를 구동하기 위한 구동수단과 사실감이 있는 체험을 위하여 바람을 제공하는 풍량조절기 및 물방울을 제공하는 분무조절기를 더 포함하는 것을 특징으로 하는 수중 모니터링을 위한 VR 영상 및 컨텐츠 제공시스템.A VR video for underwater monitoring, characterized in that the bottom of the water mat further includes a driving means for driving the water mat, an air volume controller that provides wind for a realistic experience, and a spray controller that provides water droplets. Content provision system.
PCT/KR2022/008243 2022-05-31 2022-06-10 Vr image and content providing system and method for underwater monitoring WO2023234456A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020220066810A KR20230166615A (en) 2022-05-31 2022-05-31 Virtual Reality image and contents providing method for underwater monitoring
KR10-2022-0066810 2022-05-31

Publications (1)

Publication Number Publication Date
WO2023234456A1 true WO2023234456A1 (en) 2023-12-07

Family

ID=89025098

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/008243 WO2023234456A1 (en) 2022-05-31 2022-06-10 Vr image and content providing system and method for underwater monitoring

Country Status (2)

Country Link
KR (1) KR20230166615A (en)
WO (1) WO2023234456A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160005232A1 (en) * 2014-07-04 2016-01-07 The University Of Texas At San Antonio Underwater virtual reality system
KR20190070616A (en) * 2017-12-13 2019-06-21 전자부품연구원 Enter water type marine contents experience system and method
KR20200020337A (en) * 2018-08-17 2020-02-26 주식회사 브릿지 A simulation system for Survival Swimming
KR20200047218A (en) * 2018-10-27 2020-05-07 주식회사 브릿지 A simulation system for Survival Swimming
KR20200065274A (en) * 2018-11-30 2020-06-09 주식회사 에스시전시문화 Moving image display system for virtual reality underwater experience

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20120075899A (en) 2010-12-29 2012-07-09 전남대학교산학협력단 Method of stitching underwater camera images for underwater monitoring
KR20180137916A (en) 2017-06-20 2018-12-28 (주)레드몽키픽처스 Virtual reality image contents providing system and thereof providing method
KR102095009B1 (en) 2018-07-20 2020-04-24 포항공과대학교 산학협력단 Underwater monitoring system and VR image providing method for underwater monitoring

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160005232A1 (en) * 2014-07-04 2016-01-07 The University Of Texas At San Antonio Underwater virtual reality system
KR20190070616A (en) * 2017-12-13 2019-06-21 전자부품연구원 Enter water type marine contents experience system and method
KR20200020337A (en) * 2018-08-17 2020-02-26 주식회사 브릿지 A simulation system for Survival Swimming
KR20200047218A (en) * 2018-10-27 2020-05-07 주식회사 브릿지 A simulation system for Survival Swimming
KR20200065274A (en) * 2018-11-30 2020-06-09 주식회사 에스시전시문화 Moving image display system for virtual reality underwater experience

Also Published As

Publication number Publication date
KR20230166615A (en) 2023-12-07

Similar Documents

Publication Publication Date Title
US11231897B2 (en) Display system, display device, information display method, and program
JP7190434B2 (en) Automatic control of wearable display devices based on external conditions
KR100721713B1 (en) Immersive training system for live-line workers
WO2018131914A1 (en) Method and apparatus for providing guidance in a virtual environment
CN109923462A (en) Sensing spectacles
JP6263252B1 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
CN107924584A (en) Augmented reality
JPWO2008066093A1 (en) Position-dependent information expression system, position-dependent information expression control device, and position-dependent information expression method
KR20150126938A (en) System and method for augmented and virtual reality
EP3355576A1 (en) Information processing apparatus, information processing method, and program
JP6290467B1 (en) Information processing method, apparatus, and program causing computer to execute information processing method
CN114502921A (en) Spatial instructions and guidance in mixed reality
Kawasaki et al. Collaboration and skill transmission by first-person perspective view sharing system
WO2018117753A1 (en) Electronic device and method of controlling the same
JPWO2018216355A1 (en) Information processing apparatus, information processing method, and program
US20200197783A1 (en) Information processing apparatus, information processing method, and program
CN108139804A (en) Information processing unit and information processing method
JP2018125003A (en) Information processing method, apparatus, and program for implementing that information processing method in computer
WO2023234456A1 (en) Vr image and content providing system and method for underwater monitoring
US20190355281A1 (en) Learning support system and recording medium
JP2004038470A (en) Augmented reality system and information processing method
WO2023080296A1 (en) Ar device and method for controlling ar device
JP2020126187A (en) Pseudo outgoing experience presentation device by immersion-type high-definition video presentation space
JP2016110542A (en) Head-mounted display device, method of controlling head-mounted display device, and computer program
Kurosaki et al. Skill transmission for hand positioning task through view-sharing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22944994

Country of ref document: EP

Kind code of ref document: A1