WO2018016928A1 - Système de mise en œuvre de réalité virtuelle et procédé associé de mise en œuvre de réalité virtuelle - Google Patents

Système de mise en œuvre de réalité virtuelle et procédé associé de mise en œuvre de réalité virtuelle Download PDF

Info

Publication number
WO2018016928A1
WO2018016928A1 PCT/KR2017/007944 KR2017007944W WO2018016928A1 WO 2018016928 A1 WO2018016928 A1 WO 2018016928A1 KR 2017007944 W KR2017007944 W KR 2017007944W WO 2018016928 A1 WO2018016928 A1 WO 2018016928A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
wearable device
virtual
real object
location information
Prior art date
Application number
PCT/KR2017/007944
Other languages
English (en)
Korean (ko)
Inventor
김계현
Original Assignee
김계현
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 김계현 filed Critical 김계현
Publication of WO2018016928A1 publication Critical patent/WO2018016928A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera

Definitions

  • the present invention relates to a virtual reality realization system, and more specifically to realizing a high-resolution virtual space of a work of art through optical images, at the same time to implement a number of galleries and works using location-based information,
  • the present invention relates to a virtual reality realization system and a virtual reality realization method for providing an additional service and enabling virtual advertisements, so that a plurality of galleries and works can be viewed in a specific real space.
  • An object of the present invention is to provide a virtual reality realization system and a method for implementing virtual reality such that overlapping portions between real objects and virtually implemented virtual objects in a specific space displayed on a wearable device of a user are displayed according to the depth of view of the user. To provide.
  • Another object of the present invention is to implement a virtual reality system and a method for implementing a virtual reality for realizing a virtual space for an object, such as a work of art, a work of art through an optical image, and at the same time using a location-based information To provide.
  • the virtual reality implementation system of the present invention allows an overlapping portion between a real object and a virtually implemented virtual object in a specific space displayed on the wearable device of the user to be displayed according to the depth of view of the user.
  • a virtual reality implementation system of the present invention can realistically display the overlapping portions between real objects and virtual objects in a specific space on the display panel of the wearable device.
  • a virtual reality implementation system comprising: a display panel including a real object existing in a specific space defined through a plurality of location information transceiving sensors and a virtual object virtually implemented within the specific space.
  • a wearable device wearable by a user; Collecting location information of the real object and location information of the wearable device from the location information transceiving sensor and a plurality of optical motion sensors that detect movement of the user and the real object in the specific space, and transmit the location information to the wearable device;
  • a user computer for transmitting the image data of the virtual object and the image data of the virtual object;
  • the wearable is stored when the image data of the real object and the image data of the virtual object are stored and the location information of the real object and the location information of the wearable device are determined by partially overlapping the real object and the virtual object.
  • the display panel of the device may be configured to display an overlapping portion of the real object and the virtual object located at a short distance with the wearable device according to a depth of view between the image data of the real object and the image data of the virtual object.
  • a server transmitting image data of the real object and image data of the virtual object to the user computer.
  • the virtual reality implementation system At least one three-dimensional optical scanning device scanning the real object to generate three-dimensional image data and outputting three-dimensional image data;
  • the apparatus may further include a rendering computer configured to receive the 3D image data from the 3D optical scanning device and to perform a rendering operation to transmit image data of the real object to the server.
  • the virtual reality implementation system Further comprising a wide-angle IP camera for driving to change the shooting direction in response to the movement of the wearable device under the control of the server;
  • the wearable device displays an image captured by the wide-angle IP camera on the display panel.
  • the apparatus may further include a plurality of pressure sensors provided on a bottom surface of the specific space to further detect location information of the real object and the wearable device.
  • the wearable device In another embodiment, the wearable device;
  • the display apparatus further displays additional information about the real object and the virtual object, and when the additional information is selected, it is connected to an additional service corresponding to the additional information.
  • the server The image data of the real object and the image data of the virtual object are further displayed on the display panel of the wearable device in order to further display the real object and the virtual object in response to a three-dimensional angle of the user's gaze. to provide.
  • the position information transmission and reception sensor It is provided with any one of a GPS device, a communication network base station, a wireless communication repeater and a drone to recognize the location information of the real thing and the wearable device.
  • a virtual reality implementation method of a virtual reality implementation system is provided.
  • a virtual reality implementation method of a virtual reality implementation system comprising: defining a specific space using a plurality of location information transceiving sensors;
  • the location information transmitting and receiving sensor senses the location information of the real thing existing in the specific space and the location information of the wearable device worn by the user having a display panel displaying a virtual object virtually implemented in the specific space.
  • the position information of the real object and the position information of the wearable device may be obtained from the position information transceiving sensor and a plurality of optical motion sensors that detect movement of the user and the real object in the specific space through a user computer provided in the specific space.
  • the server determines the positional information of the real object and the positional information of the wearable device in the specific space and the real object and the virtual object partially overlap
  • the image data of the real object and the image are displayed on the display panel of the wearable device.
  • the overlapping portion of the real object and the virtual object located near the wearable device among the virtual objects is displayed according to the depth of view between the image data of the virtual object, and the image data of the real object and the virtual are displayed to the user computer. Transfer the image data of the object.
  • the method comprises; Scan the real object through at least one three-dimensional optical scanning device to generate three-dimensional image data and output three-dimensional image data;
  • a rendering computer receives the three-dimensional image data from the three-dimensional optical scanning device and performs a rendering operation to transmit the image data of the real object to the server;
  • the server may further include storing image data of the real object.
  • the virtual reality implementation system of the present invention determines the location information of the real object and the location information of the wearable device in a specific space, and when the real object and the virtual object partially overlap, the image of the real object on the display panel of the wearable device.
  • the wearable device can be realistically represented using a wearable device in various spaces. Convergent implementation of augmented reality is possible.
  • the virtual curator may tell the history of the artist, auction information, the history of the work, the intention of the creation of the work in a narrative manner, wearable device By selecting a menu displayed together on the display panel, the item may be purchased directly, or a virtual advertisement or an actual product advertisement displayed on the display panel may be selected to purchase the corresponding product.
  • the wide-angle IP camera by being applied to the guard post, linked to the wearable device, the wide-angle IP camera is activated, the service soldier of the guard post, while not actually going to the outer duty patrol, stays in the guard post, etc. As intended, there is a characteristic effect that the boundary work is possible through the images captured by each wide-angle IP camera.
  • FIG. 1 is an overall schematic diagram showing the configuration of a virtual reality implementation system according to the present invention
  • FIG. 2 is a view showing a user viewing a work by using a wearable device for a real object and a virtual object in a real gallery according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram illustrating that a user utilizes a wearable device with respect to the virtual object illustrated in FIG. 2;
  • FIG. 4 is a schematic diagram illustrating that depths of view are differently set for real objects and virtual objects in the real gallery shown in FIG. 2;
  • FIG. 5 is a schematic view showing that the lens direction of the wide-angle IP camera is driven in the vertical direction in conjunction with the wearable device worn by the user according to another embodiment of the present invention
  • FIG. 6 is a schematic diagram showing the configuration of a virtual reality implementation system using a military drone according to another embodiment of the present invention
  • FIG. 7 is a schematic diagram showing that a close clip plane and a spaced clip plane from a camera are set based on a viewpoint according to the present invention
  • FIG. 8 is a view showing a wearable device in the form of a Google glass according to an embodiment of the present invention.
  • FIG. 9 is a view showing a schematic configuration of a gallery system of a virtual space using 3D according to an embodiment of the prior art.
  • FIG. 1 is an overall schematic diagram showing the configuration of a virtual reality implementation system according to the present invention.
  • an optical scan and a three-dimensional image of a subject to represent a virtual object constructed with a real object in order to implement a realistic virtual reality using the wearable device 100 is performed.
  • the virtual reality implementation system of the present invention includes a wearable device 100, a user computer 200, a server 300, a rendering computer 400, and a plurality of 3D optical scanning devices 500: 500a and 500b. And an optical IP camera 600.
  • a virtual reality implementation system is connected to the Internet using, for example, a wired or wireless communication network, and is connected to enable mutual data communication.
  • the wearable device 100 and the user computer 200 are located in a specific space S1.
  • the wearable device 100 includes a display panel worn by the user M, a display panel for outputting an image, a microphone for recording voice (for example, a holographic microphone, etc.), a speaker for outputting voice in stereo, headphones, and the like. It is provided.
  • the wearable device 100 includes various sensors, for example, a wide angle camera, an optical motion sensor, a gyroscope sensor, a position information transceiver, a pupil position tracking device, a pupil contraction / relaxation tracking device, and an illuminance sensor.
  • the wearable device 100 displays an image including a real object and a virtual object according to a virtual reality implemented using a display panel.
  • the user computer 200 is provided in a specific space S1 (eg, an actual gallery) and is connected to the wearable device 100 through a wireless communication network.
  • the user computer 200 includes a reader that reads optically readable optical recognition codes (or tags), and an optical motion sensor (eg, a laser motion tracking sensor) within a specific space defined by a plurality of location information transmission and reception sensors.
  • the location information of the wearable device 100 including the position, the direction, and the three-dimensional angle of the wearable device 100 is collected by using.
  • the reader reads an optical recognition code (or tag) that includes three-dimensional information about all subjects (eg, real objects including objects, creatures, etc.) existing in the defined space.
  • a plurality of location information transmission / reception sensors are installed in a specific space to define a space, and sense information about the position, direction, and three-dimensional angle of the wearable device 100 existing in the defined space.
  • the position information transmission and reception sensor detects a straight line distance to the wearable device 100.
  • the optical motion sensor detects information including position, shape, size, volume, etc. in real time for all subjects existing within a defined space. Also, the information detected by the reader and the optical motion sensor is used to correct an error of the subject.
  • a specific space is provided with a plurality of pressure sensors on the bottom surface, through which the position of the wearable device and the position of the existing subject are identified and corrected.
  • a specific space further includes a voice output device capable of independent left and right recording and output, through which outputs all sounds generated from a subject (for example, a real object or a virtual object) existing in the space, thereby positioning the sound source. Information can be measured to convey realistic and three-dimensional voice information.
  • the user computer 200 transmits image data and voice data to the wearable device 100 through a wireless communication network.
  • the user computer 200 receives image data of an actual object and image data of a virtual object from the server 300 through the Internet and provides the image data to the wearable device 100.
  • the server 300 stores image data of all subjects, that is, real objects and virtual objects, and voice data including sound effects of the real objects and virtual objects for various specific spaces S1 and S2.
  • the server 300 receives and stores image data about an optically scanned real object from a plurality of 3D optical scanning devices 500: 500a and 500b through the Internet.
  • the server 300 transmits image data and voice data to the user computer 200 through the Internet.
  • the server 300 computes a control signal to control the user computer 200, the rendering computer 400, and the wide-angle IP camera 600.
  • the server 300 remotely controls the photographing direction of the wide-angle IP camera 600 to be remotely changed in response thereto, and the image obtained through this is obtained.
  • the data is provided to the wearable device 100.
  • the server 300 is connected to the virtual wearable device 100, a mobile terminal, a computer, and the like located in the personal direct viewing space (S2) directly viewed by an individual to provide image data and voice data.
  • S2 personal direct viewing space
  • the server 300 receives a variety of information from the user computer 200 through the Internet, information about the position, direction, three-dimensional angle and shape, size of the user and all subjects of the wearable device 100 in a specific space It stores and analyzes in real time to provide an image in which the real and virtual reality is reproduced in real time through the display panel on the wearable device 100 is fused. In addition, the server 300 processes a realistic representation of a real object and a virtual object virtually realized according to the location information, the viewing range, and the depth of each of the waveable device 100 and the subject.
  • the server 300 processes masking so that a part of a relatively long subject overlaps with a subject in a relatively short distance so as not to be displayed according to a distance of each subject in the field of view to be implemented, or first displays a subject located in a short distance first. do.
  • the server 300 is a three-dimensional processing of the position and the moving path of the virtually implemented subject in the field of view to be implemented, it is possible to realistically implement the position, three-dimensional angle, perspective effect.
  • the rendering computer 400 receives the image data of the real object scanned from the 3D optical scanning device 500 through the Internet and converts the 3D image data into 2D image data displayed on the display panel of the wearable device 100.
  • the 3D rendering operation generates a 3D image data value, and transmits the 3D image data to the server 300 through the Internet.
  • 3D optical scanning device 500 is provided with a plurality of scanning the actual object to obtain the three-dimensional image data, and transmits it to the server 300 through the Internet.
  • the wide-angle IP camera 600 operates according to a driving signal received from the wearable device 100 or the like.
  • the wide-angle IP camera 600 is provided in various spaces and is remotely controlled to change the photographing direction according to the eyeline direction of the wearable device 100 through the server 300 to obtain image data of the corresponding space, and the server 300 To send.
  • the virtual reality realization system of the present invention may implement virtual reality and augmented reality by using the wearable device 100 in various spaces to realistically express real objects and virtual objects.
  • FIG. 2 is a view showing a user viewing a work by using a wearable device for a real object and a virtual object in a real gallery according to an embodiment of the present invention
  • FIG. 3 is a view of the virtual object shown in FIG.
  • FIG. 4 is a schematic diagram illustrating a user using a wearable device
  • FIG. 4 is a schematic diagram illustrating that depths of view are differently set for real objects and virtual objects in the actual gallery shown in FIG. 2
  • FIG. It is a schematic diagram which shows that the proximity clip surface and the spaced clip surface from the camera are set based on the viewpoint according to the invention.
  • the specific space S1 is an actual gallery, and defines the space by the plurality of location information transceiving sensors L1 to L8.
  • the user M wearing the wearable device 100 in the specific space S1 is disposed in the specific space S1 with a plurality of optical motion sensors I1 to I15 spaced apart at regular intervals along the edge of the prescribed space.
  • position information on the actual object (R) is provided on the bottom surface of the specific space (S1) with a pressure sensor (P1) to detect the user's position, movement, and the like.
  • the location information transceiving sensors L1 to L8 are provided with, for example, a Bluetooth becon.
  • the actual gallery S1 is defined as a predetermined space, and location information transceiving sensors L1 to L8 are installed to cover the entire area, and at least two adjacent location information transceiving sensors L1 to L8 are provided.
  • Optical motion sensors (I1 to I15) are installed between) so that a user (M), a real object (R), and a virtual object (V) moving in the real gallery (S1) are sensed or position data is detected as valid data. To be installed.
  • a plurality of pressure sensors P are mounted on the floor constituting the actual gallery S1 to sense actual position data of a user M or a real object R moving while applying pressure on the floor. It is provided.
  • the data value for the real object (R) can be secured by scanning by using a high-resolution 3D optical scanning device (500: 500a, 500b).
  • the secured data value is transmitted to the server 300 through the Internet and stored.
  • the user M may read the display screen of the wearable device 100. Is displayed.
  • the application principle of the virtual reality realization system in this embodiment is to cause the virtual object V, which is illustrated in a subject, for example, fish shape, to be realistically reproduced in reality, based on the positional information in a specific space S1.
  • all coordinates, sizes, etc. of all objects in the specific space S1 are recognized, and three-dimensional coordinates are assigned and displayed in the specific space S1. That is, the virtual reality realization system analyzes the location information of the object and the information of the field of view, thereby enabling the fusion of the virtual object V and the real object R to be realistic.
  • a real object R and a fish-like virtual object V are placed on a display panel of the wearable device 100 in a specific space S1.
  • the fish-like virtual object (V) is swimming in a specific space (S1), as shown in the user's eye (A) of Figure 3, behind the real object (R)
  • S1 Specific space
  • the virtual object V returns to the back of the real object R
  • an overlapping region Hi overlaps the real object R and the virtual object V, and thus the user's eyes may have a depth of view.
  • the image of the virtual object V of the overlapping portion is displayed as partly covered by the real object R, so that a virtual 3D image as if the virtual object exists is realized.
  • the virtual object V is the real object as shown in the user's eye A of FIG. 3.
  • the portion which is not visible overlapping with (R) is shown as in the user's line of sight B of FIG. 3.
  • the present invention is a technology in which virtual reality and augmented reality are fused, the fish-like virtual object (V) that was outside the user M's field of view as if the real thing is coming closer to the front of the user (M) Can be implemented virtually.
  • these data values are synthesized and displayed, and various additions to the virtual object V and the real object R are performed.
  • the information is displayed and various additional services can be received by selecting these additional information.
  • the data value for the real object R may be obtained by scanning in real time by the 3D optical scanning device 500 (500a, 500b), the optical motion sensors I1 to I15 installed in the real gallery S1,
  • the data value may be obtained by being sensed by the location information transmission / reception sensors L1 to L8 or may be obtained through the pressure sensor P located on the floor.
  • it is possible to reduce the error by comparing the data values obtained by using the above-described methods in combination.
  • the wearable device 100 measures coordinates, directions, and angle information on the three-dimensional space of the wearable device 100 using various sensors, and through this, the real objects R and the virtual objects V are measured.
  • the relative position between the users M, the real object R or the virtual object V, and the wearable devices 100 may be sensed, that is, three-dimensional spatial coordinates.
  • the sensor for detecting the 3D location information may include, for example, an IR sensor, an optical sensor, and the like. Therefore, the user's field of view is determined by the position of the wearable device 100 and the three-dimensional angle information.
  • a viewing frustum in which the user M is represented on the display through spatial position coordinates and angles is defined, and the subjects entering the viewing frustum at this time are defined.
  • the information about is realized realistically through the depth information process of the field of view. Since the user can see the real object and the background through the camera of the wearable device 100, the viewing frustum is a viewpoint, that is, which part of the entire screen is visible on the final screen in terms of the camera. This means that the visible area, that is, the determined display area. That is, the screen displayed on the display panel of the wearable device 100 becomes one screen between a near clip plane and a far clip plane in the viewing frustum according to the depth of the view.
  • the voice information generated by the object or the person who does not enter the field of view is implemented to be outputable, and the real space implemented by the virtual reality realization system of the present invention is virtual. It is implemented to match the space and its shape, width and height.
  • the present invention when the present invention is applied to the virtual gallery, when a plurality of users (M) wearing the wearable device 100 is viewed with the virtual curator in the actual gallery (S1), the various existing in the space of the actual gallery (S1) It is considered not to collide with each other users M, virtual curators and real objects R.
  • the virtual curator may tell the history of the artist, the auction information, the history of the work, the intention of the creation of the work in a narration manner.
  • the virtual curator like the curator of a real gallery, follows the audience and gives a detailed explanation of the work.
  • the item may be purchased directly.
  • a corresponding advertisement may be purchased by selecting a virtual advertisement displayed on the display panel of the wearable device 100 or an actual product advertisement, for example, a banner advertisement or a product advertisement such as a vehicle.
  • selecting a menu displayed together with the work on the display panel of the wearable device 100 it is possible to participate in the auction of the item.
  • the camera outputs images and sounds from the display panel of the wearable device 100 and has a device and a data communication device capable of informing a position to various spatial coordinate recognition sensors, and photographs the front of the device and transmits them in real time.
  • a device having a laser scanning equipment for measuring and recording the position of the pupil inside.
  • the sound is different from the sound coming from the right direction and the sound coming from the left direction.
  • the sound heard from left and right is set differently with respect to. Therefore, according to the relative positions of the user M wearing the wearable device 100 and the virtual object V in the space of the actual gallery S1, the sounds heard from the left and right sides of the wearable device 100 are mutually level of sound ( level) is set differently.
  • a user may selectively view all sides of the work at the front, back, side, and top and bottom.
  • the user's pupils placed inside the wearable device 100 may be checked to determine the preference for the virtual object (V) as an object. have.
  • perspective processing may be performed according to the depth of view to improve the realism.
  • the depth of view is processed three-dimensionally according to the positions where the plurality of real objects (R) and the virtual object (V) are placed in a specific space, so that the user's gaze feels a different distance depending on the direction of the arrow, It is determined by the user M how the image data of the virtual object V is obtained differently.
  • the actual real objects R are displayed by the depth of view, and the virtual object V is positioned behind the real object R such that a part of the virtual object V is placed on the real object R. If it is covered by, the part is processed invisible to the user M's field of view.
  • Such a virtual reality implementation system may be implemented as a virtual gallery system in which a specific space is defined, and also in various forms, for example, as shown in FIG. 2, a virtual gallery may also be implemented as a real gallery S1 in which a specific space is defined.
  • FIG. 5 is a schematic diagram illustrating driving of a lens direction of a wide-angle IP camera through a step motor in an up-down-left-left direction in conjunction with a wearable device worn by a user according to another exemplary embodiment of the present invention.
  • the wide-angle IP camera 600 of this embodiment is linked with the wearable device 100 worn by a user to remotely control the lens direction, that is, the photographing direction, of the wide-angle IP camera 600 through the step motor. That is, as the line of sight of the user wearing the wearable device 100 is changed up, down, left, and right, the photographing direction of the wide-angle IP camera 600 is driven up, down, left, and right.
  • the server 300 recognizes this through the Internet, and rotates the wide-angle IP camera 600 up, down, left, and right according to the movement of the wearable device 100 to change the shooting direction. do.
  • the wearable device 100 and the wide-angle IP camera 600 worn by the user M are connected to the server 300 through the Internet, respectively.
  • the situation in which the wide-angle IP camera 600 is linked to the wearable device 100 worn by the user M may be applied to various places, such as a stadium and a guard post.
  • the server 300 may be connected to the server 300 to drive the step motor of the wide-angle IP camera 600 to change the shooting direction according to the game situation. That is, as the user M wearing the wearable device 100 moves his or her head in up, down, left, and right directions, the lens direction of the wide-angle IP camera 600 is also driven up, down, left, and right through the step motor.
  • a plurality of wide-angle IP cameras 600 may be installed in the stadium.
  • the specific wide-angle IP camera 600 may allow a user M wearing the wearable device 100 among the plurality of wide-angle IP cameras 600 to output an image signal that is closer to or more vibrant. Can be selected. Therefore, in the field of view of the user M wearing the wearable device 100, a closer or more thrilling video signal transmitted from the specific wide-angle IP camera 600 can be viewed.
  • the wide-angle IP camera 600 when applied to the guard post, the wide-angle IP camera 600 is driven through the stepper motor in conjunction with the wearable device 100, through each of the wide-angle IP camera 600 installed in various places in the front, Even without actually going to the outside duty patrol, soldiers of guard posts can be on duty duty through the images taken by each wide-angle IP camera 600, as he intended, even while staying in the guard post.
  • FIG. 6 is a schematic diagram showing the configuration of a virtual reality implementation system using a military drone according to another embodiment of the present invention.
  • the virtual reality implementation system of this embodiment detects the location information of soldiers using military drones D1 to D4, and thereby implements the situation of soldiers in virtual reality.
  • Military drones (D1 ⁇ D4) are equipped with a position sensor and a heat sensor to detect the presence and location of each of the soldiers.
  • military drones (D1 ⁇ D4) can collect personal information, such as belonging, rank, class, etc. of allies (m1 ⁇ m4) wearing the wearable device 100 and biometric information of each of the allies (m1 ⁇ m4),
  • biomedical information may be monitored to provide unit status for unit units.
  • a plurality of military drones are used to detect the location of the enemy (e1-e3), weapons (t) or military equipment, and fight against the enemy (e1-e3).
  • Many allies (m1 ⁇ m4) soldiers collect information on various war situations from a plurality of military drones (D1 ⁇ D4) flying over the battlefield to use terrain information, location information of the enemy (e1 ⁇ e3) It can be recommended to determine the war situation and determine where to gather, targets to strike, effective defensive routes, attack routes and travel routes.
  • military drones D1 to D4 fly over a battlefield and receive current location information using a GPS receiver that receives its location information from a GPS satellite.
  • military drones (D1 to D4) transmit location information with other military drones (D1 to D4) adjacent to each other or exchange distance information calculated by using triangulation, etc. You can also determine the exact location for e1 ⁇ e3).
  • the military drones (D1 ⁇ D4) can be sent to the friendly forces (m1 ⁇ m4) wearing the wearable device 100, so as to transmit the location information about the enemy, various war situations, various sensors, for example, location And an information transmitting and receiving device, a thermal image sensor, an optical sensor, and the like.
  • friendly (m1 ⁇ m4) Using the military drones (D1 ⁇ D4) to determine the current position of the enemy (e1 ⁇ e3) and enemy tank (t), as well as the direction of attack on the enemy (e1 ⁇ e3), friendly (m1 ⁇ m4) Easily identify future paths or retreats of allies (m1 to m4) to enable rapid and safe unit movement.
  • each of the allies m1 to m4 does not expose its position to a number of enemy soldiers e1 to e3 or enemy tank t through the wearable device 100 that they wear. Can perform rapid and safe unit movements or operations.
  • Wi-Fi Wi-Fi
  • Bluetooth beacon etc.
  • the example of the communication network base station may be a certain space, not necessarily any room or office, for example, Seoul.
  • a specific space is a building
  • the location of the buildings is three-dimensionally scanned using a laser scanner, an IR scanner, a thermal scanner, and at the same time, the user is located at a communication base station, a Wi-Fi repeater, a Bluetooth beacon, etc.
  • a content such as a virtual object (V) swimming between the buildings.
  • FIG. 8 is a view showing a wearable device in the form of a Google glass according to an embodiment of the present invention.
  • the wearable device 100 of the present invention may be provided in various shapes, and may be configured in the form of transparent, translucent, opaque glass, or the like.
  • the wearable device 100 may be provided in the form of a Google Glass product.
  • the virtual reality implementation system of the present invention as applied to the virtual gallery system, a virtual curator is displayed, through which the writer's history, auction information, the history of the work, the creative intention of the work in a narrative manner
  • a virtual curator is displayed, through which the writer's history, auction information, the history of the work, the creative intention of the work in a narrative manner
  • the item may be purchased directly, and the corresponding product may be selected by selecting a virtual advertisement or an actual product advertisement displayed together with the artwork on the display panel.
  • the virtual reality implementation system of the present invention is applied to the guard post, the wide-angle IP camera 600 through the step motor in conjunction with the wearable device 100 is operated, without actually going to the outer duty patrol, soldiers of the guard post.
  • the boundary duty is possible through the images taken by each wide-angle IP camera 600 as intended by the state that actually stays in the guard post.
  • the virtual reality implementation system of the present invention is applied to military drones, as well as to determine the current position of the enemy and enemy tanks, as well as the direction of attack on the enemy, future movement route, retreat route and attack route of the friendly forces, etc. In this way, it is possible to move troops and carry out operations quickly and safely.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un système de mise en oeuvre de réalité virtuelle et un procédé associé de mise en oeuvre de réalité virtuelle. Dans un système de mise en œuvre de réalité virtuelle selon la présente invention, un objet réel existant dans un espace particulier et un objet virtuel mis en œuvre virtuellement et exprimé dans l'espace particulier sont affichés sur un panneau d'affichage d'un dispositif vestimentaire. L'espace particulier est défini par une pluralité de capteurs émettant et recevant des informations d'emplacement. Un serveur stocke, sous la forme d'images tridimensionnelles, des données d'image de l'objet réel et des données d'image de l'objet virtuel et fournit les données d'image à un ordinateur d'utilisateur dans l'espace particulier. Le serveur détermine des informations d'emplacement d'un dispositif vestimentaire et des informations d'emplacement de l'objet réel à l'intérieur de l'espace particulier. Ensuite, lorsque l'objet réel et l'objet virtuel se chevauchent partiellement, le serveur traite les images de telle sorte que la portion de chevauchement apparaisse masquée ou affichée afin d'être vue sur le panneau d'affichage du dispositif vestimentaire conformément à la profondeur de champ visuel entre les données d'image de l'objet réel et les données d'image de l'objet virtuel. Selon la présente invention, un dispositif vestimentaire peut être utilisé dans divers espaces pour exprimer un objet réel et un objet virtuel avec une sensation de réalité, de façon à mettre en œuvre une réalité virtuelle et une réalité augmentée d'une manière combinée.
PCT/KR2017/007944 2016-07-22 2017-07-24 Système de mise en œuvre de réalité virtuelle et procédé associé de mise en œuvre de réalité virtuelle WO2018016928A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2016-0093043 2016-07-22
KR20160093043 2016-07-22
KR1020160138326A KR101696243B1 (ko) 2016-07-22 2016-10-24 가상 현실 구현 시스템 및 그의 가상 현실 구현 방법
KR10-2016-0138326 2016-10-24

Publications (1)

Publication Number Publication Date
WO2018016928A1 true WO2018016928A1 (fr) 2018-01-25

Family

ID=57835444

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/007944 WO2018016928A1 (fr) 2016-07-22 2017-07-24 Système de mise en œuvre de réalité virtuelle et procédé associé de mise en œuvre de réalité virtuelle

Country Status (2)

Country Link
KR (1) KR101696243B1 (fr)
WO (1) WO2018016928A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109087237A (zh) * 2018-06-29 2018-12-25 邓文婕 一种虚拟头盔
CN111766959A (zh) * 2019-04-02 2020-10-13 海信视像科技股份有限公司 虚拟现实交互方法和虚拟现实交互装置
CN112099638A (zh) * 2020-10-19 2020-12-18 深圳市瑞立视多媒体科技有限公司 虚拟现实场景中的信息处理方法、装置和计算机设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6919222B2 (ja) * 2017-02-27 2021-08-18 セイコーエプソン株式会社 表示装置、及び、表示装置の制御方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101106857B1 (ko) * 2011-09-23 2012-01-20 여미옥 3디를 활용한 가상 공간의 갤러리시스템
KR101360999B1 (ko) * 2013-08-09 2014-02-10 코리아디지탈 주식회사 증강현실 기반의 실시간 데이터 제공 방법 및 이를 이용한 휴대단말기
KR101454445B1 (ko) * 2013-10-16 2014-10-27 백명섭 3차원 갤러리 전시 매매 시스템
KR20150126938A (ko) * 2013-03-11 2015-11-13 매직 립, 인코포레이티드 증강 및 가상 현실을 위한 시스템 및 방법
KR101638550B1 (ko) * 2014-06-25 2016-07-12 경북대학교 산학협력단 혼합현실을 이용한 가상현실 시스템 및 그 구현방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101106857B1 (ko) * 2011-09-23 2012-01-20 여미옥 3디를 활용한 가상 공간의 갤러리시스템
KR20150126938A (ko) * 2013-03-11 2015-11-13 매직 립, 인코포레이티드 증강 및 가상 현실을 위한 시스템 및 방법
KR101360999B1 (ko) * 2013-08-09 2014-02-10 코리아디지탈 주식회사 증강현실 기반의 실시간 데이터 제공 방법 및 이를 이용한 휴대단말기
KR101454445B1 (ko) * 2013-10-16 2014-10-27 백명섭 3차원 갤러리 전시 매매 시스템
KR101638550B1 (ko) * 2014-06-25 2016-07-12 경북대학교 산학협력단 혼합현실을 이용한 가상현실 시스템 및 그 구현방법

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109087237A (zh) * 2018-06-29 2018-12-25 邓文婕 一种虚拟头盔
CN111766959A (zh) * 2019-04-02 2020-10-13 海信视像科技股份有限公司 虚拟现实交互方法和虚拟现实交互装置
CN111766959B (zh) * 2019-04-02 2023-05-05 海信视像科技股份有限公司 虚拟现实交互方法和虚拟现实交互装置
CN112099638A (zh) * 2020-10-19 2020-12-18 深圳市瑞立视多媒体科技有限公司 虚拟现实场景中的信息处理方法、装置和计算机设备
CN112099638B (zh) * 2020-10-19 2024-02-06 瑞立视多媒体科技(北京)有限公司 虚拟现实场景中的信息处理方法、装置和计算机设备

Also Published As

Publication number Publication date
KR101696243B1 (ko) 2017-01-13

Similar Documents

Publication Publication Date Title
US11789523B2 (en) Electronic device displays an image of an obstructed target
CN107782314B (zh) 一种基于扫码的增强现实技术室内定位导航方法
EP3165939B1 (fr) Création et mise à jour dynamiques de cartes de positionnement intérieur
KR102060453B1 (ko) 화상 표시 시스템, 화상 표시 시스템의 제어방법, 화상 전송 시스템 및 헤드 마운트 디스플레이
WO2018016928A1 (fr) Système de mise en œuvre de réalité virtuelle et procédé associé de mise en œuvre de réalité virtuelle
JP6615732B2 (ja) 情報処理装置および画像生成方法
US11734898B2 (en) Program, information processing method, and information processing terminal
CN107844196A (zh) 视频处理设备、视频处理方法和视频处理系统
CN105393284A (zh) 基于人类身体数据的空间雕刻
KR20200060361A (ko) 정보 처리 장치, 정보 처리 방법, 및 프로그램
CN106920260A (zh) 立体惯性导盲方法及装置和系统
EP3729235B1 (fr) Traitement de données
WO2017142130A1 (fr) Procédé de traitement d'image destiné à fournir une réalité virtuelle, et un dispositif de réalité virtuelle
CN112558008B (zh) 基于光通信装置的导航方法、系统、设备及介质
US20150269777A1 (en) Optically Composited Augmented Reality Pedestal Viewer
EP4365887A1 (fr) Système d'affichage d'image et procédé d'affichage d'image
KR20230132001A (ko) 적외선 기반의 스마트형 객체 정보 처리 장치 및 방법
CN116107527A (zh) 信息显示方法及其处理装置与信息显示系统
JP2020154009A (ja) 情報表示システムおよびウェアラブルデバイス

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17831400

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17831400

Country of ref document: EP

Kind code of ref document: A1