KR101770188B1 - Method for providing mixed reality experience space and system thereof - Google Patents
Method for providing mixed reality experience space and system thereof Download PDFInfo
- Publication number
- KR101770188B1 KR101770188B1 KR1020150055376A KR20150055376A KR101770188B1 KR 101770188 B1 KR101770188 B1 KR 101770188B1 KR 1020150055376 A KR1020150055376 A KR 1020150055376A KR 20150055376 A KR20150055376 A KR 20150055376A KR 101770188 B1 KR101770188 B1 KR 101770188B1
- Authority
- KR
- South Korea
- Prior art keywords
- scenario
- user
- virtual
- virtual object
- interaction
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiments of the present invention are directed to a method and system for providing a virtual reality space based on a mixed reality, and a method for providing a mixed reality experience space according to an embodiment of the present invention includes mixing virtual objects existing in a virtual scenario Displaying on at least one of a scenario main output device and a user device located in a real experience space; Generating an interaction result based on the sensed interaction when the interaction between the virtual object and the user is sensed; Modifying the hypothetical scenario based on the interaction result; And displaying a virtual object present in the modified virtual scenario on at least one of the scenario main output device and the user device. According to the embodiments of the present invention, it is possible to provide a virtual virtual experience space.
Description
Embodiments of the present invention are directed to a method and system for providing a mixed reality experience space.
The existing game console-type experience technology has a limitation in that it can not provide direct interaction between an object existing in a three-dimensional space and a user.
For example, suppose the user wants to pop a virtual bubble. In this case, the user must break his or her bubble by moving his / her body while viewing the virtual appearance of the soap bubble and the appearance of his / her output on the remote display. You need to take action that is not natural because you have to control your movement while watching the display image.
In addition, the conventional game console-type experiential technique is difficult to provide interaction between each user and an object in an experience space shared by a plurality of users.
Embodiments of the present invention provide a way for a user to interact with a virtual object.
A method for providing a mixed reality experience space according to an embodiment of the present invention includes displaying a virtual object existing in a virtual scenario on at least one of a scenario main output device and a user device located in a mixed reality experience space; Generating an interaction result based on the sensed interaction when the interaction between the virtual object and the user is sensed; Modifying the hypothetical scenario based on the interaction result; And displaying a virtual object present in the modified virtual scenario on at least one of the scenario main output device and the user device.
In one embodiment, displaying the virtual object comprises displaying the virtual object on a device capable of displaying the virtual object at a closer focal distance from the user of the scenario main output device and the user device .
In one embodiment, the method further comprises displaying the virtual user interface corresponding to the real user interface at the location of the physical user interface when the user interacts with the virtual object using the physical user interface As shown in FIG.
In one embodiment, the method may further include outputting a sound according to the hypothetical scenario by varying the volume of the speakers based on the position of the speakers existing in the mixed reality experience space and the position of the user have.
In one embodiment, the step of displaying the virtual object may further include displaying the virtual object on the user device in consideration of the degree of freedom of the user and the position of the scenario main output device .
The mixed reality experience space providing system according to an embodiment of the present invention includes a mixed reality experience space providing system for receiving a virtual object existing in a virtual scenario and displaying a virtual object corresponding to the received information, Output device and user device; And generating an interaction result based on the interaction by sensing an interaction between the virtual object and a user, modifying the virtual scenario based on the interaction result, and generating information about a virtual object existing in the modified virtual scenario To at least one of the scenario main output device and the user device.
In one embodiment, the scenario operating server may transmit information about the virtual object to a device capable of displaying the virtual object at a closer focal distance from the user, the scenario main output device and the user device.
In one embodiment, the system further comprises a real user interface used by the user for interacting with the virtual object, the scenario operating server providing a virtual user interface corresponding to the real user interface to a location of the real user interface As shown in FIG.
In one embodiment, the system further comprises speakers for outputting sound according to the hypothetical scenario, wherein the scenario operating server is configured to generate a sound based on the position of the speakers and the position of the user, So that it can be controlled.
In one embodiment, the scenario operating server may transmit information on a virtual object to be displayed on the user device to the user device, taking into consideration the degree of freedom of the user and the location of the scenario main output device.
According to the embodiments of the present invention, it is possible to provide a virtual virtual experience space. According to embodiments of the present invention, a plurality of users can simultaneously experience virtual reality.
1 is an exemplary view for explaining a mixed reality experience space according to an embodiment of the present invention;
FIG. 2 is a block diagram illustrating a scenario management server according to an embodiment of the present invention;
3 is a block diagram illustrating a scenario main output device according to an embodiment of the present invention;
4 is a block diagram for explaining a user interface according to an embodiment of the present invention;
5 is a block diagram illustrating a user device according to an embodiment of the present invention.
6 and 7 are views for explaining a mixed reality experience space for virtual shooting training according to an embodiment of the present invention;
In the following description of the embodiments of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted when it may make the subject matter of the present invention rather unclear.
Embodiments of the present invention can utilize technologies that are being introduced into the consumer electronics market as "N screen technology ". N screen technology can provide a service that enables a user to continuously experience the viewing of content even in a user's mobile situation by interworking a plurality of display devices in a cloud service based environment.
Embodiments of the present invention provide a virtual object in a space where a user is actually located, without using a composite image based on blue screen technology, so that a user can directly interact with a virtual object.
Embodiments of the present invention allow multiple participants to simultaneously share one experience space and implement various interactive scenarios in which the mutual interaction spaces are shared or separated.
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings.
1 is an exemplary view for explaining a mixed reality experience space according to an embodiment of the present invention.
The
In the mixed reality experience space, at least one of the scenario
The scenario
The
The user
The
2 is a block diagram illustrating a scenario management server according to an embodiment of the present invention. 2, the
The
The video / audio
The
The
3 is a block diagram illustrating a scenario main output device according to an embodiment of the present invention. 3, a scenario
In the mixed reality experience space, at least one
The
The
The
4 is a block diagram illustrating a user interface according to an embodiment of the present invention. 4, a
The
The
5 is a block diagram illustrating a user device according to an embodiment of the present invention. 5, a
The
The
Generally, in the case of an optical system that forms a virtual image in a space, a virtual image is formed at a certain distance from the user's viewpoint. Therefore, it can be assumed that the virtual screen on which the
The stereoscopic optical system can embed a display filter (e.g., a stereoscopic filter) corresponding to each of a virtual screen formed by the scenario
The
The
The
The
The
6 and 7 are illustrations for explaining a mixed reality experience space for virtual shooting training according to an embodiment of the present invention.
Conventional shooting training (or game) involves visualizing a virtual target on a small number of image screens disposed on the front side of the user, and allowing a user, who is located remotely from the image screen, to shoot a virtual target with a weapon .
In the embodiments of the present invention, a space capable of expressing a virtual target can be extended to a space outside the screen. For example, N screens technology can be used to implement multiple screens within a mixed reality experience space. In addition, each of the users U1, U2, U3 may wear a user device, for example an HMD or an EGD. As described above, the 6DOF information of the user measured at the user device can be transmitted to the
Thus, the users U1 and U2 can see through the user device a virtual target that can not be output to the screen formed by the scenario
It is assumed that the target T3 being visible on the
On the other hand, it is assumed that users U2 and U3 interact with virtual targets using a real user interface (for example, a real model rifle). At this time, the image corresponding to the user's hand can be acquired through the image acquisition unit (or the sensor for extracting the depth information) built in the user device worn by the users U2 and U3, Can be transmitted to the scenario operating server. Accordingly, the scenario operating server can overlap and visualize a virtual user interface (e.g., a space laser gun) at the location of the real user interface used by the users U2 and U3. Similarly, for a user U1 using a real user interface with a built-in bare hand or switch or the like, a virtual user interface (e.g., a space laser gun) Can be superimposed and visualized.
Since the virtual user interface is also controlled by computer simulation, the result of the interaction between the virtual weapon and the virtual target (e.g., whether the target is hitting) can be expressed. In addition, natural voice output can be performed based on the position of the user device existing in the mixed reality experience space and the position of the main speaker. The location of the user device may be obtained from a sensor mounted on the user device, and the location of the main speaker may be provided by the system operator. The scenario operating server can control the channel and volume of each speaker in consideration of the positions of the speakers located in the mixed reality experience space, thereby realizing a natural sound output.
The embodiments of the invention described above may be implemented in any of a variety of ways. For example, embodiments of the present invention may be implemented using hardware, software, or a combination thereof. When implemented in software, it may be implemented as software running on one or more processors using various operating systems or platforms. Additionally, such software may be written using any of a number of suitable programming languages, and may also be compiled into machine code or intermediate code executable in a framework or virtual machine.
Also, when embodiments of the present invention are implemented on one or more processors, one or more programs for carrying out the methods of implementing the various embodiments of the invention discussed above may be stored on a processor readable medium (e.g., memory, A floppy disk, a hard disk, a compact disk, an optical disk, a magnetic tape, or the like).
Claims (10)
Generating an interaction result based on the sensed interaction when the interaction between the virtual object and the user is sensed;
Modifying the virtual scenario by performing at least one of operations of moving, deleting, and creating a new virtual object in the virtual scenario based on the result of the interaction; And
Displaying a virtual object present in the modified virtual scenario on at least one of the scenario main output device and the user device
/ RTI >
Wherein the displaying the virtual object comprises:
Displaying the virtual object on a device corresponding to a focal distance closer to the position of the virtual object from among a first focal distance corresponding to the user device and a second focal distance corresponding to the scenario main output device,
Wherein when displaying the virtual object on the user device, the virtual object is displayed in consideration of the 6-degree-of-freedom information of the user and the location of the scenario main output device.
Displaying a virtual user interface corresponding to the real user interface at a position of the real user interface when the user performs an interaction with the virtual object using the real user interface
The method of claim 1,
Outputting a sound according to the virtual scenario by varying the volume of the speakers based on the position of the speakers present in the mixed reality experience space and the position of the user
The method of claim 1,
Detecting an interaction between the virtual object and a user to generate an interaction result according to the interaction, moving and deleting the object included in the virtual scenario based on the interaction result, and generating a new virtual object A scenario management server for performing at least one of the operations of modifying the virtual scenario and transmitting information about a virtual object existing in the modified virtual scenario to at least one of the scenario main output device and the user device,
/ RTI >
The scenario operating server,
A device corresponding to a focal distance closer to the position of the virtual object, among a first focal distance corresponding to the user device and a second focal distance corresponding to the scenario main output device, And transmits the information on the virtual object to be displayed on the user device to the user device in consideration of the 6 DOF information of the user and the position of the scenario main output device, .
Further comprising a real user interface used by the user to interact with the virtual object,
The scenario operating server displays a virtual user interface corresponding to the real user interface at a location of the real user interface
Mixed reality experience space providing system.
Further comprising speakers for outputting sound according to the hypothetical scenario,
The scenario management server controls the speakers to output sound according to the volume of the speakers based on the position of the speakers and the position of the user
Mixed reality experience space providing system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150055376A KR101770188B1 (en) | 2015-04-20 | 2015-04-20 | Method for providing mixed reality experience space and system thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150055376A KR101770188B1 (en) | 2015-04-20 | 2015-04-20 | Method for providing mixed reality experience space and system thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20160124985A KR20160124985A (en) | 2016-10-31 |
KR101770188B1 true KR101770188B1 (en) | 2017-08-24 |
Family
ID=57445819
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150055376A KR101770188B1 (en) | 2015-04-20 | 2015-04-20 | Method for providing mixed reality experience space and system thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101770188B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20230165015A (en) * | 2022-05-26 | 2023-12-05 | 주식회사 네비웍스 | Operation apparatus for virtual training, and operation method for virtual training |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020081877A1 (en) * | 2018-10-18 | 2020-04-23 | Ha Nguyen | Ultrasonic messaging in mixed reality |
IL270429B2 (en) * | 2019-11-05 | 2023-08-01 | Simbionix Ltd | System and method for immerive mixed reality space(s) |
CN114972818B (en) * | 2022-05-07 | 2024-05-14 | 浙江理工大学 | Target locking system based on deep learning and mixed reality technology |
-
2015
- 2015-04-20 KR KR1020150055376A patent/KR101770188B1/en active IP Right Grant
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20230165015A (en) * | 2022-05-26 | 2023-12-05 | 주식회사 네비웍스 | Operation apparatus for virtual training, and operation method for virtual training |
KR102698043B1 (en) * | 2022-05-26 | 2024-08-23 | 주식회사 네비웍스 | Operation apparatus for virtual training, and operation method for virtual training |
Also Published As
Publication number | Publication date |
---|---|
KR20160124985A (en) | 2016-10-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114236837A (en) | Systems, methods, and media for displaying an interactive augmented reality presentation | |
CN113168007A (en) | System and method for augmented reality | |
JP2016522463A5 (en) | ||
US10755486B2 (en) | Occlusion using pre-generated 3D models for augmented reality | |
US11094107B2 (en) | Information processing device and image generation method | |
JP6615732B2 (en) | Information processing apparatus and image generation method | |
JPWO2015098807A1 (en) | An imaging system that synthesizes a subject and a three-dimensional virtual space in real time | |
WO2021106803A1 (en) | Class system, viewing terminal, information processing method, and program | |
KR101770188B1 (en) | Method for providing mixed reality experience space and system thereof | |
US10437055B2 (en) | Master device, slave device, and control method therefor | |
KR20180120456A (en) | Apparatus for providing virtual reality contents based on panoramic image and method for the same | |
EP3264228A1 (en) | Mediated reality | |
CN116210021A (en) | Determining angular acceleration | |
KR101638550B1 (en) | Virtual Reality System using of Mixed reality, and thereof implementation method | |
EP3418860A1 (en) | Provision of virtual reality content | |
JP2021512402A (en) | Multi-viewing virtual reality user interface | |
JP6534972B2 (en) | Image display apparatus, image display method and image display program | |
JP2021086606A (en) | Class system, viewing terminal, information processing method, and program | |
CN105893452A (en) | Method and device for presenting multimedia information | |
US20190089899A1 (en) | Image processing device | |
KR101893038B1 (en) | Apparatus and method for providing mapping pseudo hologram using individual video signal output | |
KR20200031255A (en) | System for sharing of image data or video data for interaction contents and the method thereof | |
Sammartino | Integrated Virtual Reality Game Interaction: The Archery Game | |
Wong | HandsOn: a portable system for collaboration on virtual 3D objects using binocular optical head-mounted display | |
JP7544071B2 (en) | Information processing device, information processing system, and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |