WO2020115824A1 - Système d'affichage d'espace virtuel - Google Patents

Système d'affichage d'espace virtuel Download PDF

Info

Publication number
WO2020115824A1
WO2020115824A1 PCT/JP2018/044631 JP2018044631W WO2020115824A1 WO 2020115824 A1 WO2020115824 A1 WO 2020115824A1 JP 2018044631 W JP2018044631 W JP 2018044631W WO 2020115824 A1 WO2020115824 A1 WO 2020115824A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
virtual space
head
motion
detection unit
Prior art date
Application number
PCT/JP2018/044631
Other languages
English (en)
Japanese (ja)
Inventor
西川隼矢
Original Assignee
株式会社Rockin’ Pool
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Rockin’ Pool filed Critical 株式会社Rockin’ Pool
Priority to PCT/JP2018/044631 priority Critical patent/WO2020115824A1/fr
Publication of WO2020115824A1 publication Critical patent/WO2020115824A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to a virtual space display system, and more particularly to a virtual space display system for displaying a virtual space and an object arranged in the virtual space on a display device mounted on a user's head in a water area as an image.
  • a technology for displaying a virtual space created by VR (Vertical Reality) technology on a head mounted display (Head Mount Display: HMD) mounted on the user's head is popular.
  • Patent Document 1 discloses a virtual space display system that displays a virtual space on the HMD of a user who wears an HMD and floats in a water area.
  • the motion of the user's head in a state of being suspended in the water in the prone position is detected, and an image of the virtual space is generated according to the detected motion of the head.
  • the user can enjoy the image in the virtual space while feeling floating in the water.
  • the interaction between the image of the virtual space and the user can be realized if it is possible to reflect not only the movement of the head but also the sensation that the user receives from the water area.
  • the interest of the user is improved.
  • the present invention has been made in view of the above circumstances, and an object of the present invention is to provide a virtual space display system capable of improving the interest of the user.
  • a virtual space display system for achieving the above object is a virtual space display system for displaying a virtual space and an object arranged in the virtual space on a display device mounted on a user's head in a water area as an image.
  • a head motion detection unit that detects a motion of the user's head
  • a resistance detection unit that is attached to an arbitrary body part of the user and detects resistance that the user receives from the water area in the body part
  • a head motion detection unit And a virtual space processing program that transforms a virtual space and an object based on the resistance of the user's head detected by 1. and the resistance detected by the resistance detection unit from the water area.
  • This virtual space display system is characterized by including a gyro sensor and an acceleration sensor that are attached to an arbitrary body part of the user and detect movements of the body part in the water area.
  • the display device mounted on the user's head based on the motion of the user's head detected by the head motion detection unit and the resistance received by the user from the water area detected by the resistance detection unit.
  • the virtual space and the objects displayed as images are transformed.
  • the virtual space and the object are transformed based on the user's action, the interaction between the image of the virtual space and the image of the object and the user is realized, and the interest of the user is improved.
  • the resistance detection unit that detects the resistance received from the water area since the resistance detection unit that detects the resistance received from the water area is provided, the user in the water area can reflect a specific action force received from the water area, such as resistance or buoyancy, in the virtual space and the object, and the user's interest can be enhanced. Is expected to improve further.
  • the virtual space processing program of this virtual space display system is configured to detect the motion of the head of one user detected by the head motion detection unit that detects the motion of the head of one user and the resistance that the user receives from the water area.
  • the display device attached to the heads of the respective users is displayed.
  • the virtual space and the object displayed as an image are transformed.
  • the virtual space and the object are transformed based on the action of each user, so that the interaction between the image of the virtual space and the image of the object and each user is realized, and the interaction between each user is realized. Therefore, the interest of each user is further improved among a plurality of users.
  • the virtual space and the object are transformed based on the user's action, the interaction between the image of the virtual space and the image of the object and the user is realized, and the interest of the user is improved.
  • FIGS. 1 to 6 Next, a virtual space display system according to an embodiment of the present invention will be described based on FIGS. 1 to 6.
  • FIG. 1 is a diagram for explaining the outline of the configuration of the virtual space display system according to this embodiment.
  • the virtual space display system 10 is a server 20 that creates a virtual space 11 and the like, and is a display device that is attached to a user 1 located in a water area W and displays the virtual space 11 created by the server 20.
  • a head mounted display (Head Mount Display: HMD) 30 and an operation unit 40 connected to the server 20 and attached to the hand of the user 1 are provided.
  • the user 1 in the present embodiment may be floating in the water body W, or may be used with the legs attached to the bottom of the water body (or a member such as a table placed on the bottom). ..
  • a virtual sphere 12 which is a spherical object expressed in three dimensions
  • a virtual hand 13 which is an object in which a hand which is a body part of the user 1 is displayed in three dimensions. Will be placed.
  • the virtual hand 13 is operated by the operation unit 40 attached to the hand of the user 1 in the present embodiment.
  • FIG. 2 is a block diagram for explaining the outline of the configuration of the server 20 according to this embodiment.
  • the server 20 includes a processor 21, a memory 22, a storage 23, a transmission/reception unit 24, and an input/output unit 25 as main components, which are electrically connected to each other via a bus 26.
  • the processor 21 is an arithmetic device that controls the operation of the server 20, controls the transmission and reception of data between each element, and performs the processing necessary for executing an application program.
  • the processor 21 is, for example, a CPU (Central Processing Unit) in the present embodiment, and executes each application by executing an application program stored in a storage 23 described later and expanded in a memory 22.
  • a CPU Central Processing Unit
  • the memory 22 includes a main storage device configured by a volatile storage device such as a DRAM (Dynamic Random Access Memory) and an auxiliary storage device configured by a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). ..
  • a volatile storage device such as a DRAM (Dynamic Random Access Memory)
  • auxiliary storage device configured by a non-volatile storage device such as a flash memory or an HDD (Hard Disc Drive). ..
  • the memory 22 is used as a work area of the processor 21, and also stores a BIOS (Basic Input/Output System) executed when the server 20 is started, and various setting information.
  • BIOS Basic Input/Output System
  • the storage 23 stores application programs and data used for various processes.
  • the virtual space processing program 23A is stored. The outline of the configuration of the virtual space processing program 23A will be described later.
  • the transmission/reception unit 24 connects the server 20 to the network, and in the present embodiment, the HMD 30 and the operation unit 40 are connected to the transmission/reception unit 24 via the network.
  • the transmission/reception unit 24 may have a short-range communication interface such as Bluetooth (registered trademark) or BLE (Bluetooth Low Energy).
  • Bluetooth registered trademark
  • BLE Bluetooth Low Energy
  • the input/output unit 25 may be connected with an information input device such as a keyboard and a mouse or an output device such as a display, if necessary.
  • the bus 26 transmits, for example, an address signal, a data signal, and various control signals among the connected processor 21, memory 22, storage 23, transmission/reception unit 24, and input/output unit 25.
  • the HMD 30 is formed in a helmet type to be worn on the head of the user 1, and includes a smartphone storage unit 31 and a respirator 32.
  • the smartphone storage unit 31 stores a smartphone 33 which is formed at a portion corresponding to the position of the eyes of the user 1 and which is connected to the server 20 via the network.
  • the respirator 32 is formed in a tubular shape or a pipe shape that extends upward from an area corresponding to the position of the mouth of the user 1 and opens, and inspires and exhausts according to the breathing of the user 1. Is done.
  • FIG. 3 is a block diagram illustrating an outline of the configuration of the smartphone 33 according to the present embodiment.
  • the smartphone 33 includes a display 33A and a head movement detection unit 33B.
  • the virtual space 11, the virtual sphere 12, and the virtual hand 13 generated by the server 20 are displayed on the display 33A as images.
  • This display 33A is mounted as a display of the HMD 30 in the present embodiment.
  • the head motion detection unit 33B includes a geomagnetic sensor 33Ba, a gyro sensor 33Bb, and an acceleration sensor 33Bc, detects the motion of the head of the user 1 wearing the HMD 30 in which the smartphone 33 is stored, Generate a head motion signal.
  • the gyro sensor 33Bb detects the angular velocities of the HMD 30 around the three axes with time according to the movement of the head of the user 1 and the corresponding movement of the HMD 30, and determines the time change of the angle (tilt) around each axis. can do.
  • the smartphone 33 mounts the HMD 30 stored in the smartphone storage unit 31, in the present embodiment, the user 1 observes only the display 33A of the smartphone 33.
  • the user 1 loses all the external field of view, and thus can completely immerse himself in the virtual space 11 displayed on the display 33A of the smartphone 33.
  • the user 1 who is immersed in the virtual space 11 can execute the interaction with the virtual sphere 12 by using the virtual hand 13 that operates in conjunction with the operation of his or her hand.
  • the operation unit 40 has a glove shape that can be worn on the hand of the user 1, and is attached to both hands of the user 1 in the present embodiment.
  • the first motion detection unit 41 includes a resistance sensor 41a, which is a resistance detection unit, a gyro sensor 41b, and an acceleration sensor 41c.
  • a resistance sensor 41a which is a resistance detection unit
  • a gyro sensor 41b As the characteristics of water, there are four elements of buoyancy, resistance, water pressure, and water temperature.
  • the present invention identifies the movement or displacement of the user 1 by detecting/detecting his/her own resistance among them. Not only the resistance but also the water pressure due to the difference in water depth may be detected if necessary.
  • the “detection unit” may be considered as a single unit, or may be built in a controller (operation unit) or the like.
  • the resistance sensor 41a is a sensor that detects the resistance that the user 1 receives from the water area W, and detects the movement of the hand of the user 1 in cooperation with the gyro sensor 41b and the acceleration sensor 41c. 1 Generate an operation signal.
  • the sensor according to the present embodiment is not limited to this, for example, a strain sensor, a torque sensor, a semiconductor resistance sensor, etc., the movement of the object, the displacement, such as speed, it is possible to obtain the information of the change, and underwater Any material may be used as long as it can be used (or processed so that it can be used in water).
  • the second motion detection unit 42 is composed of a motion capture mechanism including a sensor that detects the motion of the finger of the user 1, and detects the motion of the finger of the user 1 to generate a second motion signal.
  • FIG. 5 is a block diagram for explaining the outline of the configuration of the virtual space processing program 23A according to this embodiment.
  • the virtual space processing program 23A includes a video generation unit 23Aa, a video transformation signal generation unit 23Ab, and a feedback signal generation unit 23Ac.
  • the image generation unit 23Aa generates the virtual space 11, the virtual sphere 12 and the virtual hand 13 based on the preset virtual space generation data D, and also virtualizes the virtual sphere 12 and the virtual hand 13. It is placed in the space 11.
  • the image generation unit 23Aa changes the virtual space 11, the virtual sphere 12, and the virtual hand 13 based on the image conversion signal generated by the image generation signal generation unit 23Ab described below.
  • the image transformation signal generation unit 23Ab includes the head movement signal S0 generated by the head movement detection unit 33B of the smartphone 33 and the first movement generated by the first movement detection unit 41 of the operation unit 40.
  • the image transformation signal S3 is generated based on the signal S1 and the second motion signal S2 generated by the second motion detector 42.
  • the feedback signal generation unit 23Ac includes the head motion signal S0 generated by the head motion detection unit 33B of the smartphone 33 and the first motion signal generated by the first motion detection unit 41 of the operation unit 40.
  • the feedback signal S4 is generated based on S1 and the second motion signal S2 generated by the second motion detector 42.
  • the virtual space 11 In a state in which the user 1 wearing the HMD 30 and the operation unit 40 floats in the water W, the virtual space 11, the virtual sphere based on the virtual space generation data D preset by the video generation unit 23Aa of the virtual space processing program 23A. 12 and the virtual hand 13 are generated, the virtual sphere 12 and the virtual hand 13 are arranged in the virtual space 11, and these are displayed as an image on the HMD 30 (display 33A of the smartphone 33).
  • the head motion detection unit 33B detects the motion of the head of the user 1 and generates the head motion signal S0 according to the motion. To do.
  • the resistance sensor 41a of the first motion detection unit 41 detects the resistance that the user 1 receives from the water area W in his hand in response to the motion, and the gyro sensor 41b. Also, in cooperation with the acceleration sensor 41c, the motion of the hand of the user 1 is detected to generate the first motion signal S1.
  • the second motion detection unit 42 detects the motion of the finger of the user 1 according to the motion, and generates the second motion signal S2.
  • the video transformation signal S3 is generated by the video transformation signal generation unit 23Ab. Is generated.
  • the virtual space 11, the virtual sphere 12, and the virtual hand 13 generated by the image generation unit 23Aa are transformed by the image generation unit 23Aa, and the transformed virtual space 11, virtual sphere 12, and The image of the virtual hand 13 is displayed on the HMD 30.
  • the feedback signal generation unit 23Ac when the head movement signal S0, the first movement signal S1, and the second movement signal S2 are input to the feedback signal generation unit 23Ac, the feedback signal generation unit 23Ac generates the feedback signal S4.
  • the virtual space 11, the virtual sphere 12, and the virtual hand 13 generated by the image generation unit 23Aa are the head motion signal S0 generated by the head motion detection unit 33B, and the resistance sensor of the first motion detection unit 41. It is transformed based on the first motion signal S1 generated by 41a and the like and the second motion signal S2 generated by the second motion detector 42.
  • the virtual space 11, the virtual sphere 12, and the virtual hand 13 are transformed based on the motion of the user 1, the interaction between the image of the virtual space 11, the image of the virtual sphere 12, and the image of the virtual hand 13 and the user 1. Is realized, and the interest of the user 1 is improved.
  • the resistance sensor 41a that detects the resistance received from the water area W cooperates with the gyro sensor 41b and the acceleration sensor 41c to detect the motion of the hand of the user 1 and generate the first motion signal S1. Therefore, the specific acting force (buoyancy, resistance, etc.) that the user 1 floating in the water area W receives from the water area W can be reflected in the virtual space 11, the virtual sphere 12, and the virtual hand 13.
  • FIG. 6 is a diagram illustrating an operation outline when the virtual space display system 10 is used by a plurality of users. As illustrated, one user 1 and another user 2 floating in different water areas W1 and W2 respectively mount an HMD 30 and an operation unit 40 that can communicate with the server 20, and each HMD 30 has a virtual space 11 , The virtual sphere 12, the virtual hand 13, and the virtual hand 14 of the user 2 are displayed.
  • the head motion signal S0, the first motion signal S1 and the second motion signal S2 are respectively supplied to the users 1 and 2 depending on the motions of the heads, hands and fingers of the users 1 and 2 floating in the water areas W1 and W2. 1(2) is generated respectively, and the video alteration signal generation unit 23Ab generates the video alteration signal S3 for each user 1(2).
  • the virtual sphere 12 and the virtual hand 13(14) are transformed in the virtual space 11, and the transformed virtual space 11, virtual sphere 12 and virtual are created.
  • the image of the hand 13 (14) is displayed on the HMD 30 of each user 1 (2).
  • the head motion signal S0, the first motion signal S1, and the second motion signal S2 are generated according to the motion of the head, hand, and finger of each user 1(2) floating in the water areas W1 and W2.
  • the feedback signal S4 for each user 1(2) is generated.
  • the virtual space 11, the virtual sphere 12, and the virtual hand 13 (14) generated by the video generation unit 23Aa are generated for each user 1 (2) based on the motions of the plurality of users 1 (2).
  • the transformation is performed based on the head movement signal S0, the first movement signal S1, and the second movement signal S2.
  • the virtual space 11, the virtual sphere 12, and the virtual hand 13 are transformed based on the respective movements of the user 1 (2), so that the image of the virtual space 11, the image of the virtual sphere 12, and the virtual hand 13 (14). Since the interaction between the video and the user 1(2) is realized and the interaction between the user 1 and the user 2 is realized, the interest of the user 1(2) is further improved.
  • the present invention is not limited to the above-mentioned embodiments, and various modifications can be made without departing from the spirit of the invention.
  • the operation unit 40 may be attached to another body part such as a leg.
  • the virtual space display system 10 is used by the user 1 or a plurality of users 1(2) has been described, but the number of users is not particularly limited.
  • the display of the HMD 30 is implemented by the display 33A of the smartphone 33 stored in the HMD 30 has been described, but it may be implemented by incorporating the display in the HMD 30.
  • the water areas W, W1 and W2 may be pools and various kinds of water storage tanks, seas, lakes and the like, or may be pools and bathtubs at home or other facilities where the body below the neck can be immersed. Any body of water may be used.
  • the above-mentioned HMD 830 was mainly a device related to VR (Vertical Reality) technology, but in the field of AR (Augmented Reality) technology and MR (Mixed Reality) technology, for example.
  • the present invention can also be applied.
  • the position of the user including the displacement of a part of the body
  • the process according to the position may be performed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention ‌a‌ ‌pour‌ ‌objet‌ de‌ fournir un système d'affichage d'espace virtuel capable d'augmenter l'attrait d'un utilisateur. À cet effet, l'invention porte sur un système d'affichage d'espace virtuel qui affiche, sous la forme d'images sur un dispositif d'affichage monté sur une partie de tête d'un utilisateur flottant dans une zone d'eau, un espace virtuel et un objet disposé dans l'espace virtuel. Le système d'affichage d'espace virtuel comprend : une unité de détection de mouvement de partie de tête qui détecte un mouvement de la partie de tête de l'utilisateur ; une unité de détection de résistance qui est montée sur une partie de corps arbitraire de l'utilisateur et détecte une résistance que l'utilisateur reçoit de la zone d'eau au niveau de la partie de corps ; et un programme de traitement d'espace virtuel qui transforme l'espace virtuel et l'objet sur la base du mouvement de la partie de tête de l'utilisateur détectée par l'unité de détection de mouvement de partie de tête et de la résistance que l'utilisateur reçoit de la zone d'eau et détectée par l'unité de détection de résistance.
PCT/JP2018/044631 2018-12-04 2018-12-04 Système d'affichage d'espace virtuel WO2020115824A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/044631 WO2020115824A1 (fr) 2018-12-04 2018-12-04 Système d'affichage d'espace virtuel

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/044631 WO2020115824A1 (fr) 2018-12-04 2018-12-04 Système d'affichage d'espace virtuel

Publications (1)

Publication Number Publication Date
WO2020115824A1 true WO2020115824A1 (fr) 2020-06-11

Family

ID=70973853

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/044631 WO2020115824A1 (fr) 2018-12-04 2018-12-04 Système d'affichage d'espace virtuel

Country Status (1)

Country Link
WO (1) WO2020115824A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6928814B1 (ja) * 2021-06-13 2021-09-01 株式会社Rockin′Pool プール内エクササイズシステム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08266685A (ja) * 1995-03-31 1996-10-15 Sony Corp 潜水者支援情報集中表示装置
JP2010541306A (ja) * 2007-08-27 2010-12-24 シャオ,チュエン 宇宙での体性感覚体験のシミュレーション方法と装置
JP2018041180A (ja) * 2016-09-06 2018-03-15 株式会社マクスマラシステムズ 仮想空間表示システム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08266685A (ja) * 1995-03-31 1996-10-15 Sony Corp 潜水者支援情報集中表示装置
JP2010541306A (ja) * 2007-08-27 2010-12-24 シャオ,チュエン 宇宙での体性感覚体験のシミュレーション方法と装置
JP2018041180A (ja) * 2016-09-06 2018-03-15 株式会社マクスマラシステムズ 仮想空間表示システム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6928814B1 (ja) * 2021-06-13 2021-09-01 株式会社Rockin′Pool プール内エクササイズシステム
JP2022190178A (ja) * 2021-06-13 2022-12-23 株式会社Rockin′Pool プール内エクササイズシステム

Similar Documents

Publication Publication Date Title
CN103765352B (zh) 球形三维控制器
Burdea et al. Virtual reality technology
Frati et al. Using Kinect for hand tracking and rendering in wearable haptics
JP2022535315A (ja) 自己触覚型仮想キーボードを有する人工現実システム
JP2016115122A (ja) ヘッドマウントディスプレイシステム、ヘッドマウントディスプレイへ表示するための方法、及びプログラム
JP2008146619A (ja) 六軸の検知能力を具えた慣性の入力装置とその操作方法
WO2007053116A1 (fr) Systeme d'interface virtuelle
WO2004010370A2 (fr) Dispositif manuel interactif avec un ordinateur
KR20220016984A (ko) 디지트-매핑된 자가-햅틱 입력 방법을 갖는 인공 현실 시스템
KR20180005024A (ko) 보행 가능한 가상 현실 장치
CN109448050A (zh) 一种目标点的位置的确定方法及终端
Yau et al. How subtle can it get? a trimodal study of ring-sized interfaces for one-handed drone control
CN111831104A (zh) 头戴式显示系统、相关方法及相关计算机可读取记录媒体
JP2022184958A (ja) アニメーション制作システム
US20230142242A1 (en) Device for Intuitive Dexterous Touch and Feel Interaction in Virtual Worlds
WO2020115824A1 (fr) Système d'affichage d'espace virtuel
RU2662399C1 (ru) Система и способ для захвата движений и положения тела человека и частей тела человека
JP6518931B1 (ja) 仮想空間表示システム
JP2007079673A (ja) 描画装置
JP4779123B2 (ja) 人体動作を感知可能な電子ゲームコントローラ
JP2024066390A (ja) 触感を提供する成人向けコンテンツ用vr装置
Wang et al. Silver Surfer: A system to compare isometric and elastic board interfaces for locomotion in VR
JP2019193705A (ja) ゲームプログラム及びゲーム装置
JP7115695B2 (ja) アニメーション制作システム
JP6964302B2 (ja) アニメーション制作方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18942386

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30.08.2021)

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 18942386

Country of ref document: EP

Kind code of ref document: A1