WO2016163183A1 - Système de visiocasque et programme informatique pour la présentation d'un environnement ambiant d'espace réel d'un utilisateur dans un espace virtuel immersif - Google Patents

Système de visiocasque et programme informatique pour la présentation d'un environnement ambiant d'espace réel d'un utilisateur dans un espace virtuel immersif Download PDF

Info

Publication number
WO2016163183A1
WO2016163183A1 PCT/JP2016/056686 JP2016056686W WO2016163183A1 WO 2016163183 A1 WO2016163183 A1 WO 2016163183A1 JP 2016056686 W JP2016056686 W JP 2016056686W WO 2016163183 A1 WO2016163183 A1 WO 2016163183A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual space
hmd
surrounding environment
image
real
Prior art date
Application number
PCT/JP2016/056686
Other languages
English (en)
Japanese (ja)
Inventor
栗原秀行
Original Assignee
株式会社コロプラ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社コロプラ filed Critical 株式会社コロプラ
Publication of WO2016163183A1 publication Critical patent/WO2016163183A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention in a situation where a user wears a head-mounted display (HMD) on the head and is immersed in the three-dimensional virtual space, information related to the surrounding environment of the real space is displayed on the HMD in association with the three-dimensional virtual space.
  • the present invention relates to a head mounted display system including a computer to be controlled by the computer, and a computer program for causing the computer to function.
  • an HMD that is mounted on the user's head and can present an immersive three-dimensional virtual space image to the user by a display or the like disposed in front of the eyes.
  • a 360-degree panoramic image can be displayed in a three-dimensional virtual space.
  • Such an HMD typically includes various sensors (for example, an angle sensor or an angle sensor) and measures posture data of the HMD main body.
  • Such an HMD increases a sense of immersion in the video world and improves entertainment properties for the user.
  • the user can understand the environment around him / herself in real space as the user feels more immersed in the three-dimensional virtual space by wearing a non-transparent HMD that completely covers the user's field of view. Disappear. For example, even if there is another person close to the user wearing the HMD, the user often does not notice the presence.
  • each screen example shown in FIG. 1 is an example of a three-dimensional virtual space in which the user is immersed.
  • FIG. 1A shows a screen used in a virtual multi-display application in which a plurality of virtual televisions are arranged in a predetermined view area in a three-dimensional virtual space.
  • FIG. 1B is a screen used in an action-type RPG game application in which a user character moves around in a three-dimensional virtual space plane and battles an enemy character.
  • the present invention presents information on the surrounding environment in the three-dimensional virtual space so that the user can detect the surrounding environment in the real space when the user is wearing the HMD and is immersed in the three-dimensional virtual space.
  • the purpose is to inform the user.
  • a head-mounted display (HMD) system displays a virtual space image generated based on virtual space information and immerses the user in the virtual space.
  • An HMD that generates a peripheral image of a real space of the user, and a computer connected to the HMD and the real-time camera, and acquires the peripheral image from the real-time camera and uses the acquired peripheral image
  • a computer configured to detect the surrounding environment of the real space and output information related to the surrounding environment to the HMD in association with the virtual space information.
  • the detection of the surrounding environment using the surrounding image includes detection by human face recognition on the surrounding image, and the output of information on the surrounding environment includes notification of the number of people in the surrounding environment. . Further, the output of information related to the surrounding environment includes displaying the real space video from the real time camera on the HMD as a part of the virtual space image by displaying the video on the virtual display provided in the virtual space. In addition, the output of information related to the surrounding environment includes superimposing the character image on the virtual space image so that the character corresponding to the person is placed in the virtual space.
  • a real-time camera and a head-mounted display are connected to the computer.
  • a peripheral image acquisition unit that acquires a peripheral image of the real space
  • an environment detection unit that detects the peripheral environment of the real space using the acquired peripheral image
  • the above-mentioned computer is made to function as an output unit.
  • a real-time camera for photographing a surrounding environment in a real space is presented as a surrounding environment detection sensor, particularly a human detection sensor by presenting information on the surrounding environment in the real space in a three-dimensional virtual space. It is possible for a user who is immersed in the three-dimensional virtual space to detect a change in his / her surrounding environment without removing the HMD.
  • FIG. 1 is an example of a display screen implemented according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram illustrating an HMD system according to an embodiment of the present invention.
  • FIG. 3 shows an exemplary orthogonal coordinate system in a real space defined around the head of the user wearing the HMD shown in FIG.
  • FIG. 4 is a schematic diagram showing a plurality of detection points virtually provided on the HMD, which are detected by the position tracking camera.
  • FIG. 5 is a schematic diagram for presenting a user's surrounding environment in real space in an immersive virtual space according to an embodiment of the present invention.
  • FIG. 6 is a functional block diagram relating to the control circuit unit in the HMD system shown in FIG. FIG.
  • FIG. 7 is an exemplary process flow diagram for detecting the surrounding environment of a user according to an embodiment of the present invention.
  • FIG. 8 is an exemplary process flow diagram for detecting the surrounding environment of a user according to an embodiment of the present invention.
  • FIG. 9 is a screen example of the first embodiment relating to the presentation of the user's surrounding environment in the real space with respect to the immersive virtual space.
  • FIG. 10 is a screen example of the second embodiment regarding the presentation of the user's surrounding environment in the real space with respect to the immersive virtual space.
  • FIG. 11 is a screen example of the second embodiment relating to presentation of the user's surrounding environment in the real space with respect to the immersive virtual space.
  • FIG. 12 is a screen example of the second embodiment regarding the presentation of the user's surrounding environment in the real space with respect to the immersive virtual space.
  • FIG. 2 is an overall schematic diagram of an HMD system 100 using a head-mounted display (hereinafter referred to as “HMD”) in order to execute a computer program according to an embodiment of the present invention.
  • the HMD system 100 includes an HMD main body 110, a computer (control circuit unit) 120, a position tracking camera 130, and a real-time camera 140.
  • the HMD 110 includes a display 112 and a sensor 114.
  • the display 112 is a non-transmissive display device configured to completely cover the user's field of view, and the user observes only the image displayed on the display 112. Since the user wearing the non-transmissive HMD 110 loses all of the outside field of view, the display mode is such that the user is completely immersed in the three-dimensional virtual space displayed on the display 112 by the application executed in the control circuit unit 120. .
  • the sensor 114 included in the HMD 110 is fixed near the display 112.
  • the sensor 114 includes a geomagnetic sensor, an acceleration sensor, and / or a tilt (angle, gyro) sensor, and detects various movements of the HMD 110 (display 112) mounted on the user's head through one or more of them.
  • Can do Especially in the case of an angle sensor, as shown in FIG. 4, according to the movement of the HMD 110, the angle around the three axes of the HMD 110 is detected over time, and the time change of the angle (tilt) around each axis is determined. Can do.
  • XYZ coordinates are defined around the head of the user wearing the HMD.
  • the vertical direction in which the user stands upright is the Y axis
  • the direction orthogonal to the Y axis and connecting the center of the display 112 and the user is the Z axis
  • the axis in the direction orthogonal to the Y axis and the Z axis is the X axis.
  • an angle around each axis specifically, a yaw angle indicating rotation about the Y axis, a pitch angle indicating rotation about the X axis, and a roll angle indicating rotation about the Z axis Is detected, and the motion detection unit 220 determines angle (tilt) information data as view information based on the change over time.
  • the computer (control circuit unit) 120 included in the HMD system 100 is connected to the position tracking camera 130 and the real-time camera 140. And it functions as a control circuit device for immersing the user wearing the HMD into the three-dimensional virtual space and performing an operation based on the three-dimensional virtual space.
  • the control circuit unit 120 may be configured as hardware different from the HMD 110.
  • the hardware can be a computer such as a personal computer or a server computer via a network. That is, although not shown, any computer including a CPU, a main memory, an auxiliary memory, a transmission / reception unit, a display unit, and an input unit connected to each other by a bus can be used.
  • the control circuit unit 120 may be mounted inside the HMD 110 as a visual field adjustment device. In this case, the control circuit unit 120 can implement all or part of the functions of the visual field adjustment device. When only a part is implemented, the remaining functions may be implemented on the HMD 110 side or on a server computer (not shown) side via a network.
  • the position tracking camera 130 provided in the HMD system 100 is communicably connected to the control circuit unit 120 and has a position tracking function of the HMD 110.
  • the position tracking camera 130 is realized using an infrared sensor and / or a plurality of optical cameras.
  • the HMD system 100 includes a position tracking camera 130, and detects the position of the HMD of the user's head, thereby determining the position of the HMD in the real space and the virtual space position of the virtual camera / immersive user in the three-dimensional virtual space. It is possible to identify and associate accurately.
  • the position tracking camera 130 is virtually provided on the HMD 110, and the real space positions of a plurality of detection points that detect infrared rays are used as the user's movements. Corresponding detection over time. Then, based on the temporal change in the real space position detected by the position tracking camera 130, the position of the HMD in the real space and the virtual camera / immersive user virtual space in the three-dimensional virtual space according to the movement of the HMD 110. The position can be accurately associated and specified.
  • the position tracking camera 130 is an optional component. When the position tracking camera 130 is not used, the user is always arranged at the center (origin) in the three-dimensional virtual space.
  • the real-time camera 140 provided in the HMD system 100 captures and generates a real-time peripheral image of the user's real space, and stores the generated image in real time or at regular intervals.
  • the real-time camera may be a stereo camera that can record information in the depth direction by simultaneously photographing the surrounding environment from a plurality of different directions.
  • the real-time camera 140 has an interface such as USB or IEEE1394, and can transfer real-time images to the connection destination computer 120.
  • a network camera particularly a web camera, having a network interface and being locally / globally accessible via wired / wireless communication is preferable.
  • FIG. 5 is a schematic diagram for displaying information related to the surrounding environment of the real space on the HMD in association with the three-dimensional virtual space in order to present the surrounding environment of the user in the real space in the immersive virtual space.
  • the real-time camera 140 is installed on the upper part of the display of the computer 120, and the user 1 is placed with the HMD 110 attached so as to face it.
  • the three-dimensional image displayed on the user's HMD is simultaneously displayed on the display of the computer 120 as a two-dimensional image.
  • another two people (2, 3) are present at a close distance of the user 1 so as to look into the display of the computer 120.
  • the HMD user 1 since the HMD user 1 is immersed in the three-dimensional virtual space, the HMD user 1 cannot observe the state of the real space and has not yet noticed the existence of the people 2 and 3.
  • the HMD system notifies the HMD user 1 of the existence of the persons 2 and 3 by displaying information on the persons 2 and 3 in association with the three-dimensional virtual space on the HMD. More specifically, first, (i) the computer acquires an image of the real-time camera 140 in which the people 2 and 3 are reflected, and detects the people 2 and 3 from the images using a face recognition program. Next, (ii) according to the number of detected persons, for example, by displaying characters in a three-dimensional virtual space, the HMD user 1 is notified of the presence of two persons.
  • the face recognition / detection function in the face recognition program in (i) above those known to those skilled in the art may be used, and description thereof is omitted here. Even if the face recognition is used, the HMD user 1 is not detected as a human face because the upper half of the face is hidden by the HMD.
  • FIG. 6 shows a configuration of main functions of components related to the control circuit unit 120 of FIG. 2 in order to implement information processing for presenting the surrounding environment of the user in the real space in the immersive virtual space according to the embodiment of the present invention.
  • the control circuit unit 120 mainly receives an input from the sensor 114/130, processes the input together with the peripheral image acquired from the real-time camera 140, and outputs the input to the HMD (display) 112.
  • the control circuit unit 120 mainly includes a motion detection unit 210, a visual field determination unit 220, a visual field image generation unit 230, a peripheral image acquisition unit 250, an environment detection unit 260, and a spatial image superimposition unit 270. And it is comprised so that various information may be processed by interacting with various tables, such as the space information storage part 280 which stored virtual space information.
  • the motion detection unit 210 determines various types of motion data of the HMD 110 worn on the user's head based on the input of motion information measured by the sensor 114/130.
  • position information detected over time is determined by inclination (angle) information detected over time by an inclination sensor (gyro sensor) 114 included in the HMD.
  • the use of the position tracking camera 130 is arbitrary, and when not used, the user is always arranged at the center (origin) in the three-dimensional virtual space. Will do.
  • the position and direction of the virtual camera arranged in the three-dimensional virtual space based on the three-dimensional virtual space information stored in the spatial information storage unit 280 and the tilt information detected by the motion detection unit 210. And a field of view from the virtual camera is determined. Further, in order to display the three-dimensional virtual space image on the HMD, the view image generation unit 230 generates a partial view image of a 360-degree panorama for the view area determined by the view determination unit 220 using the virtual space information. can do. It is to be noted that the field-of-view image can be displayed on the HMD like a three-dimensional image by generating two two-dimensional images for the left eye and for the right eye and superimposing these two in the HMD. .
  • the peripheral image acquisition unit 250 captures peripheral images continuously captured by the real-time camera 140 into the storage unit. Then, the environment detection unit 260 detects the surrounding environment in the real space, particularly “changes” in the surrounding environment, using the respective surrounding images acquired by the surrounding image acquisition unit 250. More specifically, it is possible to detect the presence of a human face, particularly the increase or decrease in the number of people, from a peripheral image using a face recognition program. It should be understood that the execution of the face recognition program may be executed not as a function implemented on the control circuit unit 120 side but as a function implemented on the real-time camera 140 side.
  • the spatial image superimposing unit (output unit) 270 outputs information related to the surrounding environment detected by the environment detection unit 260 to the HMD (display) 112 in association with the virtual space information stored in the spatial information storage unit 300.
  • the number of detected people that is, the number of people in the surrounding environment is notified in the three-dimensional virtual space, and / or the detected person is used as a three-dimensional character in the three-dimensional virtual space.
  • the output of information related to the surrounding environment to the HMD is not limited to these.
  • a real-time camera 140 that can acquire depth information such as a plurality of stereo cameras the position information of the person to be detected can also be acquired.
  • the positional relationship between the HMD user and the person in the three-dimensional virtual space can be included as information about the surrounding environment.
  • FIG. 7 shows a processing flow in which the user changes the field of view in the immersive three-dimensional virtual space together with the user's HMD tilting operation as shown in FIGS. 1 (a) and 1 (b).
  • FIG. 8 shows a processing flow in which information related to the detected surrounding environment of the real space is displayed on the HMD in association with the three-dimensional virtual space.
  • the HMD 110 constantly detects the movement of the user's head using the sensors 114/130 as in step S10-1.
  • the control circuit unit 120 determines the tilt information and position information of the HMD 110 by the motion detection unit 210.
  • the motion detection unit 210 determines the position of the virtual camera arranged in the three-dimensional virtual space based on the position information.
  • the motion detection unit 210 also determines the position of the virtual camera. The direction of the virtual camera is determined based on the tilt information.
  • the position tracking camera 130 is not an essential component, and if it is not provided, step S20-1 is omitted and the position information need not be determined. .
  • step S20-3 the visual field determination unit 220 determines the visual field region from the virtual camera in the three-dimensional virtual space based on the position and orientation of the virtual camera and the predetermined viewing angle of the virtual camera.
  • step S20-4 the visual field image generation unit 230 generates a visual field image for the determined visual field region for display on the HMD 112.
  • step S10-2 the field-of-view image generated in step S20-4 is displayed on the display 112 of the HMD.
  • Steps S10-1, S20-1 to S20-4, and then S10-2 are a series of basic processing routines, and these steps are basically repeatedly processed during execution of the application.
  • a user who is immersed in the three-dimensional virtual space can view the field of view of the three-dimensional virtual space from various positions and directions through the operation of tilting his / her head.
  • the peripheral image acquisition unit 250 of the control circuit unit 120 regularly acquires a peripheral image from the real-time camera 140 in order to detect a person in the peripheral environment.
  • FIG. 8 shows a processing flow for displaying the information related to the detected surrounding environment of the real space on the HMD in association with the three-dimensional virtual space.
  • step S30-1 peripheral images are continuously generated through continuous camera shooting by the real-time camera 140.
  • a peripheral image is acquired by the peripheral image acquisition unit 250 of the control circuit unit 120
  • the peripheral environment is acquired using the peripheral image acquired by the environment detection unit 260. Is detected. In particular, it detects changes in the surrounding environment.
  • the face recognition function for the image is used to detect the people and the number of people present in the vicinity. In particular, when the number of people increases or decreases, it can be determined that the surrounding environment has changed, and it is advantageous to notify the user of the change.
  • step S20-8 the spatial image superimposing unit 270 generates information based on the surrounding environment and / or a corresponding image, and associates the information and / or image with the virtual space information stored in the virtual space information storage unit 300.
  • step S10-3 the information is displayed on the HMD 110. It should be understood that the information based on the surrounding environment includes, but is not limited to, the number of persons and character information.
  • FIG. 9 shows the first embodiment
  • FIGS. 10 to 12 show the second embodiment.
  • FIG. 9 shows a screen displayed by the virtual multi-display application shown in FIG. And it is an example in the case of one person specified by face recognition.
  • a message “There is one viewer.” Is displayed in the three-dimensional virtual space here (upper center of the screen).
  • the real space video from the real time camera is displayed as it is in the central portion that is the virtual main display, and is displayed on the HMD as a part of the virtual space image. That is, the situation where one person exists behind the HMD user and the face is photographed / detected by the real-time camera is displayed as it is in the three-dimensional virtual space and displayed on the HMD.
  • a user who is immersed in the three-dimensional virtual space can detect a change in the surrounding environment such as the presence of a person in the vicinity of the user even when the HMD is worn without being removed.
  • a real-time camera can be applied as a human detection sensor.
  • the character image is superimposed on the virtual space image so as to be reflected and displayed on the HMD.
  • the “bear” character corresponding to the person moves from left to right (arrow). It is better to display it on the HMD so that it is reflected.
  • FIG. 11 and FIG. 12 are examples of displaying two “Kuma” characters reflected while moving in this way on the HMD, and the HMD user is in the vicinity of two people in the vicinity of himself / herself. Changes in the environment can be detected with the HMD attached.
  • 11 shows a front view of the “Kuma” character
  • FIG. 12 shows a rear view.
  • HMD system 110 HMD 112 Display 114 Tilt sensor (gyro sensor) 120 Computer (Control circuit part) 130 Position tracking camera 140 Real-time camera 210 Motion detection unit 220 View determination unit 230 View image generation unit 250 Peripheral image acquisition unit 260 Environment detection unit 270 Spatial image superimposition (output) unit 280 Spatial information storage unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Optics & Photonics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'objectif de la présente invention est, lorsqu'un utilisateur porte un visiocasque (HMD) et qu'il est immergé dans un espace virtuel en trois dimensions, de présenter, à l'intérieur de l'espace virtuel en trois dimensions, des informations d'un environnement ambiant dans l'espace réel, de sorte que l'utilisateur puisse ressentir l'impression d'un environnement d'espace réel ambiant. Ce système HMD comprend : un HMD que porte un utilisateur, qui affiche une image d'espace virtuel qui est générée sur la base d'informations d'espace virtuel, et qui immerge l'utilisateur dans un espace virtuel ; une caméra en temps réel qui génère une image de l'environnement d'espace réel de l'utilisateur ; et un ordinateur qui est connecté au HMD et à la caméra en temps réel, ledit ordinateur étant configuré pour acquérir l'image de l'environnement à partir de la caméra en temps réel, utiliser l'image acquise de l'environnement pour détecter l'environnement d'espace réel ambiant, et délivrer en sortie des informations relatives à l'environnement ambiant au HMD en association avec les informations d'espace virtuel.
PCT/JP2016/056686 2015-04-08 2016-03-03 Système de visiocasque et programme informatique pour la présentation d'un environnement ambiant d'espace réel d'un utilisateur dans un espace virtuel immersif WO2016163183A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015078921A JP5869712B1 (ja) 2015-04-08 2015-04-08 没入型仮想空間に実空間のユーザの周辺環境を提示するためのヘッドマウント・ディスプレイ・システムおよびコンピュータ・プログラム
JP2015-078921 2015-04-08

Publications (1)

Publication Number Publication Date
WO2016163183A1 true WO2016163183A1 (fr) 2016-10-13

Family

ID=55360933

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/056686 WO2016163183A1 (fr) 2015-04-08 2016-03-03 Système de visiocasque et programme informatique pour la présentation d'un environnement ambiant d'espace réel d'un utilisateur dans un espace virtuel immersif

Country Status (2)

Country Link
JP (1) JP5869712B1 (fr)
WO (1) WO2016163183A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10685211B2 (en) 2015-08-04 2020-06-16 Sony Interactive Entertainment Inc. Head-mounted display, display control method, and program

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102436464B1 (ko) * 2015-06-05 2022-08-26 삼성전자주식회사 알림 정보 제공 방법 및 그 전자 장치
JP5996814B1 (ja) 2016-02-08 2016-09-21 株式会社コロプラ 仮想空間の画像をヘッドマウントディスプレイに提供する方法及びプログラム
JP6392832B2 (ja) * 2016-12-06 2018-09-19 株式会社コロプラ 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム
JP6812803B2 (ja) * 2017-01-12 2021-01-13 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP6244593B1 (ja) * 2017-01-30 2017-12-13 株式会社コロプラ 情報処理方法、装置、および当該情報処理方法をコンピュータに実行させるためのプログラム
JP6917340B2 (ja) * 2018-05-17 2021-08-11 グリー株式会社 データ処理プログラム、データ処理方法、および、データ処理装置
JP6979539B2 (ja) * 2018-11-06 2021-12-15 株式会社ソニー・インタラクティブエンタテインメント 情報処理システム、表示方法およびコンピュータプログラム
US20220129068A1 (en) * 2019-07-11 2022-04-28 Hewlett-Packard Development Company, L.P. Eye tracking for displays

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010145436A (ja) * 2008-12-16 2010-07-01 Brother Ind Ltd ヘッドマウントディスプレイ
JP2012114755A (ja) * 2010-11-25 2012-06-14 Brother Ind Ltd ヘッドマウントディスプレイ及びコンピュータプログラム
JP2013257716A (ja) * 2012-06-12 2013-12-26 Sony Computer Entertainment Inc 障害物回避装置および障害物回避方法
JP2015191124A (ja) * 2014-03-28 2015-11-02 ソフトバンクBb株式会社 非透過型ヘッドマウントディスプレイ及びプログラム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009145883A (ja) * 2007-11-20 2009-07-02 Rissho Univ 学習システム、記憶媒体及び学習方法
US20160054565A1 (en) * 2013-03-29 2016-02-25 Sony Corporation Information processing device, presentation state control method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010145436A (ja) * 2008-12-16 2010-07-01 Brother Ind Ltd ヘッドマウントディスプレイ
JP2012114755A (ja) * 2010-11-25 2012-06-14 Brother Ind Ltd ヘッドマウントディスプレイ及びコンピュータプログラム
JP2013257716A (ja) * 2012-06-12 2013-12-26 Sony Computer Entertainment Inc 障害物回避装置および障害物回避方法
JP2015191124A (ja) * 2014-03-28 2015-11-02 ソフトバンクBb株式会社 非透過型ヘッドマウントディスプレイ及びプログラム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10685211B2 (en) 2015-08-04 2020-06-16 Sony Interactive Entertainment Inc. Head-mounted display, display control method, and program
US11417126B2 (en) 2015-08-04 2022-08-16 Sony Interactive Entertainment Inc. Head-mounted display, display control method, and program
US11763578B2 (en) 2015-08-04 2023-09-19 Sony Interactive Entertainment Inc. Head-mounted display, display control method, and program
US12175776B2 (en) 2015-08-04 2024-12-24 Sony Interactive Entertainment Inc. Head-mounted display, display control method, and program

Also Published As

Publication number Publication date
JP2016198180A (ja) 2016-12-01
JP5869712B1 (ja) 2016-02-24

Similar Documents

Publication Publication Date Title
JP5869712B1 (ja) 没入型仮想空間に実空間のユーザの周辺環境を提示するためのヘッドマウント・ディスプレイ・システムおよびコンピュータ・プログラム
CN110413105B (zh) 虚拟环境内的虚拟对象的有形可视化
KR102502404B1 (ko) 정보 처리 장치 및 방법, 그리고 프로그램
JP5996814B1 (ja) 仮想空間の画像をヘッドマウントディスプレイに提供する方法及びプログラム
WO2017047367A1 (fr) Programme informatique destiné au guidage de ligne de visée
JP2013258614A (ja) 画像生成装置および画像生成方法
US11184597B2 (en) Information processing device, image generation method, and head-mounted display
JP6899875B2 (ja) 情報処理装置、映像表示システム、情報処理装置の制御方法、及びプログラム
US11195320B2 (en) Feed-forward collision avoidance for artificial reality environments
US20180246331A1 (en) Helmet-mounted display, visual field calibration method thereof, and mixed reality display system
WO2015068656A1 (fr) Dispositif et procédé de production d'images
JP6399692B2 (ja) ヘッドマウントディスプレイ、画像表示方法及びプログラム
JP2017093946A (ja) 画像表示方法及びプログラム
JP7625102B2 (ja) 情報処理装置、ユーザガイド提示方法、およびヘッドマウントディスプレイ
JP2017138973A (ja) 仮想空間の提供方法、およびプログラム
US11212501B2 (en) Portable device and operation method for tracking user's viewpoint and adjusting viewport
JP2021135776A (ja) 情報処理装置、情報処理方法、およびプログラム
JP5952931B1 (ja) コンピュータ・プログラム
US20230290081A1 (en) Virtual reality sharing method and system
JP2017046233A (ja) 表示装置及び情報処理装置及びその制御方法
US11474595B2 (en) Display device and display device control method
US20150379775A1 (en) Method for operating a display device and system with a display device
JP2016181267A (ja) コンピュータ・プログラム
US20210314557A1 (en) Information processing apparatus, information processing method, and program
JP6738308B2 (ja) 情報処理方法、プログラム、仮想空間配信システム及び装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16776352

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16776352

Country of ref document: EP

Kind code of ref document: A1