WO2024150615A1 - Image generation device, image generation method, and image generation program - Google Patents

Image generation device, image generation method, and image generation program Download PDF

Info

Publication number
WO2024150615A1
WO2024150615A1 PCT/JP2023/045397 JP2023045397W WO2024150615A1 WO 2024150615 A1 WO2024150615 A1 WO 2024150615A1 JP 2023045397 W JP2023045397 W JP 2023045397W WO 2024150615 A1 WO2024150615 A1 WO 2024150615A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewpoint
hmd
dimensional image
image
virtual space
Prior art date
Application number
PCT/JP2023/045397
Other languages
French (fr)
Japanese (ja)
Inventor
貴一 池田
武 中川
崇 天田
和城 中村
Original Assignee
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・インタラクティブエンタテインメント filed Critical 株式会社ソニー・インタラクティブエンタテインメント
Publication of WO2024150615A1 publication Critical patent/WO2024150615A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to image generation technology.
  • Players wear a head-mounted display connected to a game console on their head and operate a controller while looking at the screen displayed on the head-mounted display.
  • the user can see nothing but the image displayed on the head-mounted display, which increases the sense of immersion in the world of the images and has the effect of further enhancing the entertainment value of the game.
  • VR virtual reality
  • images are displayed on the head-mounted display and the user wearing the head-mounted display rotates their head, a 360-degree virtual space that can be seen all around is displayed. This further increases the sense of immersion in the images and improves the operability of applications such as games.
  • a head-mounted display When a head-mounted display is equipped with a head tracking function in this way and VR images are generated by changing the viewpoint position and direction in conjunction with the position and direction of the user's head, there is a delay between the generation and display of the VR images. This can result in a discrepancy between the position and direction of the user's head assumed when the image is generated and the position and direction of the user's head at the time the VR image is displayed on the head-mounted display. As a result, the user may experience a feeling of sickness (known as "VR sickness (Virtual Reality Sickness)"). For this reason, a reprojection process is commonly used to correct the rendered image to match the position and direction of the head-mounted display when the image is displayed.
  • VR sickness Virtual Reality Sickness
  • the user's viewpoint position and viewpoint direction in the virtual space change not only in response to the position and direction of the head mounted display, but also in response to changes in the position and direction in which the user is positioned in the virtual space.
  • This position and direction in which the user is positioned may change between the time an image is generated and the time the next image is generated. In this case, the change in the user's position and direction during this time may not be reflected, and the three-dimensional image may differ from what the user expected, causing the user to feel uncomfortable.
  • the object of the present invention is to provide a technology that prevents a user from feeling uncomfortable about a three-dimensional image due to a misalignment of the user's position in a virtual space.
  • an image generating device includes a reference determination unit that determines at least one of a reference position and a reference direction, which are a position and a direction in which a user is to be placed in a virtual space, based on an instruction input different from at least one of an instruction input of a position and a direction of a head-mounted display; a viewpoint determination unit that determines at least one of a viewpoint position and a viewpoint direction of the user in the virtual space, based on at least one of the reference position and the reference direction; a rendering unit that renders an object in the virtual space, based on at least one of the viewpoint position and the viewpoint direction, to generate a three-dimensional image; and a reprojection unit that executes a reprojection process that converts the three-dimensional image to match at least one of the new reference position and the reference direction.
  • An image generating method includes the steps of: determining at least one of a user's reference position and reference direction in a virtual space based on an instruction input different from at least one of an instruction input of a position and an orientation of a head-mounted display; determining at least one of a viewpoint position and a viewpoint direction of the user in the virtual space based on the at least one of the reference position and the reference direction; rendering an object in the virtual space based on the at least one of the viewpoint position and the viewpoint direction to generate a three-dimensional image; and executing a reprojection process to convert the three-dimensional image to match the new at least one of the reference position and the reference direction.
  • Another aspect of the image generation program of the present invention is an image generation program for causing a computer to execute the steps of: determining at least one of a user's reference position and reference direction in a virtual space based on an instruction input different from at least one of an instruction input of a position and direction of a head-mounted display; determining at least one of a viewpoint position and viewpoint direction of the user in the virtual space based on at least one of the reference position and reference direction; rendering an object in the virtual space based on at least one of the viewpoint position and viewpoint direction to generate a three-dimensional image; and executing a reprojection process to convert the three-dimensional image to match at least one of the new reference position and reference direction.
  • the present invention makes it possible to provide a technology that prevents users from feeling uncomfortable about three-dimensional images due to misalignment of the user in a virtual space.
  • FIG. 1 is an external view of a head mounted display.
  • FIG. 1 is a configuration diagram of an image generating system.
  • FIG. 3 is a configuration diagram of the image generating device of FIG. 2.
  • 10A to 10C are diagrams for explaining the relationship between a reference position and a reference direction, an HMD position and an HMD direction, and a viewpoint position and a viewpoint direction.
  • 4 is a flowchart showing a flow of processing relating to image generation in the image generating device of the first embodiment.
  • 11A to 11C are diagrams for explaining the relationship between the amount of change in the reference position and reference direction, the amount of change in the HMD position and HMD direction, and a new viewpoint position and viewpoint direction.
  • FIG. 4 is a diagram for explaining the flow of a reprojection process according to the first embodiment.
  • 10 is a flowchart showing a flow of processing relating to image generation in an image generating device according to a second embodiment.
  • FIG. 11 is a diagram for explaining the flow of a repro
  • the HMD 100 is a display device that is worn on a user's head to view still images and videos displayed on a display and to listen to audio and music outputted from headphones.
  • the position of the head of a user wearing the HMD 100 and the direction of the head's rotation angle and tilt can be measured using a gyro sensor or acceleration sensor built into or attached externally to the HMD 100.
  • the position and direction of the head are detected based on the position and direction of the head at the time the HMD 100 is turned on.
  • the HMD 100 may further be provided with a camera that photographs the user's eyes. The camera mounted on the HMD 100 can detect the user's gaze direction, pupil movement, blinking, etc.
  • FIG. 2 is a configuration diagram of an image generation system according to this embodiment.
  • the image generation system 1 includes an HMD 100, an image generation device 200, and an input device 300.
  • the image generating device 200 of this embodiment is a game machine.
  • the image generating device 200 may be connected to a server via a network.
  • the server may provide the image generating device 200 with an online application such as a game in which multiple users can participate via the network.
  • the image generating device 200 basically processes the content program, generates three-dimensional images, and transmits them to the HMD 100.
  • the content program and data are read by a media drive (not shown) from a ROM medium (not shown) on which the content's application software, such as a game, and license information are recorded.
  • This ROM medium is a read-only recording medium such as an optical disk, a magneto-optical disk, or a Blu-ray (registered trademark) disk.
  • the image generating device 200 determines the viewpoint position and viewpoint direction based on the position and direction of the head of the user wearing the HMD 100 (hereinafter referred to as the HMD position and HMD direction) and the reference position and reference direction described below, generates three-dimensional images of the content at a predetermined rate so that the field of view is accordingly obtained, and transmits them to the HMD 100.
  • the HMD 100 receives and displays the three-dimensional image generated by the image generating device 200.
  • the three-dimensional image displayed on the HMD 100 may be an image captured in advance by a camera, an image generated by computer graphics such as a game image, or a live image from a remote location delivered via a network.
  • the three-dimensional image displayed on the HMD 100 may also be a VR image, an AR (augmented reality) image, an MR (mixed reality) image, etc.
  • the input device 300 supplies various inputs received from the user to the image generating device 200.
  • the input device 300 supplies the image generating device 200 with instruction inputs for changing the user's position and direction in the virtual space.
  • the input device 300 is realized by any of the typical input devices, such as a game controller, a keyboard, a mouse, a joystick, a video camera that captures the user's gestures, a touchpad provided on the display screen of a flat panel display, or a combination of these.
  • the input device 300 of this embodiment is an example of a user input device.
  • FIG. 3 is a functional configuration diagram of the image generating device 200 of FIG. 2.
  • the diagram is a block diagram focusing on functions, and these functional blocks can be realized in various forms by hardware only, software only, or a combination of both.
  • At least a part of the functions of the image generating device 200 may be implemented by the HMD 100.
  • at least a part of the functions of the image generating device 200 may be implemented by a server connected to the image generating device 200 via a network.
  • the image generating device 200 comprises a game control unit 201, a reference determination unit 202, a viewpoint determination unit 203, a rendering unit 204, an image processing unit 205, a transmission unit 206, and a storage unit 207.
  • the game control unit 201 executes the game program read from the storage unit 207 to progress the game.
  • the game control unit 201 transmits instruction inputs to the reference determination unit 202 for progressing through a predetermined scene in the virtual space, and changes the reference position and reference direction of the user in the virtual space, which will be described later.
  • the "predetermined scene in the virtual space” refers to, for example, a scene that represents an action of an object on the user in the virtual space, or an event scene that occurs in the virtual space.
  • the "action of an object on the user in the virtual space” refers to, for example, an action that moves the user in a direction that is predetermined according to the positional relationship between the user and the object, such as when the user collides with an object of a physical object, or when the user is pushed or pulled by an enemy character object, or an action that moves the user according to a predetermined progression path of a vehicle, such as when the user rides on a vehicle object.
  • the "event scene that occurs in the virtual space” refers to a scene in which the user is forced to move in a predetermined pattern, such as a movie scene.
  • the reference determination unit 202 determines a reference position and a reference direction, which are the position and direction in which the user is placed in the virtual space, based on an instruction input different from the instruction input of the position and direction of the HMD 100 (hereinafter, HMD position and HMD direction).
  • the reference position and the reference direction change regardless of the change in the HMD position and the HMD direction.
  • the reference position and the reference direction change based on at least one of the instruction input via the input device 300 and the instruction input from the game control unit 201 for progressing a predetermined scene in the virtual space.
  • the reference position is set to the center of the user's body in the virtual space (e.g., the center of the torso, etc.), and the reference direction is set to the direction of the user's body in the virtual space (e.g., the direction of the torso, etc.).
  • setting the reference position to the center of the user's body is merely an example, and the reference position can be set to any position based at least on the user's position.
  • the reference determination unit 202 reads out a reference position and a reference direction in the virtual space from the storage unit 207, and determines a reference position and a reference direction based on an instruction input received from at least one of the input device 300 and the game control unit 201.
  • the reference determination unit 202 updates the reference position and reference direction in the storage unit 207 with the determined reference position and reference direction.
  • the viewpoint determination unit 203 acquires the HMD position and HMD direction as well as the reference position and reference direction, and determines the user's viewpoint position and viewpoint direction in the virtual space based on these. For example, the viewpoint determination unit 203 determines the viewpoint position and viewpoint direction based on the reference position and reference direction so as to reflect the relative position and relative direction of the HMD 100 with respect to the reference position and reference direction.
  • FIG. 4 is a diagram of a three-dimensional virtual space consisting of mutually orthogonal x-, y-, and z-axes, viewed from the z direction.
  • the rotation angle ⁇ is about the x-axis
  • the rotation angle ⁇ is about the y-axis
  • the rotation angle ⁇ is about the z-axis.
  • the viewpoint position is determined to be (x1+x2, y1+y2, z1+z2).
  • the viewpoint direction is determined to be ( ⁇ 1+ ⁇ 2, ⁇ 1+ ⁇ 2, ⁇ 1+ ⁇ 2).
  • the rendering unit 204 reads image data required for generating an image from the storage unit 207, and generates a three-dimensional image by rendering objects in a virtual space. For example, the rendering unit 204 generates a three-dimensional image by rendering objects in a virtual space that are visible from the viewpoint position and in the viewpoint direction of a user wearing the HMD 100, based on the user's viewpoint position and viewpoint direction determined by the viewpoint determination unit 203.
  • the image processing unit 205 processes the rendered image as necessary to generate a three-dimensional image (hereinafter referred to as an HMD image) to be displayed on the HMD 100, and provides it to the transmission unit 206.
  • the image processing unit 205 includes a post-processing unit 205a, a reprojection unit 205b, and a distortion processing unit 205c.
  • the reprojection unit 205b executes the reprojection process.
  • the reprojection process includes a first reprojection process and a second reprojection. The first reprojection process and the second reprojection process will be described later.
  • the reprojection unit 205b supplies the three-dimensional image that has been subjected to the reprojection process to the distortion processing unit 205c.
  • the distortion processing unit 205c performs processing to distort the three-dimensional image by modifying the image to match the distortion caused by the optical system of the HMD 100.
  • the transmission unit 206 transmits the HMD image generated in the image processing unit 205 to the HMD 100.
  • the transmission unit 206 transmits the HMD image that has been subjected to the first reprojection process, and then transmits the HMD image that has been subjected to the second reprojection process.
  • the transmission unit 206 in this embodiment causes the HMD 100 to display the HMD image that has been subjected to the first reprojection process, and then causes the HMD 100 to display the HMD image that has been subjected to the second reprojection process.
  • the transmission unit in this embodiment is an example of a display control unit.
  • the memory unit 207 stores the reference position and reference direction, data required for generating images, and various programs for executing various processes.
  • step S102 the viewpoint determination unit 203 sets the viewpoint position and viewpoint direction based on the HMD position and HMD direction and the reference position and reference direction.
  • the viewpoint determination unit 203 supplies the determined viewpoint position and viewpoint direction to the rendering unit 204.
  • step S103 the rendering unit 204 renders the object in the virtual space based on the determined viewpoint position and viewpoint direction to generate a three-dimensional image.
  • the rendering unit 204 supplies the generated three-dimensional image to the post-processing unit 205a of the image processing unit 205.
  • step S104 the post-processing unit 205a performs post-processing on the rendered three-dimensional image.
  • the post-processing unit 205a supplies the three-dimensional image that has been subjected to the post-processing to the reprojection unit 205b.
  • step S105 the reprojection unit 205b acquires a new first HMD position and first HMD direction.
  • the new first HMD position and first HMD direction here are the HMD position and HMD direction obtained at the time of drawing the three-dimensional image.
  • the reprojection unit 205b executes a first reprojection process on the post-processed three-dimensional image based on the amount of change from the HMD position and HMD orientation to the new first HMD position and first HMD orientation.
  • the first reprojection process is a process of converting the post-processed three-dimensional image to match the new HMD position and HMD orientation based on the HMD position and HMD orientation and the new HMD position and HMD orientation.
  • the first reprojection process can reduce delays due to deviations between the HMD position and HMD orientation in step S102 and the new first HMD position and first HMD orientation obtained at the drawing timing.
  • the reprojection unit 205b supplies the post-processed and first reprojection processed three-dimensional image to the distortion processing unit 205c.
  • step S107 the distortion processing unit 205c performs distortion processing on the three-dimensional image that has been subjected to the post-processing and the first reprojection processing.
  • the distortion processing unit 205c supplies the three-dimensional image that has been subjected to the post-processing, the first reprojection processing, and the distortion processing to the transmission unit 206.
  • step S108 the transmission unit 206 transmits the three-dimensional image that has been subjected to the post-processing, the first reprojection processing, and the distortion processing to the HMD 100 as an HMD image.
  • the HMD image that has been subjected to the first reprojection processing is displayed on the HMD 100.
  • step S109 the reprojection unit 205b acquires a new reference position and reference direction as well as a new second HMD position and second HMD direction.
  • the new reference position and reference direction as well as the new second HMD position and second HMD direction are, for example, the reference position and reference direction as well as the HMD position and HMD direction after a predetermined time has elapsed from the timing when the HMD image is displayed on the HMD 100.
  • the reference position and reference direction as well as the HMD position and HMD direction after a predetermined time of 8.3 milliseconds has elapsed from the timing when the HMD image is displayed on the HMD 100 are set as the new reference position and reference direction as well as the new second HMD position and second HMD direction.
  • the timing when the HMD image is displayed on the HMD 100 is determined taking into consideration the transmission time, etc., expected when the image rendered by the image generating device 200 is transmitted to the HMD 100.
  • the reprojection unit 205b executes a second reprojection process based on the amount of change from the reference position and reference direction to the new reference position and reference direction (hereinafter referred to as the amount of change in the reference position and reference direction) and the amount of change from the HMD position and HMD direction to the new second HMD position and second HMD direction (hereinafter referred to as the amount of change in the HMD position and HMD direction).
  • the second reprojection process is a process that converts the post-processed three-dimensional image to match the new viewpoint position and viewpoint direction based on the reference position and reference direction, the new reference position and reference direction, the HMD position and HMD direction, and the new HMD position and HMD direction.
  • the second reprojection process converts the post-processed three-dimensional image into a three-dimensional image that can be seen from the new viewpoint position and viewpoint direction.
  • the relationship between the amount of change in the reference position and reference direction, the amount of change in the HMD position and HMD direction, and the new viewpoint position and viewpoint direction will be described with reference to FIG. 6.
  • the viewpoint position determined in step S102 is (x, y, z)
  • the amount of change in the reference position is ( ⁇ xr, ⁇ yr, ⁇ zr)
  • the amount of change in the HMD position is ( ⁇ xh, ⁇ yh, ⁇ zh)
  • the new viewpoint position is expressed as (x+ ⁇ xr+ ⁇ xh, y+ ⁇ yr+ ⁇ yh, z+ ⁇ zr+ ⁇ zh).
  • the new viewpoint direction is expressed as ( ⁇ + ⁇ r+ ⁇ h, ⁇ + ⁇ r+ ⁇ h, ⁇ + ⁇ r+ ⁇ h).
  • the reprojection unit 205b executes the reprojection process, for example, by linearly complementing the viewpoint position based on the new viewpoint position expressed as above, and spherically linearly complementing the viewpoint direction based on the new viewpoint direction expressed as above. This converts the three-dimensional image to match the new viewpoint position and viewpoint direction.
  • the reprojection unit 205b supplies the three-dimensional image that has been subjected to the second reprojection process to the distortion processing unit 205c.
  • step S111 the distortion processing unit 205c performs distortion processing on the three-dimensional image that has been subjected to the second reprojection processing.
  • the distortion processing unit 205c supplies the three-dimensional image that has been subjected to the post-processing, the second reprojection processing, and the distortion processing to the transmission unit 206.
  • step S112 the transmission unit 206 transmits the three-dimensional image that has been subjected to the post-processing, second reprojection processing, and distortion processing to the HMD 100 as an HMD image.
  • the HMD image that has been subjected to the second reprojection processing is displayed on the HMD 100.
  • step S112 process S100 ends.
  • the synchronization timing Vsync indicates the vertical synchronization timing of the display panel of the HMD 100.
  • a rendering process 1 is executed based on the reference position and reference direction and the HMD position and HMD direction at that drawing timing.
  • the first reprojection process is executed on the drawn three-dimensional image based on the new first HMD position and first HMD direction obtained at the drawing timing of the rendering process 1, and the three-dimensional image after the first reprojection process is transmitted to the HMD 100.
  • the three-dimensional image 1-1 to which the first reprojection process is applied at timestamp t0 is displayed on the HMD 100 (corresponding to steps S101 to S108 above).
  • the reason why the reference position and reference direction are not taken into consideration in the first reprojection process is that, as described above, the reference position and reference direction at the drawing timing of the three-dimensional image 1-1 are used, and it can be considered that the reference position and reference direction at the timestamp t0 are roughly reflected.
  • a second reprojection process is performed on the three-dimensional image rendered in the rendering process 1 based on a new reference position and reference direction after a predetermined time has elapsed since the timestamp t0 (for example, immediately before the timestamp t1) and a new second HMD position and second HMD direction, and the three-dimensional image that has undergone the second reprojection process is transmitted to the HMD 100.
  • the predetermined time here is, for example, a time that is less than the frame interval time from the time when the three-dimensional image is displayed on the HMD 100 to the time when the three-dimensional image of the next frame is displayed on the HMD 100, and can be set to, for example, a time equivalent to half the frame interval time.
  • rendering process 2 is executed based on the reference position and reference direction as well as the HMD position and HMD direction at that drawing timing.
  • a first reprojection process is executed on the drawn three-dimensional image based on the new first HMD position and first HMD direction obtained at the drawing timing of rendering process 2, and a three-dimensional image 2-1 to which the first reprojection process has been applied at timestamp t2 is displayed on the HMD 100.
  • a second reprojection process is executed on the three-dimensional image drawn in rendering process 2 based on the new reference position and reference direction as well as the new second HMD position and second HMD direction after a predetermined time has elapsed since timestamp t2 (for example, immediately before timestamp t3), and a three-dimensional image 2-2 to which the second reprojection process has been applied at timestamp t3 is displayed on the HMD 100.
  • the user's reference position and reference direction in the virtual space may change. If the drawing is performed at a frame rate of, for example, 60 fps (frames per second), even if the CPU is fast enough, the reference position and reference direction that change in about 16.67 milliseconds will not be reflected in the HMD image. As a result, the HMD image may differ from what the user expected, causing the user to feel uncomfortable.
  • a reprojection process is performed to convert the three-dimensional image to match a new reference position and reference direction.
  • the second reprojection process converts the three-dimensional image to match the new viewpoint position and viewpoint direction based on the amount of change from the HMD position and HMD direction to the new second HMD position and second HMD direction, and the amount of change from the reference position and reference direction to the new reference position and reference direction.
  • the reprojection process is executed taking into account changes in the HMD position and HMD direction in addition to changes in the reference position and reference direction, so that it is possible to more effectively prevent the user from feeling uncomfortable about the HMD image.
  • the transmission unit 206 displays a three-dimensional image on the HMD 100, and then displays a three-dimensional image that has been subjected to the second reprojection process on the HMD 100.
  • the second reprojection process converts the three-dimensional image to match a new reference position and reference direction after a predetermined time has elapsed since the time the three-dimensional image was displayed.
  • the viewpoint position and viewpoint direction are determined based on the HMD position and HMD direction and the reference position and reference direction, but this is not limited to the above, and the viewpoint position and viewpoint direction may be determined based on the reference position and reference direction. In this case, the reprojection process may be performed based on the new reference position and reference direction.
  • the reprojection process is performed on the position and the direction, but the reprojection process may be performed on at least one of the position and the direction.
  • one three-dimensional image to which the second reprojection process has been applied is generated and displayed between the drawing timing of one three-dimensional image and the drawing timing of the next, but this is not limited to this, and two or more three-dimensional images to which the second reprojection process has been applied may be generated and displayed at different timestamps.
  • an HMD image that has been subjected to the first reprojection process is transmitted, and then an HMD image that has been subjected to the second reprojection process is transmitted.
  • an HMD image that has not been subjected to the reprojection process i.e., a three-dimensional image generated by the rendering process in step S103
  • an HMD image that has been subjected to the second reprojection process may be transmitted.
  • Steps S201 to S208 are basically the same as steps S101 to S104 and S109 to S112 in FIG. 5. That is, the process S200 of this embodiment does not execute a first reprojection process based only on a new HMD position and HMD direction as in step S108 of FIG. 5, but executes a second reprojection process based on a new reference position and reference direction at the timing when the three-dimensional image is displayed on the HMD 100, as well as a new HMD position and HMD direction.
  • the three-dimensional image to which reprojection processing has been applied based on the new reference position and reference direction at the time when the three-dimensional image is displayed on the HMD 100 and the new HMD position and HMD direction is applied is displayed on the HMD 100.
  • one or more three-dimensional images to which a new reprojection process has been applied may be displayed so as to match a new reference position and reference direction after a predetermined time has elapsed since the timing when the three-dimensional image in that rendering process was first displayed. That is, although the embodiment in FIG. 9 shows that the second reprojection process is performed only once for one rendering process, it is also possible to use an embodiment in which the second reprojection process is performed multiple times for one rendering process.
  • the present invention relates to image generation technology.
  • 100 Head-mounted display 200 Image generation device, 201 Game control unit, 202 Reference determination unit, 203 Viewpoint determination unit, 204 Rendering unit, 205 Image processing unit, 205a Post-processing unit, 205b Reprojection unit, 205c Distortion processing unit, 206 Transmission unit, 207 Memory unit, 300 Input device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Generation (AREA)

Abstract

This image generation device 200 according to the present invention is equipped with: a reference determination unit 202 for determining a reference position and/or a reference direction, which are the position and direction in which a user is positioned in a virtual space, on the basis of an instruction input which differs from the instruction input of the position and/or direction of a head-mounted display 100; a viewpoint determination unit 203 for determining the viewpoint position and/or viewpoint direction of the user in the virtual space on the basis of the reference position and/or reference direction; a rendering unit 204 for generating a three-dimensional image by rendering an object in virtual space on the basis of the viewpoint position and/or viewpoint direction; and a re-projection unit 205b for executing re-projection processing for transforming the three-dimensional image so as to match a new reference position and/or reference direction.

Description

画像生成装置、画像生成方法および画像生成プログラムImage generating device, image generating method, and image generating program
 本発明は、画像生成技術に関する。 The present invention relates to image generation technology.
 ゲーム機に接続されたヘッドマウントディスプレイを頭部に装着して、ヘッドマウントディスプレイに表示された画面を見ながら、コントローラなどを操作してゲームプレイすることが行われている。ヘッドマウントディスプレイを装着すると、ヘッドマウントディスプレイに表示される映像以外をユーザは見ないため、映像世界への没入感が高まり、ゲームのエンタテインメント性を一層高める効果がある。また、ヘッドマウントディスプレイに仮想現実(VR:Virtual Reality)映像を表示させ、ヘッドマウントディスプレイを装着したユーザが頭部を回転させると、360度見渡せる全周囲の仮想空間が表示される。これにより、さらに映像への没入感が高まり、ゲームなどのアプリケーションの操作性も向上する。 Players wear a head-mounted display connected to a game console on their head and operate a controller while looking at the screen displayed on the head-mounted display. When wearing a head-mounted display, the user can see nothing but the image displayed on the head-mounted display, which increases the sense of immersion in the world of the images and has the effect of further enhancing the entertainment value of the game. In addition, when virtual reality (VR) images are displayed on the head-mounted display and the user wearing the head-mounted display rotates their head, a 360-degree virtual space that can be seen all around is displayed. This further increases the sense of immersion in the images and improves the operability of applications such as games.
 このようにヘッドマウントディスプレイにヘッドトラッキング機能をもたせて、ユーザの頭部の位置及び方向と連動して視点位置及び視点方向を変えてVR映像を生成した場合、VR映像の生成から表示までに遅延がある。そのため、映像生成時に前提としたユーザの頭部の位置及び方向と、VR映像をヘッドマウントディスプレイに表示した時点でのユーザの頭部の位置及び方向との間でずれが発生する場合がある。その結果、ユーザは酔ったような感覚(「VR酔い(Virtual Reality Sickness)」などと呼ばれる)に陥ることがある。そこで、描画画像を映像表示時のヘッドマウントディスプレイの位置及び方向に合ったものに補正するリプロジェクション処理が一般的に用いられている。 When a head-mounted display is equipped with a head tracking function in this way and VR images are generated by changing the viewpoint position and direction in conjunction with the position and direction of the user's head, there is a delay between the generation and display of the VR images. This can result in a discrepancy between the position and direction of the user's head assumed when the image is generated and the position and direction of the user's head at the time the VR image is displayed on the head-mounted display. As a result, the user may experience a feeling of sickness (known as "VR sickness (Virtual Reality Sickness)"). For this reason, a reprojection process is commonly used to correct the rendered image to match the position and direction of the head-mounted display when the image is displayed.
 仮想空間におけるユーザの視点位置及び視点方向は、ヘッドマウントディスプレイの位置及び方向だけでなく、仮想空間においてユーザが配置される位置及び方向の変化に応じて変化する。このユーザが配置される位置及び方向は、映像生成時点と次の映像生成時点との間に変化する場合がある。この場合、この間に変化したユーザの配置される位置及び方向が反映されず、三次元画像がユーザの予期したものは異なるものとなり、ユーザが違和感を覚える場合がある。 The user's viewpoint position and viewpoint direction in the virtual space change not only in response to the position and direction of the head mounted display, but also in response to changes in the position and direction in which the user is positioned in the virtual space. This position and direction in which the user is positioned may change between the time an image is generated and the time the next image is generated. In this case, the change in the user's position and direction during this time may not be reflected, and the three-dimensional image may differ from what the user expected, causing the user to feel uncomfortable.
 上記を鑑み、本発明の目的は、仮想空間内のユーザの配置のずれにより三次元画像に対してユーザが違和感を覚えることを抑制する技術を提供することにある。 In view of the above, the object of the present invention is to provide a technology that prevents a user from feeling uncomfortable about a three-dimensional image due to a misalignment of the user's position in a virtual space.
 上記課題を解決するために、本発明のある態様の画像生成装置は、ヘッドマウントディスプレイの位置及び方向の少なくとも1つの指示入力とは異なる指示入力に基づいて、仮想空間においてユーザが配置される位置及び方向である基準位置及び基準方向の少なくとも1つを決定する基準決定部と、前記基準位置及び基準方向の少なくとも1つに基づいて、前記仮想空間における前記ユーザの視点位置及び視点方向の少なくとも1つを決定する視点決定部と、前記視点位置及び視点方向の少なくとも1つに基づいて、前記仮想空間のオブジェクトをレンダリングして三次元画像を生成するレンダリング部と、前記三次元画像を新たな前記基準位置及び基準方向の少なくとも1つに合うように変換するリプロジェクション処理を実行するリプロジェクション部と、を備える。 In order to solve the above problem, an image generating device according to one aspect of the present invention includes a reference determination unit that determines at least one of a reference position and a reference direction, which are a position and a direction in which a user is to be placed in a virtual space, based on an instruction input different from at least one of an instruction input of a position and a direction of a head-mounted display; a viewpoint determination unit that determines at least one of a viewpoint position and a viewpoint direction of the user in the virtual space, based on at least one of the reference position and the reference direction; a rendering unit that renders an object in the virtual space, based on at least one of the viewpoint position and the viewpoint direction, to generate a three-dimensional image; and a reprojection unit that executes a reprojection process that converts the three-dimensional image to match at least one of the new reference position and the reference direction.
 本発明の他の態様の画像生成方法は、ヘッドマウントディスプレイの位置及び方向の少なくとも1つの指示入力とは異なる指示入力に基づいて、仮想空間におけるユーザの基準位置及び基準方向の少なくとも1つを決定するステップと、前記基準位置及び基準方向の少なくとも1つに基づいて、前記仮想空間における前記ユーザの視点位置及び視点方向の少なくとも1つを決定するステップと、前記視点位置及び視点方向の少なくとも1つに基づいて、前記仮想空間のオブジェクトをレンダリングして三次元画像を生成するステップと、前記三次元画像を新たな前記基準位置及び基準方向の少なくとも1つに合うように変換するリプロジェクション処理を実行するステップと、を備える。 An image generating method according to another aspect of the present invention includes the steps of: determining at least one of a user's reference position and reference direction in a virtual space based on an instruction input different from at least one of an instruction input of a position and an orientation of a head-mounted display; determining at least one of a viewpoint position and a viewpoint direction of the user in the virtual space based on the at least one of the reference position and the reference direction; rendering an object in the virtual space based on the at least one of the viewpoint position and the viewpoint direction to generate a three-dimensional image; and executing a reprojection process to convert the three-dimensional image to match the new at least one of the reference position and the reference direction.
 本発明の他の態様の画像生成プログラムは、コンピュータに、ヘッドマウントディスプレイの位置及び方向の少なくとも1つの指示入力とは異なる指示入力に基づいて、仮想空間におけるユーザの基準位置及び基準方向の少なくとも1つを決定するステップと、前記基準位置及び基準方向の少なくとも1つに基づいて、前記仮想空間における前記ユーザの視点位置及び視点方向の少なくとも1つを決定するステップと、前記視点位置及び視点方向の少なくとも1つに基づいて、前記仮想空間のオブジェクトをレンダリングして三次元画像を生成するステップと、前記三次元画像を新たな前記基準位置及び基準方向の少なくとも1つに合うように変換するリプロジェクション処理を実行するステップと、を実行させるための画像生成プログラムである。 Another aspect of the image generation program of the present invention is an image generation program for causing a computer to execute the steps of: determining at least one of a user's reference position and reference direction in a virtual space based on an instruction input different from at least one of an instruction input of a position and direction of a head-mounted display; determining at least one of a viewpoint position and viewpoint direction of the user in the virtual space based on at least one of the reference position and reference direction; rendering an object in the virtual space based on at least one of the viewpoint position and viewpoint direction to generate a three-dimensional image; and executing a reprojection process to convert the three-dimensional image to match at least one of the new reference position and reference direction.
 なお、以上の任意の組み合わせや、本発明の構成要素や表現を方法、装置、プログラム、プログラムを記録した一時的なまたは一時的でない記憶媒体、システムなどの間で相互に置換したものもまた、本発明の態様として有効である。 In addition, any combination of the above, or mutual substitution of the components or expressions of the present invention among methods, devices, programs, temporary or non-temporary storage media recording programs, systems, etc. are also valid aspects of the present invention.
 本発明によれば、仮想空間内のユーザの配置のずれにより三次元画像に対してユーザが違和感を覚えることを抑制する技術を提供することが可能となる。 The present invention makes it possible to provide a technology that prevents users from feeling uncomfortable about three-dimensional images due to misalignment of the user in a virtual space.
ヘッドマウントディスプレイの外観図である。FIG. 1 is an external view of a head mounted display. 画像生成システムの構成図である。FIG. 1 is a configuration diagram of an image generating system. 図2の画像生成装置の構成図である。FIG. 3 is a configuration diagram of the image generating device of FIG. 2. 基準位置及び基準方向、HMD位置及びHMD方向、並びに視点位置及び視点方向の関係を説明するための図である。10A to 10C are diagrams for explaining the relationship between a reference position and a reference direction, an HMD position and an HMD direction, and a viewpoint position and a viewpoint direction. 第1実施形態の画像生成装置における画像生成に関する処理の流れを示すフローチャートである。4 is a flowchart showing a flow of processing relating to image generation in the image generating device of the first embodiment. 基準位置及び基準方向の変化量とHMD位置及びHMD方向の変化量と新たな視点位置及び視点方向との関係を説明するための図である。11A to 11C are diagrams for explaining the relationship between the amount of change in the reference position and reference direction, the amount of change in the HMD position and HMD direction, and a new viewpoint position and viewpoint direction. 第1実施形態のリプロジェクション処理の流れを説明するための図である。FIG. 4 is a diagram for explaining the flow of a reprojection process according to the first embodiment. 第2実施形態の画像生成装置における画像生成に関する処理の流れを示すフローチャートである。10 is a flowchart showing a flow of processing relating to image generation in an image generating device according to a second embodiment. 第2実施形態のリプロジェクション処理の流れを説明するための図である。FIG. 11 is a diagram for explaining the flow of a reprojection process according to the second embodiment.
 第1実施形態
 図1は、ヘッドマウントディスプレイ(HMD)100の外観図である。HMD100は、ユーザの頭部に装着してディスプレイに表示される静止画や動画などを鑑賞し、ヘッドフォンから出力される音声や音楽などを聴くための表示装置である。
1 is an external view of a head mounted display (HMD) 100. The HMD 100 is a display device that is worn on a user's head to view still images and videos displayed on a display and to listen to audio and music outputted from headphones.
 HMD100に内蔵または外付けされたジャイロセンサや加速度センサなどによりHMD100を装着したユーザの頭部の位置と頭部の回転角や傾きなどの方向を計測することができる。HMD100においては、例えば、HMD100の電源がオンされた時点の頭部の位置及び方向を基準として頭部の位置及び方向が検出される。HMD100には、さらに、ユーザの目を撮影するカメラが設けられてもよい。HMD100に搭載されたカメラにより、ユーザの凝視方向、瞳孔の動き、瞬きなどを検出することができる。 The position of the head of a user wearing the HMD 100 and the direction of the head's rotation angle and tilt can be measured using a gyro sensor or acceleration sensor built into or attached externally to the HMD 100. In the HMD 100, for example, the position and direction of the head are detected based on the position and direction of the head at the time the HMD 100 is turned on. The HMD 100 may further be provided with a camera that photographs the user's eyes. The camera mounted on the HMD 100 can detect the user's gaze direction, pupil movement, blinking, etc.
 図2は、本実施形態に係る画像生成システムの構成図である。画像生成システム1は、HMD100、画像生成装置200及び入力装置300を含む。 FIG. 2 is a configuration diagram of an image generation system according to this embodiment. The image generation system 1 includes an HMD 100, an image generation device 200, and an input device 300.
 本実施形態の画像生成装置200は、ゲーム機である。画像生成装置200は、ネットワークを介してサーバに接続されてもよい。その場合、サーバは、複数のユーザがネットワークを介して参加できるゲームなどのオンラインアプリケーションを画像生成装置200に提供してもよい。 The image generating device 200 of this embodiment is a game machine. The image generating device 200 may be connected to a server via a network. In this case, the server may provide the image generating device 200 with an online application such as a game in which multiple users can participate via the network.
 画像生成装置200は基本的に、コンテンツのプログラムを処理し、三次元画像を生成してHMD100に送信する。コンテンツのプログラムやデータは、例えば、ゲームなどのコンテンツのアプリケーションソフトウェア、およびライセンス情報を記録したROM媒体(不図示)からメディアドライブ(不図示)によって読み出される。このROM媒体は、光ディスクや光磁気ディスク、ブルーレイ(登録商標)ディスクなどの読出専用の記録メディアである。ある態様の画像生成装置200は、HMD100を装着したユーザの頭部の位置及び方向(以下、HMD位置及びHMD方向)並びに後述の基準位置及び基準方向に基づいて視点位置及び視点方向を決定し、それに応じた視野となるようにコンテンツの三次元画像を所定のレートで生成し、HMD100に送信する。 The image generating device 200 basically processes the content program, generates three-dimensional images, and transmits them to the HMD 100. The content program and data are read by a media drive (not shown) from a ROM medium (not shown) on which the content's application software, such as a game, and license information are recorded. This ROM medium is a read-only recording medium such as an optical disk, a magneto-optical disk, or a Blu-ray (registered trademark) disk. The image generating device 200 in one embodiment determines the viewpoint position and viewpoint direction based on the position and direction of the head of the user wearing the HMD 100 (hereinafter referred to as the HMD position and HMD direction) and the reference position and reference direction described below, generates three-dimensional images of the content at a predetermined rate so that the field of view is accordingly obtained, and transmits them to the HMD 100.
 HMD100は、画像生成装置200が生成した三次元画像を受信して表示する。HMD100に表示される三次元画像は、予めカメラで撮影された画像の他、ゲーム画像のようなコンピュータグラフィックスによる画像や、ネットワーク経由で配信される遠隔地のライブ画像であってもよい。また、HMD100に表示される三次元画像は、VR画像、AR(拡張現実)画像、MR(複合現実)画像等であってもよい。 The HMD 100 receives and displays the three-dimensional image generated by the image generating device 200. The three-dimensional image displayed on the HMD 100 may be an image captured in advance by a camera, an image generated by computer graphics such as a game image, or a live image from a remote location delivered via a network. The three-dimensional image displayed on the HMD 100 may also be a VR image, an AR (augmented reality) image, an MR (mixed reality) image, etc.
 入力装置300は、ユーザから受け付けた各種入力を画像生成装置200に供給する。例えば、入力装置300は、仮想空間内のユーザの位置及び方向を変更するための指示入力を画像生成装置200に供給する。入力装置300は、ゲームコントローラ、キーボード、マウス、ジョイスティック、ユーザのジェスチャを取得するビデオカメラ、平板型ディスプレイの表示画面上に設けたタッチパッドなど、一般的な入力装置のいずれか、またはそれらの組み合わせによって実現される。本実施形態の入力装置300は、ユーザ入力装置の一例である。 The input device 300 supplies various inputs received from the user to the image generating device 200. For example, the input device 300 supplies the image generating device 200 with instruction inputs for changing the user's position and direction in the virtual space. The input device 300 is realized by any of the typical input devices, such as a game controller, a keyboard, a mouse, a joystick, a video camera that captures the user's gestures, a touchpad provided on the display screen of a flat panel display, or a combination of these. The input device 300 of this embodiment is an example of a user input device.
 図3は、図2の画像生成装置200の機能構成図である。同図は機能に着目したブロック図を描いており、これらの機能ブロックはハードウェアのみ、ソフトウエアのみ、またはそれらの組合せによっていろいろな形で実現することができる。画像生成装置200の少なくとも一部の機能は、HMD100によって実装されてもよい。あるいは、画像生成装置200の少なくとも一部の機能は、ネットワークを介して画像生成装置200に接続されたサーバによって実装されてもよい。画像生成装置200は、ゲーム制御部201と、基準決定部202と、視点決定部203と、レンダリング部204と、画像処理部205と、送信部206と、記憶部207と、を備える。 FIG. 3 is a functional configuration diagram of the image generating device 200 of FIG. 2. The diagram is a block diagram focusing on functions, and these functional blocks can be realized in various forms by hardware only, software only, or a combination of both. At least a part of the functions of the image generating device 200 may be implemented by the HMD 100. Alternatively, at least a part of the functions of the image generating device 200 may be implemented by a server connected to the image generating device 200 via a network. The image generating device 200 comprises a game control unit 201, a reference determination unit 202, a viewpoint determination unit 203, a rendering unit 204, an image processing unit 205, a transmission unit 206, and a storage unit 207.
 ゲーム制御部201は、記憶部207から読み出したゲームプログラムを実行してゲームを進行させる。ゲーム制御部201は、仮想空間内で予め定められたシーンを進行させるための指示入力を基準決定部202に送信して、仮想空間内のユーザの後述の基準位置及び基準方向を変更させる。「仮想空間内で予め定められたシーン」とは、例えば、仮想空間内でオブジェクトがユーザに及ぼす作用を表すシーンや、仮想空間内で発生するイベントシーン等を指す。「仮想空間内でオブジェクトがユーザに及ぼす作用」とは、例えば、仮想空間内で、ユーザが物体のオブジェクトに衝突する、ユーザが敵キャラクタのオブジェクトに押される又は引っ張られるなどのユーザとオブジェクトとの位置関係に応じて予め定められる方向にユーザを移動させる作用や、ユーザが乗り物のオブジェクトに乗るなどの乗り物の予め定められた進行経路に従ってユーザを移動させる作用等を指す。「仮想空間内で発生するイベントシーン」は、ムービーシーンなどの予め定められたパターンでユーザの強制的に移動させるシーン等を指す。 The game control unit 201 executes the game program read from the storage unit 207 to progress the game. The game control unit 201 transmits instruction inputs to the reference determination unit 202 for progressing through a predetermined scene in the virtual space, and changes the reference position and reference direction of the user in the virtual space, which will be described later. The "predetermined scene in the virtual space" refers to, for example, a scene that represents an action of an object on the user in the virtual space, or an event scene that occurs in the virtual space. The "action of an object on the user in the virtual space" refers to, for example, an action that moves the user in a direction that is predetermined according to the positional relationship between the user and the object, such as when the user collides with an object of a physical object, or when the user is pushed or pulled by an enemy character object, or an action that moves the user according to a predetermined progression path of a vehicle, such as when the user rides on a vehicle object. The "event scene that occurs in the virtual space" refers to a scene in which the user is forced to move in a predetermined pattern, such as a movie scene.
 基準決定部202は、HMD100の位置及び方向(以下、HMD位置及びHMD方向)の指示入力とは異なる指示入力に基づいて、仮想空間においてユーザが配置される位置及び方向である基準位置及び基準方向を決定する。基準位置及び基準方向は、HMD位置及びHMD方向の変化によらずに変化する。例えば、基準位置及び基準方向は、入力装置300を介した指示入力、及び仮想空間内での予め定められたシーンを進行させるためのゲーム制御部201からの指示入力のうちの少なくとも1つに基づいて変化する。例えば、基準位置は仮想空間内のユーザの身体的な中心部(例えば、胴体の中心部など)に設定され、基準方向は仮想空間内のユーザの身体の向き(例えば、胴体の向きなど)に設定される。なお、基準位置をユーザの身体的な中心部に設定することはあくまで一例であり、基準位置としては、少なくともユーザの位置に基づく任意の位置に設定することができる。 The reference determination unit 202 determines a reference position and a reference direction, which are the position and direction in which the user is placed in the virtual space, based on an instruction input different from the instruction input of the position and direction of the HMD 100 (hereinafter, HMD position and HMD direction). The reference position and the reference direction change regardless of the change in the HMD position and the HMD direction. For example, the reference position and the reference direction change based on at least one of the instruction input via the input device 300 and the instruction input from the game control unit 201 for progressing a predetermined scene in the virtual space. For example, the reference position is set to the center of the user's body in the virtual space (e.g., the center of the torso, etc.), and the reference direction is set to the direction of the user's body in the virtual space (e.g., the direction of the torso, etc.). Note that setting the reference position to the center of the user's body is merely an example, and the reference position can be set to any position based at least on the user's position.
 例えば、基準決定部202は、記憶部207から仮想空間における基準位置及び基準方向を読み出し、入力装置300及びゲーム制御部201の少なくとも1つから受け付けた指示入力に基づいて基準位置及び基準方向を決定する。基準決定部202は、この決定した基準位置及び基準方向によって記憶部207における基準位置及び基準方向を更新する。 For example, the reference determination unit 202 reads out a reference position and a reference direction in the virtual space from the storage unit 207, and determines a reference position and a reference direction based on an instruction input received from at least one of the input device 300 and the game control unit 201. The reference determination unit 202 updates the reference position and reference direction in the storage unit 207 with the determined reference position and reference direction.
 視点決定部203は、HMD位置及びHMD方向並びに基準位置及び基準方向を取得し、これらに基づいて、仮想空間におけるユーザの視点位置及び視点方向を決定する。例えば、視点決定部203は、基準位置及び基準方向を基準として、基準位置及び基準方向に対するHMD100の相対位置及び相対方向を反映させるように、視点位置及び視点方向を決定する。 The viewpoint determination unit 203 acquires the HMD position and HMD direction as well as the reference position and reference direction, and determines the user's viewpoint position and viewpoint direction in the virtual space based on these. For example, the viewpoint determination unit 203 determines the viewpoint position and viewpoint direction based on the reference position and reference direction so as to reflect the relative position and relative direction of the HMD 100 with respect to the reference position and reference direction.
 図4を用いて、基準位置及び基準方向、HMD位置及びHMD方向、並びに視点位置及び視点方向の関係を説明する。図4は、互いに直交するx軸、y軸、及びz軸からなる三次元仮想空間をz方向から見た図である。x軸を中心軸とした回転角度α、y軸を中心軸とした回転角度β、及びz軸を中心軸とした回転角度γとする。図4に示すように、基準位置が(x1、y1、z1)に決定され、HMD位置が(x2、y2、z2)と検出された場合、視点位置は(x1+x2、y1+y2、z1+z2)に決定される。また、基準方向が(α1、β1、γ1)に決定され、HMD方向が(α2、β2、γ2)と検出された場合、視点方向は(α1+α2、β1+β2、γ1+γ2)に決定される。 The relationship between the reference position and reference direction, the HMD position and HMD direction, and the viewpoint position and viewpoint direction will be explained using FIG. 4. FIG. 4 is a diagram of a three-dimensional virtual space consisting of mutually orthogonal x-, y-, and z-axes, viewed from the z direction. The rotation angle α is about the x-axis, the rotation angle β is about the y-axis, and the rotation angle γ is about the z-axis. As shown in FIG. 4, when the reference position is determined to be (x1, y1, z1) and the HMD position is detected to be (x2, y2, z2), the viewpoint position is determined to be (x1+x2, y1+y2, z1+z2). When the reference direction is determined to be (α1, β1, γ1) and the HMD direction is detected to be (α2, β2, γ2), the viewpoint direction is determined to be (α1+α2, β1+β2, γ1+γ2).
 レンダリング部204は、記憶部207から画像の生成に必要な画像データを読み出し、仮想空間のオブジェクトをレンダリングして三次元画像を生成する。例えば、レンダリング部204は、視点決定部203によって決定されたユーザの視点位置及び視点方向に基づいて、HMD100を装着したユーザの視点位置から視点方向に見える仮想空間のオブジェクトをレンダリングして三次元画像を生成する。 The rendering unit 204 reads image data required for generating an image from the storage unit 207, and generates a three-dimensional image by rendering objects in a virtual space. For example, the rendering unit 204 generates a three-dimensional image by rendering objects in a virtual space that are visible from the viewpoint position and in the viewpoint direction of a user wearing the HMD 100, based on the user's viewpoint position and viewpoint direction determined by the viewpoint determination unit 203.
 画像処理部205は、レンダリングされた画像に必要に応じた処理を施すことにより、HMD100で表示される三次元画像(以下、HMD画像という)を生成し、送信部206に与える。画像処理部205は、ポストプロセス部205aと、リプロジェクション部205bと、歪み処理部205cと、を含む。 The image processing unit 205 processes the rendered image as necessary to generate a three-dimensional image (hereinafter referred to as an HMD image) to be displayed on the HMD 100, and provides it to the transmission unit 206. The image processing unit 205 includes a post-processing unit 205a, a reprojection unit 205b, and a distortion processing unit 205c.
 ポストプロセス部205aは、レンダリング部204から供給された画像に対して、被写界深度調整、トーンマッピング、アンチエイリアシングなどのポストプロセスを施し、三次元画像が自然で滑らかに見えるように後処理を施す。 The post-processing unit 205a performs post-processing such as depth of field adjustment, tone mapping, and anti-aliasing on the images supplied from the rendering unit 204, so that the three-dimensional images appear natural and smooth.
 リプロジェクション部205bは、リプロジェクション処理を実行する。リプロジェクション処理は、第1リプロジェクション処理と第2リプロジェクションとを含む。第1リプロジェクション処理及び第2リプロジェクションについては後述する。リプロジェクション部205bは、リプロジェクション処理を施した三次元画像を歪み処理部205cに供給する。 The reprojection unit 205b executes the reprojection process. The reprojection process includes a first reprojection process and a second reprojection. The first reprojection process and the second reprojection process will be described later. The reprojection unit 205b supplies the three-dimensional image that has been subjected to the reprojection process to the distortion processing unit 205c.
 歪み処理部205cは、三次元画像に対してHMD100の光学系で生じる歪みに合わせて画像を変形(distortion)させて歪ませる処理を施す。 The distortion processing unit 205c performs processing to distort the three-dimensional image by modifying the image to match the distortion caused by the optical system of the HMD 100.
 送信部206は、画像処理部205において生成されたHMD画像をHMD100に送信する。本実施形態の送信部206は、第1リプロジェクション処理を施したHMD画像を送信した後、第2リプロジェクション処理を施したHMD画像を送信する。これにより、本実施形態の送信部206は、第1リプロジェクション処理を施したHMD画像をHMD100に表示させた後、第2リプロジェクション処理を施したHMD画像をHMD100に表示させる。本実施形態の送信部は、表示制御部の一例である。 The transmission unit 206 transmits the HMD image generated in the image processing unit 205 to the HMD 100. In this embodiment, the transmission unit 206 transmits the HMD image that has been subjected to the first reprojection process, and then transmits the HMD image that has been subjected to the second reprojection process. As a result, the transmission unit 206 in this embodiment causes the HMD 100 to display the HMD image that has been subjected to the first reprojection process, and then causes the HMD 100 to display the HMD image that has been subjected to the second reprojection process. The transmission unit in this embodiment is an example of a display control unit.
 記憶部207は、基準位置及び基準方向、画像の生成に必要なデータや各種処理を実行するための各種プログラムを格納している。 The memory unit 207 stores the reference position and reference direction, data required for generating images, and various programs for executing various processes.
 図5を用いて、本実施形態の画像生成装置における画像生成に関する処理S100を説明する。 The process S100 for image generation in the image generating device of this embodiment will be explained using FIG. 5.
 ステップS101で、基準決定部202は、基準位置及び基準方向を決定する。本実施形態の基準決定部202は、入力装置300及びゲーム制御部201の少なくとも1つから受け付けた指示入力に基づいて、三次元画像をレンダリングするタイミング(以下、描画タイミング)での位置及び方向となるように基準位置及び基準方向を決定する。
基準決定部202は、決定した基準位置及び基準方向を視点決定部203に供給する。また、基準決定部202は、基準位置及び基準方向を新たに決定した場合には、その新たに決定した基準位置及び基準方向をリプロジェクション部205bに供給する。
In step S101, the reference determination unit 202 determines a reference position and a reference direction. The reference determination unit 202 of the present embodiment determines a reference position and a reference direction based on an instruction input received from at least one of the input device 300 and the game control unit 201 so as to be a position and a direction at the timing of rendering a three-dimensional image (hereinafter, referred to as drawing timing).
The reference determination unit 202 supplies the determined reference position and reference direction to the viewpoint determination unit 203. Furthermore, when the reference determination unit 202 determines a new reference position and reference direction, it supplies the newly determined reference position and reference direction to the reprojection unit 205b.
 ステップS102で、視点決定部203は、HMD位置及びHMD方向並びに基準位置及び基準方向に基づいて視点位置及び視点方向を設定する。視点決定部203は、決定された視点位置及び視点方向をレンダリング部204に供給する。 In step S102, the viewpoint determination unit 203 sets the viewpoint position and viewpoint direction based on the HMD position and HMD direction and the reference position and reference direction. The viewpoint determination unit 203 supplies the determined viewpoint position and viewpoint direction to the rendering unit 204.
 ステップS103で、レンダリング部204は、決定された視点位置及び視点方向に基づいて、仮想空間のオブジェクトをレンダリングして三次元画像を生成する。レンダリング部204は、生成した三次元画像を画像処理部205のポストプロセス部205aに供給する。 In step S103, the rendering unit 204 renders the object in the virtual space based on the determined viewpoint position and viewpoint direction to generate a three-dimensional image. The rendering unit 204 supplies the generated three-dimensional image to the post-processing unit 205a of the image processing unit 205.
 ステップS104で、ポストプロセス部205aは、レンダリングした三次元画像について、ポストプロセスを実行する。ポストプロセス部205aは、ポストプロセスを施した三次元画像をリプロジェクション部205bに供給する。 In step S104, the post-processing unit 205a performs post-processing on the rendered three-dimensional image. The post-processing unit 205a supplies the three-dimensional image that has been subjected to the post-processing to the reprojection unit 205b.
 ステップS105で、リプロジェクション部205bは、新たな第1HMD位置及び第1HMD方向を取得する。ここでの新たな第1HMD位置及び第1HMD方向は、三次元画像の描画タイミングで得られたHMD位置及びHMD方向である。 In step S105, the reprojection unit 205b acquires a new first HMD position and first HMD direction. The new first HMD position and first HMD direction here are the HMD position and HMD direction obtained at the time of drawing the three-dimensional image.
 ステップS106で、リプロジェクション部205bは、HMD位置及びHMD方向から新たな第1HMD位置及び第1HMD方向への変化量に基づいて、ポストプロセスを施した三次元画像に対して第1リプロジェクション処理を実行する。第1リプロジェクション処理は、HMD位置及びHMD方向及び新たなHMD位置及びHMD方向に基づいて、ポストプロセスが施された三次元画像を新たなHMD位置及びHMD方向に合うように変換する処理である。第1リプロジェクション処理により、ステップS102におけるHMD位置及びHMD方向と描画タイミングで得られた新たな第1HMD位置及び第1HMD方向とのずれによる遅延を低減することができる。リプロジェクション部205bは、ポストプロセス及び第1リプロジェクション処理を施した三次元画像を歪み処理部205cに供給する。 In step S106, the reprojection unit 205b executes a first reprojection process on the post-processed three-dimensional image based on the amount of change from the HMD position and HMD orientation to the new first HMD position and first HMD orientation. The first reprojection process is a process of converting the post-processed three-dimensional image to match the new HMD position and HMD orientation based on the HMD position and HMD orientation and the new HMD position and HMD orientation. The first reprojection process can reduce delays due to deviations between the HMD position and HMD orientation in step S102 and the new first HMD position and first HMD orientation obtained at the drawing timing. The reprojection unit 205b supplies the post-processed and first reprojection processed three-dimensional image to the distortion processing unit 205c.
 ステップS107で、歪み処理部205cは、ポストプロセス及び第1リプロジェクション処理を施した三次元画像に対して歪み処理を実行する。歪み処理部205cは、ポストプロセス、第1リプロジェクション処理及び歪み処理を施した三次元画像を送信部206に供給する。 In step S107, the distortion processing unit 205c performs distortion processing on the three-dimensional image that has been subjected to the post-processing and the first reprojection processing. The distortion processing unit 205c supplies the three-dimensional image that has been subjected to the post-processing, the first reprojection processing, and the distortion processing to the transmission unit 206.
 ステップS108で、送信部206は、ポストプロセス、第1リプロジェクション処理及び歪み処理を施した三次元画像をHMD画像としてHMD100に送信する。これにより、HMD100において第1リプロジェクション処理が施されたHMD画像が表示されることになる。 In step S108, the transmission unit 206 transmits the three-dimensional image that has been subjected to the post-processing, the first reprojection processing, and the distortion processing to the HMD 100 as an HMD image. As a result, the HMD image that has been subjected to the first reprojection processing is displayed on the HMD 100.
 ステップS109で、リプロジェクション部205bは、新たな基準位置及び基準方向並びに新たな第2HMD位置及び第2HMD方向を取得する。新たな基準位置及び基準方向並びに新たな第2HMD位置及び第2HMD方向は、例えば、HMD画像がHMD100に表示されるタイミングから所定時間経過後の基準位置及び基準方向並びにHMD位置及びHMD方向である。例えば、レンダリングが60fps(フレーム/秒)のフレームレートで行われ、フレームインターバルが16.6ミリ秒である場合、HMD画像がHMD100に表示されるタイミングから所定時間として8.3ミリ秒経過後の基準位置及び基準方向並びにHMD位置及びHMD方向が新たな基準位置及び基準方向並びに新たな第2HMD位置及び第2HMD方向とされる。なお、HMD画像がHMD100に表示されるタイミングは、画像生成装置200でレンダリングされた画像がHMD100に伝送される際に想定される伝送時間等を考慮して定められる。 In step S109, the reprojection unit 205b acquires a new reference position and reference direction as well as a new second HMD position and second HMD direction. The new reference position and reference direction as well as the new second HMD position and second HMD direction are, for example, the reference position and reference direction as well as the HMD position and HMD direction after a predetermined time has elapsed from the timing when the HMD image is displayed on the HMD 100. For example, when rendering is performed at a frame rate of 60 fps (frames per second) and the frame interval is 16.6 milliseconds, the reference position and reference direction as well as the HMD position and HMD direction after a predetermined time of 8.3 milliseconds has elapsed from the timing when the HMD image is displayed on the HMD 100 are set as the new reference position and reference direction as well as the new second HMD position and second HMD direction. The timing when the HMD image is displayed on the HMD 100 is determined taking into consideration the transmission time, etc., expected when the image rendered by the image generating device 200 is transmitted to the HMD 100.
 ステップS110で、リプロジェクション部205bは、基準位置及び基準方向から新たな基準位置及び基準方向への変化量(以下、基準位置及び基準方向の変化量という)と、HMD位置及びHMD方向から新たな第2HMD位置及び第2HMD方向への変化量(以下、HMD位置及びHMD方向の変化量という)と、に基づいて、第2リプロジェクション処理を実行する。第2リプロジェクション処理は、基準位置及び基準方向及び新たな基準位置及び基準方向とHMD位置及びHMD方向及び新たなHMD位置及びHMD方向とに基づいて、ポストプロセスが施された三次元画像を新たな視点位置及び視点方向に合うように変換する処理である。第2リプロジェクション処理により、ポストプロセスが施された三次元画像が新たな視点位置及び視点方向から見える三次元画像に変換される。 In step S110, the reprojection unit 205b executes a second reprojection process based on the amount of change from the reference position and reference direction to the new reference position and reference direction (hereinafter referred to as the amount of change in the reference position and reference direction) and the amount of change from the HMD position and HMD direction to the new second HMD position and second HMD direction (hereinafter referred to as the amount of change in the HMD position and HMD direction). The second reprojection process is a process that converts the post-processed three-dimensional image to match the new viewpoint position and viewpoint direction based on the reference position and reference direction, the new reference position and reference direction, the HMD position and HMD direction, and the new HMD position and HMD direction. The second reprojection process converts the post-processed three-dimensional image into a three-dimensional image that can be seen from the new viewpoint position and viewpoint direction.
 ここで、図6を用いて、基準位置及び基準方向の変化量とHMD位置及びHMD方向の変化量と新たな視点位置及び視点方向との関係を説明する。図6に示すように、ステップS102において決定された視点位置を(x、y、z)とし、基準位置の変化量を(Δxr、Δyr、Δzr)とし、HMD位置の変化量を(Δxh、Δyh、Δzh)とすると、新たな視点位置は(x+Δxr+Δxh、y+Δyr+Δyh、z+Δzr+Δzh)と表される。また、ステップS102において決定された視点方向を(α、β、γ)とし、基準方向の変化量を(Δαr、Δβr、Δγr)とし、HMD方向の変化量を(Δαh、Δβh、Δγh)とすると、新たな視点方向は(α+Δαr+Δαh、β+Δβr+Δβh、γ+Δγr+Δγh)と表される。 Here, the relationship between the amount of change in the reference position and reference direction, the amount of change in the HMD position and HMD direction, and the new viewpoint position and viewpoint direction will be described with reference to FIG. 6. As shown in FIG. 6, if the viewpoint position determined in step S102 is (x, y, z), the amount of change in the reference position is (Δxr, Δyr, Δzr), and the amount of change in the HMD position is (Δxh, Δyh, Δzh), the new viewpoint position is expressed as (x+Δxr+Δxh, y+Δyr+Δyh, z+Δzr+Δzh). Also, if the viewpoint direction determined in step S102 is (α, β, γ), the amount of change in the reference direction is (Δαr, Δβr, Δγr), and the amount of change in the HMD direction is (Δαh, Δβh, Δγh), the new viewpoint direction is expressed as (α+Δαr+Δαh, β+Δβr+Δβh, γ+Δγr+Δγh).
 リプロジェクション部205bは、例えば、上記のように表される新たな視点位置に基づいて視点位置を線形補完し、上記のように表される新たな視点方向に基づいて視点方向を球面線形補完することにより、リプロジェクション処理を実行する。これにより、新たな視点位置及び視点方向に合うように三次元画像が変換される。リプロジェクション部205bは、第2リプロジェクション処理を施した三次元画像を歪み処理部205cに供給する。 The reprojection unit 205b executes the reprojection process, for example, by linearly complementing the viewpoint position based on the new viewpoint position expressed as above, and spherically linearly complementing the viewpoint direction based on the new viewpoint direction expressed as above. This converts the three-dimensional image to match the new viewpoint position and viewpoint direction. The reprojection unit 205b supplies the three-dimensional image that has been subjected to the second reprojection process to the distortion processing unit 205c.
 ステップS111で、歪み処理部205cは、第2リプロジェクション処理を施した三次元画像に対して歪み処理を実行する。歪み処理部205cは、ポストプロセス、第2リプロジェクション処理及び歪み処理を施した三次元画像を送信部206に供給する。 In step S111, the distortion processing unit 205c performs distortion processing on the three-dimensional image that has been subjected to the second reprojection processing. The distortion processing unit 205c supplies the three-dimensional image that has been subjected to the post-processing, the second reprojection processing, and the distortion processing to the transmission unit 206.
 ステップS112で、送信部206は、ポストプロセス、第2リプロジェクション処理及び歪み処理を施した三次元画像をHMD画像としてHMD100に送信する。これにより、HMD100において第2リプロジェクション処理が施されたHMD画像が表示されることになる。 In step S112, the transmission unit 206 transmits the three-dimensional image that has been subjected to the post-processing, second reprojection processing, and distortion processing to the HMD 100 as an HMD image. As a result, the HMD image that has been subjected to the second reprojection processing is displayed on the HMD 100.
 ステップS112の後、処理S100は終了する。 After step S112, process S100 ends.
 図7を用いて、第1実施形態のリプロジェクション処理の流れを説明する。図7中、同期タイミングVsyncはHMD100の表示パネルの垂直同期タイミングを示す。三次元画像の描画タイミングに到達すると、その描画タイミングにおける基準位置及び基準方向並びにHMD位置及びHMD方向に基づくレンダリング処理1が実行される。次に、描画された三次元画像に対して、レンダリング処理1の描画タイミングで得られた新たな第1HMD位置及び第1HMD方向に基づいて第1リプロジェクション処理が実行され、その第1リプロジェクション処理済みの三次元画像がHMD100に送信される。その結果、タイムスタンプt0で第1リプロジェクション処理が適用された三次元画像1-1がHMD100で表示される(上記ステップS101~S108に対応)。第1リプロジェクション処理において基準位置及び基準方向を考慮しない理由は、上述のように三次元画像1-1が描画タイミングでの基準位置及び基準方向が用いられることから、タイムスタンプt0における基準位置及び基準方向が概ね反映されているとみなすことができるためである。次に、レンダリング処理1において描画された三次元画像に対して、タイムスタンプt0から所定時間経過後(例えば、タイムスタンプt1の直前)の新たな基準位置及び基準方向並びに新たな第2HMD位置及び第2HMD方向に基づいて第2リプロジェクション処理が実行され、その第2リプロジェクション処理済みの三次元画像がHMD100に送信される。ここでの所定時間は、例えば、三次元画像がHMD100に表示されるタイミングから次のフレームの三次元画像がHMD100に表示されるタイミングまでの間のフレームインターバル時間未満の時間であり、例えば、フレームインターバル時間の半分に相当する時間に設定することができる。例えば、フレームインターバル時間が16.6msである場合、所定時間は8.3msに設定される。その後、タイムスタンプt1で第2リプロジェクション処理が適用された三次元画像1-2がHMD100で表示される(上記ステップS109~S112に対応)。 The flow of the reprojection process of the first embodiment will be described with reference to FIG. 7. In FIG. 7, the synchronization timing Vsync indicates the vertical synchronization timing of the display panel of the HMD 100. When the drawing timing of the three-dimensional image is reached, a rendering process 1 is executed based on the reference position and reference direction and the HMD position and HMD direction at that drawing timing. Next, the first reprojection process is executed on the drawn three-dimensional image based on the new first HMD position and first HMD direction obtained at the drawing timing of the rendering process 1, and the three-dimensional image after the first reprojection process is transmitted to the HMD 100. As a result, the three-dimensional image 1-1 to which the first reprojection process is applied at timestamp t0 is displayed on the HMD 100 (corresponding to steps S101 to S108 above). The reason why the reference position and reference direction are not taken into consideration in the first reprojection process is that, as described above, the reference position and reference direction at the drawing timing of the three-dimensional image 1-1 are used, and it can be considered that the reference position and reference direction at the timestamp t0 are roughly reflected. Next, a second reprojection process is performed on the three-dimensional image rendered in the rendering process 1 based on a new reference position and reference direction after a predetermined time has elapsed since the timestamp t0 (for example, immediately before the timestamp t1) and a new second HMD position and second HMD direction, and the three-dimensional image that has undergone the second reprojection process is transmitted to the HMD 100. The predetermined time here is, for example, a time that is less than the frame interval time from the time when the three-dimensional image is displayed on the HMD 100 to the time when the three-dimensional image of the next frame is displayed on the HMD 100, and can be set to, for example, a time equivalent to half the frame interval time. For example, when the frame interval time is 16.6 ms, the predetermined time is set to 8.3 ms. After that, the three-dimensional image 1-2 to which the second reprojection process has been applied at the timestamp t1 is displayed on the HMD 100 (corresponding to steps S109 to S112 above).
 次の描画タイミングに到達すると、その描画タイミングにおける基準位置及び基準方向並びにHMD位置及びHMD方向に基づくレンダリング処理2が実行される。次に、描画された三次元画像に対して、レンダリング処理2の描画タイミングで得られた新たな第1HMD位置及び第1HMD方向に基づいて第1リプロジェクション処理が実行され、タイムスタンプt2で第1リプロジェクション処理が適用された三次元画像2-1がHMD100で表示される。次に、レンダリング処理2において描画された三次元画像に対してタイムスタンプt2から所定時間経過後(例えば、タイムスタンプt3の直前)の新たな基準位置及び基準方向並びに新たな第2HMD位置及び第2HMD方向に基づいて第2リプロジェクション処理が実行され、タイムスタンプt3で第2リプロジェクション処理が適用された三次元画像2-2がHMD100で表示される。 When the next drawing timing is reached, rendering process 2 is executed based on the reference position and reference direction as well as the HMD position and HMD direction at that drawing timing. Next, a first reprojection process is executed on the drawn three-dimensional image based on the new first HMD position and first HMD direction obtained at the drawing timing of rendering process 2, and a three-dimensional image 2-1 to which the first reprojection process has been applied at timestamp t2 is displayed on the HMD 100. Next, a second reprojection process is executed on the three-dimensional image drawn in rendering process 2 based on the new reference position and reference direction as well as the new second HMD position and second HMD direction after a predetermined time has elapsed since timestamp t2 (for example, immediately before timestamp t3), and a three-dimensional image 2-2 to which the second reprojection process has been applied at timestamp t3 is displayed on the HMD 100.
 ここで、映像生成時点と次の映像生成時点との間に例えば入力装置300やゲーム制御部201からの指示入力がなされることにより、この間に仮想空間のユーザの基準位置及び基準方向が変更する場合がある。描画が例えば60fps(フレーム/秒)のフレームレートで行われているとすると、CPUが十分高速であったとしても、16.67ミリ秒ほどの間に変化した基準位置及び基準方向がHMD画像に反映されないこととなる。その結果、HMD画像がユーザの予期したものは異なるものとなり、ユーザが違和感を覚える場合がある。 Here, between the time when an image is generated and the time when the next image is generated, for example, an instruction is input from the input device 300 or the game control unit 201, and during this time the user's reference position and reference direction in the virtual space may change. If the drawing is performed at a frame rate of, for example, 60 fps (frames per second), even if the CPU is fast enough, the reference position and reference direction that change in about 16.67 milliseconds will not be reflected in the HMD image. As a result, the HMD image may differ from what the user expected, causing the user to feel uncomfortable.
 本実施形態では、三次元画像を新たな基準位置及び基準方向に合うように変換するリプロジェクション処理が実行される。本構成によると、映像生成時点と次の映像生成時点との間の基準位置及び基準方向の変化がHMD画像に反映されるため、この基準位置及び基準方向の変化によりHMD画像に対してユーザが違和感を覚えることを抑制することができる。 In this embodiment, a reprojection process is performed to convert the three-dimensional image to match a new reference position and reference direction. With this configuration, the change in the reference position and reference direction between the time point at which an image is generated and the time point at which the next image is generated is reflected in the HMD image, so that it is possible to prevent the user from feeling uncomfortable about the HMD image due to this change in the reference position and reference direction.
 本実施形態では、第2リプロジェクション処理は、HMD位置及びHMD方向から新たな第2HMD位置及び第2HMD方向への変化量と、基準位置及び基準方向から新たな基準位置及び基準方向への変化量とに基づいて、三次元画像を新たな視点位置及び視点方向に合うように変換する。本構成によると、基準位置及び基準方向の変化に加えて、HMD位置及びHMD方向の変化を加味してリプロジェクション処理が実行されるため、HMD画像に対してユーザが違和感を覚えることをより効果的に抑制できる。 In this embodiment, the second reprojection process converts the three-dimensional image to match the new viewpoint position and viewpoint direction based on the amount of change from the HMD position and HMD direction to the new second HMD position and second HMD direction, and the amount of change from the reference position and reference direction to the new reference position and reference direction. With this configuration, the reprojection process is executed taking into account changes in the HMD position and HMD direction in addition to changes in the reference position and reference direction, so that it is possible to more effectively prevent the user from feeling uncomfortable about the HMD image.
 本実施形態では、送信部206は、三次元画像をHMD100に表示させた後、第2リプロジェクション処理を施した三次元画像をHMD100に表示させ、第2リプロジェクション処理は、三次元画像が表示されたタイミングから所定時間経過後の新たな基準位置及び基準方向に合うように変換する。本構成によると、三次元画像を表示した後に、新たな基準位置及び基準方向に合うように変換した画像を補間することにより、より高フレームレートでの表示を実現できる。 In this embodiment, the transmission unit 206 displays a three-dimensional image on the HMD 100, and then displays a three-dimensional image that has been subjected to the second reprojection process on the HMD 100. The second reprojection process converts the three-dimensional image to match a new reference position and reference direction after a predetermined time has elapsed since the time the three-dimensional image was displayed. With this configuration, after the three-dimensional image is displayed, it is possible to realize display at a higher frame rate by interpolating the image that has been converted to match the new reference position and reference direction.
 以下、実施形態の変形例について説明する。  Below, we will explain some modified examples of the embodiment.
 本実施形態では、HMD位置及びHMD方向並びに基準位置及び基準方向に基づいて視点位置及び視点方向が決定されたが、これに限定されず、基準位置及び基準方向に基づいて視点位置及び視点方向が決定されてもよい。この場合、新たな基準位置及び基準方向に基づいて、リプロジェクション処理が実行されればよい。 In this embodiment, the viewpoint position and viewpoint direction are determined based on the HMD position and HMD direction and the reference position and reference direction, but this is not limited to the above, and the viewpoint position and viewpoint direction may be determined based on the reference position and reference direction. In this case, the reprojection process may be performed based on the new reference position and reference direction.
 本実施形態では、位置及び方向についてリプロジェクション処理が実行されたが、位置及び方向の少なくとも1つについてリプロジェクション処理が実行されてもよい。 In this embodiment, the reprojection process is performed on the position and the direction, but the reprojection process may be performed on at least one of the position and the direction.
 本実施形態では、図7に示すように、三次元画像の描画タイミングから次の描画タイミングまでの間に、第2リプロジェクション処理が適用された三次元画像が1つ生成されて表示されたが、これに限定されず、第2リプロジェクション処理が適用された三次元画像がそれぞれ異なるタイムスタンプにおいて2つ以上生成されてそれぞれ表示されてもよい。 In this embodiment, as shown in FIG. 7, one three-dimensional image to which the second reprojection process has been applied is generated and displayed between the drawing timing of one three-dimensional image and the drawing timing of the next, but this is not limited to this, and two or more three-dimensional images to which the second reprojection process has been applied may be generated and displayed at different timestamps.
 本実施形態では、第1リプロジェクション処理を施したHMD画像を送信した後、第2リプロジェクション処理を施したHMD画像を送信したが、これに限定されず、リプロジェクション処理を施していないHMD画像(すなわち、ステップS103のレンダリング処理により生成された三次元画像)を送信した後、第2リプロジェクション処理を施したHMD画像を送信してもよい。 In this embodiment, an HMD image that has been subjected to the first reprojection process is transmitted, and then an HMD image that has been subjected to the second reprojection process is transmitted. However, this is not limited to this, and an HMD image that has not been subjected to the reprojection process (i.e., a three-dimensional image generated by the rendering process in step S103) may be transmitted, and then an HMD image that has been subjected to the second reprojection process may be transmitted.
 第2実施形態
 以下、本発明の第2実施形態を説明する。第2実施形態の図面および説明では、第1実施形態と同一または同等の構成要素、部材には、同一の符号を付する。第1実施形態と重複する説明を適宜省略し、第1実施形態と相違する構成について重点的に説明する。
Second embodiment Hereinafter, a second embodiment of the present invention will be described. In the drawings and description of the second embodiment, the same or equivalent components and members as those of the first embodiment are denoted by the same reference numerals. Descriptions that overlap with the first embodiment will be omitted as appropriate, and the description will focus on the configurations that differ from the first embodiment.
 図8を用いて、本実施形態の画像生成装置における画像生成に関する処理S200を説明する。ステップS201~S208は、図5のステップS101~S104、S109~S112と基本的に同様である。すなわち、本実施形態の処理S200は、図5のステップS108のような新たなHMD位置及びHMD方向のみに基づく第1リプロジェクション処理を実行するものではなく、三次元画像がHMD100に表示されるタイミングの新たな基準位置及び基準方向並びに新たなHMD位置及びHMD方向に基づく第2リプロジェクション処理を実行するものである。 The process S200 for image generation in the image generating device of this embodiment will be described with reference to FIG. 8. Steps S201 to S208 are basically the same as steps S101 to S104 and S109 to S112 in FIG. 5. That is, the process S200 of this embodiment does not execute a first reprojection process based only on a new HMD position and HMD direction as in step S108 of FIG. 5, but executes a second reprojection process based on a new reference position and reference direction at the timing when the three-dimensional image is displayed on the HMD 100, as well as a new HMD position and HMD direction.
 図9を用いて、第2実施形態のリプロジェクション処理の流れを説明する。三次元画像の描画タイミングに到達すると、その描画タイミングにおける基準位置及び基準方向並びにHMD位置及びHMD方向に基づくレンダリング処理1が実行される。次に、レンダリング処理1において描画された三次元画像に対して、その三次元画像がHMD100に表示されるタイミングの新たな基準位置及び基準方向並びに新たなHMD位置及びHMD方向に基づいて第2リプロジェクション処理が実行され、その第2リプロジェクション処理済みの三次元画像がHMD100に送信される。ここでは、例えば、三次元画像がHMD100に送信されるタイミング(タイムスタンプt0)に得られた基準位置等を「三次元画像がHMD100に表示されるタイミングの新たな基準位置等」とみなしてもよいし、描画タイミング後に新たに得られた基準位置等に基づいて公知の手法により予測された「三次元画像がHMD100に表示されるタイミングの新たな基準位置等」が用いられてもよい。その後、三次元画像がHMD100に表示されるタイミングでの新たな基準位置及び基準方向並びに新たなHMD位置及びHMD方向に基づくリプロジェクション処理が適用された三次元画像がHMD100で表示される。 The flow of the reprojection process of the second embodiment will be described with reference to FIG. 9. When the drawing timing of the three-dimensional image is reached, a rendering process 1 is executed based on the reference position and reference direction at that drawing timing and the HMD position and HMD direction. Next, a second reprojection process is executed for the three-dimensional image drawn in the rendering process 1 based on the new reference position and reference direction at the timing when the three-dimensional image is displayed on the HMD 100 and the new HMD position and HMD direction, and the three-dimensional image after the second reprojection process is transmitted to the HMD 100. Here, for example, the reference position etc. obtained at the timing when the three-dimensional image is transmitted to the HMD 100 (timestamp t0) may be regarded as the "new reference position etc. at the timing when the three-dimensional image is displayed on the HMD 100", or a "new reference position etc. at the timing when the three-dimensional image is displayed on the HMD 100" predicted by a known method based on the reference position etc. newly obtained after the drawing timing may be used. After that, the three-dimensional image to which reprojection processing has been applied based on the new reference position and reference direction at the time when the three-dimensional image is displayed on the HMD 100 and the new HMD position and HMD direction is applied is displayed on the HMD 100.
 ここで、三次元画像の生成から表示までの間には若干の遅延があるため、この間に基準位置及び基準方向がずれる場合がある。これに対し、本実施形態では、リプロジェクション処理は、三次元画像がHMD100に表示されるタイミングの新たな基準位置及び基準方向に合うように三次元画像を変換する。本構成によると、ユーザが基準位置及び基準方向のずれを感知しにくくなるため、HMD画像に対してユーザが違和感を覚えることをより効果的に抑制できる。 Here, because there is a slight delay between the generation and display of the three-dimensional image, the reference position and reference direction may shift during this time. In contrast, in this embodiment, the reprojection process converts the three-dimensional image so that it matches the new reference position and reference direction at the time the three-dimensional image is displayed on the HMD 100. With this configuration, it becomes difficult for the user to sense the shift in the reference position and reference direction, so it is possible to more effectively prevent the user from feeling uncomfortable about the HMD image.
 第2実施形態においても、第1実施形態の変形例のように、三次元画像の描画タイミングから次の描画タイミングまでの間に、そのレンダリング処理における三次元画像が最初に表示されたタイミングから所定時間経過後の新たな基準位置及び基準方向に合うように新たにリプロジェクション処理が適用された三次元画像が1つ又は複数表示されてもよ
即ち、図9においては1つのレンダリング処理に対して第2リプロジェクション処理を1回のみ行う実施形態としたが、1つのレンダリング処理に対して、複数回の第2リプロジェクション処理を行う実施形態としてもよい。
In the second embodiment, as in the modified example of the first embodiment, between the drawing timing of a three-dimensional image and the next drawing timing, one or more three-dimensional images to which a new reprojection process has been applied may be displayed so as to match a new reference position and reference direction after a predetermined time has elapsed since the timing when the three-dimensional image in that rendering process was first displayed. That is, although the embodiment in FIG. 9 shows that the second reprojection process is performed only once for one rendering process, it is also possible to use an embodiment in which the second reprojection process is performed multiple times for one rendering process.
 以上、本発明を実施の形態をもとに説明した。実施の形態は例示であり、それらの各構成要素や各処理プロセスの組合せにいろいろな変形例が可能なこと、またそうした変形例も本発明の範囲にあることは当業者に理解されるところである。そのような変形例を説明する。 The present invention has been described above based on an embodiment. The embodiment is merely an example, and those skilled in the art will understand that various modifications are possible in the combination of each component and each processing process, and that such modifications are also within the scope of the present invention. Such modifications will now be described.
本発明は、画像生成技術に関する。 The present invention relates to image generation technology.
 100 ヘッドマウントディスプレイ、 200 画像生成装置、 201 ゲーム制御部、 202 基準決定部、 203 視点決定部、 204 レンダリング部、 205 画像処理部、 205a ポストプロセス部、 205b リプロジェクション部、 205c 歪み処理部、 206 送信部、 207 記憶部、 300 入力装置。 100 Head-mounted display, 200 Image generation device, 201 Game control unit, 202 Reference determination unit, 203 Viewpoint determination unit, 204 Rendering unit, 205 Image processing unit, 205a Post-processing unit, 205b Reprojection unit, 205c Distortion processing unit, 206 Transmission unit, 207 Memory unit, 300 Input device.

Claims (7)

  1.  ヘッドマウントディスプレイの位置及び方向の少なくとも1つの指示入力とは異なる指示入力に基づいて、仮想空間においてユーザが配置される位置及び方向である基準位置及び基準方向の少なくとも1つを決定する基準決定部と、
     前記基準位置及び基準方向の少なくとも1つに基づいて、前記仮想空間における前記ユーザの視点位置及び視点方向の少なくとも1つを決定する視点決定部と、
     前記視点位置及び視点方向の少なくとも1つに基づいて、前記仮想空間のオブジェクトをレンダリングして三次元画像を生成するレンダリング部と、
     前記三次元画像を新たな前記基準位置及び基準方向の少なくとも1つに合うように変換するリプロジェクション処理を実行するリプロジェクション部と、
     を備える、画像生成装置。
    a reference determination unit that determines at least one of a reference position and a reference direction, which are a position and a direction in which a user is to be placed in a virtual space, based on an instruction input that is different from at least one of an instruction input of a position and a direction of the head mounted display;
    a viewpoint determination unit that determines at least one of a viewpoint position and a viewpoint direction of the user in the virtual space based on at least one of the reference position and the reference direction;
    a rendering unit that generates a three-dimensional image by rendering an object in the virtual space based on at least one of the viewpoint position and the viewpoint direction;
    a reprojection unit that performs a reprojection process to convert the three-dimensional image to match at least one of the new reference position and the new reference direction;
    An image generating device comprising:
  2.  前記視点決定部は、前記ヘッドマウントディスプレイの位置及び方向の少なくとも1つの指示入力にさらに基づいて、前記視点位置及び視点方向の少なくとも1つを決定し、
     前記リプロジェクション処理は、前記ヘッドマウントディスプレイの位置及び方向の少なくとも1つから前記ヘッドマウントディスプレイの新たな前記位置及び方向の少なくとも1つへの変化量と、前記基準位置及び基準方向の少なくとも1つから前記新たな基準位置及び基準方向の少なくとも1つへの変化量とに基づいて、前記三次元画像を新たな前記視点位置及び視点方向の少なくとも1つに合うように変換する、
     請求項1に記載の画像生成装置。
    the viewpoint determination unit determines at least one of the viewpoint position and the viewpoint direction based on at least one instruction input of the position and the direction of the head mounted display;
    The reprojection process converts the three-dimensional image to match at least one of the new viewpoint position and viewpoint direction based on an amount of change from at least one of the position and orientation of the head mounted display to at least one of the new position and orientation of the head mounted display and an amount of change from at least one of the reference position and reference orientation to at least one of the new reference position and reference orientation.
    2. The image generating device of claim 1.
  3.  前記三次元画像を前記ヘッドマウントディスプレイに表示させた後、前記リプロジェクション処理を施した三次元画像を前記ヘッドマウントディスプレイに表示させる表示制御部を備え、
     前記リプロジェクション処理は、前記三次元画像が表示されたタイミングから所定時間経過後の前記新たな基準位置及び基準方向の少なくとも1つに合うように前記三次元画像を変換する、
     請求項1に記載の画像生成装置。
    a display control unit that causes the head mounted display to display the three-dimensional image after the three-dimensional image has been subjected to the reprojection processing;
    The reprojection process converts the three-dimensional image so as to conform to at least one of the new reference position and the new reference direction after a predetermined time has elapsed since the timing when the three-dimensional image was displayed.
    2. The image generating device of claim 1.
  4.  前記リプロジェクション処理は、前記ヘッドマウントディスプレイに前記三次元画像が表示されるときの前記新たな基準位置及び基準方向の少なくとも1つに合うように前記三次元画像を変換する、
     請求項1に記載の画像生成装置。
    The reprojection process converts the three-dimensional image so as to match at least one of the new reference position and the new reference direction when the three-dimensional image is displayed on the head mounted display.
    2. The image generating device of claim 1.
  5.  前記異なる指示入力は、前記ヘッドマウントディスプレイとは異なるユーザ入力装置を介した指示入力と、前記仮想空間内での予め定められたシーンを進行させるための指示入力と、のうちの少なくとも1つを含む、
     請求項1に記載の画像生成装置。
    The different instruction input includes at least one of an instruction input via a user input device different from the head mounted display and an instruction input for progressing a predetermined scene in the virtual space.
    2. The image generating device of claim 1.
  6.  ヘッドマウントディスプレイの位置及び方向の少なくとも1つの指示入力とは異なる指示入力に基づいて、仮想空間におけるユーザの基準位置及び基準方向の少なくとも1つを決定するステップと、
     前記基準位置及び基準方向の少なくとも1つに基づいて、前記仮想空間における前記ユーザの視点位置及び視点方向の少なくとも1つを決定するステップと、
     前記視点位置及び視点方向の少なくとも1つに基づいて、前記仮想空間のオブジェクトをレンダリングして三次元画像を生成するステップと、
     前記三次元画像を新たな前記基準位置及び基準方向の少なくとも1つに合うように変換するリプロジェクション処理を実行するステップと、
     を備える、画像生成方法。
    determining at least one of a reference position and a reference direction of the user in the virtual space based on an instruction input different from at least one of an instruction input of a position and an orientation of the head mounted display;
    determining at least one of a viewpoint position and a viewpoint direction of the user in the virtual space based on at least one of the reference position and the reference direction;
    generating a three-dimensional image by rendering an object in the virtual space based on at least one of the viewpoint position and the viewpoint direction;
    performing a reprojection process to convert the three-dimensional image to match at least one of the new reference position and reference orientation;
    An image generating method comprising:
  7.  コンピュータに、
     ヘッドマウントディスプレイの位置及び方向の少なくとも1つの指示入力とは異なる指示入力に基づいて、仮想空間におけるユーザの基準位置及び基準方向の少なくとも1つを決定するステップと、
     前記基準位置及び基準方向の少なくとも1つに基づいて、前記仮想空間における前記ユーザの視点位置及び視点方向の少なくとも1つを決定するステップと、
     前記視点位置及び視点方向の少なくとも1つに基づいて、前記仮想空間のオブジェクトをレンダリングして三次元画像を生成するステップと、
     前記三次元画像を新たな前記基準位置及び基準方向の少なくとも1つに合うように変換するリプロジェクション処理を実行するステップと、
     を実行させるための画像生成プログラム。
    On the computer,
    determining at least one of a reference position and a reference direction of the user in the virtual space based on an instruction input different from at least one of an instruction input of a position and an orientation of the head mounted display;
    determining at least one of a viewpoint position and a viewpoint direction of the user in the virtual space based on at least one of the reference position and the reference direction;
    generating a three-dimensional image by rendering an object in the virtual space based on at least one of the viewpoint position and the viewpoint direction;
    performing a reprojection process to convert the three-dimensional image to match at least one of the new reference position and reference orientation;
    An image generation program for executing the above.
PCT/JP2023/045397 2023-01-10 2023-12-19 Image generation device, image generation method, and image generation program WO2024150615A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023001646A JP2024098257A (en) 2023-01-10 2023-01-10 Image generating device, image generating method, and image generating program
JP2023-001646 2023-01-10

Publications (1)

Publication Number Publication Date
WO2024150615A1 true WO2024150615A1 (en) 2024-07-18

Family

ID=91896687

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/045397 WO2024150615A1 (en) 2023-01-10 2023-12-19 Image generation device, image generation method, and image generation program

Country Status (2)

Country Link
JP (1) JP2024098257A (en)
WO (1) WO2024150615A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07334703A (en) * 1994-06-10 1995-12-22 Susumu Tate Three-dimensional video processor and its method
JP2012033038A (en) * 2010-07-30 2012-02-16 Fujitsu Ltd Simulation video generation apparatus, method and program
JP2015095045A (en) * 2013-11-11 2015-05-18 株式会社ソニー・コンピュータエンタテインメント Image generation apparatus and image generation method
JP2017097122A (en) * 2015-11-20 2017-06-01 株式会社ソニー・インタラクティブエンタテインメント Information processing device and image generation method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07334703A (en) * 1994-06-10 1995-12-22 Susumu Tate Three-dimensional video processor and its method
JP2012033038A (en) * 2010-07-30 2012-02-16 Fujitsu Ltd Simulation video generation apparatus, method and program
JP2015095045A (en) * 2013-11-11 2015-05-18 株式会社ソニー・コンピュータエンタテインメント Image generation apparatus and image generation method
JP2017097122A (en) * 2015-11-20 2017-06-01 株式会社ソニー・インタラクティブエンタテインメント Information processing device and image generation method

Also Published As

Publication number Publication date
JP2024098257A (en) 2024-07-23

Similar Documents

Publication Publication Date Title
KR102384232B1 (en) Technology for recording augmented reality data
EP3760287B1 (en) Method and device for generating video frames
US10496158B2 (en) Image generation device, image generation method and non-transitory recording medium storing image generation program
US11003408B2 (en) Image generating apparatus and image generating method
US20220226731A1 (en) Positional Haptics Via Head-Mounted Peripheral
JP6978289B2 (en) Image generator, head-mounted display, image generation system, image generation method, and program
US11187895B2 (en) Content generation apparatus and method
WO2024150615A1 (en) Image generation device, image generation method, and image generation program
US11100716B2 (en) Image generating apparatus and image generation method for augmented reality
JP7047085B2 (en) Image generator, image generator, and program
JP2023095862A (en) Program and information processing method
JP2023099494A (en) Data processing apparatus for virtual reality, data processing method, and computer software
TWI715474B (en) Method for dynamically adjusting camera configuration, head-mounted display and computer device
WO2022107688A1 (en) Image generating device, image generating method, and program
WO2022190572A1 (en) Image generating device, program, image generating method, and image display system
US20220317765A1 (en) Image generation apparatus, image generation method, and image displaying program
JP2024015868A (en) Head-mounted display and image display method
Choi et al. VR Game Interfaces for Interaction onto 3D Display.
JPH1031756A (en) Picture generating device, its method and information storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23916229

Country of ref document: EP

Kind code of ref document: A1