WO2024150615A1 - Dispositif, procédé et programme de génération d'image - Google Patents
Dispositif, procédé et programme de génération d'image Download PDFInfo
- Publication number
- WO2024150615A1 WO2024150615A1 PCT/JP2023/045397 JP2023045397W WO2024150615A1 WO 2024150615 A1 WO2024150615 A1 WO 2024150615A1 JP 2023045397 W JP2023045397 W JP 2023045397W WO 2024150615 A1 WO2024150615 A1 WO 2024150615A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- viewpoint
- hmd
- dimensional image
- image
- virtual space
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 87
- 238000012545 processing Methods 0.000 claims abstract description 37
- 238000009877 rendering Methods 0.000 claims abstract description 33
- 230000008569 process Effects 0.000 claims description 82
- 230000008859 change Effects 0.000 claims description 27
- 230000002250 progressing effect Effects 0.000 claims description 3
- 230000001131 transforming effect Effects 0.000 abstract 1
- 238000012805 post-processing Methods 0.000 description 14
- 230000005540 biological transmission Effects 0.000 description 13
- 210000003128 head Anatomy 0.000 description 12
- 238000010586 diagram Methods 0.000 description 11
- FFBHFFJDDLITSX-UHFFFAOYSA-N benzyl N-[2-hydroxy-4-(3-oxomorpholin-4-yl)phenyl]carbamate Chemical compound OC1=C(NC(=O)OCC2=CC=CC=C2)C=CC(=C1)N1CCOCC1=O FFBHFFJDDLITSX-UHFFFAOYSA-N 0.000 description 5
- 230000009471 action Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 206010025482 malaise Diseases 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000007654 immersion Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the present invention relates to image generation technology.
- Players wear a head-mounted display connected to a game console on their head and operate a controller while looking at the screen displayed on the head-mounted display.
- the user can see nothing but the image displayed on the head-mounted display, which increases the sense of immersion in the world of the images and has the effect of further enhancing the entertainment value of the game.
- VR virtual reality
- images are displayed on the head-mounted display and the user wearing the head-mounted display rotates their head, a 360-degree virtual space that can be seen all around is displayed. This further increases the sense of immersion in the images and improves the operability of applications such as games.
- a head-mounted display When a head-mounted display is equipped with a head tracking function in this way and VR images are generated by changing the viewpoint position and direction in conjunction with the position and direction of the user's head, there is a delay between the generation and display of the VR images. This can result in a discrepancy between the position and direction of the user's head assumed when the image is generated and the position and direction of the user's head at the time the VR image is displayed on the head-mounted display. As a result, the user may experience a feeling of sickness (known as "VR sickness (Virtual Reality Sickness)"). For this reason, a reprojection process is commonly used to correct the rendered image to match the position and direction of the head-mounted display when the image is displayed.
- VR sickness Virtual Reality Sickness
- the user's viewpoint position and viewpoint direction in the virtual space change not only in response to the position and direction of the head mounted display, but also in response to changes in the position and direction in which the user is positioned in the virtual space.
- This position and direction in which the user is positioned may change between the time an image is generated and the time the next image is generated. In this case, the change in the user's position and direction during this time may not be reflected, and the three-dimensional image may differ from what the user expected, causing the user to feel uncomfortable.
- the object of the present invention is to provide a technology that prevents a user from feeling uncomfortable about a three-dimensional image due to a misalignment of the user's position in a virtual space.
- an image generating device includes a reference determination unit that determines at least one of a reference position and a reference direction, which are a position and a direction in which a user is to be placed in a virtual space, based on an instruction input different from at least one of an instruction input of a position and a direction of a head-mounted display; a viewpoint determination unit that determines at least one of a viewpoint position and a viewpoint direction of the user in the virtual space, based on at least one of the reference position and the reference direction; a rendering unit that renders an object in the virtual space, based on at least one of the viewpoint position and the viewpoint direction, to generate a three-dimensional image; and a reprojection unit that executes a reprojection process that converts the three-dimensional image to match at least one of the new reference position and the reference direction.
- An image generating method includes the steps of: determining at least one of a user's reference position and reference direction in a virtual space based on an instruction input different from at least one of an instruction input of a position and an orientation of a head-mounted display; determining at least one of a viewpoint position and a viewpoint direction of the user in the virtual space based on the at least one of the reference position and the reference direction; rendering an object in the virtual space based on the at least one of the viewpoint position and the viewpoint direction to generate a three-dimensional image; and executing a reprojection process to convert the three-dimensional image to match the new at least one of the reference position and the reference direction.
- Another aspect of the image generation program of the present invention is an image generation program for causing a computer to execute the steps of: determining at least one of a user's reference position and reference direction in a virtual space based on an instruction input different from at least one of an instruction input of a position and direction of a head-mounted display; determining at least one of a viewpoint position and viewpoint direction of the user in the virtual space based on at least one of the reference position and reference direction; rendering an object in the virtual space based on at least one of the viewpoint position and viewpoint direction to generate a three-dimensional image; and executing a reprojection process to convert the three-dimensional image to match at least one of the new reference position and reference direction.
- the present invention makes it possible to provide a technology that prevents users from feeling uncomfortable about three-dimensional images due to misalignment of the user in a virtual space.
- FIG. 1 is an external view of a head mounted display.
- FIG. 1 is a configuration diagram of an image generating system.
- FIG. 3 is a configuration diagram of the image generating device of FIG. 2.
- 10A to 10C are diagrams for explaining the relationship between a reference position and a reference direction, an HMD position and an HMD direction, and a viewpoint position and a viewpoint direction.
- 4 is a flowchart showing a flow of processing relating to image generation in the image generating device of the first embodiment.
- 11A to 11C are diagrams for explaining the relationship between the amount of change in the reference position and reference direction, the amount of change in the HMD position and HMD direction, and a new viewpoint position and viewpoint direction.
- FIG. 4 is a diagram for explaining the flow of a reprojection process according to the first embodiment.
- 10 is a flowchart showing a flow of processing relating to image generation in an image generating device according to a second embodiment.
- FIG. 11 is a diagram for explaining the flow of a repro
- the HMD 100 is a display device that is worn on a user's head to view still images and videos displayed on a display and to listen to audio and music outputted from headphones.
- the position of the head of a user wearing the HMD 100 and the direction of the head's rotation angle and tilt can be measured using a gyro sensor or acceleration sensor built into or attached externally to the HMD 100.
- the position and direction of the head are detected based on the position and direction of the head at the time the HMD 100 is turned on.
- the HMD 100 may further be provided with a camera that photographs the user's eyes. The camera mounted on the HMD 100 can detect the user's gaze direction, pupil movement, blinking, etc.
- FIG. 2 is a configuration diagram of an image generation system according to this embodiment.
- the image generation system 1 includes an HMD 100, an image generation device 200, and an input device 300.
- the image generating device 200 of this embodiment is a game machine.
- the image generating device 200 may be connected to a server via a network.
- the server may provide the image generating device 200 with an online application such as a game in which multiple users can participate via the network.
- the image generating device 200 basically processes the content program, generates three-dimensional images, and transmits them to the HMD 100.
- the content program and data are read by a media drive (not shown) from a ROM medium (not shown) on which the content's application software, such as a game, and license information are recorded.
- This ROM medium is a read-only recording medium such as an optical disk, a magneto-optical disk, or a Blu-ray (registered trademark) disk.
- the image generating device 200 determines the viewpoint position and viewpoint direction based on the position and direction of the head of the user wearing the HMD 100 (hereinafter referred to as the HMD position and HMD direction) and the reference position and reference direction described below, generates three-dimensional images of the content at a predetermined rate so that the field of view is accordingly obtained, and transmits them to the HMD 100.
- the HMD 100 receives and displays the three-dimensional image generated by the image generating device 200.
- the three-dimensional image displayed on the HMD 100 may be an image captured in advance by a camera, an image generated by computer graphics such as a game image, or a live image from a remote location delivered via a network.
- the three-dimensional image displayed on the HMD 100 may also be a VR image, an AR (augmented reality) image, an MR (mixed reality) image, etc.
- the input device 300 supplies various inputs received from the user to the image generating device 200.
- the input device 300 supplies the image generating device 200 with instruction inputs for changing the user's position and direction in the virtual space.
- the input device 300 is realized by any of the typical input devices, such as a game controller, a keyboard, a mouse, a joystick, a video camera that captures the user's gestures, a touchpad provided on the display screen of a flat panel display, or a combination of these.
- the input device 300 of this embodiment is an example of a user input device.
- FIG. 3 is a functional configuration diagram of the image generating device 200 of FIG. 2.
- the diagram is a block diagram focusing on functions, and these functional blocks can be realized in various forms by hardware only, software only, or a combination of both.
- At least a part of the functions of the image generating device 200 may be implemented by the HMD 100.
- at least a part of the functions of the image generating device 200 may be implemented by a server connected to the image generating device 200 via a network.
- the image generating device 200 comprises a game control unit 201, a reference determination unit 202, a viewpoint determination unit 203, a rendering unit 204, an image processing unit 205, a transmission unit 206, and a storage unit 207.
- the game control unit 201 executes the game program read from the storage unit 207 to progress the game.
- the game control unit 201 transmits instruction inputs to the reference determination unit 202 for progressing through a predetermined scene in the virtual space, and changes the reference position and reference direction of the user in the virtual space, which will be described later.
- the "predetermined scene in the virtual space” refers to, for example, a scene that represents an action of an object on the user in the virtual space, or an event scene that occurs in the virtual space.
- the "action of an object on the user in the virtual space” refers to, for example, an action that moves the user in a direction that is predetermined according to the positional relationship between the user and the object, such as when the user collides with an object of a physical object, or when the user is pushed or pulled by an enemy character object, or an action that moves the user according to a predetermined progression path of a vehicle, such as when the user rides on a vehicle object.
- the "event scene that occurs in the virtual space” refers to a scene in which the user is forced to move in a predetermined pattern, such as a movie scene.
- the reference determination unit 202 determines a reference position and a reference direction, which are the position and direction in which the user is placed in the virtual space, based on an instruction input different from the instruction input of the position and direction of the HMD 100 (hereinafter, HMD position and HMD direction).
- the reference position and the reference direction change regardless of the change in the HMD position and the HMD direction.
- the reference position and the reference direction change based on at least one of the instruction input via the input device 300 and the instruction input from the game control unit 201 for progressing a predetermined scene in the virtual space.
- the reference position is set to the center of the user's body in the virtual space (e.g., the center of the torso, etc.), and the reference direction is set to the direction of the user's body in the virtual space (e.g., the direction of the torso, etc.).
- setting the reference position to the center of the user's body is merely an example, and the reference position can be set to any position based at least on the user's position.
- the reference determination unit 202 reads out a reference position and a reference direction in the virtual space from the storage unit 207, and determines a reference position and a reference direction based on an instruction input received from at least one of the input device 300 and the game control unit 201.
- the reference determination unit 202 updates the reference position and reference direction in the storage unit 207 with the determined reference position and reference direction.
- the viewpoint determination unit 203 acquires the HMD position and HMD direction as well as the reference position and reference direction, and determines the user's viewpoint position and viewpoint direction in the virtual space based on these. For example, the viewpoint determination unit 203 determines the viewpoint position and viewpoint direction based on the reference position and reference direction so as to reflect the relative position and relative direction of the HMD 100 with respect to the reference position and reference direction.
- FIG. 4 is a diagram of a three-dimensional virtual space consisting of mutually orthogonal x-, y-, and z-axes, viewed from the z direction.
- the rotation angle ⁇ is about the x-axis
- the rotation angle ⁇ is about the y-axis
- the rotation angle ⁇ is about the z-axis.
- the viewpoint position is determined to be (x1+x2, y1+y2, z1+z2).
- the viewpoint direction is determined to be ( ⁇ 1+ ⁇ 2, ⁇ 1+ ⁇ 2, ⁇ 1+ ⁇ 2).
- the rendering unit 204 reads image data required for generating an image from the storage unit 207, and generates a three-dimensional image by rendering objects in a virtual space. For example, the rendering unit 204 generates a three-dimensional image by rendering objects in a virtual space that are visible from the viewpoint position and in the viewpoint direction of a user wearing the HMD 100, based on the user's viewpoint position and viewpoint direction determined by the viewpoint determination unit 203.
- the image processing unit 205 processes the rendered image as necessary to generate a three-dimensional image (hereinafter referred to as an HMD image) to be displayed on the HMD 100, and provides it to the transmission unit 206.
- the image processing unit 205 includes a post-processing unit 205a, a reprojection unit 205b, and a distortion processing unit 205c.
- the reprojection unit 205b executes the reprojection process.
- the reprojection process includes a first reprojection process and a second reprojection. The first reprojection process and the second reprojection process will be described later.
- the reprojection unit 205b supplies the three-dimensional image that has been subjected to the reprojection process to the distortion processing unit 205c.
- the distortion processing unit 205c performs processing to distort the three-dimensional image by modifying the image to match the distortion caused by the optical system of the HMD 100.
- the transmission unit 206 transmits the HMD image generated in the image processing unit 205 to the HMD 100.
- the transmission unit 206 transmits the HMD image that has been subjected to the first reprojection process, and then transmits the HMD image that has been subjected to the second reprojection process.
- the transmission unit 206 in this embodiment causes the HMD 100 to display the HMD image that has been subjected to the first reprojection process, and then causes the HMD 100 to display the HMD image that has been subjected to the second reprojection process.
- the transmission unit in this embodiment is an example of a display control unit.
- the memory unit 207 stores the reference position and reference direction, data required for generating images, and various programs for executing various processes.
- step S102 the viewpoint determination unit 203 sets the viewpoint position and viewpoint direction based on the HMD position and HMD direction and the reference position and reference direction.
- the viewpoint determination unit 203 supplies the determined viewpoint position and viewpoint direction to the rendering unit 204.
- step S103 the rendering unit 204 renders the object in the virtual space based on the determined viewpoint position and viewpoint direction to generate a three-dimensional image.
- the rendering unit 204 supplies the generated three-dimensional image to the post-processing unit 205a of the image processing unit 205.
- step S104 the post-processing unit 205a performs post-processing on the rendered three-dimensional image.
- the post-processing unit 205a supplies the three-dimensional image that has been subjected to the post-processing to the reprojection unit 205b.
- step S105 the reprojection unit 205b acquires a new first HMD position and first HMD direction.
- the new first HMD position and first HMD direction here are the HMD position and HMD direction obtained at the time of drawing the three-dimensional image.
- the reprojection unit 205b executes a first reprojection process on the post-processed three-dimensional image based on the amount of change from the HMD position and HMD orientation to the new first HMD position and first HMD orientation.
- the first reprojection process is a process of converting the post-processed three-dimensional image to match the new HMD position and HMD orientation based on the HMD position and HMD orientation and the new HMD position and HMD orientation.
- the first reprojection process can reduce delays due to deviations between the HMD position and HMD orientation in step S102 and the new first HMD position and first HMD orientation obtained at the drawing timing.
- the reprojection unit 205b supplies the post-processed and first reprojection processed three-dimensional image to the distortion processing unit 205c.
- step S107 the distortion processing unit 205c performs distortion processing on the three-dimensional image that has been subjected to the post-processing and the first reprojection processing.
- the distortion processing unit 205c supplies the three-dimensional image that has been subjected to the post-processing, the first reprojection processing, and the distortion processing to the transmission unit 206.
- step S108 the transmission unit 206 transmits the three-dimensional image that has been subjected to the post-processing, the first reprojection processing, and the distortion processing to the HMD 100 as an HMD image.
- the HMD image that has been subjected to the first reprojection processing is displayed on the HMD 100.
- step S109 the reprojection unit 205b acquires a new reference position and reference direction as well as a new second HMD position and second HMD direction.
- the new reference position and reference direction as well as the new second HMD position and second HMD direction are, for example, the reference position and reference direction as well as the HMD position and HMD direction after a predetermined time has elapsed from the timing when the HMD image is displayed on the HMD 100.
- the reference position and reference direction as well as the HMD position and HMD direction after a predetermined time of 8.3 milliseconds has elapsed from the timing when the HMD image is displayed on the HMD 100 are set as the new reference position and reference direction as well as the new second HMD position and second HMD direction.
- the timing when the HMD image is displayed on the HMD 100 is determined taking into consideration the transmission time, etc., expected when the image rendered by the image generating device 200 is transmitted to the HMD 100.
- the reprojection unit 205b executes a second reprojection process based on the amount of change from the reference position and reference direction to the new reference position and reference direction (hereinafter referred to as the amount of change in the reference position and reference direction) and the amount of change from the HMD position and HMD direction to the new second HMD position and second HMD direction (hereinafter referred to as the amount of change in the HMD position and HMD direction).
- the second reprojection process is a process that converts the post-processed three-dimensional image to match the new viewpoint position and viewpoint direction based on the reference position and reference direction, the new reference position and reference direction, the HMD position and HMD direction, and the new HMD position and HMD direction.
- the second reprojection process converts the post-processed three-dimensional image into a three-dimensional image that can be seen from the new viewpoint position and viewpoint direction.
- the relationship between the amount of change in the reference position and reference direction, the amount of change in the HMD position and HMD direction, and the new viewpoint position and viewpoint direction will be described with reference to FIG. 6.
- the viewpoint position determined in step S102 is (x, y, z)
- the amount of change in the reference position is ( ⁇ xr, ⁇ yr, ⁇ zr)
- the amount of change in the HMD position is ( ⁇ xh, ⁇ yh, ⁇ zh)
- the new viewpoint position is expressed as (x+ ⁇ xr+ ⁇ xh, y+ ⁇ yr+ ⁇ yh, z+ ⁇ zr+ ⁇ zh).
- the new viewpoint direction is expressed as ( ⁇ + ⁇ r+ ⁇ h, ⁇ + ⁇ r+ ⁇ h, ⁇ + ⁇ r+ ⁇ h).
- the reprojection unit 205b executes the reprojection process, for example, by linearly complementing the viewpoint position based on the new viewpoint position expressed as above, and spherically linearly complementing the viewpoint direction based on the new viewpoint direction expressed as above. This converts the three-dimensional image to match the new viewpoint position and viewpoint direction.
- the reprojection unit 205b supplies the three-dimensional image that has been subjected to the second reprojection process to the distortion processing unit 205c.
- step S111 the distortion processing unit 205c performs distortion processing on the three-dimensional image that has been subjected to the second reprojection processing.
- the distortion processing unit 205c supplies the three-dimensional image that has been subjected to the post-processing, the second reprojection processing, and the distortion processing to the transmission unit 206.
- step S112 the transmission unit 206 transmits the three-dimensional image that has been subjected to the post-processing, second reprojection processing, and distortion processing to the HMD 100 as an HMD image.
- the HMD image that has been subjected to the second reprojection processing is displayed on the HMD 100.
- step S112 process S100 ends.
- the synchronization timing Vsync indicates the vertical synchronization timing of the display panel of the HMD 100.
- a rendering process 1 is executed based on the reference position and reference direction and the HMD position and HMD direction at that drawing timing.
- the first reprojection process is executed on the drawn three-dimensional image based on the new first HMD position and first HMD direction obtained at the drawing timing of the rendering process 1, and the three-dimensional image after the first reprojection process is transmitted to the HMD 100.
- the three-dimensional image 1-1 to which the first reprojection process is applied at timestamp t0 is displayed on the HMD 100 (corresponding to steps S101 to S108 above).
- the reason why the reference position and reference direction are not taken into consideration in the first reprojection process is that, as described above, the reference position and reference direction at the drawing timing of the three-dimensional image 1-1 are used, and it can be considered that the reference position and reference direction at the timestamp t0 are roughly reflected.
- a second reprojection process is performed on the three-dimensional image rendered in the rendering process 1 based on a new reference position and reference direction after a predetermined time has elapsed since the timestamp t0 (for example, immediately before the timestamp t1) and a new second HMD position and second HMD direction, and the three-dimensional image that has undergone the second reprojection process is transmitted to the HMD 100.
- the predetermined time here is, for example, a time that is less than the frame interval time from the time when the three-dimensional image is displayed on the HMD 100 to the time when the three-dimensional image of the next frame is displayed on the HMD 100, and can be set to, for example, a time equivalent to half the frame interval time.
- rendering process 2 is executed based on the reference position and reference direction as well as the HMD position and HMD direction at that drawing timing.
- a first reprojection process is executed on the drawn three-dimensional image based on the new first HMD position and first HMD direction obtained at the drawing timing of rendering process 2, and a three-dimensional image 2-1 to which the first reprojection process has been applied at timestamp t2 is displayed on the HMD 100.
- a second reprojection process is executed on the three-dimensional image drawn in rendering process 2 based on the new reference position and reference direction as well as the new second HMD position and second HMD direction after a predetermined time has elapsed since timestamp t2 (for example, immediately before timestamp t3), and a three-dimensional image 2-2 to which the second reprojection process has been applied at timestamp t3 is displayed on the HMD 100.
- the user's reference position and reference direction in the virtual space may change. If the drawing is performed at a frame rate of, for example, 60 fps (frames per second), even if the CPU is fast enough, the reference position and reference direction that change in about 16.67 milliseconds will not be reflected in the HMD image. As a result, the HMD image may differ from what the user expected, causing the user to feel uncomfortable.
- a reprojection process is performed to convert the three-dimensional image to match a new reference position and reference direction.
- the second reprojection process converts the three-dimensional image to match the new viewpoint position and viewpoint direction based on the amount of change from the HMD position and HMD direction to the new second HMD position and second HMD direction, and the amount of change from the reference position and reference direction to the new reference position and reference direction.
- the reprojection process is executed taking into account changes in the HMD position and HMD direction in addition to changes in the reference position and reference direction, so that it is possible to more effectively prevent the user from feeling uncomfortable about the HMD image.
- the transmission unit 206 displays a three-dimensional image on the HMD 100, and then displays a three-dimensional image that has been subjected to the second reprojection process on the HMD 100.
- the second reprojection process converts the three-dimensional image to match a new reference position and reference direction after a predetermined time has elapsed since the time the three-dimensional image was displayed.
- the viewpoint position and viewpoint direction are determined based on the HMD position and HMD direction and the reference position and reference direction, but this is not limited to the above, and the viewpoint position and viewpoint direction may be determined based on the reference position and reference direction. In this case, the reprojection process may be performed based on the new reference position and reference direction.
- the reprojection process is performed on the position and the direction, but the reprojection process may be performed on at least one of the position and the direction.
- one three-dimensional image to which the second reprojection process has been applied is generated and displayed between the drawing timing of one three-dimensional image and the drawing timing of the next, but this is not limited to this, and two or more three-dimensional images to which the second reprojection process has been applied may be generated and displayed at different timestamps.
- an HMD image that has been subjected to the first reprojection process is transmitted, and then an HMD image that has been subjected to the second reprojection process is transmitted.
- an HMD image that has not been subjected to the reprojection process i.e., a three-dimensional image generated by the rendering process in step S103
- an HMD image that has been subjected to the second reprojection process may be transmitted.
- Steps S201 to S208 are basically the same as steps S101 to S104 and S109 to S112 in FIG. 5. That is, the process S200 of this embodiment does not execute a first reprojection process based only on a new HMD position and HMD direction as in step S108 of FIG. 5, but executes a second reprojection process based on a new reference position and reference direction at the timing when the three-dimensional image is displayed on the HMD 100, as well as a new HMD position and HMD direction.
- the three-dimensional image to which reprojection processing has been applied based on the new reference position and reference direction at the time when the three-dimensional image is displayed on the HMD 100 and the new HMD position and HMD direction is applied is displayed on the HMD 100.
- one or more three-dimensional images to which a new reprojection process has been applied may be displayed so as to match a new reference position and reference direction after a predetermined time has elapsed since the timing when the three-dimensional image in that rendering process was first displayed. That is, although the embodiment in FIG. 9 shows that the second reprojection process is performed only once for one rendering process, it is also possible to use an embodiment in which the second reprojection process is performed multiple times for one rendering process.
- the present invention relates to image generation technology.
- 100 Head-mounted display 200 Image generation device, 201 Game control unit, 202 Reference determination unit, 203 Viewpoint determination unit, 204 Rendering unit, 205 Image processing unit, 205a Post-processing unit, 205b Reprojection unit, 205c Distortion processing unit, 206 Transmission unit, 207 Memory unit, 300 Input device.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Image Generation (AREA)
Abstract
Le dispositif de génération d'image (200) selon la présente invention comprend : une unité de détermination de références (202) conçue pour déterminer une position de référence et/ou une direction de référence, autrement dit la position et la direction dans lesquelles un utilisateur est positionné dans un espace virtuel, sur la base d'une entrée d'instruction qui diffère de l'entrée d'instruction de la position et/ou de la direction d'un visiocasque (100) ; une unité de détermination de point de vue (203) conçue pour déterminer la position d'un point de vue et/ou la direction d'un point de vue de l'utilisateur dans l'espace virtuel sur la base de la position de référence et/ou de la direction de référence ; une unité de rendu (204) conçue pour générer une image tridimensionnelle en rendant un objet dans un espace virtuel sur la base de la position d'un point de vue et/ou de la direction d'un point de vue ; et une unité de reprojection (205b) conçue pour exécuter un traitement de reprojection destiné à transformer l'image tridimensionnelle de façon à correspondre à une nouvelle position de référence et/ou direction de référence.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023-001646 | 2023-01-10 | ||
JP2023001646A JP2024098257A (ja) | 2023-01-10 | 2023-01-10 | 画像生成装置、画像生成方法および画像生成プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024150615A1 true WO2024150615A1 (fr) | 2024-07-18 |
Family
ID=91896687
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/045397 WO2024150615A1 (fr) | 2023-01-10 | 2023-12-19 | Dispositif, procédé et programme de génération d'image |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2024098257A (fr) |
WO (1) | WO2024150615A1 (fr) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07334703A (ja) * | 1994-06-10 | 1995-12-22 | Susumu Tate | 3次元映像処理装置及び方法 |
JP2012033038A (ja) * | 2010-07-30 | 2012-02-16 | Fujitsu Ltd | 模擬映像生成装置、方法、プログラム |
JP2015095045A (ja) * | 2013-11-11 | 2015-05-18 | 株式会社ソニー・コンピュータエンタテインメント | 画像生成装置および画像生成方法 |
JP2017097122A (ja) * | 2015-11-20 | 2017-06-01 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および画像生成方法 |
-
2023
- 2023-01-10 JP JP2023001646A patent/JP2024098257A/ja active Pending
- 2023-12-19 WO PCT/JP2023/045397 patent/WO2024150615A1/fr unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07334703A (ja) * | 1994-06-10 | 1995-12-22 | Susumu Tate | 3次元映像処理装置及び方法 |
JP2012033038A (ja) * | 2010-07-30 | 2012-02-16 | Fujitsu Ltd | 模擬映像生成装置、方法、プログラム |
JP2015095045A (ja) * | 2013-11-11 | 2015-05-18 | 株式会社ソニー・コンピュータエンタテインメント | 画像生成装置および画像生成方法 |
JP2017097122A (ja) * | 2015-11-20 | 2017-06-01 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および画像生成方法 |
Also Published As
Publication number | Publication date |
---|---|
JP2024098257A (ja) | 2024-07-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102384232B1 (ko) | 증강 현실 데이터를 레코딩하기 위한 기술 | |
EP3760287B1 (fr) | Procédé et dispositif de génération de trames vidéo | |
US10496158B2 (en) | Image generation device, image generation method and non-transitory recording medium storing image generation program | |
JP2021002288A (ja) | 画像処理装置、コンテンツ処理システム、および画像処理方法 | |
US11003408B2 (en) | Image generating apparatus and image generating method | |
JP6978289B2 (ja) | 画像生成装置、ヘッドマウントディスプレイ、画像生成システム、画像生成方法、およびプログラム | |
US11187895B2 (en) | Content generation apparatus and method | |
WO2024150615A1 (fr) | Dispositif, procédé et programme de génération d'image | |
JP7560835B2 (ja) | プログラム及び情報処理方法 | |
US11100716B2 (en) | Image generating apparatus and image generation method for augmented reality | |
JP7047085B2 (ja) | 画像生成装置、画像生成方法、およびプログラム | |
JP2023099494A (ja) | 仮想現実のためのデータ処理装置、データ処理方法およびコンピュータソフトウェア | |
TWI715474B (zh) | 動態調整鏡頭配置的方法、頭戴式顯示器及電腦裝置 | |
WO2022107688A1 (fr) | Dispositif de génération d'image, procédé de génération d'image et programme | |
WO2022190572A1 (fr) | Dispositif de production d'image, programme, procédé de production d'image et système d'affichage d'image | |
US20220317765A1 (en) | Image generation apparatus, image generation method, and image displaying program | |
JP2024015868A (ja) | ヘッドマウントディスプレイおよび画像表示方法 | |
Choi et al. | VR Game Interfaces for Interaction onto 3D Display. | |
JPH1031756A (ja) | 画像生成装置、画像生成方法および情報記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23916229 Country of ref document: EP Kind code of ref document: A1 |