WO2018214431A1 - Procédé et appareil de présentation de scène utilisant un dispositif de réalité virtuelle, et appareil de réalité virtuelle - Google Patents

Procédé et appareil de présentation de scène utilisant un dispositif de réalité virtuelle, et appareil de réalité virtuelle Download PDF

Info

Publication number
WO2018214431A1
WO2018214431A1 PCT/CN2017/112383 CN2017112383W WO2018214431A1 WO 2018214431 A1 WO2018214431 A1 WO 2018214431A1 CN 2017112383 W CN2017112383 W CN 2017112383W WO 2018214431 A1 WO2018214431 A1 WO 2018214431A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
eye
center
image
imaging device
Prior art date
Application number
PCT/CN2017/112383
Other languages
English (en)
Chinese (zh)
Inventor
王鹏
Original Assignee
歌尔科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 歌尔科技有限公司 filed Critical 歌尔科技有限公司
Publication of WO2018214431A1 publication Critical patent/WO2018214431A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

Definitions

  • the present invention relates to the field of virtual reality technologies, and in particular, to a method, a device, and a virtual reality device for presenting a scene through a virtual reality device.
  • virtual reality technology has developed rapidly, not only for presenting virtual scenes to provide users with near-real immersion, but also for presenting real-life scenes to provide users with a sense of realism in the real world. Therefore, virtual reality devices such as virtual reality helmets, virtual reality glasses, and the like that present scenes based on virtual reality technology are also attracting more and more users' attention.
  • Current virtual reality devices are typically used to present scenes by setting a dual camera mode for the left eye camera and the right eye camera.
  • the dual camera of the virtual reality device is different from the general single camera, and the depth of field of the corresponding scene image can be increased when the scene is presented, so that the presented scene image has a stereoscopic effect and enhances the realism when the user uses to view the real world.
  • a virtual reality device uses a dual camera to present a scene, it is necessary to introduce a third-party open source library, such as the FFmpeg library, to perform a merge algorithm on each frame image previewed by the left-eye camera and the right-eye camera to achieve a real-world simulation effect.
  • a third-party open source library such as the FFmpeg library
  • a method of presenting a scene by a virtual reality device comprising a left eye imaging device and a right eye imaging device, the left eye imaging device a left eye lens, a left eye camera, a left eye screen, and a left lens head are sequentially disposed, and the right eye imaging device is sequentially provided with a right eye lens, a right eye camera, a right eye screen, and a right lens head;
  • the method includes:
  • an imaging parameter of the left-eye imaging device and an imaging parameter of the right-eye imaging device where the imaging parameter includes at least a central vertical distance of the lens to the camera in the corresponding imaging device and a refractive angle of the lens;
  • the center horizontal distance of the eye lens is not greater than the human eyelid distance
  • the left eye lens is horizontally offset from the left eye camera center by a first offset distance
  • the right eye lens is horizontally offset from the right eye camera center by a second offset Moving distance
  • the first image previewed by the left eye camera is rendered on the left eye screen
  • the second image corresponding to the simultaneous preview of the right eye camera is rendered on the right eye screen to present the corresponding scene to the user through the corresponding lens.
  • a scene presenting apparatus is provided, which is disposed on a virtual reality device side.
  • the virtual reality device includes a left eye imaging device and a right eye imaging device, and the left eye imaging device is sequentially provided with a left eye lens, a left eye camera, a left eye screen, and a left lens, and the right eye imaging device is sequentially provided with Right eye lens, right eye camera, right eye screen, and right lens head;
  • the scene rendering device includes:
  • a parameter obtaining unit configured to respectively acquire an imaging parameter of the left-eye imaging device and an imaging parameter of the right-eye imaging device, where the imaging parameter includes at least a central vertical distance of the lens to the camera in the corresponding imaging device and a refractive angle of the lens;
  • An offset distance acquisition unit configured to acquire a first offset distance of the left-eye imaging device according to an imaging parameter of the left-eye imaging device, and a second offset distance of the right-eye imaging device according to an imaging parameter of the right-eye imaging device;
  • a component adjustment unit configured to adjust the left eye lens, the left eye camera, the right eye lens, and the right eye camera according to the first offset distance, the second offset distance, and a preset human eye distance
  • the center horizontal distance between the left eye lens and the right eye lens is not greater than the human eyelid distance
  • the left eye lens is horizontally offset from the left eye camera center by a first offset distance
  • the right eye lens is horizontally offset from the right eye camera center by a second offset distance
  • An image rendering unit configured to render a first image previewed by the left eye camera on the left eye screen, and a second image corresponding to simultaneously previewing the right eye camera on the right eye screen to present a corresponding to the user through the corresponding lens Scene.
  • a virtual reality device comprising:
  • a left eye imaging device which is provided with a left eye lens, a left eye camera, a left eye screen, and a left lens head in sequence;
  • a right-eye imaging device which is sequentially provided with a right-eye lens, a right-eye camera, a right-eye screen, and a right-eye lens;
  • a method, a device, and a virtual reality device for presenting a scenario by using a virtual reality device may perform a merge algorithm processing on each frame image previewed by the left and right eye cameras without introducing a third-party library, and reduce the virtual reality device.
  • the difficulty of implementation reduce the size of the program, to avoid the problem of low operating efficiency and high power consumption.
  • the image refresh speed is improved when the scene is presented, and the realism of the scene presentation is enhanced.
  • it can enhance the comfort of users using virtual reality devices. Especially suitable for presenting real scenes in the real world.
  • FIG. 1 is a block diagram showing an example of a hardware configuration of an electronic device that can be used to implement an embodiment of the present invention.
  • FIG. 2 shows a flow diagram of a method of presenting a scene by a virtual reality device in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates an example of a method of presenting a scene by a virtual reality device according to an embodiment of the present invention. Sub-schematic.
  • FIG. 4 shows a schematic block diagram of a scene presenting apparatus of an embodiment of the present invention.
  • FIG. 5 shows a schematic block diagram of a virtual reality scene device of an embodiment of the present invention.
  • FIG. 1 is a block diagram showing a hardware configuration of an electronic device 1000 in which an embodiment of the present invention can be implemented.
  • the electronic device 1000 can be a virtual reality helmet or virtual reality glasses or the like.
  • the electronic device 1000 may include a processor 1100, a memory 1200, an interface device 1300, a communication device 1400, a display device 1500, an input device 1600, a speaker 1700, a microphone 1800, and the like.
  • the processor 1100 may be a central processing unit CPU, a microprocessor MCU, or the like.
  • the memory 1200 includes, for example, a ROM (Read Only Memory), a RAM (Random Access Memory), a nonvolatile memory such as a hard disk, and the like.
  • the interface device 1300 includes, for example, a USB interface, a headphone jack, and the like.
  • the communication device 1400 can, for example, perform wired or wireless communication, and specifically can include Wifi communication, Bluetooth communication, 2G/3G/4G/5G communication, and the like.
  • Display device 1500 cases Such as LCD screen, touch screen, etc.
  • Input device 1600 can include, for example, a touch screen, a keyboard, a somatosensory input, and the like. The user can input/output voice information through the speaker 1700 and the microphone 1800.
  • the memory 1200 of the electronic device 1000 is configured to store instructions for controlling the processor 1100 to operate to perform any of the embodiments provided by the embodiments of the present invention through a virtual reality device. The method of rendering the scene.
  • the present invention may relate only to some of the devices therein, for example, electronic device 1000 only relates to processor 1100 and memory 1200.
  • a technician can design instructions in accordance with the disclosed aspects of the present invention. How the instructions control the processor for operation is well known in the art and will not be described in detail herein.
  • a method for presenting a scene by using a virtual reality device is implemented on a virtual reality device, where the virtual reality device includes a left eye imaging device and a right eye imaging device, and the left eye imaging device is sequentially disposed with a left An eye lens, a left eye camera, a left eye screen, and a left eye lens, and the right eye imaging device is sequentially provided with a right eye lens, a right eye camera, a right eye screen, and a right eye lens.
  • the left-eye imaging device and the right-eye imaging device in the virtual reality device may be physically separated physical devices, or may be physically or partially integrated and logically separated.
  • the virtual reality device can be a virtual reality glasses or a virtual reality helmet.
  • the method for presenting a scene by using a virtual reality device includes steps S2100 to S2400.
  • step S2100 an imaging parameter of the left-eye imaging device and an imaging parameter of the right-eye imaging device are respectively acquired, the imaging parameter including at least a central vertical distance of the lens to the camera in the corresponding imaging device and a refractive angle of the lens;
  • the imaging parameters may be provided by a manufacturer or manufacturer of the corresponding virtual reality device. And pre-stored in the storage area of the virtual reality device and provides an interface for acquisition, or may be obtained as a product parameter of the virtual reality device by means of a manual or a product official website description, and provided for downloading, etc., which are not enumerated here.
  • step S2200 a first offset distance of the left-eye imaging device is acquired according to an imaging parameter of the left-eye imaging device, and a second offset distance of the right-eye imaging device is acquired according to an imaging parameter of the right-eye imaging device.
  • the imaging parameters of the left-eye rendering device acquired in step S2100 include the lens-to-camera vertical vertical distance h 1 of the left-eye imaging device and the refractive angle ⁇ 1 of the lens
  • the imaging parameters of the right-eye rendering device include right-eye imaging.
  • the steps of obtaining the first offset distance and the second offset distance include:
  • step S2300 the left eye lens, the left eye camera, the right eye lens, and the right eye camera are adjusted according to the first offset distance, the second offset distance, and a preset human eye pupil distance, so that the The center horizontal distance of the left eye lens and the right eye lens is not greater than the human eyelid distance, the left eye lens is horizontally offset from the left eye camera center by a first offset distance, and the right eye lens and the right eye camera center level are Offset the second offset distance.
  • the left eye lens is horizontally offset from the left eye camera center by a first offset distance and the right eye lens is horizontally offset from the right eye camera center by a second offset distance, which can be prevented from being presented by the virtual reality device.
  • the center horizontal distance of the left-eye lens and the right-eye lens is not greater than the human eyelid distance, and the visual fatigue caused by the user after use may be avoided because the distance between the left and right eye lenses is greater than the lay length, thereby further improving the use comfort.
  • the center vertical distance of the right lens to right eye camera is h 1 and the angle of refraction of the right lens is ⁇ 1
  • the vertical distance from the center of the left lens to the camera is h 2 and the angle of refraction of the right lens is ⁇ . 2
  • the first offset distance obtained by the step S2200 is w 1
  • the second offset distance is w 2 .
  • the corresponding components are set as shown in FIG. 3 , wherein the left side shown in FIG. 3
  • the central horizontal distance of the eye lens and the right eye lens is the human eyelid distance and is merely illustrative. In other examples, it is also possible that the center horizontal distance of the left-eye lens and the right-eye lens is a value close to the human eyelid distance.
  • the human eyelid distance may be an average distance of a general user selected according to engineering experience or experimental simulation, or may be obtained by an interface or interface provided by the virtual reality device involved in the embodiment for user operation or input.
  • the value is used by the user who actually uses the virtual reality device to set the eyelid distance according to his own needs or application scenarios, so as to realize the personalized distance setting to further improve the comfort of the user using the virtual reality device.
  • step S2400 the first image previewed by the left eye camera is rendered on the left eye screen, and the second image corresponding to the simultaneous preview of the right eye camera is rendered on the right eye screen to present the corresponding image to the user through the corresponding lens. Scenes.
  • the scene may be a virtual scene to provide a near-real immersion of the user, or a real scene in the real world, to provide the user with a sense of realism in the real world.
  • the images obtained by simultaneously previewing the left and right eye cameras respectively in step S2400 are respectively corresponding to the respective left and right eye screens, and the corresponding scenes are directly presented to the user, so that the first scene does not need to be introduced.
  • the three-party library performs a merging algorithm on each frame of the preview of the left and right eye cameras, thereby reducing the difficulty of implementing the virtual reality device, reducing the size of the program, and avoiding the problems of low operating efficiency and high power consumption.
  • the image refresh speed is improved when the scene is presented, and the realism of the scene presentation is enhanced. In particular, it is particularly suitable for presenting real world real scenes, providing a sense of solid reality.
  • the rendering operation can be implemented by the rendering function provided by Unity3D.
  • Unity3D is a fully integrated professional game engine developed by Unity Technologies, which can be generally applied to the development of virtual reality device rendering scenarios, and will not be described here.
  • each frame image obtained by simultaneously previewing the left and right eye cameras respectively can be respectively rendered on the corresponding left and right eye screens by the rendering function provided by Unity3D to present corresponding scenes.
  • the image center of each frame of the image previewed by the left and right eye cameras may be offset before the rendering operation is performed to enhance the depth of field effect of the corresponding image. It makes the corresponding corresponding scene (especially the real scene of the real world) more realistic.
  • the method for presenting a scenario by using a virtual reality device provided in this example further includes:
  • the center of the image is horizontally offset to obtain a new image center, and then the rendering operation is performed;
  • the center performs a horizontal offset to obtain a new image center, and then performs the rendering operation.
  • h 1 is the center vertical distance from the lens of the left eye imaging device to the camera
  • w 1 is the first offset distance
  • H 1 is the image height of the first image
  • h 2 is the lens to camera of the right eye imaging device.
  • the center vertical distance, w 2 is the first offset distance
  • H 2 is the image height of the second image
  • d 1 (w 1 /h 1 ) ⁇ (H 1 /2
  • d 2 (w 2 /h 2 ) ⁇ (H 2 /2).
  • the object other than the outline of the image center object of each frame image previewed by the left and right eye cameras may be blurred or blurred, and then the rendering operation is performed to enhance the correspondence.
  • the depth of field effect of the image makes the corresponding scene (especially the real scene of the real world) more realistic.
  • the method for presenting a scenario by using a virtual reality device provided in this example further includes:
  • the rendering operation After determining a center object of the first image based on an image center point of the first image, performing contour detection on the center object to correspond to a center object contour, and the first outside the center object contour After the object included in the image is blurred or blurred, the rendering operation is performed;
  • the rendering operation After determining a center object of the second image based on an image center point of the second image, performing contour detection on the center object to correspond to a center object contour, and the second outside the center object contour After the object included in the image is blurred or blurred, the rendering operation is performed.
  • the contour detection can be implemented by the function findContours() provided in OpenCV, which is an open source cross-platform computer vision library, which will not be described here.
  • the image center of each frame image previewed by the left and right eye cameras may be offset, and the object other than the contour of the image center object may be blurred or blurred.
  • the rendering operation is performed to better enhance the depth of field effect of the corresponding image, so that the corresponding corresponding scene is more realistic.
  • a scene presentation device 3000 is further provided, as shown in FIG. 3, disposed on the virtual reality device 4000 side, where the scene presentation device 3000 includes a parameter acquisition unit 3100, an offset distance acquisition unit 3200, and component adjustment.
  • the unit 3300 and the image rendering unit 3400 optionally, further include an image center offset unit 3500 and a contour processing unit 3600 for implementing the method for presenting a scene by using a virtual reality device provided in this embodiment, and details are not described herein again.
  • the virtual reality device 4000 includes a left eye imaging device 4100 and a right eye imaging device 4200, which are sequentially provided with a left eye lens 4101, a left eye camera 4102, a left eye screen 4103, and a left lens. 4104.
  • the right-eye imaging device 4200 is sequentially provided with a right-eye lens 4201, a right-eye camera 4202, a right-eye screen 4203, and a right-eye lens 4204.
  • the scene rendering device 3000 includes:
  • the parameter obtaining unit 3100 is configured to respectively acquire imaging parameters of the left-eye imaging device and imaging parameters of the right-eye imaging device, where the imaging parameters include at least a central vertical distance of the lens from the corresponding imaging device to the camera and a refractive angle of the lens;
  • the offset distance acquiring unit 3200 is configured to acquire a first offset distance of the left eye imaging device according to the imaging parameter of the left eye imaging device, and acquire a second offset distance of the right eye imaging device according to the imaging parameter of the right eye imaging device;
  • the component adjusting unit 3300 is configured to adjust the left eye lens, the left eye camera, the right eye lens, and the right eye camera according to the first offset distance, the second offset distance, and a preset human eye pupil distance, so that The center horizontal distance of the left eye lens and the right eye lens is not greater than the human eyelid distance, the left eye lens is horizontally offset from the left eye camera center by a first offset distance, and the right eye lens and the right eye camera are The center horizontal offset is a second offset distance;
  • the image rendering unit 3400 is configured to render the first image previewed by the left eye camera on the left eye screen, and the second image corresponding to the right eye camera simultaneously previewed on the right eye screen to present to the user through the corresponding lens Corresponding scene.
  • the offset distance obtaining unit 3200 includes:
  • Means for calculating a first offset distance w 1 h 1 ⁇ tg ⁇ 1 according to a center-to-center distance h 1 of the lens of the left-eye imaging device to the camera and a refractive angle ⁇ 1 of the lens;
  • the scene rendering device 3000 further includes an image center offset unit 3400, configured to:
  • the center of the image is horizontally offset to obtain a new image center, and then the rendering operation is performed;
  • the center performs a horizontal offset to obtain a new image center, and then performs the rendering operation.
  • the scene rendering device 3000 further includes a contour processing unit 3500, configured to:
  • the rendering operation After determining a center object of the first image based on an image center point of the first image, performing contour detection on the center object to correspond to a center object contour, and the first outside the center object contour After the object included in the image is blurred or blurred, the rendering operation is performed; and,
  • the rendering operation After determining a center object of the second image based on an image center point of the second image, performing contour detection on the center object to correspond to a center object contour, and the second outside the center object contour After the object included in the image is blurred or blurred, the rendering operation is performed.
  • the connection relationship is merely illustrative and not a specific limitation.
  • the scene presentation device 3000 may be disposed in or integrated with the virtual reality device 4000, and may be connected to the virtual reality device 4000 through a wireless or wired connection form, and may be executed in a virtual or wired connection manner. This is not listed one by one.
  • left-eye imaging device 4100 and the right-eye imaging device 4200 included in the virtual reality device 4000 illustrated in FIG. 3 are merely illustrative, and do not necessarily mean that the left-eye imaging device 4100 and the right-eye imaging device 4200 are physically separated.
  • the two physical devices, the left-eye imaging device 4100 and the right-eye imaging device 4200 may be logically separated, for example, in a specific implementation, a left-eye lens 4101, a left-eye camera 4102, corresponding to the left-eye imaging device 4100.
  • the left-eye screen 4103, the left-eye lens 4104, and the right-eye lens 4201, the right-eye camera 4202, the right-eye screen 4203, and the right-eye lens 4204 corresponding to the right-eye rendering device 4200 may be integrated or built in the same physical device.
  • the elements are not physically separated to set the corresponding left-eye imaging device 4100 and right-eye imaging device 4200.
  • the hardware configuration of the scene rendering device 3000 may be the electronic device 1000 as shown in FIG. 1.
  • the scene rendering device 3000 can be implemented in a variety of ways.
  • the scene rendering device 3000 can be implemented by an instruction configuration processor.
  • the instructions can be stored in the ROM, and when the device is booted, the instructions are read from the ROM into the programmable device to implement the scene rendering device 3000.
  • scene rendering device 3000 can be cured into a dedicated device (eg, an ASIC).
  • the scene rendering device 3000 can be divided into mutually independent units, or they can be implemented together.
  • the scene rendering device 3000 may be implemented by one of the various implementations described above, or may be implemented by a combination of two or more of the various implementations described above.
  • a virtual reality device 5000 is further provided, as shown in FIG. 5, including:
  • the left eye imaging device 5100 is sequentially provided with a left eye lens 5101, a left eye camera 5102, a left eye screen 5103, and a left lens 5104;
  • a right-eye imaging device 5200 which is sequentially provided with a right-eye lens 5201, a right-eye camera 5202, a right-eye screen 5203, and a right-eye lens 5204;
  • the virtual display scene presentation device 3000 provided in this embodiment.
  • the virtual reality device 5000 may be a virtual reality helmet or virtual reality glasses or the like.
  • the virtual reality device 5000 can be the electronic device 1000 as shown in FIG.
  • the left-eye imaging device 5100 and the right-eye imaging device 5200 included in the virtual reality device 5000 illustrated in FIG. 5 are merely illustrative, and do not necessarily mean that the left-eye imaging device 5100 and the right-eye imaging device 5200 are physically separated.
  • the two-dimensional physical device, the left-eye imaging device 5100 and the right-eye imaging device 5200, may be logically separated.
  • the virtual reality device 5000 may include a left-eye lens 5101 and a left eye that are sequentially disposed.
  • the camera 5102, the left-eye screen 5103, the left-eye lens 5104, and the right-eye lens 5201, the right-eye camera 5202, the right-eye screen 5203, and the right-eye lens 5204 of the corresponding sequential device are not physically separated to set the corresponding left eye.
  • a method, a device, and a virtual reality device for presenting a scene through a virtual reality device are provided, and after performing an adjustment operation on the left and right eye cameras in the virtual reality device, respectively
  • the images previewed by the left and right eye cameras are respectively correspondingly rendered on the corresponding left and right lens heads, so that the third party library is not required to introduce a merge algorithm for each frame of the left and right eye camera previews, thereby reducing the virtual reality.
  • the difficulty of implementing the device reduces the size of the program and avoids the problems of low operating efficiency and high power consumption.
  • the image refresh speed is improved when the scene is presented, and the realism of the scene presentation is enhanced.
  • the invention can be a system, method and/or computer program product.
  • the computer program product can comprise a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement various aspects of the present invention.
  • the computer readable storage medium can be a tangible device that can hold and store the instructions used by the instruction execution device.
  • the computer readable storage medium can be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Non-exhaustive List includes: portable computer disk, hard disk, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM or flash), static random access memory (SRAM), portable compression Disk read only memory (CD-ROM), digital versatile disc (DVD), memory stick, floppy disk, mechanical encoding device, for example, a punch card or a recessed structure in which a command is stored thereon, and any suitable The combination.
  • RAM random access memory
  • ROM read only memory
  • EPROM or flash erasable programmable read only memory
  • SRAM static random access memory
  • CD-ROM compact disc
  • DVD digital versatile disc
  • memory stick floppy disk
  • mechanical encoding device for example, a punch card or a recessed structure in which a command is stored thereon, and any suitable The combination.
  • a computer readable storage medium as used herein is not to be interpreted as a transient signal itself, such as a radio wave or other freely propagating electromagnetic wave, an electromagnetic wave propagating through a waveguide or other transmission medium (eg, a light pulse through a fiber optic cable), or through a wire The electrical signal transmitted.
  • the computer readable program instructions described herein can be downloaded from a computer readable storage medium to various computing/processing devices or downloaded to an external computer or external storage device over a network, such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in each computing/processing device .
  • Computer program instructions for performing the operations of the present invention may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine related instructions, microcode, firmware instructions, state setting data, or in one or more programming languages.
  • the computer readable program instructions can execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer, partly on the remote computer, or entirely on the remote computer or server. carried out.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or wide area network (WAN), or can be connected to an external computer (eg, using an Internet service provider to access the Internet) connection).
  • the customized electronic circuit such as a programmable logic circuit, a field programmable gate array (FPGA), or a programmable logic array (PLA), can be customized by utilizing state information of computer readable program instructions.
  • Computer readable program instructions are executed to implement various aspects of the present invention.
  • the computer readable program instructions can be provided to a general purpose computer, a special purpose computer, or a processor of other programmable data processing apparatus to produce a machine such that when executed by a processor of a computer or other programmable data processing apparatus Means for implementing the functions/acts specified in one or more of the blocks of the flowcharts and/or block diagrams.
  • the computer readable program instructions can also be stored in a computer readable storage medium that causes the computer, programmable data processing device, and/or other device to operate in a particular manner, such that the computer readable medium storing the instructions includes An article of manufacture that includes instructions for implementing various aspects of the functions/acts recited in one or more of the flowcharts.
  • the computer readable program instructions can also be loaded onto a computer, other programmable data processing device, or other device to perform a series of operational steps on a computer, other programmable data processing device or other device to produce a computer-implemented process.
  • instructions executed on a computer, other programmable data processing apparatus, or other device implement the functions/acts recited in one or more of the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagram can represent a module, a program segment, or a portion of an instruction that includes one or more components for implementing the specified logical functions.
  • Executable instructions can also occur in a different order than those illustrated in the drawings. For example, two consecutive blocks may be executed substantially in parallel, and they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowcharts, and combinations of blocks in the block diagrams and/or flowcharts can be implemented in a dedicated hardware-based system that performs the specified function or function. Or it can be implemented by a combination of dedicated hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, implementation by software, and implementation by a combination of software and hardware are equivalent.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un procédé et un appareil destinés à présenter une scène à l'aide d'un dispositif de réalité virtuelle, et un appareil de réalité virtuelle. Le procédé comporte les étapes consistant à: obtenir des paramètres d'imagerie provenant d'un dispositif d'imagerie d'œil gauche et d'un dispositif d'imagerie d'œil droit, respectivement; obtenir une première distance de décalage du dispositif d'imagerie d'œil gauche et une seconde distance de décalage du dispositif d'imagerie d'œil droit d'après les paramètres d'imagerie provenant du dispositif d'imagerie d'œil gauche et du dispositif d'imagerie d'œil droit, respectivement; régler une lentille d'œil gauche, une caméra d'œil gauche, une lentille d'œil droit et une caméra d'œil droit en fonction de la première distance de décalage, de la seconde distance de décalage et d'une distance prédéterminée de pupilles humaines; et restituer une première image pré-visualisée au niveau de la caméra d'œil gauche et une seconde image pré-visualisée simultanément au niveau de la caméra d'œil droit vers un écran d'œil gauche et un écran d'œil droit, respectivement.
PCT/CN2017/112383 2017-05-22 2017-11-22 Procédé et appareil de présentation de scène utilisant un dispositif de réalité virtuelle, et appareil de réalité virtuelle WO2018214431A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710364277.6A CN107302694B (zh) 2017-05-22 2017-05-22 通过虚拟现实设备呈现场景的方法、设备及虚拟现实设备
CN201710364277.6 2017-05-22

Publications (1)

Publication Number Publication Date
WO2018214431A1 true WO2018214431A1 (fr) 2018-11-29

Family

ID=60137596

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/112383 WO2018214431A1 (fr) 2017-05-22 2017-11-22 Procédé et appareil de présentation de scène utilisant un dispositif de réalité virtuelle, et appareil de réalité virtuelle

Country Status (2)

Country Link
CN (1) CN107302694B (fr)
WO (1) WO2018214431A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107302694B (zh) * 2017-05-22 2019-01-18 歌尔科技有限公司 通过虚拟现实设备呈现场景的方法、设备及虚拟现实设备
CN108986225B (zh) * 2018-05-29 2022-10-18 歌尔光学科技有限公司 虚拟现实设备显示场景时的处理方法及装置、设备
CN108830943B (zh) * 2018-06-29 2022-05-31 歌尔光学科技有限公司 一种图像处理方法及虚拟现实设备
CN109002164B (zh) * 2018-07-10 2021-08-24 歌尔光学科技有限公司 头戴显示设备的显示方法、装置及头戴显示设备
CN111193919B (zh) * 2018-11-15 2023-01-13 中兴通讯股份有限公司 一种3d显示方法、装置、设备及计算机可读介质
CN112235562B (zh) * 2020-10-12 2023-09-15 聚好看科技股份有限公司 一种3d显示终端、控制器及图像处理方法
CN115079826A (zh) * 2022-06-24 2022-09-20 平安银行股份有限公司 一种虚拟现实的实现方法、电子设备及存储介质
CN115880999B (zh) * 2022-12-30 2024-06-21 视涯科技股份有限公司 一种显示装置及一种近眼显示设备

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080062537A1 (en) * 2006-09-08 2008-03-13 Asia Optical Co., Inc. Compact imaging lens system
CN102782562A (zh) * 2010-04-30 2012-11-14 北京理工大学 宽视场高分辨率拼接式头盔显示装置
CN102928979A (zh) * 2011-08-30 2013-02-13 微软公司 调整混合现实显示器以供瞳孔间距对准
CN105103032A (zh) * 2013-03-26 2015-11-25 鲁索空间工程项目有限公司 显示装置
CN106461943A (zh) * 2014-05-23 2017-02-22 高通股份有限公司 用于透视近眼显示器的方法和设备
CN106445167A (zh) * 2016-10-20 2017-02-22 网易(杭州)网络有限公司 单眼视界自适配调整方法及装置、头戴式可视设备
CN106646892A (zh) * 2017-03-21 2017-05-10 上海乐蜗信息科技有限公司 光学系统及头戴式虚拟现实装置
CN107302694A (zh) * 2017-05-22 2017-10-27 歌尔科技有限公司 通过虚拟现实设备呈现场景的方法、设备及虚拟现实设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107438788A (zh) * 2015-04-30 2017-12-05 谷歌公司 校正不同于眼睛的透镜位置的观看实际场景的虚拟眼镜套装
CN105892053A (zh) * 2015-12-30 2016-08-24 乐视致新电子科技(天津)有限公司 一种虚拟头盔的透镜间距调节方法及其装置
CN105954875A (zh) * 2016-05-19 2016-09-21 华为技术有限公司 一种虚拟现实眼镜以及虚拟现实眼镜的调整方法
CN106019588A (zh) * 2016-06-23 2016-10-12 深圳市虚拟现实科技有限公司 一种可以自动测量瞳距的近眼显示装置及方法
CN106598250B (zh) * 2016-12-19 2019-10-25 北京星辰美豆文化传播有限公司 一种vr显示方法、装置和电子设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080062537A1 (en) * 2006-09-08 2008-03-13 Asia Optical Co., Inc. Compact imaging lens system
CN102782562A (zh) * 2010-04-30 2012-11-14 北京理工大学 宽视场高分辨率拼接式头盔显示装置
CN102928979A (zh) * 2011-08-30 2013-02-13 微软公司 调整混合现实显示器以供瞳孔间距对准
CN105103032A (zh) * 2013-03-26 2015-11-25 鲁索空间工程项目有限公司 显示装置
CN106461943A (zh) * 2014-05-23 2017-02-22 高通股份有限公司 用于透视近眼显示器的方法和设备
CN106445167A (zh) * 2016-10-20 2017-02-22 网易(杭州)网络有限公司 单眼视界自适配调整方法及装置、头戴式可视设备
CN106646892A (zh) * 2017-03-21 2017-05-10 上海乐蜗信息科技有限公司 光学系统及头戴式虚拟现实装置
CN107302694A (zh) * 2017-05-22 2017-10-27 歌尔科技有限公司 通过虚拟现实设备呈现场景的方法、设备及虚拟现实设备

Also Published As

Publication number Publication date
CN107302694B (zh) 2019-01-18
CN107302694A (zh) 2017-10-27

Similar Documents

Publication Publication Date Title
WO2018214431A1 (fr) Procédé et appareil de présentation de scène utilisant un dispositif de réalité virtuelle, et appareil de réalité virtuelle
US10013796B2 (en) Rendering glasses shadows
JP2016018560A (ja) 視覚効果を有するオブジェクトを表示する装置及び方法
WO2018000629A1 (fr) Procédé et appareil de réglage de luminosité
US9965898B2 (en) Overlay display
CN112105983B (zh) 增强的视觉能力
KR20210082242A (ko) 증강-현실 또는 가상-현실 씬에서 오브젝트들의 표현들의 생성 및 수정
KR102204212B1 (ko) 실감형 콘텐츠 제공 장치 및 방법
KR20200043432A (ko) 이미지 데이터에 가상 조명 조정들을 제공하기 위한 기술
CN112074800A (zh) 用于在沉浸水平之间切换的技术
CN112272296B (zh) 使用深度和虚拟光的视频照亮
US11543655B1 (en) Rendering for multi-focus display systems
JP2023505235A (ja) 仮想、拡張、および複合現実システムおよび方法
CN111866492A (zh) 基于头戴显示设备的图像处理方法、装置及设备
US11237413B1 (en) Multi-focal display based on polarization switches and geometric phase lenses
KR20220106076A (ko) 조밀한 깊이 맵의 재구성을 위한 시스템 및 방법
CN111612915A (zh) 渲染对象以匹配相机噪声
US10757401B2 (en) Display system and method for display control of a video based on different view positions
KR102480916B1 (ko) 게임 엔진 선정 장치 및 이를 포함하는 콘텐츠 제작 시스템
US11983819B2 (en) Methods and systems for deforming a 3D body model based on a 2D image of an adorned subject
US11823343B1 (en) Method and device for modifying content according to various simulation characteristics
KR102396060B1 (ko) 전자 게임에서 카메라 뷰 변경
US12002132B1 (en) Rendering using analytic signed distance fields
US11282171B1 (en) Generating a computer graphic for a video frame
US11989404B1 (en) Time-based visualization of content anchored in time

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17910794

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17910794

Country of ref document: EP

Kind code of ref document: A1