WO2018176927A1 - Procédé et système de rendu binoculaire destinés à une compensation de calcul de parallaxe active virtuelle - Google Patents

Procédé et système de rendu binoculaire destinés à une compensation de calcul de parallaxe active virtuelle Download PDF

Info

Publication number
WO2018176927A1
WO2018176927A1 PCT/CN2017/117130 CN2017117130W WO2018176927A1 WO 2018176927 A1 WO2018176927 A1 WO 2018176927A1 CN 2017117130 W CN2017117130 W CN 2017117130W WO 2018176927 A1 WO2018176927 A1 WO 2018176927A1
Authority
WO
WIPO (PCT)
Prior art keywords
compensation
parallax
image
rendering
active
Prior art date
Application number
PCT/CN2017/117130
Other languages
English (en)
Chinese (zh)
Inventor
赵凤萍
邓金富
Original Assignee
上海讯陌通讯技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海讯陌通讯技术有限公司 filed Critical 上海讯陌通讯技术有限公司
Publication of WO2018176927A1 publication Critical patent/WO2018176927A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity

Definitions

  • the present invention relates to the field of virtual reality image processing technologies, and in particular, to a binocular rendering method and system for virtual active disparity calculation compensation.
  • the distance between the eyes of a person is a certain distance, so when observing a three-dimensional object, since the two eyes are horizontally separated at two different positions, the observed image of the object is different, and there is an aberration between them.
  • the existence of aberrations, through the human brain, can feel stereoscopic effects. How to compensate the rendering with the virtual active parallax method to enhance the look and feel of the virtual reality is the focus of the present invention.
  • the imaging foundation relied on in virtual reality is based on the simulation of binoculars in the natural environment of two different images with aberrations, while the monocular camera acquisition device is still the mainstream (smartphone, tablet, notebook, headset, Cameras, etc., binoculars are not suitable for popularization due to the specialization of the use scene and the limitations of physical hardware. So how to use the monocular camera to capture the simulated dual-purpose framing effect and compensate the rendering through virtual active parallax is also difficult and different.
  • the synchronization between the two images is directly related to whether the user causes vertigo, and the image color, saturation and exposure difference of the two channels will also reduce the virtual reality perception;
  • the traditional dual-purpose view content is fixed parallax that is immutable.
  • the present invention proposes a binocular rendering method for virtual active disparity calculation compensation for a single-purpose imaging device.
  • the method strives to bring the parallax fitness of the subject closer to the human eyes and directly observe the parallax brought in by nature, and image the virtual active difference part of the left and right eyes.
  • the binocular is rendered once to ensure synchronization and obtain enhancement. Virtual reality experience.
  • the application number is: 201610666409.6, and the name is: “A virtual reality mobile dynamic time frame compensation rendering system and method”, the method is: applying frame rendering to generate an application frame buffer sequence, and extracting in the application frame buffer sequence
  • the latest or most recent application frame performs secondary rendering to obtain a time frame
  • the time frame is sent to the shared buffer, and is refreshed by the screen reading time frame rendering result under the timing control of the vertical synchronization management module.
  • the GPU renders the result directly to the screen refresh buffer, reducing the delay of the multi-level cache exchange.
  • vertical synchronization time management the GPU's rendering time is controlled, and the conflict between GPU rendering and screen refresh reading is avoided, so that the picture can be displayed normally at low delay without tearing.
  • the method in the above document solves the problem that the split screen image is not synchronized due to the split screen rendering when the number of cache frames is relatively large, which is related to the parallax compensation and the binocular rendering problem close to the real presence experience to be solved by the present invention.
  • the present invention performs pre-processing before final rendering to the screen to achieve the same effect of the comparative invention.
  • the method of the present invention mainly comprises: acquiring a parallax range preset by a virtual reality model, the parallax range being a parallax upper limit value And a range of a parallax lower limit value; obtaining a stereoscopic slice source and a disparity value corresponding to the stereoscopic slice source, the stereoscopic slice source being a slice source displayed in a virtual reality scene; adjusting the parallax range according to the parallax range The disparity value of the stereoscopic sheet source such that the disparity value of the stereoscopic sheet source does not exceed the parallax range.
  • the present invention can improve the display effect of displaying a new stereoscopic scene in a stereoscopic scene.
  • the methods in the above documents mainly focus on and improve the display effect of displaying a new stereoscopic scene in a stereoscopic scene, which is different from the parallax compensation problem to be solved by the present invention and the binocular rendering problem close to the real on-the-spot experience.
  • the projection transformation step includes: adding a middle plane as a projection plane between the near plane and the far plane, and the near plane and the far side
  • the primitives between the planes are projected onto the midplane.
  • the invention projects the primitive between the near plane and the far plane onto the middle plane through the added midplane, and the primitive between the near plane and the middle plane has a stereoscopic effect of the screen, between the middle plane and the far plane
  • the primitives have a stereo effect of entering the screen; thus, the existing "outline” and “on screen” effects can be rendered without using special hardware when using the existing rendering pipeline in the 3D display device.
  • the method in the above document focuses on the rendering of three-dimensional graphics by utilizing the hardware characteristics of the 3D display device, which is solved by the present invention based entirely on software without relying on the parallax compensation binocular rendering method of the 3D display device in the virtual reality device. Not the same problem.
  • an object of the present invention is to provide a binocular rendering method and system for virtual active disparity calculation compensation.
  • the binocular rendering method for virtual active disparity calculation compensation provided by the present invention includes the following steps:
  • Active parallax calculation step performing binocular rendering adaptation on the initial user, and obtaining a dedicated adaptive parallax synchronization matrix of the corresponding user;
  • Image disparity compensation and clipping step the user's exclusive adaptive parallax synchronization matrix is used as the clipping factor of the virtual active disparity compensation, and a complete image is cut into left and right eye combinations, which are respectively displayed in the left and right eye framing windows;
  • Binocular rendering step pre-rendering the same monocular camera-viewed image on the waiting overlay canvas of the next field, and projecting the prepared overlay canvas composite image rendering target virtual reality header at a time according to a fixed time period. Wear the device's display hardware layer.
  • the active disparity calculation step comprises:
  • Step A1 selecting a plurality of original images of different resolutions from the image library, and performing binocular rendering on the user wearing the virtual reality wearing device for the first time through the original image, based on the M points at the fixed position in the original image.
  • the positions are adapted one by one; wherein M is a positive integer, and the positions of the M points are selected based on the form of the adaptive parallax synchronization matrix;
  • Step A2 obtaining the optimal parallax correction parameter and the visual range of each region in the original image by using the user's adaptive feedback, and generating the adapted parallax synchronization matrix of the user; wherein the original image has a visual range that is greater than the left and right eyes respectively. Range of vision;
  • Step A3 automatically generate an active parallax compensation standard suitable for the user's own eyes according to the wearer's sensory comfort response, and use the active parallax compensation standard as the best fitting adjustment standard for each subsequent view.
  • the image disparity compensation and clipping step comprises:
  • Step B1 The left and right eyes are respectively based on the parallax tailoring factors of different regions as the clipping basis of the current wearer to view the virtual reality, and the complete image is cut into the optimal left and right eye combinations, and the optimal left and right eye combinations are:
  • the wearer determines the best fit result according to his own wearing habits.
  • the best fit result is the best adjustment target, saves the compensation matrix generated by the optimal adjustment target, and makes the compensation matrix
  • the dynamic adjustment is used as a reference, and the compensation matrix is a calibration matrix, which is also called an optimal left and right eye combination;
  • Step B2 The two-way divided image obtained by the step B1 is used as two subsets of the original image pixel ensemble, and the two subsets have an intersection; the clipping is performed by the virtual active parallax compensation clipping factor, so that the left and right eyes see the image.
  • the principle of maximizing the optimal parallax compensation is: recording data based on single or multiple active corrections as a set of candidate compensation, in a period of wearing, recording and screening The preferred compensation matrix for different light, scene, and wearing angle, and the maximum binocular synthesis field of view as the auxiliary target;
  • Step B3 performing pre-processing on the image obtained by the subsequent monocular camera, and forming a separate view image of the two-way coordinate system, wherein the sub-view compensation parameters include: according to different observation angles, observation distances, Light, color, image conversion, and image cropping.
  • the period of the fixed timing in the binocular rendering step is to render each image by a specified frequency timing.
  • a binocular rendering system for virtual active disparity calculation compensation includes the following modules:
  • Active disparity calculation module used for performing binocular rendering adaptation on the initial user, and obtaining a dedicated adaptive disparity synchronization matrix of the corresponding user;
  • Image disparity compensation and trimming module used as a tailoring disparity synchronization matrix of the user as a clipping factor of virtual active disparity compensation, and cuts a complete image into left and right eye combinations, which are respectively displayed in the left and right eye framing windows;
  • Binocular rendering module used to pre-render the same monocular camera image on the waiting overlay canvas of the next field, and project the prepared overlay canvas composite image rendering to the target virtual at a time according to the fixed timing period.
  • Realistic display device hardware layer used to pre-render the same monocular camera image on the waiting overlay canvas of the next field, and project the prepared overlay canvas composite image rendering to the target virtual at a time according to the fixed timing period.
  • the active disparity calculation module includes:
  • Parallax synchronization matrix selection sub-module selecting a plurality of original images with different resolutions from the image library, and performing binocular rendering on the user wearing the virtual reality wearing device for the first time through the original image, based on the fixed position in the original image
  • the positions of the M points are adapted one by one; wherein M is a positive integer, and the positions of the M points are selected based on the form of the adaptive parallax synchronization matrix;
  • Adapting the parallax synchronization matrix generation sub-module obtaining the optimal parallax correction parameters and the visual range of each region in the original image through the user's adaptive feedback, and generating the adaptive parallax synchronization matrix of the user; wherein the original image has a range of visual forces Greater than the left and right eyesight alone;
  • the best fit calibration standard generation sub-module automatically generates an active parallax compensation standard suitable for the user's own eyes according to the wearer's sensory comfort response, and based on the active parallax compensation standard as the best fit adjustment for each subsequent view standard.
  • the image disparity compensation and cropping module comprises:
  • the wearer determines the best fit result according to his own wearing habits.
  • the best fit result is the best adjustment target, saves the compensation matrix generated by the optimal adjustment target, and makes the compensation matrix
  • the dynamic adjustment is used as a reference, and the compensation matrix is a calibration matrix, which is also called an optimal left and right eye combination;
  • the cropping sub-module the two-way divided image obtained by the cropping is used as two subsets of the original image pixel complete set, the two subsets have an intersection; the cropping factor of the virtual active parallax compensation is used to compensate the cropping, so that the left and right eyes see the image conforming
  • the principle of maximizing the optimal parallax compensation is: the data based on single or multiple active corrections as a set of candidate compensation, recorded and selected in a period of wearing Including: different compensation modes for different light, scene, and wearing angle, and the maximum binocular synthesis field of view as the auxiliary target;
  • the sub-view image generation sub-module pre-processes the image obtained by the subsequent monocular camera by the pre-view compensation parameter to form a separate bi-directional coordinate system
  • the sub-view compensation parameter refers to the observation environment: Observe the angle, the observation distance, the brightness, and the color when the image is transformed and the image is cropped.
  • the present invention has the following beneficial effects:
  • the binocular rendering method for virtual active parallax calculation compensation uses a single camera as the hardware foundation of the image acquisition device, and compensates for the wearer's own eyes to directly observe the natural and reduce the ghosting caused by vertigo by software parallax calculation. According to the individual difference of the wearer, the optimal adjustment of the parallax can be actively obtained, and the single view source of the cropping can be compensated in reverse to achieve a more realistic immersive look.
  • the binocular rendering method of the virtual active parallax calculation compensation ensures the timing synchronization in the binocular rendering process, the color and the luminosity are consistent, and the two-way imaging has other effects, which helps the wearing body to dynamically modify the rendering content to Meet the virtual effects of binoculars close to reality.
  • the binocular rendering method for virtual active parallax calculation compensation provided by the invention can provide targeted and precise adjustment for the parallax of the view content, which not only satisfies the synchronization requirement, but also satisfies the difference of parallax and reduces dizziness.
  • FIG. 1 is a flowchart of a binocular rendering method for virtual active disparity calculation compensation provided by the present invention
  • FIG. 2 is a schematic diagram of pre-calibration of left and right eyes for virtual active disparity calculation compensation provided by the present invention
  • 3 is a standard diagram of a best fit for a binocular generated by a user
  • Figure 4 is a schematic diagram showing the result of the cropping compensation
  • Figure 5 is a new view of the render composition.
  • the binocular rendering method for virtual active disparity calculation compensation provided by the present invention includes the following steps:
  • Active parallax calculation step performing binocular rendering on the initial user to obtain a dedicated adaptive parallax synchronization matrix of the corresponding user.
  • the active parallax calculation mentioned in the present invention is virtual active parallax calculation, that is, when the user first uses, firstly, several dual-purpose images are rendered to the wearer, respectively.
  • the nine point positions shown in Fig. 2 are adapted one by one, and the parameters of the optimal parallax correction and the range of visual acuity of each region are obtained by the user's adaptive feedback (minimum of 3 points of 1/3/5/7/9) And, in turn, generate a wearer-specific adaptive parallax synchronization matrix.
  • the original image's visual range is always greater than the left and right eyesight's separate visual range.
  • the active parallax compensation standard suitable for the viewer's own eyes is achieved, and the compensation based on this will be the best fit adjustment for each subsequent view, as shown in FIG.
  • the user's exclusive adaptive parallax synchronization matrix is used as the clipping factor of the virtual active parallax compensation, and a complete image is cut into left and right eye combinations, which are respectively displayed in the left and right eye view windows.
  • the adaptive parallax synchronization matrix that best fits the wearer is used as the clipping factor of the virtual active disparity compensation, and the original image is a complete set of pixels, and the left and right eyes respectively use the disparity clipping factor of different regions as the current wearer to view the virtual
  • a complete image is cut into the best left and right eye combination, and the field of view is displayed in a single way in the left and right eye viewfinder windows.
  • the two-way split image thus cropped is a subset of the complete set, but the two subsets overlap and have non-overlapping portions.
  • the split compensation trim is shown in Figure 4.
  • Binocular rendering step pre-rendering the image of the same monocular camera to the waiting overlay on the next field, and projecting the prepared overlay composite image at the target display hardware map at a time according to the period of the fixed timing.
  • the image rendering is designed to synthesize the image of the same monocular camera, and the pre-rendering is synthesized on the waiting superimposed canvas of the next field.
  • the prepared superimposed canvas is rendered and projected at one time.
  • the hardware layer is displayed in the target, because in the pre-processing binocular rendering has ensured the optimal combination of color, chromatic aberration and parallax, the image has been synthesized in the fixed timing cycle to ensure optimal combination of binocular timing, as shown in Figure 5.
  • the binocular rendering method for virtual active disparity calculation compensation includes the following steps:
  • Step S1 randomly selecting several original images of different resolutions from the template library
  • Step S2 pre-marking 5 to 9 parallax standard crosses on the selected original image
  • Step S3 attempting to separately project the synthesized area in the binocular display
  • Step S4 the calibration cross is adjusted by the wearer according to his own perception, if the non-vertigo binocular fitting that meets the maximum viewing angle is performed, step S5 is performed, otherwise, step S3 is performed;
  • Step S5 storing the final compensation adjustment parameter of the calibration cross according to the self-perception of the wearer, and attempting to adjust the compensation parameters of the plurality of sets according to different resolutions;
  • Step S6 start frame processing of the actual single-view video data stream, perform reverse cutting based on the pre-prepared parallax compensation adjustment value, and cut out the left-eye frame and the right-eye frame from the original image respectively;
  • Step S7 pre-synthesizing the binoculars with different cuts, reconstructing a new composite image with a larger overall size than the original view, waiting for rendering in the binocular viewfinder;
  • Step S8 Rendering into a new picture at one time.
  • the virtual active device can perform virtual active parallax compensation rendering on the virtual device side to the user. Provides the best immersive entertainment experience with low stun ghosting and high image quality.
  • Embodiment 2 (360 degree panoramic broadcast)
  • system provided by the present invention and its various devices can be logically gated, except that the system provided by the present invention and its various devices are implemented in purely computer readable program code. Switches, ASICs, programmable logic controllers, and embedded microcontrollers are used to achieve the same functionality. Therefore, the system and its various devices provided by the present invention can be considered as a hardware component, and the devices included therein for implementing various functions can also be regarded as structures within hardware components; A device that implements various functions is considered to be either a software module that implements a method or a structure within a hardware component.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un procédé et un système de rendu binoculaire destinés à une compensation de calcul de parallaxe active virtuelle. Le procédé comprend les étapes suivantes : calcul de parallaxe active : réaliser une adaptation de rendu binoculaire sur un utilisateur initial dans le but d'obtenir une matrice de synchronisation de parallaxe adaptative dédiée correspondant à l'utilisateur; compensation et recadrage de parallaxe d'image : utiliser la matrice de synchronisation de parallaxe adaptative dédiée de l'utilisateur comme facteur de recadrage en vue de la compensation de parallaxe active virtuelle, recadrer une image complète en groupes d'yeux gauche et droit; et rendu binoculaire : effectuer un rendu préalable d'images dichoptiques de la même caméra monoculaire sur une toile de recouvrement en attente d'une scène suivante, et rendre et projeter, selon une période de séquence temporelle fixe, une image composite de la toile de recouvrement préparée sur une couche matérielle d'affichage cible en une seule fois. Selon la présente invention, une seule caméra est utilisée en tant que matériel d'acquisition d'image de base, et l'effet fantôme provoquant un vertige est réduit puisque, après une compensation de calcul de parallaxe basée sur un logiciel, l'effet s'approche d'un effet dans lequel un porteur observe directement des objets naturels avec ses propres yeux. Un réglage optimal de la parallaxe est obtenu activement, et une source de vue unique recadrée est compensée en sens inverse, ce qui permet d'obtenir un effet visuel plus immersif.
PCT/CN2017/117130 2017-04-01 2017-12-19 Procédé et système de rendu binoculaire destinés à une compensation de calcul de parallaxe active virtuelle WO2018176927A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710214653.3A CN107071384B (zh) 2017-04-01 2017-04-01 虚拟主动视差计算补偿的双目渲染方法及系统
CN201710214653.3 2017-04-01

Publications (1)

Publication Number Publication Date
WO2018176927A1 true WO2018176927A1 (fr) 2018-10-04

Family

ID=59602891

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/117130 WO2018176927A1 (fr) 2017-04-01 2017-12-19 Procédé et système de rendu binoculaire destinés à une compensation de calcul de parallaxe active virtuelle

Country Status (2)

Country Link
CN (1) CN107071384B (fr)
WO (1) WO2018176927A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107071384B (zh) * 2017-04-01 2018-07-06 上海讯陌通讯技术有限公司 虚拟主动视差计算补偿的双目渲染方法及系统
CN110072049B (zh) * 2019-03-26 2021-11-09 Oppo广东移动通信有限公司 图像处理方法和装置、电子设备、计算机可读存储介质
CN110177216B (zh) * 2019-06-28 2021-06-15 Oppo广东移动通信有限公司 图像处理方法、装置、移动终端以及存储介质
CN111202663B (zh) * 2019-12-31 2022-12-27 浙江工业大学 一种基于vr技术的视觉训练学习系统
CN113010020A (zh) * 2021-05-25 2021-06-22 北京芯海视界三维科技有限公司 一种时序控制器和显示设备
CN115955554B (zh) * 2022-06-27 2024-07-12 浙江传媒学院 一种图层化的虚实融合视频制作方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005060271A1 (fr) * 2003-12-18 2005-06-30 University Of Durham Procede et appareil destines a generer une image stereoscopique
CN102811359A (zh) * 2011-06-01 2012-12-05 三星电子株式会社 3d图像转换设备和调整3d图像转换设备的深度信息的方法
CN103039080A (zh) * 2010-06-28 2013-04-10 汤姆森特许公司 定制立体内容的3维效果的方法和装置
CN106507093A (zh) * 2016-09-26 2017-03-15 北京小鸟看看科技有限公司 一种虚拟现实设备的显示模式切换方法和装置
CN107071384A (zh) * 2017-04-01 2017-08-18 上海讯陌通讯技术有限公司 虚拟主动视差计算补偿的双目渲染方法及系统

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102300103B (zh) * 2010-06-25 2013-08-14 深圳Tcl新技术有限公司 一种将2d内容转换成3d内容的方法
JP2012174237A (ja) * 2011-02-24 2012-09-10 Nintendo Co Ltd 表示制御プログラム、表示制御装置、表示制御システム、及び表示制御方法
CN103905806B (zh) * 2012-12-26 2018-05-01 三星电子(中国)研发中心 利用单摄像头实现3d拍摄的系统和方法
CN105611278B (zh) * 2016-02-01 2018-10-02 欧洲电子有限公司 防裸眼3d观看眩晕感的图像处理方法及系统和显示设备
CN106251403B (zh) * 2016-06-12 2018-02-16 深圳超多维光电子有限公司 一种虚拟立体场景实现的方法、装置和系统
CN106101689B (zh) * 2016-06-13 2018-03-06 西安电子科技大学 利用手机单目摄像头对虚拟现实眼镜进行增强现实的方法
CN106385576B (zh) * 2016-09-07 2017-12-08 深圳超多维科技有限公司 立体虚拟现实直播方法、装置及电子设备
CN106231292B (zh) * 2016-09-07 2017-08-25 深圳超多维科技有限公司 一种立体虚拟现实直播方法、装置及设备
CN106375749B (zh) * 2016-09-12 2018-06-29 北京邮电大学 一种视差调整方法及装置
CN106454313A (zh) * 2016-10-18 2017-02-22 深圳市云宙多媒体技术有限公司 一种3d视频图像渲染及视差调节方法和系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005060271A1 (fr) * 2003-12-18 2005-06-30 University Of Durham Procede et appareil destines a generer une image stereoscopique
CN103039080A (zh) * 2010-06-28 2013-04-10 汤姆森特许公司 定制立体内容的3维效果的方法和装置
CN102811359A (zh) * 2011-06-01 2012-12-05 三星电子株式会社 3d图像转换设备和调整3d图像转换设备的深度信息的方法
CN106507093A (zh) * 2016-09-26 2017-03-15 北京小鸟看看科技有限公司 一种虚拟现实设备的显示模式切换方法和装置
CN107071384A (zh) * 2017-04-01 2017-08-18 上海讯陌通讯技术有限公司 虚拟主动视差计算补偿的双目渲染方法及系统

Also Published As

Publication number Publication date
CN107071384B (zh) 2018-07-06
CN107071384A (zh) 2017-08-18

Similar Documents

Publication Publication Date Title
WO2018176927A1 (fr) Procédé et système de rendu binoculaire destinés à une compensation de calcul de parallaxe active virtuelle
US11076142B2 (en) Real-time aliasing rendering method for 3D VR video and virtual three-dimensional scene
JP2010033367A (ja) 情報処理装置及び情報処理方法
US11962746B2 (en) Wide-angle stereoscopic vision with cameras having different parameters
US10553014B2 (en) Image generating method, device and computer executable non-volatile storage medium
CN103329165B (zh) 放缩三维场景中的用户控制的虚拟对象的像素深度值
JP6384940B2 (ja) 3d画像の表示方法及びヘッドマウント機器
JP2010154422A (ja) 画像処理装置
CN101562754A (zh) 一种改善平面图像转3d图像视觉效果的方法
JP2018500690A (ja) 拡大3d画像を生成するための方法およびシステム
TWI589150B (zh) 3d自動對焦顯示方法及其系統
US20130208097A1 (en) Three-dimensional imaging system and image reproducing method thereof
JP2011529285A (ja) 再現描写メディア中への両眼ステレオ情報の包含のための合成構造、メカニズムおよびプロセス
WO2013133057A1 (fr) Appareil, procédé et programme de traitement d'image
WO2023056803A1 (fr) Procédé et appareil de présentation holographique
WO2017085803A1 (fr) Dispositif d'affichage vidéo, et procédé d'affichage vidéo
Wu et al. P‐94: Free‐form micro‐optical design for enhancing image quality (MTF) at large FOV in light field near eye display
Mikšícek Causes of visual fatigue and its improvements in stereoscopy
Brooker et al. Operator performance evaluation of controlled depth of field in a stereographically displayed virtual environment
JP5539486B2 (ja) 情報処理装置及び情報処理方法
CN115334296B (zh) 一种立体图像显示方法和显示装置
KR20170096567A (ko) 가상현실용 몰입형 디스플레이 시스템 및 이를 이용한 가상현실 디스플레이 방법
CN202713529U (zh) 一种3d视频色彩校正器
JP2003101690A (ja) 画像処理方法及びデジタルカメラ並びに記録媒体
Sasaki et al. 8‐2: Invited Paper: Hyper‐Realistic Head‐up Display System for Medical Application

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17904098

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17904098

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26.11.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17904098

Country of ref document: EP

Kind code of ref document: A1