WO2015067071A1 - 一种实现虚拟视图转立体视图的方法及装置 - Google Patents
一种实现虚拟视图转立体视图的方法及装置 Download PDFInfo
- Publication number
- WO2015067071A1 WO2015067071A1 PCT/CN2014/082831 CN2014082831W WO2015067071A1 WO 2015067071 A1 WO2015067071 A1 WO 2015067071A1 CN 2014082831 W CN2014082831 W CN 2014082831W WO 2015067071 A1 WO2015067071 A1 WO 2015067071A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- axis
- view
- angle
- miscut
- viewpoint
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 37
- 239000011159 matrix material Substances 0.000 claims abstract description 51
- 230000000694 effects Effects 0.000 claims description 43
- 230000014509 gene expression Effects 0.000 claims description 12
- 230000001419 dependent effect Effects 0.000 claims description 6
- 210000001747 pupil Anatomy 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 8
- 230000009466 transformation Effects 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000010008 shearing Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/363—Image reproducers using image projection screens
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03H—HOLOGRAPHIC PROCESSES OR APPARATUS
- G03H1/00—Holographic processes or apparatus using light, infrared or ultraviolet waves for obtaining holograms or for obtaining an image from them; Details peculiar thereto
- G03H1/04—Processes or apparatus for producing holograms
- G03H1/08—Synthesising holograms, i.e. holograms synthesized from objects or objects from holograms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/60—Rotation of a whole image or part thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
- H04N13/279—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
Definitions
- the present invention relates to the field of 3D display technologies, and in particular, to a method and apparatus for implementing a virtual view to a stereoscopic view.
- the existing 2D to 3D technology converts the existing 2D video into 3D video through the view conversion method, but because the technology is not mature enough, and the conversion is time consuming, the cost is high, and the 3D effect obtained by the conversion is not ideal, thus affecting The development of the 3D industry.
- the present invention provides a method and apparatus for realizing a virtual view to a stereoscopic view, so that a user can adjust a 3D effect to obtain a better holographic 3D visual experience.
- the present invention provides a method and apparatus for implementing a virtual view to a stereoscopic view.
- the tracking angle of the human eye is tracked to determine the rotation angle of the virtual scene, and then the virtual scene is rotated to obtain a virtual holographic stereoscopic view matrix.
- the miscut matrix to multiply the matrix of each viewpoint model, the image of each viewpoint is obtained after projection, and according to the user's 3D effect experience, the position of the observer in the scene and the miscut angle are adjusted to finally obtain a better 3D effect experience.
- the first technical solution provided by the present invention provides a method for implementing a virtual view to a stereoscopic view, and the method includes the following steps:
- the method further comprises the steps of:
- the user adjusts the miscut angle of the second image processing module and the position coordinates of the observer in the scene according to the experienced 3D effect, thereby improving the 3D effect of the projected 3D image.
- the virtual scene view matrix before rotation is represented by A
- the virtual holographic stereo view matrix is represented by A
- M2 gets the rotated view A', where, in the spatial Cartesian coordinate system 0-XYZ before the rotation, the center of the screen is at the origin of the coordinate system 0-XYZ, and the projection of the human eye to the center of the screen in the XOZ plane is projected.
- the angle between the positive half-axis of the axis is ⁇
- the angle between the projection of the human eye and the center of the screen on the YOZ plane and the positive half-axis of the z-axis is ⁇
- the X-axis direction points from the midpoint of the left side of the screen to the midpoint of the right side of the screen.
- the direction of the x-axis is from the midpoint of the screen to the midpoint of the lower side of the screen.
- the user adjusts the miscut angle of the second image processing module and the position coordinates of the observer in the scene according to the experienced 3D effect, thereby improving the 3D effect of the projected 3D image.
- the new coordinate system after rotation is represented by O'-X'Y'Z', the origin 0' and the viewpoint in the original coordinate system
- the center position coincides, and the positive direction of the Z' axis points along the coordinates of the observer in the original coordinate system to the center coordinates of the viewpoint.
- the miscut transformation means that the y' and z' of the viewpoint are unchanged, and the x' value is the z' axis.
- the miscut angle ⁇ refers to the angle between the viewpoint coordinate and the positive direction of the z' axis.
- the coordinates of any viewpoint after the miscut are represented by (X", y ⁇ z" ), and are located at X'
- the miscut expressions for the negative half-axis view of the axis are:
- the corresponding miscut matrices are:
- the corresponding miscut matrices are:
- the method further comprises the steps of:
- the user adjusts the miscut angle of the second image processing module and the position coordinates of the observer in the scene according to the experienced 3D effect, thereby improving the 3D effect of the projected 3D image.
- the second technical solution provided by the present invention provides a device for implementing a virtual view to a stereoscopic view, and the device includes:
- a human eye tracking module for capturing human eye position coordinates
- a first image processing module is electrically connected to the human eye tracking module, configured to determine a rotation angle of the virtual scene according to the position coordinates of the human eye and the center coordinates of the screen of the projection display module, and rotate the virtual scene according to the rotation angle to obtain a virtual holographic stereoscopic view matrix.
- the second image processing module is electrically connected to the first image processing module, and is configured to determine a miscut angle of each viewpoint according to the center of the virtual scene, the position of the observer in the scene, and the coordinates of each viewpoint, thereby generating and viewing the viewpoints.
- Corresponding miscut matrices the right-handed matrix of the mismatched matrix is multiplied by the corresponding viewpoint model matrix to generate left and right views;
- the projection display module is electrically connected to the second image processing module for projecting left and right views of the respective viewpoints.
- the second image processing module is further configured to change the miscut angle and the position coordinates of the observer in the scene according to the input of the user, thereby implementing a stereoscopic effect of improving the stereoscopic image obtained by the projection.
- the virtual scene view matrix before the rotation is represented by A
- the virtual holographic stereo view matrix is represented by A
- the rotated view ⁇ ' is obtained, wherein the space rectangular coordinate system before the rotation is used to represent the ⁇ - ⁇ , the center of the screen is located at the origin of the coordinate system 0-XYZ, and the projection of the line connecting the human eye to the center of the screen in the pupil plane and ⁇
- the angle between the positive half axis of the axis is ⁇
- the angle between the projection of the human eye and the center of the screen in the pupil plane and the positive half axis of the ⁇ axis is ⁇
- the X axis points from the midpoint of the left side of the screen to the midpoint of the right side of the screen, ⁇
- the axis points from the midpoint of the screen to the lower midpoint of the screen.
- the second image processing module is further configured to change the miscut angle and the position coordinates of the observer in the scene according to the input of the user, thereby implementing a stereoscopic effect of improving the stereoscopic image obtained by the projection.
- the new coordinate system after rotation is represented by O'-X'Y'Z', wherein the origin 0' coincides with the center position of the viewpoint in the original coordinate system, and the positive direction of the Z' axis is along the observer in the original coordinate system.
- the miscut transformation refers to the y' and z' of the viewpoint, and the x' value is linearly transformed with the z' axis as the dependent axis, and the miscut angle ⁇ refers to the viewpoint coordinate and z 'The positive angle of the axis, the coordinates of the viewpoint after the miscut are represented by ( ⁇ '', yz"), then the miscut expressions of the negative half-axis view of the X' axis are:
- the corresponding miscut matrices are:
- the corresponding miscut matrices are:
- the second image processing module is further configured to change the miscut angle and the position coordinates of the observer in the scene according to the input of the user, thereby implementing a stereoscopic effect of improving the stereoscopic image obtained by the projection.
- the present invention provides an implementation virtual The method and device for transforming the stereoscopic view into a stereoscopic view, the rotation angle of the virtual scene is determined by tracking the dynamic coordinates of the human eye, and then the virtual holographic stereoscopic view matrix is obtained by rotating the virtual scene, and then the parallax matrix is used to right-multiply the matrix of each viewpoint model, and after projection The image of each viewpoint is obtained, and according to the user's 3D effect experience, the position of the observer in the scene and the miscut angle are adjusted to finally obtain a better 3D effect experience.
- FIG. 1 is a schematic flow chart of an embodiment of a method for implementing a virtual view to a stereoscopic view of the present invention
- FIG. 2 is a schematic view showing an angle between a human eye position coordinate and a screen in the embodiment shown in FIG.
- Fig. 3 is a schematic view showing the relationship between the angle of rotation of the virtual scene around the Y-axis and the position of the human eye, the center of the scene, and the position of the screen in the embodiment shown in Fig. 1.
- FIG. 4 is a schematic diagram showing the relationship between the miscut angle and the position of the observer, the center of the viewpoint, and the position of the viewpoint in the embodiment shown in FIG. 1;
- FIG. 5 is a schematic structural diagram of an embodiment of an apparatus for implementing a virtual view to a stereoscopic view of the present invention.
- FIG. 1 is a schematic flow chart of an embodiment of a method for implementing a virtual view to a stereoscopic view according to the present invention. As shown in FIG. 1 , the process of implementing a virtual view to a stereo view in this embodiment includes the following steps:
- FIG. 2 is the position coordinates of the human eye and the screen in the embodiment shown in FIG.
- the angle diagram as shown in Figure 2, before the rotation of the virtual scene, the space rectangular coordinate system 0-XYZ, the center of the screen is located at the origin 0 of the coordinate system 0-XYZ, the human eye to the center of the screen 0
- the angle between the projection of the line in the XOZ plane and the positive half-axis of the z-axis is ⁇
- the angle between the projection of the human eye and the center of the screen 0 on the YOZ plane and the positive half-axis of the z-axis is ⁇
- the direction of the X-axis is from the left side of the screen.
- the midpoint points to the midpoint of the right side of the screen, and the direction of the x-axis points from the midpoint of the upper side of the screen to the midpoint of the lower side of the screen.
- the use of the human eye tracking module can make the projected image of the human eye change when the human eye is in different positions, so that the user can still experience a better 3D effect during the moving process;
- the first image processing module determines a rotation angle of the virtual scene according to the position coordinates of the human eye and the coordinates of the center of the screen of the projection display module, and rotates the virtual scene according to the rotation angle to obtain a virtual holographic stereoscopic view matrix.
- FIG. 3 is a view showing the rotation angle of the virtual scene around the Y axis and the position of the human eye in the embodiment shown in FIG. The center of the scene and the positional relationship of the screen. As shown in Figure 3, the distance from the human eye to the screen on the plane XOZ is L, and the distance from the center of the virtual scene to the screen is Z ⁇ .
- M2 gets the rotated view ⁇ ' ;
- the second image processing module determines a miscut angle of each viewpoint according to the center of the virtual scene, the position of the observer in the scene, and the coordinates of each viewpoint, thereby generating a miscut matrix corresponding to each viewpoint, and the right-cut matrix is right-multiplied
- the corresponding viewpoint model matrix generates a left and right view
- FIG. 4 is the miscut angle and the position of the observer, the center of the viewpoint, and the position of the viewpoint in the embodiment shown in FIG.
- the coordinate ⁇ ⁇ points to the center coordinate of the viewpoint.
- the miscut transformation refers to the y' and ⁇ ' of the viewpoint, and the X' value is linearly transformed with the Z' axis as the dependent axis.
- the miscut angle ⁇ refers to the viewpoint coordinate and The z' axis is in the positive direction
- FIG. 4 shows that the four viewpoints include the viewpoint 1, the viewpoint 2, the viewpoint 3, and the viewpoint 4, wherein the viewpoint 1 and the viewpoint 4 are a pair of viewpoints corresponding to the left view and the right view, respectively.
- Viewpoint 2 and viewpoint 3 are a pair of viewpoints corresponding to the left view and the right view, respectively, and the angle between the viewpoint 3 and the positive direction of the Z' axis in FIG. 4 is ⁇ , and the coordinates of any viewpoint after the miscut are (X" , yz" ) means that the miscut expressions of the negative half-axis viewpoint 2 of the X' axis are:
- the corresponding miscut matrices are:
- the corresponding miscut matrices are:
- the method flow shown in FIG. 1 further includes the steps of:
- the user adjusts a miscut angle and a scene of the second image processing module according to the experienced 3D effect.
- the position coordinates of the observer in the middle thereby achieving a 3D effect of improving the projected 3D image.
- z 'coordinate is larger than the viewer [sigma] [zeta], which point to the wrong cut when x' axis negative directions, z 'coordinates ⁇ less than the observer, the point when the shearing The x' axis moves in the positive direction, and the miscut direction of view 2 and view 3 is different, and the miscut angle is the same.
- the projection display module allows the user to experience the holographic stereo view by projecting the miscut view.
- the user can adjust the miscut angle of the second image processing module and the position of the observer in the scene according to his own experience, thereby improving the 3D effect of the projected view. .
- the user improves the stereoscopic effect of the projected image by changing the size of the Z G sum: when Z G is increased, z _ z G is decreased, the stereoscopic effect is weakened, and vice versa, the stereoscopic effect is enhanced; ( 0 ⁇ ), tan/ increases, the stereoscopic effect of the projected image is enhanced, and conversely,
- the method for realizing the virtual view to the stereoscopic view of the present invention can achieve a better 3D stereoscopic effect experience by appropriately modifying z G and ⁇ , and the dynamic tracking human eye position coordinates used in the embodiment of the present invention makes the user A better holographic stereo view can be viewed during the movement process, which avoids the inconvenience that the user can experience a better holographic stereo view only at a certain fixed point.
- FIG. 5 is a schematic structural diagram of an embodiment of an apparatus for implementing a virtual view to a stereoscopic view according to the present invention. As shown in FIG.
- the apparatus 20 for implementing a virtual view to a stereoscopic view of the present embodiment includes: a human eye tracking module 21 for capturing human eye position coordinates; a first image processing module 22, and the human eye tracking module 21 Electrical connection for use in accordance with human eye position coordinates and the center of the screen of projection display module 24 Coordinates determine the rotation angle of the virtual scene, and rotate the virtual scene according to the rotation angle to obtain a virtual holographic stereoscopic view matrix.
- the second image processing module 23 is electrically connected to the first image processing module 22 for viewing according to the center of the virtual scene and the scene.
- the position of each viewpoint and the coordinates of each viewpoint determine the miscut angle of each viewpoint, and then generate a miscut matrix corresponding to each viewpoint, and the misalignment matrix is multiplied by the corresponding viewpoint model matrix to generate a left and right view; projection display module 24,
- the second image processing module 23 is electrically connected to project a left and right view of each viewpoint.
- the human eye tracking module 21 can track the position of the human eye in real time, please refer to FIG. 2, FIG. 2 shows a schematic diagram of the angle between the position coordinates of the human eye and the screen, as shown in FIG. 2, in the virtual Before the scene is rotated, the space rectangular coordinate system 0-XYZ is used.
- the center of the screen is located at the origin 0 of the coordinate system 0-XYZ, and the projection of the human eye to the center of the screen 0 in the XOZ plane and the positive z-axis of the z-axis.
- the angle between the angle of the human eye and the center of the screen is 0.
- the angle between the projection of the line on the ⁇ plane and the positive half of the ⁇ axis is ⁇ , and the direction of the X axis is from the midpoint of the left side of the screen to the midpoint of the right side of the screen. Point to the lower midpoint of the screen by the midpoint on the screen.
- the user can view the holographic stereoscopic view changed according to the change of the human eye position during the movement, thereby avoiding that the user can only view at a certain fixed point. The inconvenience caused by the holographic stereo view.
- FIG. 3 is a schematic diagram showing the relationship between the rotation angle of the virtual scene around the x-axis and the position of the human eye, the center of the scene, and the position of the screen in the embodiment shown in FIG.
- the distance from the projection of the human eye on the plane to the screen is L, and the distance from the center of the virtual scene to the screen is ⁇ center.
- M2 gets the rotated view A'.
- FIG. 4 is the miscut angle and the position of the observer, the center of the viewpoint, and the position of the viewpoint in the embodiment shown in FIG.
- the relationship diagram, as shown in Figure 4, the new coordinate system after rotation is represented by O'-X'Y'Z', the origin 0' coincides with the center position of the viewpoint in the original coordinate system, and the positive direction of the Z' axis follows the original
- the coordinate ⁇ ⁇ of the observer in the coordinate system points to the coordinates of the center of the viewpoint.
- the miscut transformation means that the y' and ⁇ ' of the viewpoint are unchanged, and the value of ⁇ is linearly transformed with the z' axis as the dependent axis, and the miscut angle is set.
- ⁇ refers to the angle between the viewpoint coordinates and the positive direction of the z' axis.
- four viewpoints include viewpoint 1, viewpoint 2, viewpoint 3, and viewpoint 4, where viewpoint 1 and viewpoint 4 are left view and right, respectively.
- the pair of viewpoints corresponding to the view, the viewpoint 2 and the viewpoint 3 are respectively a pair of viewpoints corresponding to the left view and the right view. As shown in FIG.
- the angle between the viewpoint 3 and the positive direction of the Z′ axis is ⁇ , after the miscut
- the coordinates of any viewpoint are represented by (X", ⁇ ⁇ " ), and the miscut expression of the viewpoint 2 of the negative axis of X and the axis They are:
- the second image processing module 23 is further configured to change the miscut angle and the position coordinates of the observer in the scene according to the input of the user, thereby implementing a stereoscopic effect of improving the stereoscopic image obtained by the projection.
- the point moves in the negative direction of the x′ axis when the miscut is performed, and when z′ is smaller than the coordinate Z G of the observer, the point is missed. Move to the 'axis' direction.
- the miscut direction of the viewpoint 2 and the viewpoint 3 are different, and the miscut angle is the same.
- any point A (X, y, z) of the view is rotated to obtain A' (x, y, ⁇ ,) in the Cartesian coordinate system O'-X'Y'Z' .
- the projection display module 24 allows the user to experience the holographic stereoscopic view by projecting the miscut view.
- the user can adjust the miscut angle of the second image processing module and the position of the observer in the scene according to his own experience, thereby improving the 3D effect of the projected view. .
- the user improves the stereoscopic effect of the projected image by changing the size of z G and 6 >: when z g is increased, the stereoscopic effect is reduced, and vice versa, the stereoscopic effect is enhanced; when increasing / is (0) ⁇ ), tan/ increases, the stereoscopic effect of the projected image is enhanced, and conversely,
- the present invention provides a method and apparatus for implementing a virtual view to a stereoscopic view, by tracking the dynamic coordinates of the human eye to determine the rotation angle of the virtual scene, and then rotating the virtual scene to obtain a virtual holographic stereoscopic view matrix, and then using the error
- the tangent matrix is multiplied by each viewpoint model matrix, and the image of each viewpoint is obtained after projection, and according to the user's 3D effect experience, the position of the observer in the scene and the miscut angle are adjusted to finally obtain a better 3D effect experience.
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015545662A JP2015536010A (ja) | 2013-11-05 | 2014-07-23 | 仮想ビューから立体ビューへの変換を実現する方法及び装置 |
US14/417,557 US9704287B2 (en) | 2013-11-05 | 2014-07-23 | Method and apparatus for achieving transformation of a virtual view into a three-dimensional view |
EP14814675.6A EP3067866A4 (en) | 2013-11-05 | 2014-07-23 | Method and device for converting virtual view into stereoscopic view |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201310542642.XA CN103996215A (zh) | 2013-11-05 | 2013-11-05 | 一种实现虚拟视图转立体视图的方法及装置 |
CN201310542642.X | 2013-11-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015067071A1 true WO2015067071A1 (zh) | 2015-05-14 |
Family
ID=51310368
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2014/082831 WO2015067071A1 (zh) | 2013-11-05 | 2014-07-23 | 一种实现虚拟视图转立体视图的方法及装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9704287B2 (zh) |
EP (1) | EP3067866A4 (zh) |
JP (1) | JP2015536010A (zh) |
CN (1) | CN103996215A (zh) |
WO (1) | WO2015067071A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113379897A (zh) * | 2021-06-15 | 2021-09-10 | 广东未来科技有限公司 | 应用于3d游戏渲染引擎的自适应虚拟视图转立体视图的方法及装置 |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103996215A (zh) * | 2013-11-05 | 2014-08-20 | 深圳市云立方信息科技有限公司 | 一种实现虚拟视图转立体视图的方法及装置 |
KR101666959B1 (ko) * | 2015-03-25 | 2016-10-18 | ㈜베이다스 | 카메라로부터 획득한 영상에 대한 자동보정기능을 구비한 영상처리장치 및 그 방법 |
CN106454315A (zh) * | 2016-10-26 | 2017-02-22 | 深圳市魔眼科技有限公司 | 一种自适应虚拟视图转立体视图的方法、装置及显示设备 |
CN106961592B (zh) * | 2017-03-01 | 2020-02-14 | 深圳市魔眼科技有限公司 | 3d视频的vr显示方法及系统 |
CN107027015A (zh) * | 2017-04-28 | 2017-08-08 | 广景视睿科技(深圳)有限公司 | 基于增强现实的3d动向投影系统以及用于该系统的投影方法 |
US10672311B2 (en) * | 2017-05-04 | 2020-06-02 | Pure Depth, Inc. | Head tracking based depth fusion |
CN107193372B (zh) * | 2017-05-15 | 2020-06-19 | 杭州一隅千象科技有限公司 | 从多个任意位置矩形平面到可变投影中心的投影方法 |
CN109960401B (zh) * | 2017-12-26 | 2020-10-23 | 广景视睿科技(深圳)有限公司 | 一种基于人脸追踪的动向投影方法、装置及其系统 |
CN115842907A (zh) * | 2018-03-27 | 2023-03-24 | 京东方科技集团股份有限公司 | 渲染方法、计算机产品及显示装置 |
CN109189302B (zh) * | 2018-08-29 | 2021-04-06 | 百度在线网络技术(北京)有限公司 | Ar虚拟模型的控制方法及装置 |
CN111050145B (zh) * | 2018-10-11 | 2022-07-01 | 上海云绅智能科技有限公司 | 一种多屏融合成像的方法、智能设备及系统 |
CN111131801B (zh) * | 2018-11-01 | 2023-04-28 | 华勤技术股份有限公司 | 投影仪校正系统、方法及投影仪 |
CN111182278B (zh) * | 2018-11-09 | 2022-06-14 | 上海云绅智能科技有限公司 | 一种投影展示管理方法及系统 |
AT522012A1 (de) * | 2018-12-19 | 2020-07-15 | Viewpointsystem Gmbh | Verfahren zur Anpassung eines optischen Systems an einen individuellen Benutzer |
CN110913200B (zh) * | 2019-10-29 | 2021-09-28 | 北京邮电大学 | 一种多屏拼接同步的多视点图像生成系统及方法 |
CN111031298B (zh) * | 2019-11-12 | 2021-12-10 | 广景视睿科技(深圳)有限公司 | 控制投影模块投影的方法、装置和投影系统 |
CN112235562B (zh) * | 2020-10-12 | 2023-09-15 | 聚好看科技股份有限公司 | 一种3d显示终端、控制器及图像处理方法 |
CN112672139A (zh) * | 2021-03-16 | 2021-04-16 | 深圳市火乐科技发展有限公司 | 投影显示方法、装置及计算机可读存储介质 |
CN116819925B (zh) * | 2023-08-29 | 2023-11-14 | 廊坊市珍圭谷科技有限公司 | 一种基于全息投影的互动娱乐系统及方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7574045B2 (en) * | 2001-07-27 | 2009-08-11 | Matrox Electronic Systems Ltd. | Model-based recognition of objects using a calibrated image system |
CN101853518A (zh) * | 2010-05-28 | 2010-10-06 | 电子科技大学 | 基于各向异性体数据的错切变形体绘制方法 |
CN101866497A (zh) * | 2010-06-18 | 2010-10-20 | 北京交通大学 | 基于双目立体视觉的智能三维人脸重建方法及系统 |
CN102509334A (zh) * | 2011-09-21 | 2012-06-20 | 北京捷成世纪科技股份有限公司 | 一种将虚拟3d场景转换为立体视图的方法 |
CN103996215A (zh) * | 2013-11-05 | 2014-08-20 | 深圳市云立方信息科技有限公司 | 一种实现虚拟视图转立体视图的方法及装置 |
Family Cites Families (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB8706348D0 (en) * | 1987-03-17 | 1987-04-23 | Quantel Ltd | Electronic image processing systems |
JP2812254B2 (ja) * | 1995-05-31 | 1998-10-22 | 日本電気株式会社 | 視点追従型立体映像表示装置 |
US5850225A (en) * | 1996-01-24 | 1998-12-15 | Evans & Sutherland Computer Corp. | Image mapping system and process using panel shear transforms |
US6108440A (en) * | 1996-06-28 | 2000-08-22 | Sony Corporation | Image data converting method |
US6522312B2 (en) * | 1997-09-01 | 2003-02-18 | Canon Kabushiki Kaisha | Apparatus for presenting mixed reality shared among operators |
US6640018B1 (en) * | 1999-06-08 | 2003-10-28 | Siemens Aktiengesellschaft | Method for rotating image records with non-isotropic topical resolution |
AU2001239926A1 (en) * | 2000-02-25 | 2001-09-03 | The Research Foundation Of State University Of New York | Apparatus and method for volume processing and rendering |
US7657083B2 (en) * | 2000-03-08 | 2010-02-02 | Cyberextruder.Com, Inc. | System, method, and apparatus for generating a three-dimensional representation from one or more two-dimensional images |
GB2372659A (en) * | 2001-02-23 | 2002-08-28 | Sharp Kk | A method of rectifying a stereoscopic image |
US7003175B2 (en) * | 2001-03-28 | 2006-02-21 | Siemens Corporate Research, Inc. | Object-order multi-planar reformatting |
US7043073B1 (en) * | 2001-10-19 | 2006-05-09 | Zebra Imaging, Inc. | Distortion correcting rendering techniques for autostereoscopic displays |
JP3805231B2 (ja) * | 2001-10-26 | 2006-08-02 | キヤノン株式会社 | 画像表示装置及びその方法並びに記憶媒体 |
US7565004B2 (en) * | 2003-06-23 | 2009-07-21 | Shoestring Research, Llc | Fiducial designs and pose estimation for augmented reality |
US7643025B2 (en) * | 2003-09-30 | 2010-01-05 | Eric Belk Lange | Method and apparatus for applying stereoscopic imagery to three-dimensionally defined substrates |
DE602004016185D1 (de) * | 2003-10-03 | 2008-10-09 | Automotive Systems Lab | Insassenerfassungssystem |
GB0410551D0 (en) * | 2004-05-12 | 2004-06-16 | Ller Christian M | 3d autostereoscopic display |
JP4434890B2 (ja) * | 2004-09-06 | 2010-03-17 | キヤノン株式会社 | 画像合成方法及び装置 |
AU2004240229B2 (en) * | 2004-12-20 | 2011-04-07 | Canon Kabushiki Kaisha | A radial, three-dimensional, hierarchical file system view |
US20070127787A1 (en) * | 2005-10-24 | 2007-06-07 | Castleman Kenneth R | Face recognition system and method |
JP4375325B2 (ja) * | 2005-11-18 | 2009-12-02 | カシオ計算機株式会社 | 画像処理装置、画像処理方法及びプログラム |
US8548265B2 (en) * | 2006-01-05 | 2013-10-01 | Fastvdo, Llc | Fast multiplierless integer invertible transforms |
GB0716776D0 (en) * | 2007-08-29 | 2007-10-10 | Setred As | Rendering improvement for 3D display |
DE102007056528B3 (de) * | 2007-11-16 | 2009-04-02 | Seereal Technologies S.A. | Verfahren und Vorrichtung zum Auffinden und Verfolgen von Augenpaaren |
US20100110069A1 (en) * | 2008-10-31 | 2010-05-06 | Sharp Laboratories Of America, Inc. | System for rendering virtual see-through scenes |
TWI398796B (zh) * | 2009-03-27 | 2013-06-11 | Utechzone Co Ltd | Pupil tracking methods and systems, and correction methods and correction modules for pupil tracking |
JP2011165068A (ja) * | 2010-02-12 | 2011-08-25 | Nec System Technologies Ltd | 画像生成装置、画像表示システム、画像生成方法、及びプログラム |
US8890934B2 (en) * | 2010-03-19 | 2014-11-18 | Panasonic Corporation | Stereoscopic image aligning apparatus, stereoscopic image aligning method, and program of the same |
JP5872185B2 (ja) * | 2010-05-27 | 2016-03-01 | 任天堂株式会社 | 携帯型電子機器 |
JP4869430B1 (ja) * | 2010-09-24 | 2012-02-08 | 任天堂株式会社 | 画像処理プログラム、画像処理装置、画像処理システム、および、画像処理方法 |
US8896631B2 (en) * | 2010-10-25 | 2014-11-25 | Hewlett-Packard Development Company, L.P. | Hyper parallax transformation matrix based on user eye positions |
TWI433530B (zh) * | 2010-11-01 | 2014-04-01 | Ind Tech Res Inst | 具有立體影像攝影引導的攝影系統與方法及自動調整方法 |
US9020241B2 (en) * | 2011-03-03 | 2015-04-28 | Panasonic Intellectual Property Management Co., Ltd. | Image providing device, image providing method, and image providing program for providing past-experience images |
US20130113701A1 (en) * | 2011-04-28 | 2013-05-09 | Taiji Sasaki | Image generation device |
JP5849811B2 (ja) * | 2011-05-27 | 2016-02-03 | 株式会社Jvcケンウッド | 裸眼立体視用映像データ生成方法 |
KR101779423B1 (ko) * | 2011-06-10 | 2017-10-10 | 엘지전자 주식회사 | 영상처리방법 및 영상처리장치 |
WO2012172719A1 (ja) * | 2011-06-16 | 2012-12-20 | パナソニック株式会社 | ヘッドマウントディスプレイおよびその位置ずれ調整方法 |
KR101265667B1 (ko) * | 2011-06-21 | 2013-05-22 | ㈜베이다스 | 차량 주변 시각화를 위한 3차원 영상 합성장치 및 그 방법 |
KR101315303B1 (ko) * | 2011-07-11 | 2013-10-14 | 한국과학기술연구원 | 착용형 디스플레이 장치 및 컨텐츠 디스플레이 방법 |
JP2013128181A (ja) * | 2011-12-16 | 2013-06-27 | Fujitsu Ltd | 表示装置、表示方法および表示プログラム |
CN102520970A (zh) * | 2011-12-28 | 2012-06-27 | Tcl集团股份有限公司 | 一种立体用户界面的生成方法及装置 |
JP2013150249A (ja) * | 2012-01-23 | 2013-08-01 | Sony Corp | 画像処理装置と画像処理方法およびプログラム |
US20140002443A1 (en) * | 2012-06-29 | 2014-01-02 | Blackboard Inc. | Augmented reality interface |
US9092897B2 (en) * | 2012-08-10 | 2015-07-28 | Here Global B.V. | Method and apparatus for displaying interface elements |
US9639924B2 (en) * | 2012-09-24 | 2017-05-02 | Seemsome Everyone Ltd | Adding objects to digital photographs |
KR101416378B1 (ko) * | 2012-11-27 | 2014-07-09 | 현대자동차 주식회사 | 영상 이동이 가능한 디스플레이 장치 및 방법 |
US9225969B2 (en) * | 2013-02-11 | 2015-12-29 | EchoPixel, Inc. | Graphical system with enhanced stereopsis |
KR102040653B1 (ko) * | 2013-04-08 | 2019-11-06 | 엘지디스플레이 주식회사 | 홀로그래피 입체 영상 표시장치 |
US9264702B2 (en) * | 2013-08-19 | 2016-02-16 | Qualcomm Incorporated | Automatic calibration of scene camera for optical see-through head mounted display |
-
2013
- 2013-11-05 CN CN201310542642.XA patent/CN103996215A/zh active Pending
-
2014
- 2014-07-23 EP EP14814675.6A patent/EP3067866A4/en not_active Withdrawn
- 2014-07-23 WO PCT/CN2014/082831 patent/WO2015067071A1/zh active Application Filing
- 2014-07-23 US US14/417,557 patent/US9704287B2/en not_active Expired - Fee Related
- 2014-07-23 JP JP2015545662A patent/JP2015536010A/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7574045B2 (en) * | 2001-07-27 | 2009-08-11 | Matrox Electronic Systems Ltd. | Model-based recognition of objects using a calibrated image system |
CN101853518A (zh) * | 2010-05-28 | 2010-10-06 | 电子科技大学 | 基于各向异性体数据的错切变形体绘制方法 |
CN101866497A (zh) * | 2010-06-18 | 2010-10-20 | 北京交通大学 | 基于双目立体视觉的智能三维人脸重建方法及系统 |
CN102509334A (zh) * | 2011-09-21 | 2012-06-20 | 北京捷成世纪科技股份有限公司 | 一种将虚拟3d场景转换为立体视图的方法 |
CN103996215A (zh) * | 2013-11-05 | 2014-08-20 | 深圳市云立方信息科技有限公司 | 一种实现虚拟视图转立体视图的方法及装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3067866A4 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113379897A (zh) * | 2021-06-15 | 2021-09-10 | 广东未来科技有限公司 | 应用于3d游戏渲染引擎的自适应虚拟视图转立体视图的方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
CN103996215A (zh) | 2014-08-20 |
JP2015536010A (ja) | 2015-12-17 |
US20150339844A1 (en) | 2015-11-26 |
US9704287B2 (en) | 2017-07-11 |
EP3067866A4 (en) | 2017-03-29 |
EP3067866A1 (en) | 2016-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015067071A1 (zh) | 一种实现虚拟视图转立体视图的方法及装置 | |
EP3460746B1 (en) | Generating stereoscopic light field panoramas using concentric viewing circles | |
US8994780B2 (en) | Video conferencing enhanced with 3-D perspective control | |
CN107637060B (zh) | 相机装备和立体图像捕获 | |
CN102665087B (zh) | 3d立体摄像设备的拍摄参数自动调整系统 | |
US20150358539A1 (en) | Mobile Virtual Reality Camera, Method, And System | |
WO2018005235A1 (en) | System and method for spatial interaction using automatically positioned cameras | |
WO2012050040A1 (ja) | 立体映像変換装置及び立体映像表示装置 | |
JP2017536565A (ja) | ステレオ視のための広視野カメラ装置 | |
JP2016500954A5 (zh) | ||
JP6057570B2 (ja) | 立体パノラマ映像を生成する装置及び方法 | |
Tang et al. | A system for real-time panorama generation and display in tele-immersive applications | |
US11812009B2 (en) | Generating virtual reality content via light fields | |
CN103247020A (zh) | 一种基于径向特征的鱼眼图像展开方法 | |
CN103269430A (zh) | 基于bim的三维场景生成方法 | |
WO2013178188A1 (zh) | 视频会议显示方法及装置 | |
WO2019206827A1 (en) | Apparatus and method for rendering an audio signal for a playback to a user | |
WO2013185429A1 (zh) | 一种投影显示系统、投影设备及投影显示方法 | |
WO2022267694A1 (zh) | 一种显示调节方法、装置、设备及介质 | |
WO2012100495A1 (zh) | 双摄像头立体拍摄的处理方法及装置 | |
WO2023056803A1 (zh) | 一种全息展示方法及装置 | |
CN109961395B (zh) | 深度图像的生成及显示方法、装置、系统、可读介质 | |
JP2012216883A (ja) | 表示制御装置、表示制御方法、及びプログラム | |
Zhang et al. | A new 360 camera design for multi format VR experiences | |
CN203896436U (zh) | 一种虚拟现实投影机 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015545662 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2014814675 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014814675 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14417557 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14814675 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |