WO2014000159A1 - 校正多通道视景投影系统的投影仪的方法及设备 - Google Patents

校正多通道视景投影系统的投影仪的方法及设备 Download PDF

Info

Publication number
WO2014000159A1
WO2014000159A1 PCT/CN2012/077538 CN2012077538W WO2014000159A1 WO 2014000159 A1 WO2014000159 A1 WO 2014000159A1 CN 2012077538 W CN2012077538 W CN 2012077538W WO 2014000159 A1 WO2014000159 A1 WO 2014000159A1
Authority
WO
WIPO (PCT)
Prior art keywords
projector
screen
projection
coordinate system
reference point
Prior art date
Application number
PCT/CN2012/077538
Other languages
English (en)
French (fr)
Inventor
严涛
Original Assignee
Yan Tao
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yan Tao filed Critical Yan Tao
Priority to PCT/CN2012/077538 priority Critical patent/WO2014000159A1/zh
Publication of WO2014000159A1 publication Critical patent/WO2014000159A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3185Geometric adjustment, e.g. keystone or convergence

Definitions

  • the present invention relates to the field of virtual reality, and more particularly to a method and apparatus for correcting a projector of a multi-channel vision projection system. Background technique
  • the multi-channel visual projection system is a display system that seamlessly projects a three-dimensional scene on a large-size screen (usually a curved surface) by multiple projectors.
  • This type of system includes various mainstream mainstream projection systems, such as Virtual Reality System, Scientific Visualization System, Visual Simulation System, Collaborative Visualization System. ), etc., widely used in science, education, defense, entertainment, energy, manufacturing and government.
  • Key technologies for multi-channel vision simulation systems include: geometric correction techniques, edge blending techniques, and color consistency matching techniques.
  • the present invention primarily discusses geometric correction techniques, while the method of the present invention can also provide form factors for edge fusion techniques.
  • the multi-channel visual simulation system needs to perform geometric correction of the projected image because the internal and external parameters of the projector and the self-designed eye point are different from the corresponding cone parameters of the channel.
  • geometric corrections were performed manually, using special hardware to manually match the test grid points (called "control points") in the projected image with the reference points on the screen. The pixel positions between the control points were interpolated. generate.
  • the reference point on the screen uses the ultraviolet sensitive paint spots applied on the screen, and these points will be illuminated when the ultraviolet light is irradiated during debugging, which can be used as a reference point; after the debugging is finished, in the normal projection image The point is not visible and therefore does not affect the normal use of the system.
  • the laser spot from the laser dot matrix was used as a reference point, which eliminated the painting of the screen, as shown in Figure 1.
  • Embodiments of the present invention provide a method for correcting a projector of a multi-channel visual projection system, including:
  • Embodiments of the present invention provide an apparatus for correcting a projector of a multi-channel visual projection system, including:
  • the projector calibration method and apparatus avoids the use of the camera, and can cooperate with the laser theodolite to directly scale the projector using the depth information of the screen.
  • a simple, simple operation of a mouse click can be used to obtain a high-precision, hardware-independent geometric correction grid for projector calibration.
  • the engineering application results show that this method is convenient to use and has an ideal effect.
  • DRAWINGS In order to more clearly illustrate the technical solutions of the embodiments of the present invention, a brief description of the drawings used in the description of the embodiments will be briefly described below. Obviously, the drawings in the following description are only some embodiments of the present invention, and those skilled in the art can obtain other drawings according to the drawings without any creative work. In the picture:
  • Figure 1 shows a schematic diagram of manual correction according to the prior art
  • Figure 2 shows a schematic diagram of the screen, projector and design eye points of the visual simulation system
  • Figure 3a shows a projection image of the projection channel No. 5 in Figure 2;
  • Figure 3 (b) shows a view cone corresponding to the projection channel No. 5 in Figure 2;
  • Figure 4a shows the test grid of the projection channel No. 5 in Figure 2;
  • Figure 4b shows the geometric correction target image of the projection channel No. 5 in Figure 2;
  • FIG. 5 is a schematic view showing a projection template coordinate system and a template plane coordinate system according to an exemplary embodiment of the present invention
  • FIG. 6 is a schematic diagram showing a projection relationship according to an exemplary embodiment of the present invention.
  • Figure 7 shows a schematic diagram of the coordinate system of the projector
  • FIG. 8 is a flow chart showing a projector correction method according to an exemplary embodiment of the present invention.
  • FIG. 9 is a block diagram showing a projector calibration apparatus according to an exemplary embodiment of the present invention.
  • Figure 10 shows the labeling points of projection channels 2, 3, and 5 in the simulation
  • Figure 11a and Figure 1b show the test grid and the geometric correction target grid of the No. 2 projection channel, respectively.
  • Figure 11c and Figure 1 id show the test grid and geometric correction target network of the No. 2 projection channel, respectively.
  • Figures 12a-12c illustrate geometric correction grids for projection channels 2, 3, and 5, respectively;
  • Figures 13a and 13b show the comparison before and after the geometric correction of the projection channels 2, 3, and 5, respectively;
  • Figure 14 shows the images of the projection channels 2, 3, and 5, in which no edge fusion is performed;
  • Figure 15 shows an image of projection channels Nos. 2, 3, and 5 in which edge blending is performed
  • Figure 16 shows the correction and fusion results of the whole ball
  • Figure 17 is a diagram showing the optical edge fusion transparency distribution of the third channel
  • Figure 2 shows a schematic diagram of the relationship between the screen, the projector, the design eye point, and the viewing cone.
  • the system of Figure 2 includes five projectors P1-P5.
  • the "+" in the figure indicates the design eye point.
  • the design eye point is an important part of the visual simulation system. In theory, for the projection scene, only the image observed from that point is geometrically correct.
  • view cone refers to a viewing frustum cone used to observe a virtual scene from a design eye point corresponding to a projection channel, and may also be referred to as a "light cone".
  • the viewing cone of a projection channel is closely related to the position and size of the projected image of the channel.
  • the optical center of the projector is at the design eye point, and the projector light cone completely coincides with the view cone of the corresponding projection channel, the projected image does not have to be geometrically corrected, that is, the projector does not have to be corrected. Obviously, this is difficult to achieve in practice. Taking the projector No. 5 in the system shown in Fig.
  • the position of the projector, the cone of the projector and the projected image are shown in Fig. 3 (a), and the corresponding cone of the channel is shown in Fig. 3 (b).
  • the viewing cone and the projection screen of a projection channel do not have to correspond exactly to the coverage area of the projected image, but the pixel utilization is comprehensively considered based on the geometric correction result.
  • the test grid for geometric image correction refers to the observation of the meshed screen from the design eye point and the corresponding cone of the channel (see Figure 3 (b) The way shown).
  • the test grid for channel 5 in Figure 3 is shown in Figure 4 (a) (for example, a spherical screen with a horizontal field of view ⁇ 90 °).
  • the so-called geometric correction refers to moving the control points (for example, the intersection of the warp and the weft) in the test grid projected on the screen to the correct position where the points should be on the screen.
  • the correct position of the control point on the screen is given by the reference point on the screen (as shown in Figure 1).
  • the image coordinates of the "correct screen position" of each control point in the test image need to be calculated by some means. Once these coordinates are determined, the geometric correction can be completed by simple texture mapping.
  • the viewpoint can be moved to the projector's optical center based on its external parameters, and the line of sight direction and direction can be determined.
  • the up-vector is then used to determine the light cone of the projector based on its internal parameters.
  • the gridded screen is observed (in the way shown in Figure 3 (a)), and the "correct position map of the screen" of each control point is obtained, which is called “geometric correction target image", as shown in Fig. 4. (b) as shown.
  • Geometric correction refers to the process of deforming Figure 4 (a) into Figure 4 (b).
  • the operation of the projector can be seen as the inverse of the camera. That is, the screen is "photographed" with the scaled projector, and the resulting image is the geometric correction target image.
  • the key to automatic geometric correction is the correction of the projector. Since a projector can be regarded as a camera, the present invention uses a pinhole model to describe the projector.
  • a pinhole model For related technical content of the pinhole model, see “Computer Vision - Computational Theory and Algorithm Basis", Ma Yude et al., Science Press, October 2003, the contents of which are incorporated herein by reference. I will not go into details here for brevity.
  • each element in the template projected by the projector is called a pixel.
  • the Cartesian coordinate system OcrMV is defined on the template, and the coordinates of each pixel, v ) are the number of columns and the number of rows of the pixel in the template, respectively. Since the projection template coordinate system only indicates the number of columns and rows of pixels in the digital template, and the physical position of the pixel in the template is not represented by physical units, it is necessary to establish a template plane represented by physical units (for example, meters).
  • the coordinate system O;-x_y as shown in Figure 5.
  • the origin ft is defined at the intersection of the projector's optical axis and the template plane, called the principal point.
  • the ideal position of this point is at the center of the template, but due to the projector's lens shift function. , there will always be deviations. If ft is in the "- coordinate system, Q, V Q), the physical size of each pixel in the axis and 7-axis direction is d, then the relationship between the two coordinate systems is as follows:
  • the geometric relationship projected by the projector can be represented by Figure 6, where the point is called the projector's optical center, the ⁇ axis and the axis are parallel to the axis of the template plane coordinate system and the 7 axis, and the axis is perpendicular to the template plane, called the optical axis of the projector.
  • the intersection of the optical axis and the template plane is the main point, and the Cartesian coordinate system consisting of points and ; ⁇ , :r e , z 3 ⁇ 4 is called the projector coordinate system.
  • 0 ⁇ 3 ⁇ 4 is the focal length of the projector /.
  • the relationship between the projector coordinate system and the world coordinate system can be described by a rotation matrix and a translation vector.
  • the projection point has the following relationship between the homogeneous coordinates ( w , :r w , z w , if of the world coordinate system and the homogeneous coordinates of the projector coordinate system:
  • a point ffl in the plane of the image template intersects with the line of the projector's optical center and a spatial surface (ie, the projected surface) to obtain a projection point ⁇ as shown in Fig. 7.
  • [R ty is called the external parameter matrix of the projector, including rotation Matrix and translation vector; called the internal parameter matrix of the projector, ( M() , V() ) is the main point coordinate, , / is the scale factor of the image axis and the axis respectively, which is the degree of inclination of the two coordinate axes in the template.
  • the parameter; P T [R t] is a 3 x 4 matrix, commonly referred to as the projection matrix. (5) expresses the projection principle under the pinhole projector model.
  • Z c m PX ( 6 ) where fff is the pixel point on the template plane and is the spatial point of the projection. Therefore, if you know the pixel points on the template and their corresponding spatial points, you can form constraints on the elements of the projection matrix. If there are enough constraints, you can find the projection matrix A to get the internal and external parameters of the projector. In theory, as long as the projected spatial points are not on the same twist cubic surface, 6 pairs of pixels and spatial points can obtain sufficient constraints to solve the projection matrix.
  • the embodiment of the present invention proposes a method of correcting a projector.
  • the basic idea is: for example, using a laser theodolite at the center of the screen to mark some points on the screen area covered by the projected image, and obtain the coordinates of these points in the world coordinate system, and then obtain each of the pairs in (6)
  • a point for example, using a corresponding pattern generator (for example, a click of a mouse of a graphics generator), can obtain an image space point set in (6).
  • the projector can be accurately calculated from and ffl. Internal and external parameters.
  • the external parameters of the projector can include parameters: projector optical center 0 (X w , Y w , Z W X optical axis vector v (ie vector representation in the world coordinate system) and upward vector u (ie K in world coordinates)
  • the vector representation of the system) can include parameters: focal length, and the image space coordinate W of the main point (u 0 , VoX can calculate the light cone of the projector according to and (u 0 , v 0 ).
  • the geometric correction mapping relationship of each point on the screen can be obtained, that is, the image coordinates (called test points) and self-projection observed from the design eye point according to the design cone.
  • Geometric correction is to "move" the test point to the target point, regardless of the means of implementation.
  • the geometry correction can be done by testing the mapping of the mesh to the target mesh. This mapping enables geometric correction through existing texture mapping.
  • the application can be implemented by software (texture mapping) or dedicated hardware data interface (such as Norwegian fusion device CompactUTM, data interface is csv file) projector calibration.
  • interpolation When performing texture mapping, interpolation can be reduced to linear interpolation by increasing the density of control points in the two grids.
  • the system 100 in FIG. 9 includes a projector calibration device 110 and a projector 120.
  • the projector calibration apparatus 110 may include reference point coordinate acquisition means 112, control point coordinate acquisition means 114, and projector parameter calculation means 116.
  • step S11 for a channel of the simulation system, a plurality of reference points are marked on the screen area covered by the projected image. For example, it can be marked with a laser theodolite located at the center of the screen.
  • point marking can also be performed in other ways, such as earlier laser marking in engineering or direct labeling of fluorescent materials on the surface of the screen during screen production.
  • the number of points is greater than or equal to 6.
  • the number of appropriate reference points can be selected according to the actual situation.
  • the coordinates of the reference points in the world coordinate system are acquired by the reference point coordinate acquiring means 112, that is, the method of obtaining the coordinates of the reference point in the world coordinate system in the equation (6) is well known in the art. , will not repeat them here.
  • step S15 for each of the points, the image coordinates of each point are recorded, that is, the image space point set ffl (template plane coordinate) in (6), that is, the set of control points.
  • the step can be implemented by clicking a reference point on the projected image with a mouse and using a pattern generator connected to the projector; next, in step S17, using the projector parameter calculation means 116, according to (6) Calculate the internal and external parameters of the projector; wherein the external parameters of the projector may include: projector optical center Y w , Z W X optical axis vector v (ie vector representation in the world coordinate system) and upward vector u (ie vector representation of K in the world coordinate system); the internal parameters of the projector can include parameters: focal length, and the image space of the main point Mark 01 (u 0 , V.
  • the projection matrix of the projector can be obtained first, and the internal parameters and the external parameters are obtained by QR decomposition of the matrix.
  • step S19 the light cone of the projector is determined with an internal parameter, and the position and posture of the projector are determined by other parameters, thereby realizing correction for the projector. Then, using the above result, the target image is obtained by "observing" the screen with the corrected projector in the virtual scene.
  • step S21 geometric correction of the image is completed based on the test target and the target image.
  • the above processing can be sequentially performed for each channel of the projection system.
  • the projector is scaled for the five projection channels of the 2-meter radius dome flight simulation visual simulation system shown in Figure 2, and the geometric correction results are written, for example, in 3D.
  • -Perception's professional geometry correction hardware CompactUTM to enable direct application of the method of the invention on dedicated hardware.
  • Figure 10 shows the reference points in the field. For the sake of clarity, only the 2, 3, and 5 projection channels are shown, each of which is labeled with 16 points.
  • Table 1 shows the image space point set and the 3D point set X of the projector No. 5 in Fig. 3.
  • Figure 11 shows the 2 and 3 channel test grids and the geometry corrected target grid.
  • the corresponding grid for channel 5 is shown in Figure 4.
  • Figure 12 shows the geometry correction grid for the 2, 3, and 5 channels.
  • Figures 13a and 13b show the test grid before and after the 2, 3, and 5 channel geometry corrections, respectively.
  • Figure 14 shows images of 2, 3, and 5 channels (without edge blending).
  • Figure 15 shows the effect after edge blending.
  • Figure 16 shows the global screen display of all five channels.
  • the projector correction method according to the present invention can also be applied to edge fusion.
  • Edge fusion is another key technology for multi-channel vision simulation systems, including electronic edge fusion and optical edge fusion. The electronic edge fusion achieves the effect of image fusion by controlling the pixel gray level of the image overlap area, which is suitable for bright scenes, but can not solve the problem of overlapping area bright band caused by projector light leakage in dark scenes.
  • optical edge fusion is to place an optical filter in front of the projector lens, and achieve the purpose of edge fusion by controlling the transmittance of the image overlap region filter, which can solve the dark scene caused by light leakage. Fusion failure.
  • the image overlap area (including the edge fusion area) is reflected on the filter is an irregular curve. Since the method of the embodiment of the invention can calibrate the projector, the range of the edge fusion region can be determined very accurately, and combined with the typical calculation method of edge fusion, that is, the formula (7), the shape of the electronic or optical edge fusion is calculated and transparent.
  • Over rate change is to place an optical filter in front of the projector lens, and achieve the purpose of edge fusion by controlling the transmittance of the image overlap region filter, which can solve the dark scene caused by light leakage. Fusion failure.
  • the image overlap area (including the edge fusion area) is reflected on the filter is an irregular curve. Since the method of the embodiment of the invention can calibrate the projector, the range of the edge fusion region can be determined very accurately,
  • x is the location of the fusion zone
  • t is the transparency of the location
  • ⁇ and ⁇ are the control factors.
  • Equation 7 calculates the edge transparency change and obtains the edge blend transparency map.
  • the electronic edge fusion of the projection channel can be realized, or the optical edge fusion shape factor of the channel can be obtained.
  • the invention provides a projector calibration method and device for a multi-channel vision projection system, which avoids the use of the camera, greatly reduces the complexity of system installation and debugging, and realizes field projector calibration by a very convenient method.
  • the geometric correction accuracy is improved, and the transparency map of the electronic or optical edge fusion can be calculated to achieve precise electronic edge fusion, while providing a very accurate form factor for optical edge fusion.
  • the method and device of the present invention can also be applied to a non-camera self-positioning device, which can realize self-positioning according to the input simple prior information at the scene simulation system, and automatically generate label points, thereby realizing the method. Fully automated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Projection Apparatus (AREA)
  • Transforming Electric Information Into Light Information (AREA)

Abstract

一种用于校正多通道视景仿真投影系统的投影仪方法,包括:在所述投影仪投影于屏幕上的图像中,确定多个参考点,并获得每个参考点在世界坐标系下的坐标;获得控制点在投影模板中的像素坐标,所述控制点分别与每个参照点相对应;根据每个参照点在世界坐标系下的坐标以及所述控制点的像素坐标,计算投影仪的投影矩阵;以及利用所述投影矩阵对投影仪进行校正。还公开了一种用于校正多通道视景仿真投影系统的投影仪的设备。通过该方法和设备,对于每一通道,仅需简单操作便可得到高精度的、独立于硬件的几何校正网络。

Description

校正多通道视景投影系统的投影仪的方法及设备
技术领域 本发明涉及虚拟现实领域, 尤其涉及一种校正多通道视景投影系统的投影仪的 方法和设备。 背景技术
多通道视景投影系统是一种由多台投影仪无缝地在大尺寸屏幕 (通常为曲面) 上实时投射三维场景的显示系统。 该类系统包含了各种业界主流复杂投影系统, 如 虚拟现实系统 (Virtual Reality System)、 科学可视化系统 (Scientific Visualization System )、 视景模拟仿真系统 (Visual Simulation System )、 协同可视化系统 (Collaborative Visualization System) 等, 广泛应用于科学、 教育、 国防、 娱乐、 能 源、 制造业及政府等领域。 多通道视景仿真系统的关键技术包括: 几何校正技术、 边缘融合技术及色彩一致性匹配技术。 本发明主要讨论几何校正技术, 同时本发明 的方法也可为边缘融合技术提供形状因子。
多通道视景仿真系统之所以需要进行投影图像几何校正, 是由于投影仪的内、 外参数与自设计眼点出发与该通道相应的视锥参数不同造成的。 早期的几何校正由 手工进行, 即利用特殊的硬件将投影图像中的测试网格点 (称为 "控制点")与屏幕 上标定好的参照点进行手工匹配, 控制点间的像素位置利用插值生成。 在现有技术 中, 屏幕上的参照点利用涂在屏幕上的紫外敏感涂料点, 调试时在现场紫外灯照射 下这些点会发光, 可作为参照点; 调试结束后, 在正常投影图像中这些点不可见, 因此并不影响系统的正常使用。 后来开始使用来自激光点阵的激光点作为参照点, 免去了对屏幕的涂画, 如图 1所示。
由于基于屏幕上为数不多的参照点, 手动几何校正的精度受到很大影响, 参照 点间的几何校正精度难以保证。 同时, 手动几何校正耗时较长, 在通道数量较多时 往往需要数周甚至数月的调试才能得到较为满意的效果。
关于自动集合校正, 目前的自动几何校正理论研究与工程应用均基于一个或多 个相机来实现图像的几何校正。 由于相机的安装位置及分辨率原因, 基于相机得到 的几何校正往往精度不高。 同时, 在具体应用中, 由于相机的视场有限、 成本较高 且安装调试不便, 在很大程度上增加了系统造价, 影响了自动几何校正的效率。 发明内容 本发明的实施例提供了一种用于校正多通道视景投影系统的投影仪的方法, 包 括:
在所述投影仪投影于屏幕上的图像中, 确定多个参照点, 并获得每个参照点在 世界坐标系下的坐标;
获得控制点在投影模板中的像素坐标, 所述控制点分别与每个参照点相对 应;
根据每个参照点在世界坐标系下的坐标以及所述控制点的像素坐标, 计算 投影仪的投影矩阵; 以及
利用所述投影矩阵对投影仪进行校正。 本发明的实施例提供了一种用于校正多通道视景投影系统的投影仪的设备, 包 括:
用于在所述投影仪投影于屏幕上的图像中, 确定多个参照点的装置, 并获 得每个参照点在世界坐标系下的坐标;
用于获得控制点在投影模板中的像素坐标的装置, 所述控制点分别与每个 参照点相对应;
用于根据每个参照点在世界坐标系下的坐标以及所述控制点的像素坐标, 计算投影仪的投影矩阵的装置; 以及
用于利用所述投影矩阵对投影仪进行校正的装置。 根据本发明的实施例的投影仪校正方法和装置, 避免了相机的使用, 可以配合 激光经纬仪, 利用屏幕的深度信息对投影仪直接定标。 对于每一通道, 仅需十几次 例如鼠标点击的简单操作便可得到高精度的、 独立于硬件的几何校正网格, 从而实 现投影仪的校正。 工程应用结果表明这一方法使用便捷, 效果理想。 附图说明 为了更清楚地说明本发明实施例的技术方案, 下面将对实施例描述中所需要使 用的附图作一简单地介绍。 显而易见地, 下面描述中的附图仅仅是本发明的一些实 施例, 对于本领域普通技术人员来讲, 在不付出创造性劳动的前提下, 还可以根据 这些附图获得其他的附图。 图中:
图 1示出了根据现有技术的手动校正的示意图;
图 2示出了视景仿真系统的屏幕、 投影仪及设计眼点的示意图;
图 3a示出了图 2中 5号投影通道的投影图像;
图 3 ( b ) 示出了与图 2中 5号投影通道相对应的视锥;
图 4a示出了图 2中 5号投影通道的测试网格;
图 4b示出了图 2中 5号投影通道的几何校正目标图像;
图 5示出了根据本发明示例实施例的投影模板坐标系和模板平面坐标系的示意 图;
图 6示出了根据本发明示例实施例的投影关系的示意图;
图 7示出了投影仪的坐标系的示意图;
图 8示出了根据本发明示例实施例的投影仪校正方法的流程图;
图 9示出了根据本发明示例实施例的投影仪校正设备的框图;
图 10示出了仿真中第 2、 3、 5号投影通道的标注点;
图 11a和图 l ib分别示出了第 2号投影通道的测试网格与几何校正目标网格, 图 11c和图 l id分别示出了第 2号投影通道的测试网格与几何校正目标网格;
图 12a-图 12c分别示出了第 2、 3、 5号投影通道的几何校正网格;
图 13a和图 13b分别示出了第 2、 3、 5号投影通道的几何校正前后的比较; 图 14示出了第 2、 3、 5号投影通道的图像, 其中没有进行边缘融合;
图 15示出了第 2、 3、 5号投影通道的图像, 其中进行了边缘融合;
图 16示出了整球的校正、 融合结果; 以及
图 17示出了第 3号通道的光学边缘融合透明度分布图; 具体实施方式
下面将结合本发明实施例中的附图, 对本发明实施例中的技术方案进行清楚、 完整地描述。 显然, 所描述的实施例仅仅是本发明一部分实施例, 而不是全部的实 施例。 基于本发明中的实施例, 本领域普通技术人员在没有作出创造性劳动前提下 所获得的所有其他实施例, 都属于本发明保护的范围。 首先对用于根据本发明实施例的视景投影系统的结构和操作原理进行说明。 根据本发明实施例的视景投影系统可以包括屏幕、 投影仪以及投影仪校正设备。 其中, 投影仪可以具有相应的设计眼点以及视锥。 以一个五通道球幕视景仿真系统 为例, 图 2示出了屏幕、 投影仪、 设计眼点以及视锥之间关系的示意图。 图 2中的系统包括 5个投影仪 Pl-P5。 图中的 "+ "表示设计眼点。 设计眼点是 视景仿真系统的重要组成部分。 理论上, 对于投影现场, 只有从该点观察到的图像 才是几何正确的。
所谓 "视锥"是指一个投影通道所对应的、 自设计眼点出发的观察虚拟场景所 使用的视见 (viewing frustum) 锥体, 也可以将其称作 "光锥"。 显然, 一个投影 通道的视锥与该通道的投影图像的位置、 尺寸紧密相关。 理论上讲, 若投影仪的光 心位于设计眼点, 且投影仪光锥与对应投影通道的视锥完全重合, 则该投影图像不 必进行几何校正, 即, 不必对该投影仪进行校正。 显然, 这在实际中是难以实现的。 以图 2所示系统中的 5号投影仪为例, 投影仪位置、 投影仪光锥和投影图像如图 3 ( a) 所示, 该通道所对应的视锥如图 3 (b ) 所示。 应当注意的是, 一个投影通道的视锥和投影屏幕并不一定要与投影图像的覆盖 区域完全对应, 而是要根据几何校正结果针对像素利用率进行综合考虑。 对于一个投影通道而言, 用于几何图像校正的测试网格是指自设计眼点出发、 以该通道的对应视锥为视见锥体对网格化屏幕的观察(以图 3 (b )所示方式)。 图 3 中通道 5 的测试网格如图 4 ( a) 所示 (以水平视场 ± 90 ° 的球幕为例)。 所谓几何 校正, 是指将投射在屏幕上的测试网格中的各控制点 (例如, 经、 纬线的交点) 移 动到屏幕上这些点应处的正确位置。 在手动几何校正中, 控制点在屏幕上的正确位 置由屏幕上的参照点给出 (例如图 1所示)。 对于自动几何校正而言, 需要通过某种 手段计算测试图像中各控制点的 "屏幕正确位置" 的图像坐标, 一旦这些坐标得以 确定, 几何校正即可通过简单的纹理映射完成。 由计算机视觉理论可知, 一旦投影 仪被定标(校正), 即可根据其外参数将视点移至投影仪光心、 并确定视线方向及向 上向量 (up-vector), 然后根据其内参数确定投影仪的光锥。 以此方式对网格化的 屏幕进行观察 (以图 3 (a) 所示方式), 即可得到各控制点的 "屏幕正确位置图", 称其为 "几何校正目标图像", 如图 4 (b) 所示。 几何校正即是指将图 4 (a) 变形 为图 4 (b) 的过程。
可以将投影仪的操作看作照相机的逆过程。 即, 用定标过的投影仪对屏幕进行 "拍照", 所得图像即为几何校正目标图像。
由以上分析可知, 自动几何校正的关键在于投影仪的校正。 由于可以把一台投影仪看作一台照相机, 本发明采用针孔模型来描述投影仪。 有关针孔模型的相关技术内容参见 "计算机视觉 -计算理论与算法基础", 马颂德等, 科学出版社, 2003年 10月, 将其内容一并在此作为参考。 这里为了简要不再赘述。
根据针孔模型, 投影仪投射的模板中的每一个元素称为像素 (pixel)。 在模板上 定义直角坐标系 OcrMV, 每一像素的坐标 , v)分别是该像素在模板中的列数和行数。 由于投影模板坐标系只表示像素位于数字模板中的列数和行数, 并没有用物理单位 表示出该像素在模板中的物理位置, 因而需要再建立以物理单位 (例如米) 表示的 模板平面坐标系 O;-x_y, 如图 5所示。
在 - ^坐标系中, 原点 ft定义在投影仪光轴和模板平面的交点处, 称为主点 (principal point). 该点理想的位置位于模板中心处, 但由于投影仪镜头平移功 能的原因, 往往会有偏离。 若 ft在" - 坐标系中的坐标为 Q,VQ), 每个像素在 轴 和 7轴方向上的物理尺寸为 , d , 则两个坐标系的关系如下:
Figure imgf000007_0001
其中/表示因投影仪模板平面坐标系的坐标轴相互不正交引出的倾斜因子 (skew factor), (x, 1)是点 (x,_y)的齐次坐标。
投影仪投射的几何关系可由图 6表示, 其中 点称为投影仪光心, ε轴和 轴 与模板平面坐标系的 轴和 7轴平行, 轴与模板平面垂直, 称为投影仪的光轴。 光轴与模板平面的交点为主点 , 由点 与;^ ,:re,z ¾组成的直角坐标系称为投影 仪坐标系。 0^¾为投影仪的焦距 /。 投影仪坐标系与世界坐标系之间的关系可用旋转矩阵 ?与平移向量 来描述。 由 此, 投影点 在世界坐标系的齐次坐标 ( w,:rw,zw,if和投影仪坐标系下的齐次坐标 之间存在如下关系:
(2)
Figure imgf000008_0003
其中 7?是 3x3正交旋转矩阵, 是 3维平移向量, 0 = (0,0,0 , Μ表示两个坐标系 之间的变换矩阵。
在针孔模型下, 图像模板平面中的某一点 ffl与投影仪光心的连线和某一空间曲 面 (即, 投影曲面) 相交得到投影点 Λ 如图 7所示。
投影仪坐标系与模板平面坐标系之间的关系为: χ = , y = o zc zc
其中, (x, 为点《7在模板平面坐标系中的坐标, ( e,:re,zj为投影得到的空间点 在投影仪坐标系下的坐标。 使用齐次坐标与矩阵形式, (3) 式可表示为:
Figure imgf000008_0001
If (1) 式与 (2) 式代入 (4) 式, 可得模板平面坐标系和世界坐标系之间的关系:
Figure imgf000008_0002
其中, fu =fidx, fv = f/dy , r = rf; [R ty称为投影仪外参数矩阵, 包括旋转 矩阵和平移向量; 称为投影仪内参数矩阵, (M(),V())为主点坐标, , / 分别为图像 轴和 轴上的尺度因子, 是描述模板中两坐标轴倾斜程度的参数; P = T [R t] 为 3 x 4矩阵, 通常称为投影矩阵。 (5 ) 式表达了在针孔投影仪模型下的投影原理。
由式(5 )式可以知道, 若已知投影矩阵 , 就可以根据矩阵的 QR分解得到内参 数矩阵 ^以及外参数 并进而得到平移向量 , 从而获得投影仪的全部参数。 由式 (5 ) 式可知:
Zcm = PX ( 6 ) 其中 ffl是模板平面上的像素点, 是投影得到的空间点。 因此如果知道模板上像 素点及其对应的空间点, 就可以对投影矩阵 的元素形成约束, 如果约束足够多, 就可以求出投影矩阵 A 从而得到投影仪的内参数和外参数。 理论上讲, 只要投影 得到的空间点不在同一个 twist cubic 曲面上, 6对像素点和空间点就可以获得足 够的约束, 从而求解投影矩阵。
因此, 根据上述理论推导, 本发明实施例提出了一种对投影仪进行校正的方法。 其基本思想在于: 例如利用位于屏幕中心的激光经纬仪在投影图像覆盖的屏幕区域 上标注一些点, 得出这些点在世界坐标系下的坐标, 即可得到 (6 ) 式中的 对于 中的每一个点, 例如利用相应图形发生器 (例如图形发生器的鼠标进行点击), 可 以得到 (6 ) 式中的图像空间点集^ 由上面的叙述可知, 可以由 和 ffl精确地计算 出投影仪的内、 外参数。
投影仪的外参数可以包括参数: 投影仪光心 0 (Xw, Yw, ZWX 光轴向量 v (即 在 世界坐标系下的向量表示) 以及向上向量 u (即 K在世界坐标系下的向量表示); 投 影仪的内参数可以包括参数: 焦距 , 以及主点的图像空间坐标 W (u0, VoX 根据 及 (u0, v0)可计算出投影仪的光锥。
由上述讨论可知, 由于得到了投影矩阵, 即可获得屏幕上每一点的几何校正的 映射关系, 即从设计眼点按设计视锥对其观察所得的图像坐标 (称为测试点) 与自 投影仪对其观察所得的图像坐标 (称为目标点) 之间的映射关系。 几何校正即是将 测试点 "移动 "至目标点, 与实现手段无关。 当投影仪被校正后, 可通过测试网格到目标网格的映射来完成几何校正。 这种 映射可以通过已有的纹理映射来实现几何校正。 应用中可通过软件 (纹理映射) 或 专用硬件的数据接口具体实现(如挪威产融合器 CompactUTM,数据接口为 csv文件) 投影机的校正。
在进行纹理映射时, 可通过提高两个网格中控制点的密度将插值简化为线性插 值。
参考图 8和图 9来描述根据本发明实施例的多通道视景系统的投影仪校正方法。 其中, 图 9中的系统 100包括投影仪校正设备 110和投影仪 120。 投影仪校正设备 110可以包括参照点坐标获取装置 112、控制点坐标获取装置 114以及投影仪参数计 算装置 116。 首先, 在步骤 Sl l, 针对仿真系统的一个通道, 在投影图像覆盖的屏幕区域上标 注多个参照点。 例如可以利用位于屏幕中心的激光经纬仪来进行标注。 本领域技术 人员已知的是也可以利用其他方式来进行点的标注, 例如工程中较早的激光点阵标 注或屏幕制作过程中在屏幕表面使用荧光材料的直接标注。 理论上讲, 点的数目大 于等于 6即可。 参照点的数目越多, 则复杂度越高, 但是在之后的插值过程则越简 单。 可以根据实际情况来选择适当的参照点的数目。 然后, 在步骤 S13, 利用参照点坐标获取装置 112获取这些参照点在世界坐标系 下的坐标, 即得到 (6 ) 式中的 获得参照点在世界坐标系下的坐标的方法是本领 域公知的, 此处不再赘述。 接下来,在步骤 S15,对于 中的每一个点,记录每个点的图像坐标, 即得到(6 ) 式中的图像空间点集 ffl (模板平面坐标), 即控制点的集合。 例如, 该步骤可以通过 以鼠标点击投影图像上的参照点并使用与投影仪相连的图形发生器来实现; 接下来, 在步骤 S17, 利用投影仪参数计算装置 116, 根据 (6 ) 式精确地计算 出投影仪的内、外参数; 其中,投影仪的外参数可以包括: 投影仪光心 Yw, ZWX 光轴向量 v (即 在世界坐标系下的向量表示) 以及向上向量 u (即 K在世界坐标 系下的向量表示); 投影仪的内参数可以包括参数: 焦距 , 以及主点的图像空间坐 标 01 (u0, V。入
其中, 如上文所述, 可以先得到投影仪的投影矩阵, 并通过矩阵的 QR分解来得 到内参数和外参数。 在步骤 S19, 以内参数确定投影仪的光锥,以外参数确定投影仪的位置及姿态, 从而实现对于投影仪的校正。然后,利用上述结果在虚拟场景中以校正的投影仪"观 察"屏幕, 得到目标图像。
然后, 在步骤 S21, 根据测试目标及目标图像, 完成图像的几何校正。
本领域技术人员可以设想, 可以针对投影系统的各个通道依次进行上述处理。 在一个示例中, 使用 25x25个控制点, 对图 2所示的 2米半径的球幕飞行模拟 视景仿真系统的 5 个投影通道进行了投影仪定标, 并将几何校正结果写入例如 3D-perception公司专业几何校正硬件 CompactUTM, 从而实现本发明方法在专用硬 件上的直接应用。 这里应注意, 尽管实施例以球幕为例进行说明, 本发明的几何校 正方法并不局限于此, 而适用于任意曲率较大的曲面屏幕。 图 10显示了现场的参照点, 为清楚起见只显示了 2、 3、 5号投影通道, 每一通 道各标注了 16个点。 表 1显示了图 3中 5号投影仪的图像空间点集 与三维点集 X。
表 1: 5号投影仪的标注点
Figure imgf000011_0001
10 522 659 -1. 16419 1. 419141 -0. 79416
11 877 655 -0. 54536 1. 419141 -1. 29947
12 1207 673 0. 159645 1. 3108 -1. 5021
13 154 930 -1. 79491 0. 837319 - 0. 27787
14 517 934 - 1. 46847 0. 837319 - 1. 06887
15 865 930 -0. 76616 0. 837319 -1. 64678
16 1236 919 - 0. 00322 0. 768054 - 1. 84664 表 2显示了对所有投影仪的定标结果。
表 2: 投影仪定标结果
Figure imgf000012_0001
图 11显示了 2、 3通道的测试网格与几何校正目标网格, 5号通道的相应网格见 图 4。 图 12显示了 2、 3、 5通道的几何校正网格。 图 13a和 13b分别显示了 2、 3、 5通道几何校正前后的测试网格。图 14显示了 2、 3、 5通道的图像(未经边缘融合)。 图 15显示了边缘融合后的效果。 图 16显示了所有 5个通道的全球幕显示效果。 根据本发明的投影仪校正方法还可以应用于边缘融合。 边缘融合是多通道视景 仿真系统的另一关键技术, 包括电子边缘融合和光学边缘融合技术。 电子边缘融合 通过控制图像重叠区像素灰度达到图像融合的效果, 对于亮场景较为适用, 但不能 解决暗场景下由于投影仪光渗漏造成的重叠区亮带问题。
在现有技术中, 光学边缘融合是在投影仪镜头前放置一个光学滤片, 通过控制 图像重叠区滤片透过率达到边缘融合的目的, 该方法可以解决光渗漏造成的暗场景 融合失败问题。 在曲面幕、 尤其是球面幕上, 图像重叠区 (包含了边缘融合区) 反 映到滤片上是不规则曲线。 由于本发明实施例的方法可将投影仪定标, 因此可以非 常精确地确定边缘融合区的范围, 并结合边缘融合典型计算方法, 即 (7 )式, 计算 电子或光学边缘融合的形状及透过率变化:
Figure imgf000013_0001
其中 x为融合区位置, t为该位置透明度, α、 Υ为控制因子。 以通道 3为例 计算得边缘融合透明度分布图如图 17所示, 图中白色代表完全透明, 颜色越深 透明度越低, 其中 α =1. 5、 y =1. 25 ,
由于求得了投影矩阵, 即可获得自投影仪对屏幕的观察图像, 可在该图像中确 定边缘融合的范围, 如在球幕中可通过指定边缘融合带的经、 纬度确定范围, 从而 按通用公式 (公式 7 ) 计算边缘透明度变化, 获得边缘融合透明度分布图。
根据图 17即可实现该投影通道的电子边缘融合, 或得到该通道的光学边缘融合 形状因子。 本发明提出了一种用于多通道视景投影系统的投影仪校正方法和设备, 该方法 避免了相机的使用, 大大降低了系统安装调试复杂度, 通过十分便捷的方法实现现 场投影仪定标, 通过使用密集控制点避免插值, 提高了几何校正精度, 并可计算电 子或光学边缘融合的透明度分布图, 从而实现精确电子边缘融合, 同时为光学边缘 融合提供十分准确的形状因子。
本发明的方法和设备还可以应用于一种非相机的自定位设备, 该设备可在视景 仿真系统现场根据输入的简单先验信息实现自我定位, 并自动生成标注点, 从而实 现本方法的全自动化。
本领域技术人员将受益于前述说明书和相关附图中呈现的教导而想到本文所 述的本发明的多种改型和其它实施例。 因此, 应该理解本发明不限于所公开的具体 实施例, 并且应该理解, 本发明旨在将变型和其它实施例包括在由所附权利要求限 定的范围内。 尽管本文采用了具体术语, 但是只是从描述的角度来使用它们, 而并 非限制。

Claims

权 利 要 求
1、 一种用于校正多通道视景仿真投影系统的投影仪方法, 包括:
在所述投影仪投影于屏幕上的图像中, 确定多个参照点, 并获得每个参照 点在世界坐标系下的坐标;
获得控制点在投影模板中的像素坐标, 所述控制点分别与每个参照点相对 应;
根据每个参照点在世界坐标系下的坐标以及所述控制点的像素坐标, 计算 投影仪的投影矩阵; 以及
利用所述投影矩阵对投影仪进行校正。
2、 根据权利要求 1所述的方法, 其中所述屏幕是曲率较大的曲面屏幕。
3、 根据权利要求 2所述的方法, 其中所述屏幕是球面屏幕。
4、 根据权利要求 1所述的方法, 其中根据下式来计算所述投影矩阵:
Zcm = PX
其中, Zc是参照点在投影仪坐标系下的齐次坐标, 是控制点的像素坐标, P 是投影矩阵, X是所述参照点在世界坐标系下的坐标。
5、根据权利要求 4所述的方法, 其中投影矩阵中的元素包括用于确定投影仪的 光锥的内参数以及用于确定投影仪的位置和姿态的外参数, 利用对投影矩阵 X进行 QR分解得到所述内参数和所述外参数。
6、根据权利要求 1所述的方法, 其中通过位于屏幕中心的激光经纬仪来获得每 个参照点在世界坐标系下的坐标。
7、 根据权利要求 1所述的方法, 其中根据每个投影仪的投影矩阵, 对投影图像 进行边缘融合处理。
8、 一种用于校正多通道视景仿真投影系统的投影仪的设备, 包括:
用于在所述投影仪投影于屏幕上的图像中, 确定多个参照点的装置, 并获 得每个参照点在世界坐标系下的坐标;
用于获得控制点在投影模板中的像素坐标的装置, 所述控制点分别与每个 参照点相对应;
用于根据每个参照点在世界坐标系下的坐标以及所述控制点的像素坐标, 计算投影仪的投影矩阵的装置; 以及
用于利用所述投影矩阵对投影仪进行校正的装置。
9、 根据权利要求 1所述的设备, 其中所述屏幕是曲率较大的曲面屏幕。
10、 根据权利要求 8所述的设备, 其中所述屏幕是球面屏幕。
11、 根据权利要求 8所述的设备, 其中根据下式来计算所述投影矩阵:
Zcm = PX
其中, Zc是参照点在投影仪坐标系下的齐次坐标, 是控制点的像素坐标, P 是投影矩阵, X是所述参照点在世界坐标系下的坐标。
12、根据权利要求 11所述的设备, 其中投影矩阵中的元素包括用于确定投影仪 的光锥的内参数以及用于确定投影仪的位置和姿态的外参数, 利用对投影矩阵 X进 行 QR分解得到所述内参数和所述外参数。
13、 根据权利要求 8所述的设备, 其中通过位于屏幕中心的激光经纬仪来获得 每个参照点在世界坐标系下的坐标。
14、 根据权利要求 8所述的设备, 还包括根据每个投影仪的投影矩阵对投影图 像进行边缘融合处理的装置。
PCT/CN2012/077538 2012-06-26 2012-06-26 校正多通道视景投影系统的投影仪的方法及设备 WO2014000159A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/077538 WO2014000159A1 (zh) 2012-06-26 2012-06-26 校正多通道视景投影系统的投影仪的方法及设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2012/077538 WO2014000159A1 (zh) 2012-06-26 2012-06-26 校正多通道视景投影系统的投影仪的方法及设备

Publications (1)

Publication Number Publication Date
WO2014000159A1 true WO2014000159A1 (zh) 2014-01-03

Family

ID=49782030

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2012/077538 WO2014000159A1 (zh) 2012-06-26 2012-06-26 校正多通道视景投影系统的投影仪的方法及设备

Country Status (1)

Country Link
WO (1) WO2014000159A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106846477A (zh) * 2017-02-10 2017-06-13 中国电建集团成都勘测设计研究院有限公司 一种编录野外地质影像的地质标记解译建模方法
US10009586B2 (en) 2016-11-11 2018-06-26 Christie Digital Systems Usa, Inc. System and method for projecting images on a marked surface
EP3539289A4 (en) * 2016-12-16 2020-11-11 CJ CGV Co., Ltd. PROCESS FOR IMAGE PROJECTION ON A CURVED PROJECTION AREA, AND ASSOCIATED PROJECTION SYSTEM

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008250487A (ja) * 2007-03-29 2008-10-16 Kyushu Institute Of Technology エッジ検出によるモデルマッチングを用いたカメラ校正方法
CN101344707A (zh) * 2008-01-09 2009-01-14 上海海事大学 自动多投影仪非线性几何校正与边缘融合方法
CN101572787A (zh) * 2009-01-04 2009-11-04 四川川大智胜软件股份有限公司 基于计算机视觉精密测量多投影视景自动几何校正和拼接方法
JP2010039778A (ja) * 2008-08-05 2010-02-18 Hitachi Computer Peripherals Co Ltd 次元削減方法、パターン認識用辞書生成装置、及びパターン認識装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008250487A (ja) * 2007-03-29 2008-10-16 Kyushu Institute Of Technology エッジ検出によるモデルマッチングを用いたカメラ校正方法
CN101344707A (zh) * 2008-01-09 2009-01-14 上海海事大学 自动多投影仪非线性几何校正与边缘融合方法
JP2010039778A (ja) * 2008-08-05 2010-02-18 Hitachi Computer Peripherals Co Ltd 次元削減方法、パターン認識用辞書生成装置、及びパターン認識装置
CN101572787A (zh) * 2009-01-04 2009-11-04 四川川大智胜软件股份有限公司 基于计算机视觉精密测量多投影视景自动几何校正和拼接方法

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10009586B2 (en) 2016-11-11 2018-06-26 Christie Digital Systems Usa, Inc. System and method for projecting images on a marked surface
EP3539289A4 (en) * 2016-12-16 2020-11-11 CJ CGV Co., Ltd. PROCESS FOR IMAGE PROJECTION ON A CURVED PROJECTION AREA, AND ASSOCIATED PROJECTION SYSTEM
CN106846477A (zh) * 2017-02-10 2017-06-13 中国电建集团成都勘测设计研究院有限公司 一种编录野外地质影像的地质标记解译建模方法
CN106846477B (zh) * 2017-02-10 2020-03-31 中国电建集团成都勘测设计研究院有限公司 一种编录野外地质影像的地质标记解译建模方法

Similar Documents

Publication Publication Date Title
US11269244B2 (en) System and method for calibrating a display system using manual and semi-manual techniques
CN109272478B (zh) 一种荧幕投影方法和装置及相关设备
CN110211043B (zh) 一种用于全景图像拼接的基于网格优化的配准方法
WO2018076154A1 (zh) 一种基于鱼眼摄像机空间位姿标定的全景视频生成方法
JP5999615B2 (ja) カメラ較正情報生成装置、カメラ較正情報生成方法およびカメラ較正情報生成プログラム
US9195121B2 (en) Markerless geometric registration of multiple projectors on extruded surfaces using an uncalibrated camera
US9892488B1 (en) Multi-camera frame stitching
Zhang et al. A universal and flexible theodolite-camera system for making accurate measurements over large volumes
CN110312111B (zh) 用于图像装置的自动校准的装置、系统和方法
CN105308503A (zh) 利用短程相机校准显示系统的系统和方法
CN105067011A (zh) 一种基于视觉标定及坐标转换的测量系统整体校准方法
CN110191326A (zh) 一种投影系统分辨率扩展方法、装置和投影系统
CN111062869B (zh) 一种面向曲面幕的多通道校正拼接的方法
CN110505468B (zh) 一种增强现实显示设备的测试标定及偏差修正方法
Yang et al. A calibration method for binocular stereo vision sensor with short-baseline based on 3D flexible control field
JP2017058843A (ja) 画像処理装置、画像処理方法および画像処理用プログラム
WO2013069555A1 (ja) 画像処理装置および方法、並びにプログラム
CN113298886B (zh) 一种投影仪的标定方法
CN110675484A (zh) 一种基于复眼相机的具有时空一致性的动态三维数字场景构建方法
CN114283203A (zh) 一种多相机系统的标定方法及系统
CN102628693A (zh) 一种用于摄像机主轴与激光束进行平行配准的方法
WO2014000159A1 (zh) 校正多通道视景投影系统的投影仪的方法及设备
Jones et al. Correction of geometric distortions and the impact of eye position in virtual reality displays
CN113052974A (zh) 物体三维表面的重建方法和装置
CN109323691B (zh) 一种定位系统以及定位方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12880046

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12880046

Country of ref document: EP

Kind code of ref document: A1