WO2020135746A1 - 裸眼 3d 头盔显示器及其投影图像梯形校准方法 - Google Patents

裸眼 3d 头盔显示器及其投影图像梯形校准方法 Download PDF

Info

Publication number
WO2020135746A1
WO2020135746A1 PCT/CN2019/129288 CN2019129288W WO2020135746A1 WO 2020135746 A1 WO2020135746 A1 WO 2020135746A1 CN 2019129288 W CN2019129288 W CN 2019129288W WO 2020135746 A1 WO2020135746 A1 WO 2020135746A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
coordinates
optical medium
distortion
camera
Prior art date
Application number
PCT/CN2019/129288
Other languages
English (en)
French (fr)
Inventor
张关平
Original Assignee
未来科技(襄阳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201822272122.2U external-priority patent/CN209433126U/zh
Priority claimed from CN201811643000.8A external-priority patent/CN109541808B/zh
Application filed by 未来科技(襄阳)有限公司 filed Critical 未来科技(襄阳)有限公司
Publication of WO2020135746A1 publication Critical patent/WO2020135746A1/zh

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details

Definitions

  • the present application relates to the technical field of image processing, in particular to a naked-eye 3D helmet display and a trapezoidal calibration method for projected images.
  • 3D helmets are popular with users because of its super immersive sensation in many aspects such as movies and games, and gradually enter people's daily lives.
  • the present application provides a naked-eye 3D helmet display and a trapezoidal calibration method of its projected image.
  • a naked-eye 3D helmet display including:
  • a projector that projects a monochrome original image onto the optical medium
  • a camera the camera takes a picture of the original image projected on the optical medium to generate a sample image
  • a processor which uses image processing techniques to calculate the bottom angle parameter ⁇ of the vertical trapezoid of the sampled image
  • a coordinate converter which samples the coordinates of each pixel of the original image, substitutes the coordinates and the base angle parameter ⁇ into a preset formula, and generates an anti-distortion image by changing the coordinates;
  • the optical medium, the projector and the camera are all embedded in the housing, the projector and the camera are opposite to the optical medium, the processor and the projector and the The camera is connected, and the coordinate converter is connected to the projector and the processor, respectively.
  • the orthographic projection of the camera onto the optical medium falls on the geometric center of the optical medium.
  • the coordinate converter uses bilinear interpolation to sample the coordinates and substitute the coordinates and the base angle parameter ⁇ into a preset formula to generate the anti-distortion image.
  • the preset formula is:
  • X and Y are the horizontal and vertical coordinate values of individual pixels in the anti-distortion image
  • x and y are the horizontal and vertical coordinate values of individual pixels in the original image
  • h is the height of the anti-distortion image
  • W is the number of pixels in the bottom row of the anti-distortion image
  • the number of pixels in the bottom row of the anti-distortion image is the same as the number of pixels in the bottom row of the original image.
  • the present application also provides a trapezoid calibration method for projection images based on the naked-eye 3D helmet display described above, which includes the following steps:
  • Step S1 the projector projects a monochrome original image onto the optical medium
  • Step S2 the camera photographs the original image projected on the optical medium to generate a sample image
  • Step S3 the processor calculates the bottom angle parameter ⁇ of the vertical trapezoid of the sampled image using image processing technology
  • step S4 the coordinate converter samples the coordinates of each pixel of the original image, substitutes the coordinates and the base angle parameter ⁇ into a preset formula, and generates an anti-distortion image by changing the coordinates.
  • the naked-eye 3D helmet display further includes a picture comparator, after the anti-distortion image is generated, the projector projects the anti-distortion image onto the optical medium, and the camera takes the original image Compare.
  • step S4 generating an anti-distortion image by changing the coordinates is specifically:
  • the pixel coordinates (x, y) in the original image are changed by the preset formula to obtain the calibrated pixel coordinates (X, Y);
  • the bilinear interpolation method is used to sample the pixel values at the pixel coordinates (x, y) of the original image and fill the calibrated pixel coordinates (X, Y) to generate an anti-distortion image.
  • the optical medium is a half mirror.
  • the camera is a distortion-free camera.
  • the naked-eye 3D helmet display and the trapezoidal calibration method for the projected image provided by the present application calculate the base angle parameter ⁇ of the sampled image and substitute the base angle parameter ⁇ and the coordinates of each pixel of the original image into the pre- In the formula, finally, the distortion-resistant image is generated by changing the coordinates, which effectively solves the problem of trapezoidal distortion of the projected image of the naked-eye 3D helmet display and improves the user's experience.
  • FIG. 1 is a schematic diagram of connection of an optical medium, a projector, a camera, a processor and a coordinate converter in a naked-eye 3D helmet display provided by this application;
  • FIG. 2 is a schematic flowchart of a trapezoid calibration method for a projection image based on a naked-eye 3D helmet display provided by this application;
  • FIG. 3 is an image shape presented on the optical medium before calibration in the method for trapezoidal calibration of the projected image shown in FIG. 2;
  • FIG. 4 is an image shape presented on the optical medium after calibration in the trapezoidal calibration method of the projected image shown in FIG. 2.
  • the present application provides a naked-eye 3D helmet display, which includes a housing, an optical medium 20, a projector 30, a camera 40, a processor 50, a coordinate converter 60, and an image comparator.
  • the optical medium 20, the projector 30, and the camera 40 are all embedded in the housing, and the projector 30 and the camera 40 are all disposed opposite to the optical medium 20.
  • the projector 30 is used to obtain original image information and project a monochromatic original image on the optical medium 20, and the camera 40 is used to photograph the original image projected on the optical medium to generate a sample image.
  • the processor 50 is used to calculate the bottom angle parameter ⁇ of the vertical trapezoid of the sampled image using image processing technology.
  • the coordinate converter 60 is used to sample the coordinates of each pixel of the original image, and substitute the coordinates and the base angle parameter ⁇ into a preset formula, and generate an anti-distortion image by changing the coordinates. After the anti-distortion image is generated, the projector 30 projects the anti-distortion image onto the optical medium 20, the camera 40 takes a picture to generate a calibration image, and the picture comparator compares the calibration image with the original image Compare.
  • the processor 50 is connected to the projector 30 and the camera 40 respectively, and the coordinate converter 60 is connected to the projector 30 and the processor 50 respectively.
  • the picture comparator is connected to the camera 40 and the processor 50 respectively.
  • the connection method between the processor 50 and the projector 30 and the camera 40 may be an electrical connection or a communication connection.
  • the coordinate converter 60 and the projector 30 and the processing The connection method between the device 50 and the connection method between the picture comparator and the camera 40 and the processor 50 may also be an electrical connection or a communication connection.
  • the method for trapezoidal calibration of the projected image of the naked-eye 3D helmet display includes the following steps:
  • Step S1 The projector 30 projects a monochromatic original image onto the optical medium 20; in this embodiment, the original image is a solid color image, for example, due to different processing techniques and parameters of the optical medium 20, The original image projected by the projector 30 undergoes vertical trapezoidal distortion on the optical medium 20, as shown in FIG. 2.
  • the optical medium 20 is a transparent or translucent display medium.
  • the optical medium 20 is a half mirror.
  • Step S2 the camera 40 photographs the image projected on the optical medium 20 to generate a sample image; preferably, the orthographic projection of the camera 40 onto the optical medium 20 falls on the geometric center of the optical medium 20 . More preferably, the camera 40 is an undistorted camera, and the image captured by the camera 40 is the image presented on the optical medium 20 without any distortion. That is to say, during the calibration process, all The camera 40 is placed close to the human eye, so that the image actually seen by the user during use is captured.
  • Step S3 The processor 50 uses image processing technology to calculate the bottom angle parameter ⁇ of the vertical trapezoid of the sampled image;
  • Step S4 The coordinate transformer 60 samples the coordinates of each pixel of the original image, substitutes the coordinates and the base angle parameter ⁇ into a preset formula, and generates an anti-distortion image by changing the coordinates.
  • the pixel coordinates (x, y) in the original image are changed through the preset formula to obtain the calibrated pixel coordinates (X, Y), where the preset formula is:
  • the number of pixels in the bottom row of the anti-distortion image be equal to the number of pixels in the bottom row of the original image, and both are w, the height of the anti-distortion image is h, and a single pixel in the anti-distortion image
  • the horizontal and vertical coordinate values of X and Y are respectively, and the horizontal and vertical coordinate values of a single pixel in the original image are x and y, respectively, that is,
  • bilinear interpolation is used to sample the pixel value at the pixel coordinate (x, y) of the original image and fill it to the calibrated pixel coordinate (X, Y) to generate an anti-distortion image.
  • Step S5 After the anti-distortion image is generated, the projector 30 projects the anti-distortion image onto the optical medium 20. As shown in FIG. 3, the camera 40 takes a picture to generate a calibration image, and the pictures are compared The device compares the calibration image with the original image.
  • the picture comparator is a Texas Instruments DSP chip model TMSC2000.
  • the naked-eye 3D helmet display and the trapezoidal calibration method for the projected image provided by the present application calculate the base angle parameter ⁇ of the sampled image and substitute the base angle parameter ⁇ and the coordinates of each pixel of the original image into the pre- In the formula, finally, the distortion-resistant image is generated by changing the coordinates, which effectively solves the problem of trapezoidal distortion of the projected image of the naked-eye 3D helmet display and improves the user's experience.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Projection Apparatus (AREA)

Abstract

一种裸眼3D头盔显示器,包括:壳体;光学媒介(20);投影仪(30),向光学媒介(20)投影单原色图像;摄像头(40),对投影在光学媒介(20)上的图形进行拍照生成采样图像;处理器(50),采用图像处理技术计算出采样图像的垂直梯形的底角参数θ;坐标变换器(60),采样原图像各个像素点的坐标,并将坐标及底角参数θ带入预设公式,通过更改坐标生成抗畸变图像,以解决投影图像的梯形失真。还提供了一种基于裸眼3D头盔显示器的投影图像梯形校准方法。

Description

裸眼3D头盔显示器及其投影图像梯形校准方法 技术领域
本申请涉及图像处理技术领域,尤其涉及一种裸眼3D头盔显示器及其投影图像梯形校准方法。
背景技术
随着科技的发展,投影设备的应用范围不断扩大,从商业办公领域已经逐步扩展到教育学习和家庭生活领域。3D头盔作为近年出现的新兴产品,因其在电影、游戏等多方面具备超强的沉浸感,得到广大用户的喜爱,并逐步走进人们的日常生活中。
然而,不可避免的是,3D头盔在成像时或多或少会存在投影图像的梯形失真,从而严重影响到了用户的观影及游戏体验。经科学研究发现,造成图像梯形失真的主要问题是由于投影设备与投影背景不垂直,导致成像与目标图像存在较大的差异。
为了解决图像失真的问题,目前市场上主流的3D头盔绝大部分是双目头盔,即头盔具有两个显示器,左眼看到的图像是左显示器给出的图像,右眼看到的图像是右显示器给出的图像,例如VR头盔等。相关技术的裸眼3D头盔往往需要借助复杂的仪器设备来进行校准,如此一来,一方面在空间狭小的头盔中无法设计太过复杂的仪器设备,另一方面也会增加校准成本,且校准过程较为复杂。若通过用户的主观来判断校准是否为目标图像,由于每个人的观察能力,评估判断力不同,也将导致校准出的结果存在巨大的差异性。
因此,有必要提供一种裸眼3D头盔显示器及其投影图像梯形校准方法来解决上述问题。
技术问题
针对现有技术的以上缺陷或改进需求,本申请提供了一种裸眼3D头盔显示器及其投影图像梯形校准方法。
技术解决方案
一种裸眼3D头盔显示器,包括:
壳体;
光学媒介;
投影仪,所述投影仪向所述光学媒介投影单色原图像;
摄像头,所述摄像头对投影在所述光学媒介上的原图像进行拍照生成采样图像;
处理器,所述处理器采用图像处理技术计算出所述采样图像的垂直梯形的底角参数θ;及
坐标变换器,所述坐标变换器采样所述原图像各个像素点的坐标,并将所述坐标及所述底角参数θ代入到预设公式中,通过更改所述坐标生成抗畸变图像;
所述光学媒介、所述投影仪及所述摄像头均嵌设于所述壳体,所述投影仪及所述摄像头均与所述光学媒介相对设置,所述处理器分别与所述投影仪及所述摄像头连接,所述坐标变换器分别与所述投影仪及所述处理器连接。
优选的,所述摄像头向所述光学媒介的正投影落在所述光学媒介的几何中心。
优选的,所述坐标变换器采用双线性内插法采样所述坐标并将所述坐标及所述底角参数θ代入到预设公式中生成所述抗畸变图像。
优选的,所述预设公式为:
Y=y;
X=(x-w/2)*w*h/[(w-2*h*tanθ)*h+2*h*tanθ*y]+w/2;
其中, X及Y分别为所述抗畸变图像中单个像素点的横纵坐标值, x及y分别为所述原图像中单个像素点的横纵坐标值,h为所述抗畸变图像的高度,w为所述抗畸变图像底行的像素点个数,且所述抗畸变图像底行的像素点个数与所述原图像底行的像素点个数相同。
本申请还提供一种基于上述所述裸眼3D头盔显示器的投影图像梯形校准方法,其包括如下步骤:
步骤S1,所述投影仪向所述光学媒介投影单色原图像;
步骤S2,所述摄像头对投影在所述光学媒介上的原图像进行拍照生成采样图像;
步骤S3,所述处理器采用图像处理技术计算出所述采样图像的垂直梯形的底角参数θ;
步骤S4,所述坐标变换器采样所述原图像各个像素点的坐标,并将所述坐标及所述底角参数θ代入到预设公式中,通过更改所述坐标生成抗畸变图像。
优选的,所述裸眼3D头盔显示器还包括图片比对器,所述抗畸变图像生成后,所述投影仪向所述光学媒介投影所述抗畸变图像,所述摄像头拍照后与所述原图像进行比对。
优选的,在步骤S4中,通过更改所述坐标生成抗畸变图像具体为:
通过所述预设公式将所述原图像中的像素点坐标(x,y)更改后得到校准的像素点坐标(X,Y);
采用双线性内插法采样所述原图像的像素点坐标(x,y)处的像素值填充到校准后的像素点坐标(X,Y)处,生成抗畸变图像。
优选的,所述光学媒介为半反光镜。
优选的,所述摄像头为无畸变摄像头。
有益效果
与相关技术相比,本申请提供的裸眼3D头盔显示器及其投影图像梯形校准方法,通过计算采样图像的底角参数θ并将所述底角参数θ及原图像各个像素点的坐标代入到预设公式中,最后通过更改所述坐标生成所述抗畸变图像,有效的解决了裸眼3D头盔显示器投影图像的梯形失真问题,提高了用户的体验感。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图,其中:
图1为本申请提供的裸眼3D头盔显示器中光学媒介、投影仪、摄像头、处理器及坐标变换器的连接示意图;
图2为本申请提供的基于裸眼3D头盔显示器的投影图像梯形校准方法的流程示意图;
图3为图2所示投影图像梯形校准方法中校准前所述光学媒介上呈现的图像形状;
图4为图2所示投影图像梯形校准方法中校准后所述光学媒介上呈现的图像形状。
本发明的最佳实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本申请的一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其它实施例,都属于本申请保护的范围。
请参阅图1,本申请提供了一种裸眼3D头盔显示器,其包括壳体、光学媒介20、投影仪30、摄像头40、处理器50、坐标变换器60及图片比对器。
所述光学媒介20、所述投影仪30及所述摄像头40均嵌设于所述壳体,且所述投影仪30及所述摄像头40均与所述光学媒介20相对设置。所述投影仪30用于获取原图像信息,并在所述光学媒介20上投影单色原图像,所述摄像头40用于对投影在所述光学媒介上的原图像进行拍照生成采样图像。所述处理器50用于采用图像处理技术计算出所述采样图像的垂直梯形的底角参数θ。所述坐标变换器60用于采样所述原图像各个像素点的坐标,并将所述坐标及所述底角参数θ代入到预设公式中,通过更改所述坐标生成抗畸变图像,在所述抗畸变图像生成后,所述投影仪30向所述光学媒介20投影所述抗畸变图像,所述摄像头40拍照生成校准图像,所述图片比对器将所述校准图像与所述原图像进行比对。
所述处理器50分别与所述投影仪30及所述摄像头40连接,所述坐标变换器60分别与所述投影仪30及所述处理器50连接。所述图片比对器分别与所述摄像头40和所述处理器50连接。其中,所述处理器50与所述投影仪30及所述摄像头40之间的连接方式可以为电连接或通信连接,同理,所述坐标变换器60与所述投影仪30及所述处理器50之间的连接方式以及所述图片比对器与所述摄像头40和所述处理器50之间的连接方式也可以为电连接或通信连接。
具体地,请结合参阅图2,所述裸眼3D头盔显示器的投影图像梯形校准方法包括如下步骤:
步骤S1:所述投影仪30向所述光学媒介20投影单色原图像;在本实施方式中,所述原图像为纯色图像,例如由于所述光学媒介20的加工工艺及参数的不同,经过所述投影仪30投影后的所述原图像在所述光学媒介20会上发生垂直梯形畸变,如图2所示。
所述光学媒介20为透明或半透明的显示媒体。优选的,在本实施方式中,所述光学媒介20为半反光镜。
步骤S2:所述摄像头40对投影在所述光学媒介20上的图像进行拍照生成采样图像;优选地,所述摄像头40向所述光学媒介20的正投影落在所述光学媒介20的几何中心。更优地,所述摄像头40为无畸变摄像头,所述摄像头40拍摄出的图像就是所述光学媒介20上所呈现的图像,不会发生任何畸变,也就是说,在校准过程中,将所述摄像头40放置于靠近人眼的位置,如此拍摄出使用者在使用过程中实际看到的图像。
步骤S3:所述处理器50采用图像处理技术计算出所述采样图像的垂直梯形的底角参数θ;
步骤S4:所述坐标变换器60采样所述原图像各个像素点的坐标,并将所述坐标及所述底角参数θ代入到预设公式中,通过更改所述坐标生成抗畸变图像。
具体的,首先通过所述预设公式将所述原图像中的像素点坐标(x,y)更改后得到校准的像素点坐标(X,Y),其中所述预设公式为:
令所述抗畸变图像底行的像素点个数等于所述原图像底行的像素点个数,且均为w,所述抗畸变图像的高度为h,所述抗畸变图像中单个像素点的横纵坐标值分别为X和Y,所述原图像中单个像素点的横纵坐标值分别为x和y,即有,
Y=y; (0<=y<=h)
X=(x-w/2)*w*h/[(w-2*h*tanθ)*h+2*h*tanθ*y]+w/2    (0<=x<=w)
再采用双线性内插法采样所述原图像的像素点坐标(x,y)处的像素值填充到校准后的像素点坐标(X,Y)处,生成抗畸变图像。
设抗畸变图像通过坐标变换到基于原图像的浮点坐标为(i+u,j+v),令X=i+u, Y=j+v, u和v为[0,1]区间的浮点数,则抗畸变图像的像素值f(X,Y)由原图像中坐标为(i,j)、(i,j+1)、(i+1,j)和(i+1,j+1)决定,f(X,Y)=(1-u)*(1-v)*f(i,j)+(1-u)*v*f(i,j+1)+u*(1-v)*f(i+1,j)+u*v*f(i+1,j+1);  f(i,j)为坐标(i,j)处的像素值。
通过计算采样图像的底角参数θ并将所述底角参数θ及原图像各个像素点的坐标代入到预设公式中,最后通过更改所述坐标生成所述抗畸变图像,有效的解决了裸眼3D头盔显示器投影图像的梯形失真问题,提高了用户的体验感。
步骤S5:在所述抗畸变图像生成后,所述投影仪30向所述光学媒介20投影所述抗畸变图像,如图3所示,所述摄像头40拍照生成校准图像,所述图片比对器将所述校准图像与所述原图像进行比对。
通过设置所述图片比对器,若经所述图片比对器比对后,所述校准图像与所述原图像仍然具有明显差异,则判定校准失败,重新开始比对;若所述校准图像与所述原图像无明显差异,则判定校准成功,校准完成。
在本实施方式中,所述图片比对器为德州仪器公司的型号为TMSC2000的DSP芯片。
与相关技术相比,本申请提供的裸眼3D头盔显示器及其投影图像梯形校准方法,通过计算采样图像的底角参数θ并将所述底角参数θ及原图像各个像素点的坐标代入到预设公式中,最后通过更改所述坐标生成所述抗畸变图像,有效的解决了裸眼3D头盔显示器投影图像的梯形失真问题,提高了用户的体验感。
以上所述的仅是本申请的实施方式,在此应当指出,对于本领域的普通技术人员来说,在不脱离本申请创造构思的前提下,还可以做出改进,但这些均属于本申请的保护范围。
 

Claims (9)

  1. 一种裸眼3D头盔显示器,包括:
    壳体;
    光学媒介;
    投影仪,所述投影仪向所述光学媒介投影单色原图像;
    摄像头,所述摄像头对投影在所述光学媒介上的原图像进行拍照生成采样图像;
    处理器,所述处理器采用图像处理技术计算出所述采样图像的垂直梯形的底角参数θ;其特征在于,还包括:
    坐标变换器,所述坐标变换器采样所述原图像各个像素点的坐标,并将所述坐标及所述底角参数θ代入到预设公式中,通过更改所述坐标生成抗畸变图像;
    所述光学媒介、所述投影仪及所述摄像头均嵌设于所述壳体,所述投影仪及所述摄像头均与所述光学媒介相对设置,所述处理器分别与所述投影仪及所述摄像头连接,所述坐标变换器分别与所述投影仪及所述处理器连接。
  2. 根据权利要求1所述的裸眼3D头盔显示器,其特征在于,所述摄像头向所述光学媒介的正投影落在所述光学媒介的几何中心。
  3. 根据权利要求1所述的裸眼3D头盔显示器,其特征在于,所述坐标变换器采用双线性内插法采样所述坐标并将所述坐标及所述底角参数θ带入所述预设公式中生成所述抗畸变图像。
  4. 根据权利要求3所述的裸眼3D头盔显示器,其特征在于,所述预设公式为:
    Y=y;
    X=(x-w/2)*w*h/[(w-2*h*tanθ)*h+2*h*tanθ*y]+w/2;
    其中, X及Y分别为所述抗畸变图像中单个像素点的横纵坐标值, x及y分别为所述原图像中单个像素点的横纵坐标值,h为所述抗畸变图像的高度,w为所述抗畸变图像底行的像素点个数,且所述抗畸变图像底行的像素点个数与所述原图像底行的像素点个数相同。
  5. 一种基于权利要求1-4任一项所述的裸眼3D头盔显示器的投影图像梯形校准方法,其特征在于,包括如下步骤:
    步骤S1,所述投影仪向所述光学媒介投影单色原图像;
    步骤S2,所述摄像头对投影在所述光学媒介上的原图像进行拍照生成采样图像;
    步骤S3,所述处理器采用图像处理技术计算出所述采样图像的垂直梯形的底角参数θ;
    步骤S4,所述坐标变换器采样所述原图像各个像素点的坐标,并将所述坐标及所述底角参数θ代入到预设公式中,通过更改所述坐标生成抗畸变图像。
  6. 根据权利要求5所述的投影图像梯形校准方法,其特征在于,所述裸眼3D头盔显示器还包括图片比对器,所述抗畸变图像生成后,所述投影仪向所述光学媒介投影所述抗畸变图像,所述摄像头拍照生成校准图像,所述图片比对器将所述校准图像与所述原图像进行比对。
  7. 根据权利要求5所述的投影图像梯形校准方法,其特征在于,在步骤S4中,通过更改所述坐标生成抗畸变图像具体为:
    通过所述预设公式将所述原图像中的像素点坐标(x,y)更改后得到校准的像素点坐标(X,Y);
    采用双线性内插法采样所述原图像的像素点坐标(x,y)处的像素值填充到校准后的像素点坐标(X,Y)处,生成抗畸变图像。
  8. 根据权利要求5所述的投影图像梯形校准方法,其特征在于,所述光学媒介为半反光镜。
  9. 根据权利要求5所述的投影图像梯形校准方法,其特征在于,所述摄像头为无畸变摄像头。
     
PCT/CN2019/129288 2018-12-29 2019-12-27 裸眼 3d 头盔显示器及其投影图像梯形校准方法 WO2020135746A1 (zh)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201822272122.2U CN209433126U (zh) 2018-12-29 2018-12-29 裸眼3d头盔显示器
CN201811643000.8 2018-12-29
CN201811643000.8A CN109541808B (zh) 2018-12-29 2018-12-29 基于裸眼3d头盔显示器投影图像梯形校准方法及3d头盔
CN201822272122.2 2018-12-29

Publications (1)

Publication Number Publication Date
WO2020135746A1 true WO2020135746A1 (zh) 2020-07-02

Family

ID=71127735

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/129288 WO2020135746A1 (zh) 2018-12-29 2019-12-27 裸眼 3d 头盔显示器及其投影图像梯形校准方法

Country Status (1)

Country Link
WO (1) WO2020135746A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1391778A1 (en) * 2002-08-08 2004-02-25 Seiko Precision Inc. Apparatus for detecting the inclination angle of a projection screen and projector comprising the same
US20080204670A1 (en) * 2007-02-23 2008-08-28 Seiko Epson Corporation Projector, projected image adjustment method, and program
CN102158673A (zh) * 2010-02-11 2011-08-17 财团法人工业技术研究院 投影校正系统及方法
CN107493463A (zh) * 2017-09-19 2017-12-19 歌尔股份有限公司 投影仪梯形校正方法及系统
CN109541808A (zh) * 2018-12-29 2019-03-29 未来科技(襄阳)有限公司 基于裸眼3d头盔显示器投影图像梯形校准方法及3d头盔
CN209433126U (zh) * 2018-12-29 2019-09-24 未来科技(襄阳)有限公司 裸眼3d头盔显示器

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1391778A1 (en) * 2002-08-08 2004-02-25 Seiko Precision Inc. Apparatus for detecting the inclination angle of a projection screen and projector comprising the same
US20080204670A1 (en) * 2007-02-23 2008-08-28 Seiko Epson Corporation Projector, projected image adjustment method, and program
CN102158673A (zh) * 2010-02-11 2011-08-17 财团法人工业技术研究院 投影校正系统及方法
CN107493463A (zh) * 2017-09-19 2017-12-19 歌尔股份有限公司 投影仪梯形校正方法及系统
CN109541808A (zh) * 2018-12-29 2019-03-29 未来科技(襄阳)有限公司 基于裸眼3d头盔显示器投影图像梯形校准方法及3d头盔
CN209433126U (zh) * 2018-12-29 2019-09-24 未来科技(襄阳)有限公司 裸眼3d头盔显示器

Similar Documents

Publication Publication Date Title
US11076142B2 (en) Real-time aliasing rendering method for 3D VR video and virtual three-dimensional scene
CN106162203B (zh) 全景视频播放方法、播放器与头戴式虚拟现实设备
TWI479452B (zh) 修正數位影像的裝置及方法
US20150358539A1 (en) Mobile Virtual Reality Camera, Method, And System
US20120105611A1 (en) Stereoscopic image processing method and apparatus
JP3749227B2 (ja) 立体画像処理方法および装置
JP3857988B2 (ja) 立体画像処理方法および装置
JP2004221700A (ja) 立体画像処理方法および装置
JP7074056B2 (ja) 画像処理装置、画像処理システム、および画像処理方法、並びにプログラム
CN107368192A (zh) Vr眼镜的实景观测方法及vr眼镜
WO2022262839A1 (zh) 现场演出的立体显示方法、装置、介质及系统
JP2016212351A (ja) ヘッドマウントディスプレイ、情報処理装置、情報処理システム、およびコンテンツデータ出力方法
WO2012136002A1 (zh) 调节立体图像的方法、装置、系统、电视机及立体眼镜
CN107197135A (zh) 一种视频生成方法、播放方法及视频生成装置、播放装置
CN109541808B (zh) 基于裸眼3d头盔显示器投影图像梯形校准方法及3d头盔
CN113112407B (zh) 基于电视的照镜视野生成方法、系统、设备及介质
JP2004221699A (ja) 立体画像処理方法および装置
EP3047882A1 (en) Method and device for displaying image
WO2020135746A1 (zh) 裸眼 3d 头盔显示器及其投影图像梯形校准方法
WO2023036218A1 (zh) 视点宽度的确定方法及其装置
GB2546273A (en) Detection system
JP2001245322A (ja) 立体画像の入出力方法と装置
TW201916682A (zh) 即時二維轉三維的影像處理方法
CN111630848B (zh) 图像处理装置、图像处理方法、程序和投影系统
JP2013134599A (ja) 位置座標検出装置、位置座標検出方法および電子機器

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19906516

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19906516

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19906516

Country of ref document: EP

Kind code of ref document: A1