WO2023165223A1 - 一种近眼显示器的测量方法及装置 - Google Patents

一种近眼显示器的测量方法及装置 Download PDF

Info

Publication number
WO2023165223A1
WO2023165223A1 PCT/CN2022/140316 CN2022140316W WO2023165223A1 WO 2023165223 A1 WO2023165223 A1 WO 2023165223A1 CN 2022140316 W CN2022140316 W CN 2022140316W WO 2023165223 A1 WO2023165223 A1 WO 2023165223A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye display
tested
photoelectric sensor
area array
array photoelectric
Prior art date
Application number
PCT/CN2022/140316
Other languages
English (en)
French (fr)
Inventor
潘建根
宋立
Original Assignee
杭州远方光电信息股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州远方光电信息股份有限公司 filed Critical 杭州远方光电信息股份有限公司
Publication of WO2023165223A1 publication Critical patent/WO2023165223A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0207Details of measuring devices

Definitions

  • the invention relates to the field of testing optical characteristics of near-eye displays, in particular to a measuring method and device for near-eye displays.
  • Near-eye displays also known as head-mounted displays or virtual displays, are mainly divided into virtual reality (VR) and augmented reality (AR).
  • the display principle of the near-eye display is shown in FIG. 2 .
  • 1 is the near-eye display to be tested
  • 2 is the object plane of the near-eye display to be tested
  • 3 is the optical axis of the near-eye display to be tested
  • 4 is the exit pupil
  • 9 is the eye box
  • 10 is the eye point
  • the eye point is located at the exit pupil
  • the center of , the axis perpendicular to the plane where the eye-passing point and the exit pupil are located is the optical axis of the near-eye display to be tested.
  • the basic measurement geometry of the display optical properties of the near-eye display requires that the optical axis of the measuring device coincide with the optical axis of the near-eye display to be tested, and place the entrance pupil of the lens of the measuring device at the eye point of the near-eye display to be tested.
  • the determination of the exit pupil, eye point and optical axis of the near-eye display to be tested is crucial to the measurement of the optical properties of the near-eye display.
  • the space area where a complete image can be observed in the image space of the near-eye display is called the eye box of the near-eye display, as shown in Figure 2, the area shown in 9, the method of measuring the exit pupil, eye point and optical axis of the near-eye display is mainly by measuring Eye box to be sure.
  • the largest section of the eye box is the exit pupil, the center point is directly regarded as the eye point, and the optical axis is obtained from the eye point and the exit pupil. Therefore, the key to this method is to measure the size of the eye box.
  • the existing eye box measurement method is to use an aiming-point luminance meter with a precision optical lens to measure the image displayed on the near-eye display at different positions through the aiming-point luminance meter, and determine the eye box according to the brightness change of the image boundary.
  • a precision optical lens to measure the image displayed on the near-eye display at different positions through the aiming-point luminance meter, and determine the eye box according to the brightness change of the image boundary.
  • first place the entrance pupil of the aiming-point luminance meter on the optical axis of the display then adjust the luminance meter to align it with the boundary of the display image, and measure the boundary luminance value. Then, move the point-and-point luminance meter.
  • the existing optical axis alignment method is mainly to align the central cross of the imaging system used for measurement with the central cross of the output image of the near-eye display to be tested, that is, it is considered that the optical axis of the measuring device and the optical axis of the near-eye display to be tested have been aligned. alignment. In fact, when there is a difference in the alignment angle, the center of the cross can still be aligned, so there is a large error in this method.
  • the present invention provides a method and device for measuring a near-eye display, aiming at solving the problem of relatively large measurement errors of the exit pupil, eye point and optical axis, complex and expensive equipment, and complex operation of the near-eye display in the prior art.
  • the problem of inefficiency is a problem of inefficiency.
  • the invention provides a method for measuring a near-eye display, which uses an area array photoelectric sensor and a transmission device without an imaging lens to obtain the exit pupil parameters of the near-eye display to be tested, specifically comprising: the output picture of the near-eye display to be tested, the surface
  • the array photoelectric sensor is placed in the human eye viewing area of the near-eye display to be tested. This area is also called the eye box of the near-eye display.
  • the area array photoelectric sensor directly receives the light output signal from the near-eye display to be tested, and obtains light distribution image.
  • the exit pupil is the common exit of the imaging beams of each point of the near-eye display, that is, the outgoing light of the output image of the near-eye display passes through the exit together, that is, each point on the exit contains the entire output image
  • the illumination distribution is relatively uniform and has obvious boundaries. If the area array photoelectric sensor is located at the exit pupil of the near-eye display and the size of the sensitive area of the area array photoelectric sensor can completely cover the entire exit pupil, the area array photoelectric sensor can receive an exit pupil image with relatively uniform illumination distribution and clear boundaries. When , the area surrounded by the relatively clearest image boundary is the smallest compared to the spot image area received when the area array photoelectric sensor is located at other positions.
  • the area array photoelectric sensor or the near-eye display to be tested is controlled by the transmission device to cause relative movement between the two, and the area array photoelectric sensor obtains the illumination distribution images at different spatial positions, and the image boundary recognition algorithm is used to identify different spatial positions. Analyze the illumination distribution image below. When the obtained image has the clearest boundary, the image area is also the smallest relative to other positions. At this time, the plane where the area array photoelectric sensor is located is the plane where the exit pupil of the near-eye display to be tested is located.
  • the illuminated area obtained by the time-area photoelectric sensor is the exit pupil of the near-eye display to be tested.
  • 1 is the near-eye display to be tested
  • 2 is the object plane of the near-eye display to be tested
  • 3 is the optical axis of the near-eye display to be tested
  • 5 is the light emitted by the image square image output by the near-eye display to be tested.
  • the illumination distribution image received by the sufficiently large area photoelectric sensor 6 is the area photoelectric sensor
  • 7 is the sensitive area of the area photoelectric sensor.
  • the exit pupil parameters include: spatial position information of the exit pupil, eye point, and optical axis, and two-dimensional size information of the exit pupil boundary.
  • the exit pupil is the optical image of the entrance pupil of the optical system of the near-eye display to be tested, which is each point on the object plane (that is, the original emitting surface of the displayed image)
  • the exit pupil of the near-eye display is an important parameter of the near-eye display; the eye point is located on the plane where the exit pupil is located and is located in the center of the exit pupil, which is also the distance between the optical axis and the exit pupil.
  • the vertical foot of the pupil; the optical axis is a straight line passing through the eye point and perpendicular to the plane where the exit pupil is located.
  • 1 is the near-eye display to be tested
  • 2 is the object plane of the near-eye display to be tested
  • 3 is the optical axis of the near-eye display to be tested
  • 4 is the exit pupil
  • 8 is the image square picture output by the near-eye display to be tested (image surface, also referred to as the output picture of the near-eye display to be tested in this patent)
  • 9 is an eye box
  • 10 is an eye point.
  • the images output by the near-eye display all refer to the image-side images emitted from the object plane of the near-eye display and imaged by the optical system of the near-eye display.
  • the center of the illumination distribution image obtained by the area array photoelectric sensor is obtained by using the image center recognition algorithm, thus, the center position of the exit pupil is obtained, and the center position of the exit pupil is the eye point of the near-eye display to be tested.
  • the optical axis of the near-eye display to be tested is obtained by the following method: after using the image center recognition algorithm to obtain the center position of the exit pupil, pass through the center position of the exit pupil for the Measure the vertical line of the exit pupil of the near-eye display, and the straight line where the vertical line is located is the optical axis of the near-eye display to be tested.
  • the image boundary recognition algorithm includes but not limited to a boundary sharpness recognition algorithm and/or a boundary contrast recognition algorithm.
  • the sharpest borders are obtained when the sharpness and/or contrast of the image borders are at an extreme value.
  • the algorithm is used to determine the boundary clarity of the illumination distribution images obtained in various directions and positions.
  • the image center recognition algorithm includes but not limited to gray-scale centroid method or geometric method.
  • the gray-scale center-of-gravity method is to obtain the coordinates of the gray-scale weight center according to the gray-scale distribution of the illumination distribution image obtained by the area array photoelectric sensor.
  • the geometric method is to obtain the figure surrounded by the boundary of the illumination distribution image obtained by the area array photoelectric sensor and calculate the geometric center of the figure.
  • the area array photoelectric sensor cannot obtain a complete illumination pattern, and different positions can be acquired step by step.
  • the illumination distribution image received by the array photoelectric sensor is judged using the image boundary recognition algorithm.
  • the area array photoelectric sensor is located on the near-eye display to be tested. on the exit pupil.
  • the transmission device the area array photoelectric sensor and the near-eye display to be tested are relatively translated, and a complete exit pupil is obtained by splicing. Further, the eye point and optical axis of the near-eye display to be tested can be obtained according to the center position of the exit pupil. .
  • the area array photoelectric sensor or the near-eye display to be tested is controlled by the transmission device, so that the two are relatively moved, and the area array photoelectric sensor obtains the illumination distribution images at different spatial positions, using The image boundary recognition algorithm analyzes the illumination distribution images in different spatial positions.
  • the area of the illumination distribution image obtained by the area array photoelectric sensor is the smallest relative to other positions.
  • the plane position of the area array photoelectric sensor is the exit pupil of the near-eye display to be tested.
  • the illuminated area obtained by the area array photoelectric sensor is the exit pupil of the near-eye display to be tested.
  • the spatial position of the exit pupil of the near-eye display to be measured obtained by the method and the spatial position of the exit pupil of the near-eye display obtained by taking the boundary clarity of the illumination distribution image obtained by the area array photoelectric sensor as a criterion can be mutually verified.
  • the relative movement between the area array photoelectric sensor and the near-eye display to be tested includes translation in three directions of up and down, left and right, and front and back, and movement along two or more mutually perpendicular rotation axes. Rotation, or compound motion with multiple degrees of freedom.
  • the picture output by the near-eye display to be tested includes, but is not limited to, a full white label picture or a black background white frame picture.
  • the exit pupil is the common exit of the imaging light beams of each point on the object surface of the near-eye display, and the outgoing light passes through the surface together, that is, each point on the surface contains the image of each point of the entire complete image. lighting information.
  • the output picture type of the near-eye display to be tested will not affect the illumination uniformity and boundary characteristics of the exit pupil.
  • the transmission device is used to control the area array photoelectric sensor and the near-eye display to be tested so that the two undergo relative motion, which specifically includes: processing and analyzing the illumination distribution image obtained at the position of the area array photoelectric sensor , to obtain the corresponding position and attitude (referred to as "position") adjustment information, and adjust the relative position of the area array photoelectric sensor and the near-eye display to be tested through the transmission device.
  • the invention also discloses a method for measuring a near-eye display, using an area array photoelectric sensor, a transmission device and an imaging device without an imaging lens to obtain the optical axis position of the near-eye display to be measured, specifically comprising:
  • the near-eye display to be tested outputs a first detection picture with or without an image center mark;
  • the area array photoelectric sensor is placed in the human eye viewing area of the near-eye display to be tested and directly receives the The light output signal of the near-eye display to be tested and an image of the light distribution are obtained;
  • the near-eye display to be tested outputs a second detection picture with an image center mark, the imaging device focuses on the second detection picture, and the imaging device or the near-eye display to be tested is controlled by the transmission device so that both Relative movement occurs, so that the center of the receiving target surface of the imaging device coincides with the center of the acquired second detection picture, and record the imaging device and its optical axis position information at this time;
  • the near-eye display to be tested outputs a first detection picture, and controls the area array photoelectric sensor and/or the imaging device and/or the near-eye display to be tested through the transmission device to adjust the area array photoelectric
  • step S5 repeat steps S2 to S4 until the clarity of the boundary of the illumination distribution image obtained in step S4 reaches the set range, then the optical axis position is the optical axis position of the near-eye display to be tested, and the The area where the light distribution image with clear boundaries obtained by the area array photoelectric sensor is located is the exit pupil of the near-eye display, and the center of the light distribution image is the eye point.
  • the area array photoelectric sensor obtains the illumination distribution image with the clearest boundary
  • the area enclosed by the boundary of the illumination distribution image is compared with that of the area array photoelectric sensor.
  • the area of the spot image received by the sensor is the smallest when the sensor is located in other positions. Therefore, for step S4, the size of the area of the illumination distribution image may be used instead of the clarity of the boundary of the illumination distribution image as the criterion. And the results of exit pupil parameters obtained by the two criteria can be mutually verified.
  • the optical axis of the near-eye display to be tested is perpendicular to the exit pupil through the eye point, and the plane of each illuminated area parallel to the exit pupil in the eye box is perpendicular to the optical axis, and the near-eye display to be tested The optical axis passes through the center of the illuminated area.
  • the image center recognition algorithm can be used to analyze the illumination distribution image obtained by the area array photoelectric sensor to obtain the center of the illumination distribution image, and then place the entrance pupil center of the imaging device in the illumination distribution obtained by the area array photoelectric sensor Image center position, control the near-eye display to be tested to output the second detection picture, take the entrance pupil as the center of rotation, use the transmission device to adjust the relative pose of the imaging device and the near-eye display to be tested, so that the center of the imaging device is in line with the center of the near-eye display to be tested The screen centers of the near-eye display to be tested are coincident, and the optical axis position of the imaging device at this time is recorded.
  • the position of the optical axis of the imaging device substantially coincides with the position of the optical axis of the near-eye display.
  • verification and adjustment are required, that is, the area array photoelectric sensor is placed perpendicular to the optical axis position of the previous imaging device, and the center of the illumination distribution image is obtained again.
  • the center of the illumination distribution image and the image center of the near-eye display to be tested adjust the relative pose of the imaging device and the near-eye display to be tested, and obtain the optical axis position of the imaging device again.
  • the optical axis position of the imaging device described in the preamble at this time is the optical axis position of the near-eye display to be tested, and the boundary obtained by the area array photoelectric sensor is clear.
  • the area where the illumination distribution image is located is the exit pupil of the near-eye display, and the center of the illumination distribution image is the eye point.
  • step S2 the transmission device controls the area array photoelectric sensor and/or the imaging device and/or the near-eye display to be tested to adjust the area array photoelectric
  • the transmission device controls the area array photoelectric sensor and/or the imaging device and/or the near-eye display to be tested to adjust the area array photoelectric
  • the near-eye display remains stationary, and the area array photoelectric sensor and the imaging device are controlled by the transmission device, thereby adjusting the poses of the area array photoelectric sensor and the imaging device so that the center of the entrance pupil of the imaging device is positioned at The position of the center point of the image obtained by the area array photoelectric sensor mentioned in the preamble.
  • the image center recognition algorithm includes but not limited to gray-scale centroid method or geometric method.
  • the gray-scale center of gravity method is to obtain the coordinates of the gray-scale weight center according to the gray-scale distribution of the image.
  • the geometric method is to identify and acquire the centers of the four sides of the illuminated image, and the coordinates of the intersection of two lines connecting the centers of opposite sides are the coordinates of the image center.
  • the transmission device described in the S3 step controls the imaging device or the near-eye display to be tested, so that the two undergo relative movement, generally a relative rotation, and the rotation center is located at the center of the entrance pupil of the lens of the imaging device.
  • the first detection picture output by the near-eye display to be tested includes but is not limited to a full white label picture or a white frame picture on a black background, and the picture may have a center mark or not.
  • the second detection picture output by the near-eye display to be tested is an image with a center mark, including but not limited to a full-center crosshair picture.
  • the first detection frame and the second detection frame may be the same.
  • the invention also discloses a measuring device for a near-eye display, comprising the near-eye display to be tested, a sample stage for clamping the near-eye display to be tested, an area array photoelectric sensor, a first transmission device and a program control system;
  • the area array photoelectric sensor is placed facing the output direction of the near-eye display to be tested, the area array photoelectric sensor or the sample stage is connected to the first transmission device, and the first transmission device is controlled by the program control system, so that the near-eye display to be measured and the first transmission device are controlled.
  • the area array photoelectric sensor moves relative to each other; the program control system is electrically connected with the first transmission device and the area array photoelectric sensor respectively.
  • the relative movement between the area array photoelectric sensor and the near-eye display to be tested includes translation in the three directions of up and down, left and right, and front and back, and rotation along two or more mutually perpendicular rotation axes, or compound motion with multiple degrees of freedom .
  • the area array photoelectric sensor includes but not limited to CMOS and CCD, and the area array photoelectric sensor has corresponding data acquisition and transmission circuits. It should be noted that this is only an example, and those skilled in the art can make adjustments according to common knowledge.
  • the first transmission device includes a rotation mechanism and/or a translation mechanism.
  • the first transmission device is a robot device with four or more rotation axes.
  • the second transmission device is connected with the sample stage, and the second transmission device drives the sample stage to move;
  • the second transmission device includes a rotating mechanism and a translation mechanism. mechanism.
  • the invention also discloses another measuring device for a near-eye display, which includes a sample stage for clamping a near-eye display to be measured, an area array photoelectric sensor, an imaging device, a first transmission device, and a program control system; the area array photoelectric sensor and The imaging devices are respectively placed facing the output direction of the near-eye display to be tested, the area array photoelectric sensor and the imaging device are respectively connected to the first transmission device, and the first transmission device is controlled by a program control system to adjust the area array photoelectric sensor.
  • the program control system is electrically connected with the first transmission device, the area array photoelectric sensor and the imaging device respectively.
  • the area array photoelectric sensor includes but not limited to CMOS and CCD, and the area array photoelectric sensor has corresponding data acquisition and transmission circuits. It should be noted that this is only an example, and those skilled in the art can make adjustments according to common knowledge.
  • the first transmission device includes a rotation mechanism and/or a translation mechanism.
  • the first transmission device is a robot device with four or more rotation axes.
  • the second transmission device is connected with the sample stage, and the second transmission device drives the sample stage to move;
  • the second transmission device includes a rotating mechanism and a translation mechanism. mechanism.
  • the sampling device used in the measurement method of the near-eye display provided by the present invention is only an ordinary area array photoelectric sensor, and does not need to be equipped with a lens with a complex optical path design, and the cost of the solution is greatly reduced.
  • the implementation steps of the measurement scheme are simple and easy, which greatly simplifies the measurement steps of the optical axis and eye point of the near-eye display, improves the measurement efficiency, and reduces the measurement cost.
  • another near-eye display measurement method provided by the present invention uses a common area array photoelectric sensor and an imaging device to cooperate with each other, which greatly improves the measurement accuracy and efficiency and reduces the measurement cost.
  • FIG. 3 is a schematic structural diagram of a measuring device for a near-eye display provided in Embodiment 1 of the present invention.
  • 1 is the near-eye display to be tested
  • 2 is the object plane of the near-eye display to be tested
  • 3 is the optical axis of the near-eye display to be tested
  • 4 is the exit pupil
  • 5 is the image square image output by the near-eye display to be tested.
  • the light is received by the light distribution image of a sufficiently large area array photoelectric sensor
  • 6 is the area array photoelectric sensor
  • 7 is the sensitive area of the area array photoelectric sensor
  • 8 is the image square picture (image surface) output by the near-eye display to be tested
  • 9 is an eye box
  • 10 is an eye point
  • 11 is a support frame, wherein 11-1 is a support frame for clamping an area sensor, 11-2 is a support frame for clamping an imaging device, and 12 is a two-axis rotation stage, 13 is an imaging device, 14 is a four-axis moving platform, 15 is a sample stage, and 16 is a three-dimensional translation stage.
  • This embodiment discloses a measuring device for a near-eye display, as shown in Figure 3, comprising a near-eye display (1) to be tested, a sample stage (15) for clamping the near-eye display (1) to be tested, and an area array photoelectric sensor (6), a support frame (11) for clamping an area array photoelectric sensor (6); the support frame (11) includes a three-dimensional translation platform (16) and a two-axis rotation platform (12), wherein the near-eye
  • the display (1) is set on the sample stage (15), the three-dimensional translation stage (16) and the two-axis rotation stage (12) respectively control the movement and rotation of the area array photoelectric sensor (6), and change the area array photoelectric sensor (6) and
  • the positional relationship between the near-eye displays (1) to be measured realizes the measurement of illumination distribution at different positions; when the area array photoelectric sensor (6) is located at different positions, the received optical illuminance distributions are different.
  • the exit pupil parameters of the near-eye display to be tested can be obtained.
  • This embodiment discloses a measuring device for a near-eye display, as shown in Figure 4, comprising a near-eye display (1) to be tested, a sample stage (15) for clamping the near-eye display (1) to be tested, and an area array photoelectric sensor (6), a support frame (11) for clamping an area array photoelectric sensor (6); the sample stage includes a three-dimensional translation stage (16), and the support frame (11) includes a two-axis rotary table (12), Among them, the near-eye display to be tested (1) is set on the sample stage (15), the three-dimensional translation stage (16) controls the movement of the near-eye display to be tested, and the two-axis rotary table (12) controls the rotation of the area array photoelectric sensor (6), thereby changing The positional relationship between the area array photoelectric sensor (6) and the near-eye display to be tested (1) realizes the measurement of illuminance distribution at different positions, and performs boundary recognition algorithm analysis on the illumination distribution image to obtain the exit pupil parameters of the near-eye display to be tested.
  • This embodiment discloses a measuring device for the optical axis of a near-eye display, as shown in FIG.
  • Photoelectric sensor (6) support frame (11-1) for clamping the area array photoelectric sensor (6), imaging device (13), support frame (11-2) for clamping imaging device, three-dimensional translation stage (16), two-axis rotary table (12).
  • the near-eye display to be tested (1) is arranged on the sample stage (15), the three-dimensional translation stage (16) and the two-axis rotating stage (12) control the movement and rotation of the near-eye display to be tested (1)
  • the area array photoelectric sensor (6 ) is set on the support frame (11-1) for clamping the area array photoelectric sensor (6)
  • the imaging device (13) is set on the support frame (11-2) for clamping the imaging device.
  • An area array photoelectric sensor (6) is used in conjunction with an imaging device (13) to obtain exit pupil parameters and an optical axis position of the near-eye display to be tested.
  • This embodiment discloses a measuring device for the optical axis of a near-eye display, as shown in FIG.
  • An area array photoelectric sensor (6) is used in conjunction with an imaging device (13) to obtain exit pupil parameters and an optical axis position of the near-eye display to be tested.
  • This embodiment also discloses a method for measuring a near-eye display, as shown in Figure 7, the measuring steps include:
  • step S4 if it is determined that the boundary sharpness of the illumination distribution image does not reach the highest level, then obtain the corresponding pose adjustment information again, and repeat step S3; if it is determined that the illumination distribution image meets the threshold requirement set by the algorithm, then the area array photoelectric sensor (6 ) plane is the plane where the exit pupil of the near-eye display to be measured is located, and now the spot area on the area array photoelectric sensor (6) is the exit pupil (4) of the near-eye display to be measured, and the exit pupil (4)
  • the center of the eye point (5) is the eye point (5), and the axis perpendicular to the exit pupil (4) of the center of the exit pupil (4) is the optical axis (3) of the near-eye display to be measured (1) .
  • This embodiment also discloses another measurement method for a near-eye display, as shown in Figure 8, the measurement steps include:
  • the near-eye display to be tested outputs the first detection picture;
  • the area array photoelectric sensor (6) is placed in the human eye viewing area of the near-eye display to be tested (1), and directly receives information from the near-eye display to be tested ( 1) light output signal and obtain the light distribution image;
  • the near-eye display to be tested outputs a second detection picture
  • the imaging device (13) focuses on the second detection picture
  • the two-axis rotating table (12) controls the near-eye display to be tested (1), so that the near-eye display to be tested
  • the display (1) and the imaging device (13) are rotated so that the center of the receiving target surface of the imaging device coincides with the center of the obtained second detection picture, and the imaging device and its optical axis position at this time are recorded;
  • step S4 control the area array photoelectric sensor (6) through the four-axis mobile platform (14), so that the area array photoelectric sensor (6) is perpendicular to the optical axis of the imaging device obtained in step S3, and then control The area array photoelectric sensor (6) is adjusted back and forth in the direction of the optical axis until the illumination distribution image (5) with the clearest boundary is obtained;
  • the optical axis position is the optical axis position of the near-eye display to be tested, and the area array
  • the area where the light distribution image with clear boundaries obtained by the photoelectric sensor is located is the exit pupil of the near-eye display, and the center of the light distribution image is the eye point.

Landscapes

  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

一种近眼显示器的测量方法及装置,使用不设置成像镜头的面阵光电传感器(6)和传动装置(11)获得待测近眼显示器(1)的出瞳参数,有效解决了现有的近眼显示器眼点及光轴测量技术误差较大、设备复杂昂贵、操作复杂且效率低下的问题。

Description

一种近眼显示器的测量方法及装置 技术领域
本发明涉及近眼显示光学特性测试领域,具体涉及一种近眼显示器的测量方法及装置。
背景技术
近眼显示器,也称头戴式显示器或者虚拟显示器,主要分虚拟现实(VR)和增强现实(AR)。近眼显示器的显示原理,如图2所示。图中,1为待测近眼显示器,2为待测近眼显示器的物面,3为待测近眼显示器的光轴,4为出瞳,9为眼盒,10为眼点,眼点位于出瞳的中心,过眼点与出瞳所在平面垂直的轴线即为待测近眼显示器的光轴。近眼显示器显示光学特性的基本测量几何是需要将测量设备光轴与待测近眼显示器的光轴重合,并将测量设备镜头的入瞳置于待测近眼显示器的眼点。待测近眼显示器的出瞳、眼点及光轴的确定对于近眼显示器光学性能的测量是至关重要的。
在近眼显示器像方空间上可以观测到完整图像的空间区域称为近眼显示器的眼盒,如图2中9所示区域,测量近眼显示器的出瞳、眼点及光轴的方法主要是通过测量眼盒来确定。一般情况下,眼盒的最大截面处为出瞳,中心点直接被认为是眼点,并由眼点和出瞳获得光轴。因此该方法的关键在于测量眼盒大小。
现有眼盒的测量方法是使用带精密光学镜头的瞄点式亮度计,通过瞄点式亮度计在不同位置测量近眼显示器所显示的图像,根据图像边界的亮度变化来判定眼盒。测量时首先将瞄点式亮度计的入瞳置于显示器光轴上,然后调整亮度计,使其对准显示图像边界,并测量得到边界亮度值。然后,平移瞄点式亮度计,移动的时候需要调整亮度计角度,保证被测量的图像的点位不移动,读取当前位置的亮度值。当亮度值低于初始位置的50%,则判定该位置为该移动方向上的眼盒边界。重复以上步骤得到其他方向上的眼盒边界,并进一步的得到近眼显示器出瞳、眼点及光轴。
另一方面,现有的光轴对准方法主要是将测量所用的成像系统中心十字与待测近眼显示器输出图像的中心十字对齐,即认为完成了测量设备光轴与待测近眼显示器光轴的对准。而实际上,当对准角度有差别时,十字中心仍可以对齐,因此该方法存在较大误差。
不难发现,现有的通过测量眼盒区域来获得近眼显示器的出瞳、眼点及光轴的方法操作极其复杂,此外眼盒测量本身就受到移动精度、步进及对准精度等影响,存在较大误差,且测量效率低下。
发明内容
针对现有技术的不足,本发明提供一种近眼显示器的测量方法及装置,旨在解决现有技术中近眼显示器的出瞳、眼点及光轴测量误差较大、设备复杂昂贵、操作复杂且效率低下的问题。
为实现上述目的,本发明采用的技术方案如下:
本发明提供了一种近眼显示器的测量方法,使用不设置成像镜头的面阵光电传感器和传动装置获得待测近眼显示器的出瞳参数,具体包括:所述待测近眼显示器输出画面,所述面阵光电传感器置于待测近眼显示器人眼观看区域内,这一区域也称为近眼显示器的眼盒,所述面阵光电传感器直接接收来自所述待测近眼显示器的光输出信号,并获得光照分布图像。根据近眼显示器显示的光学成像原理,出瞳为近眼显示器各点成像光束的公共出口,即近眼显示器输出图像的出射光线共同经过该出口,即该出口上的每个点都包含有整幅输出图像各点的光信息,其光照分布较为均匀且有明显的边界。若面阵光电传感器位于近眼显示器的出瞳且面阵光电传感器的敏感区域面积大小完全可以覆盖整个出瞳时,则面阵光电传感器能够收到光照分布较为均匀且边界清晰的出瞳图像,此时,这个相对最为清晰的图像边界围成的面积相比面阵光电传感器位于其他位置时所接收到的光斑图像面积为最小。若面阵光电传感器偏离出瞳,面阵光电传感器所获得的光照分布图像变大且边界模糊不清,图像中不是每一个点都能够接收到来自近眼显示器输出像面各点的光线。因此,通过传动装置控制所述面阵光电传感器或所述待测近眼显示器,使两者发生相对运动,面阵光电传感器获得不同空间位置下的光照分布图像,使用图像边界识别算法对不同空间位置下的光照分布图像进行分析,当所获图像具有最清晰的边界时,图像面积也为相对其他位置最小,此时面阵光电传感器所在平面位置即为待测近眼显示器的出瞳所在平面,此时面阵光电传感器所获光照区域即为待测近眼显示器的出瞳。如图1所示,1为待测近眼显示器,2为待测近眼显示器的物面,3为待测近眼显示器的光轴,5为待测近眼显示器所输出的像方图像所发出的光线被足够大面阵光电传感器所接收到的光照分布 图像,6为面阵光电传感器,7为面阵光电传感器的敏感区域。
需要说明的是,在上述技术方案中,所述的出瞳参数包含:出瞳、眼点和光轴的空间位置信息以及出瞳边界的二维尺寸信息。为了避免歧义,本技术方案对于出瞳、眼点及光轴进一步说明如下:出瞳为待测近眼显示器光学系统入瞳的光学像,是物面(即显示图像的原始发出面)上各点经过近眼显示器光学系统成像后出射光束的共同出口,近眼显示器的出瞳的位置和大小是近眼显示器的重要参数;眼点位于出瞳所在的平面且位于出瞳的中心,也为光轴与出瞳的垂足;光轴为过眼点且与出瞳所在平面垂直的一条直线。如图2所示,1为待测近眼显示器,2为待测近眼显示器的物面,3为待测近眼显示器的光轴,4为出瞳,8为待测近眼显示器所输出的像方画面(像面,本专利中也称为待测近眼显示器的输出画面),9为眼盒,10为眼点。
为了方便表述,下文中,近眼显示器所输出的画面均是指由近眼显示器的物面所发出经过近眼显示器光学系统成像所输出的像方画面。
进一步的,在上述技术方案中,在所述面阵光电传感器所在平面位于待测近眼显示器出瞳所在平面位置基础上,使用图像中心识别算法获得面阵光电传感器所获得的光照分布图像的中心,从而获得所述出瞳的中心位置,所述出瞳的中心位置为所述待测近眼显示器的眼点。
进一步的,在上述技术方案中,所述待测近眼显示器的光轴由以下方法得到:使用图像中心识别算法获得所述出瞳的中心位置之后,过所述出瞳的中心位置作所述待测近眼显示器的出瞳的垂线,该垂线所在直线即为所述待测近眼显示器的光轴。
进一步的,在上述技术方案中,所述的图像边界识别算法包括但不限于边界锐度识别算法和/或边界对比度识别算法。当图像边界锐度和/或对比度达到极值时,即获得最清晰边界。进一步的,所述的算法用于判定在各个方向和位置获得的光照分布图像的边界清晰程度。
进一步的,在上述技术方案中,所述图像中心识别算法包括但不限于灰度重心法或几何法。所述的灰度重心法为按照所述面阵光电传感器所获得的光照分布图像的灰度分布求出灰度权重中心的坐标。所述的几何法为获取所述面阵光电传感器所获得的光照分布图像边界所围成的图形并求出所述图形的几何中心。
进一步的,在上述技术方案中,当所述面阵光电传感器的感光面尺寸小于所述待测近眼显示器的出瞳尺寸时,面阵光电传感器无法获取完整光照图案,可以通过分步获取不同位置下面阵光电传感器所接收到的光照分布图像,并使用所述图像边界识别算法判别,当光照分布图像具有最清晰的边界或面积相对最小时,所述面阵光电传感器位于所述待测近眼显示器的出瞳上。通过传动装置使得面阵光电传感器与所述待测近眼显示器相对平动,通过拼接得完整的出瞳,进一步的,可根据所述出瞳的中心位置获得待测近眼显示器的眼点及光轴。
进一步的,在上述技术方案中,通过传动装置控制所述面阵光电传感器或所述待测近眼显示器,使两者发生相对运动,而面阵光电传感器获得不同空间位置下的光照分布图像,使用图像边界识别算法对不同空间位置下的光照分布图像进行分析,面阵光电传感器获得的光照分布图像面积相对与其他位置最小,此时面阵光电传感器所在平面位置即为待测近眼显示器的出瞳所在平面,此时面阵光电传感器所获光照区域即为待测近眼显示器的出瞳。进一步的,该方法获得待测近眼显示出瞳的空间位置与所述以面阵光电传感器获得光照分布图像边界清晰程度为判据所获得近眼显示出瞳的空间位置可以互为验证。
进一步的,在上述技术方案中,所述面阵光电传感器与所述待测近眼显示器的相对运动包括上下、左右、前后三个方向的平动和沿着两个或以上相互垂直的旋转轴的转动,或者多自由度的复合运动。
在一些可选的实施例中,所述的待测近眼显示器输出的画面,包括但不限于全白标板画面或黑底白框画面。根据近眼显示器显示的光学成像原理,出瞳为近眼显示器物面上各点成像光束的公共出口,出射光线共同经过该面,即该面上的每个点都包含了整个完整的图像各点的光照信息。所述待测近眼显示器输出画面类型不会影响出瞳的光照均匀性及边界特性。
进一步的,在上述技术方案中,使用传动装置控制面阵光电传感器与所述待测近眼显示器,使两者发生相对运动的过程,具体包括:处理分析面阵光电传感器所在位置获得的光照分布图像,以获得相应的位置和姿态(简称“位姿”)调整信息,并通过传动装置调整所述面阵光电传感器与所述待测近眼显示器的相对位置。
本发明还公开了一种近眼显示器的测量方法,使用不设置成像镜头的面 阵光电传感器、传动装置和成像装置获得待测近眼显示器的光轴位置,具体包括:
S1,所述待测近眼显示器输出带图像中心标记或者不带图像中心标记的第一检测画面;所述面阵光电传感器置于所述待测近眼显示器人眼观看区域内,直接接收来自所述待测近眼显示器的光输出信号并获得光照分布图像;
S2,通过图像中心识别算法获得所述光照分布图像的中心位置,通过所述传动装置控制所述面阵光电传感器和/或所述成像装置和/或所述待测近眼显示器,以调整所述面阵光电传感器和/或所述成像装置和/或所述待测近眼显示器的位姿,使得所述成像装置的入瞳中心置于前序所述面阵光电传感器所获得的图像的中心点位置;
S3,所述待测近眼显示器输出带图像中心标记的第二检测画面,所述成像装置对焦于第二检测画面,通过所述传动装置控制所述成像装置或所述待测近眼显示器,使两者发生相对运动,使得所述成像装置的接收靶面中心与获取的第二检测画面中心重合,并记录此时所述成像装置及其光轴位置信息;
S4,所述待测近眼显示器输出第一检测画面,通过所述传动装置控制所述面阵光电传感器和/或所述成像装置和/或所述待测近眼显示器,以调整所述面阵光电传感器、所述成像装置及所述待测近眼显示器的相对位姿,使得所述面阵光电传感器的位姿至垂直于步骤S3中所述成像装置的光轴,然后控制所述面阵光电传感器在该光轴方向前后调节,直至获得边界相对最为清晰的光照分布图像;
S5,反复S2至S4步骤,直至步骤S4所获得的光照分布图像边界的清晰程度达到设定的范围后,则所述的光轴位置即为所述待测近眼显示器的光轴位置,所述的面阵光电传感器所获得的边界清晰的光照分布图像所在区域即为近眼显示器的出瞳,所述的光照分布图像中心即为眼点。
需要说明的是,在上述技术方案中,根据近眼显示成像原理,所述面阵光电传感器获得边界相对最为清晰的光照分布图像时,其所述光照分布图像边界围成的面积相比面阵光电传感器位于其他位置时所接收到的光斑图像面积为最小。因此,对于步骤S4,可以以光照分布图像面积的大小替代以光照分布图像边界清晰程度为判据。且通过两种判据所得的出瞳参数的结果可以互为验证。
根据近眼显示器显示的光学成像原理,待测近眼显示器的光轴是过眼点与出瞳垂直,且眼盒内每个与出瞳平行的光照区域平面均与光轴垂直,且待测近 眼显示器的光轴过光照区域的中心。因此,可使用图像中心识别算法对面阵光电传感器所获得的光照分布图像进行分析进而获得所述光照分布图像的中心,然后将成像装置的入瞳中心置于所述面阵光电传感器获得的光照分布图像中心位置,控制待测近眼显示器输出第二检测画面,以入瞳为旋转中心,使用传动装置调整所述成像装置与所述待测近眼显示器的相对位姿,使所述成像装置中心与所述待测近眼显示器的画面中心重合,记录此时所述成像装置的光轴位置。此时所述成像装置的光轴位置与所述近眼显示器的光轴位置基本重合。为了更精确的获得待测近眼显示器的光轴位置,需要验证及调整,即将所述面阵光电传感器垂直于与前序成像装置的光轴位置放置,再次获得光照分布图像的中心,根据所述光照分布图像的中心及所述待测近眼显示器图像中心调整所述成像装置与待测近眼显示器的相对位姿,并再次获得所述成像装置光轴位置。通过传动装置来回调整面阵光电传感器、近眼显示器、成像装置三者的相对位姿,多次循环逼近,直至所述面阵光电传感器所获得的光照分布图像边界清晰程度达到设定的误差范围,即直至获得的光照分布图像所有边界最为清晰时,此时前序所述成像装置的光轴位置即为所述待测近眼显示器的光轴位置,所述的面阵光电传感器所获得的边界清晰的光照分布图像所在区域即为近眼显示器的出瞳,所述的光照分布图像中心即为眼点。
进一步的,在上述技术方案中,步骤S2所述的通过所述传动装置控制所述面阵光电传感器和/或所述成像装置和/或所述待测近眼显示器,以调整所述面阵光电传感器、所述成像装置及所述待测近眼显示器的相对位姿,有多种实现方式。比如,近眼显示器保持不动,通过所述传动装置控制所述面阵光电传感器和成像装置,进而调整所述面阵光电传感器和成像装置的位姿,使得所述成像装置的入瞳中心置于前序所述面阵光电传感器所获得的图像的中心点位置。
进一步的,在上述技术方案中,所述的图像中心识别算法包括但不限于灰度重心法或几何法。所述的灰度重心法为按照图像的灰度分布求出灰度权重中心的坐标。所述的几何法为识别获取光照图像四边的中心,两条对边中心连线的交点坐标为图像中心的坐标。
进一步的,所述S3步骤中所述的传动装置控制成像装置或所述待测近眼显示器,使两者发生相对运动,一般为相对转动,转动中心位于所述成像装置镜 头的入瞳位置中心。
在一些可选的实施例中,所述的待测近眼显示器输出的第一检测画面,包括但不限于全白标板画面或黑底白框画面,画面可以带中心标记或者不带中心标记。所述的待测近眼显示器输出的第二检测画面,为带中心标记图像,包括但不限于全中心十字叉丝画面。
在一些可选的实施例中,第一检测画面与第二检测画面可相同。
本发明还公开了一种近眼显示器的测量装置,包括所述待测近眼显示器、用于夹持所述待测近眼显示器的样品台、面阵光电传感器、第一传动装置以及程控系统;所述面阵光电传感器面向所述待测近眼显示器的输出方向放置,所述面阵光电传感器或样品台与第一传动装置相连接,通过程控系统控制第一传动装置,使所述待测近眼显示器与面阵光电传感器发生相对运动;所述程控系统分别与第一传动装置和面阵光电传感器电连接。所述面阵光电传感器与所述待测近眼显示器的相对运动包括上下、左右、前后三个方向的平动和沿着两个或以上相互垂直的旋转轴的转动,或者多自由度的复合运动。
在一些可选的实施例中,所述面阵光电传感器,包括但不限于CMOS、CCD,所述面阵光电传感器带有相应的数据获取及传输电路。需要说明的是,此处仅为示例,本领域技术人员可以根据公知常识进行调整。
在一些可选的实施例中,所述的第一传动装置包括转动机构和/或平动机构。
在一些可选的实施例中,所述的第一传动装置是具有四个或以上旋转轴的机器人装置。
进一步的,在上述技术方案中,还包括第二传动装置,所述的第二传动装置与样品台相连接,第二传动装置带动样品台运动;所述的第二传动装置包括转动机构和平动机构。
本发明还公开了另一种近眼显示器的测量装置,包括用于夹持待测近眼显示器的样品台、面阵光电传感器、成像装置、第一传动装置以及程控系统;所述面阵光电传感器和所述成像装置分别面向所述待测近眼显示器的输出方向放置,所述面阵光电传感器和成像装置分别与第一传动装置相连接,通过程控系统控制第一传动装置,调整所述面阵光电传感器和成像装置的位姿。所述程控系统 分别与第一传动装置、面阵光电传感器和成像装置电连接。
在一些可选的实施例中,所述面阵光电传感器,包括但不限于CMOS、CCD,所述面阵光电传感器带有相应的数据获取及传输电路。需要说明的是,此处仅为示例,本领域技术人员可以根据公知常识进行调整。
在一些可选的实施例中,所述的第一传动装置包括转动机构和/或平动机构。
在一些可选的实施例中,所述的第一传动装置是具有四个或以上旋转轴的机器人装置。
进一步的,在上述技术方案中,还包括第二传动装置,所述的第二传动装置与样品台相连接,第二传动装置带动样品台运动;所述的第二传动装置包括转动机构和平动机构。
本发明的有益效果:本发明提供的近眼显示器的测量方法使用的采样装置仅为普通面阵光电传感器,无需搭载复杂光路设计的镜头,方案成本大大降低。同时,测量方案实施步骤简单易行,大大简化了近眼显示器光轴及眼点的测量步骤,提高了测量效率,降低了测量成本。此外,本发明提供的另一种近眼显示器的测量方法,使用的测量装置为普通面阵光电传感器与成像装置互相配合,大大提高了测量精度和测量效率,降低了测量成本。
附图说明
附图1为本发明技术方案的原理示意图;
附图2为近眼显示器出瞳、眼点及光轴示意图;
附图3为本发明实施例一提供的一种近眼显示器的测量装置结构示意图;
附图4为本发明实施例二提供的一种近眼显示器的测量装置结构示意图;
附图5为本发明实施例三提供的一种近眼显示器光轴的测量装置结构示意图;
附图6为本发明实施例四提供的一种近眼显示器光轴的测量装置结构示意图;
附图7为本发明提供的一种近眼显示器的测量方法的流程图;
附图8为本发明提供的另一种近眼显示器的测量方法的流程图;
图中,1为待测近眼显示器,2为待测近眼显示器的物面,3为待测近眼显示器的光轴,4为出瞳,5为待测近眼显示器所输出的像方图像所发出的光线被足够大面阵光电传感器所接收到的光照分布图像,6为面阵光电传感器,7为面阵光 电传感器的敏感区域,8为待测近眼显示器所输出的像方画面(像面),9为眼盒,10为眼点,11为支撑架,其中11-1为用于夹持面阵传感器的支撑架,11-2为用于夹持成像装置的支撑架,12为两轴旋转台,13为成像装置,14为四轴移动平台,15为样品台,16为三维平移台。
具体实施方式
以下结合附图对本发明的具体实施方式作了说明,但本领域技术人员应当理解,以下实施例仅是为了进行说明,而不是为了限制本发明的范围。本领域技术人员应当理解,可在不脱离本发明的范围和精神的情况下,对以下实施例进行修改。本发明的保护范围由所附的权利要求来限定。
实例一
本实施例公开了一种近眼显示器的测量装置,如图3所示,包括待测近眼显示器(1)、用于夹持待测近眼显示器(1)的样品台(15)、面阵光电传感器(6)、用于夹持面阵光电传感器(6)的支撑架(11);所述支撑架(11)包括三维平移台(16)及两轴旋转台(12),其中,待测近眼显示器(1)设置于样品台(15)上,三维平移台(16)及两轴旋转台(12)分别控制面阵光电传感器(6)的移动与转动,改变面阵光电传感器(6)与待测近眼显示器(1)之间的位置关系,实现不同位置的光照分布测量;面阵光电传感器(6)位于不同位置时,其所接收到的光学照度分布各有不同。对光照分布图像进行边界识别算法分析,即可以获得待测近眼显示器的出瞳参数。
实例二
本实施例公开了一种近眼显示器的测量装置,如图4所示,包括待测近眼显示器(1)、用于夹持待测近眼显示器(1)的样品台(15)、面阵光电传感器(6)、用于夹持面阵光电传感器(6)的支撑架(11);所述样品台包括三维平移台(16),所述支撑架(11)包括两轴旋转台(12),其中,待测近眼显示器(1)设置于样品台(15)上,三维平移台(16)控制待测近眼显示器移动,两轴旋转台(12)控制面阵光电传感器(6)转动,进而改变面阵光电传感器(6)与待测近眼显示器(1)之间的位置关系,实现不同位置的照度分布测量,对光照分布图像进行边界识别算法分析,即获得待测近眼显示器的出瞳参数。
实例三
本实施例公开了一种近眼显示器光轴的测量装置,如图5所示,包括待测近眼显示器(1)、用于夹持待测近眼显示器(1)的样品台(15)、面阵光电传感器(6)、用于夹持面阵光电传感器(6)的支撑架(11-1)、成像装置(13)、用于夹持成像装置的支撑架(11-2)、三维平移台(16),两轴旋转台(12)。其中,待测近眼显示器(1)设置于样品台(15)上,三维平移台(16)及两轴旋转台(12)控制待测近眼显示器(1)移动及转动,面阵光电传感器(6)设置于用于夹持面阵光电传感器(6)的支撑架(11-1)上,成像装置(13)设置与用于夹持成像装置的支撑架(11-2)上。使用面阵光电传感器(6)配合成像装置(13)获得待测近眼显示器出瞳参数及光轴位置。
实例四
本实施例公开了一种近眼显示器光轴的测量装置,如图6所示,包括待测近眼显示器(1)、用于夹持待测近眼显示器(1)的样品台(15)、面阵光电传感器(6)、成像装置(13)、四轴移动平台(14),两轴旋转台(12),其中,待测近眼显示器(1)设置于样品台(15)上,两轴旋转台(12)控制待测近眼显示器(1)转动,面阵光电传感器(6)及成像装置(13)分别设置在四轴移动平台(14)上,四轴移动平台(14)控制面阵光电传感器(6)及成像装置(13)移动。使用面阵光电传感器(6)配合成像装置(13)获得待测近眼显示器出瞳参数及光轴位置。
本实施例还公开了一种近眼显示器的测量方法,如图7所示,测量步骤包括:
S1,控制待测近眼显示器(1)输出画面(2);
S2,通过所述面阵光电传感器(6)获取初始空间位置下的待测近眼显示器光照分布,并对光照分布图像进行算法判定,并获得相应位姿调整信息;
S3,使用三维平移台(16)及两轴旋转台(12)改变面阵光电传感器(6)与待测近眼显示器(1)的相对位姿后再次获取待测近眼显示器的光照分布,并再次对光照分布图像进行图像算法判定;
S4,若判定光照分布图像边界锐度未达到最高时,则再次获得相应位姿调整信息,并重复步骤S3;若判定光照分布图像达到算法所设阈值要求,则所述面阵光电传感器(6)所在平面即为待测近眼显示器出瞳所在平面,此时所述面 阵光电传感器(6)上的光斑区域为所述待测近眼显示器的出瞳(4),所述出瞳(4)的中心即为眼点(5),所述过出瞳(4)的中心与所述待测近眼显示器(1)出瞳(4)垂直的轴线即为待测近眼显示器的光轴(3)。
本实施例还公开了另一种近眼显示器的测量方法,如图8所示,测量步骤包括:
S1,所述待测近眼显示器输出第一检测画面;所述面阵光电传感器(6)置于所述待测近眼显示器(1)人眼观看区域内,直接接收来自所述待测近眼显示器(1)的光输出信号并获得光照分布图像;
S2,通过图像中心识别算法获得所述光照分布图像的中心位置,并确定所述成像装置(13)与待测近眼显示器(1)的相对位姿调整信息;通过所述四轴移动平台(14)控制所述成像装置(13)调整所述成像装置(13)的位姿;
S3,所述待测近眼显示器输出第二检测画面,所述成像装置(13)对焦于第二检测画面,所述两轴旋转台(12)控制待测近眼显示器(1),使待测近眼显示器(1)与所述成像装置(13)发生转动,使得所述成像装置接收靶面中心与获取的第二检测画面中心重合,并记录此时所述成像装置及其光轴位置;
S4,通过所述四轴移动平台(14)控制所述面阵光电传感器(6),使所述面阵光电传感器(6)垂直于步骤S3所获的所述成像装置的光轴,然后控制所述面阵光电传感器(6)在该光轴方向前后调节,直至获得边界相对最为清晰的光照分布图像(5);
反复S1至S4步骤,直至步骤S4所获得的光照分布图像边界清晰程度达到设定的范围后,则所述的光轴位置即为所述待测近眼显示器的光轴位置,所述的面阵光电传感器所获得的边界清晰的光照分布图像所在区域即为近眼显示器的出瞳,所述的光照分布图像中心即为眼点。

Claims (17)

  1. 一种近眼显示器的测量方法,其特征在于,使用不设置成像镜头的面阵光电传感器和传动装置获得待测近眼显示器的出瞳参数,具体包括:待测近眼显示器输出画面,面阵光电传感器置于待测近眼显示器人眼观看区域内,所述面阵光电传感器直接接收来自所述待测近眼显示器的光输出信号并获得光照分布图像;传动装置控制所述面阵光电传感器或所述待测近眼显示器,使两者发生相对运动,并利用不同空间位置下所述面阵光电传感器获得的光照分布图像,分析得到所述待测近眼显示器的出瞳参数。
  2. 根据权利要求1所述的一种近眼显示器的测量方法,其特征在于,利用不同空间位置下所述面阵光电传感器获得的光照分布图像,分析得到所述待测近眼显示器的出瞳参数,具体包括:使用图像边界识别算法判别所述光照分布图像的边界,当所述光照分布图像具有最清晰的边界时,所述面阵光电传感器位于所述待测近眼显示器的出瞳所在平面上,所述面阵光电传感器上的光照区域为所述待测近眼显示器的出瞳;使用图像中心识别算法获得所述光照分布图像的中心,获得所述出瞳的中心位置,从而获得所述待测近眼显示器的眼点。
  3. 根据权利要求1所述的一种近眼显示器的测量方法,其特征在于,利用不同空间位置下所述面阵光电传感器获得的光照分布图像,分析得到所述待测近眼显示器的出瞳参数,具体包括:使用图像边界识别算法判别所述光照分布图像的边界,并计算所述光照分布图像面积,当所述光照分布图像面积相对最小时,所述面阵光电传感器位于所述待测近眼显示器的出瞳所在平面上,所述面阵光电传感器上的光照区域为所述待测近眼显示器的出瞳;使用图像中心识别算法获得所述光照分布图像的中心,获得所述出瞳的中心位置,从而获得所述待测近眼显示器的眼点。
  4. 根据权利要求2或3所述的一种近眼显示器的测量方法,其特征在于,使用所述图像中心识别算法获得所述出瞳的中心位置之后,过所述出瞳的中心位置作所述待测近眼显示器出瞳的垂线,从而获得所述待测近眼显示器的光轴。
  5. 根据权利要求2或3所述的一种近眼显示器的测量方法,其特征在于,所述面阵光电传感器的感光面尺寸大于所述待测近眼显示器的出瞳尺寸,当使用所述图像边界识别算法判别光照分布图像具有最清晰的边界或面积相对最小时,直接获得所述待测近眼显示器的出瞳。
  6. 根据权利要求2或3所述的一种近眼显示器的测量方法,其特征在于,所述面阵光电传感器的感光面尺寸小于所述待测近眼显示器的出瞳尺寸,当所述图像边界识别算法判别光照分布图像具有最清晰的边界或面积相对最小时,通过所述传动装置控制面阵光电传感器与待测近眼显示器发生相对平动,通过拼接得到所述待测近眼显示器的出瞳。
  7. 根据权利要求2或3所述的一种近眼显示器的测量方法,其特征在于,所述图像边界识别算法包括但不限于边界锐度识别算法和/或边界对比度识别算法;所述图像中心识别算法包括但不限于灰度重心法或几何法。
  8. 根据权利要求1或2或3所述的一种近眼显示器的测量方法,其特征在于,所述待测近眼显示器输出的画面,包括但不限于全白画面或黑底白框画面。
  9. 根据权利要求1或2或3所述的一种近眼显示器的测量方法,其特征在于,所述传动装置控制所述面阵光电传感器或所述待测近眼显示器,使两者发生相对运动的过程,具体包括:通过处理分析面阵光电传感器所在位置获得的光照分布图像,获得相应的位姿调整信息,并通过传动装置调整所述面阵光电传感器与所述待测近眼显示器的相对位置。
  10. 根据权利要求1或2或3所述的一种近眼显示器的测量方法,其特征在于,所述面阵光电传感器与所述待测近眼显示器的相对运动包括上下、左右、前后三个方向的平动和沿着两个或以上相互垂直的旋转轴的转动,或者多自由度的复合运动。
  11. 一种近眼显示器的测量方法,其特征在于,使用不设置成像镜头的面阵光电传感器、传动装置和成像装置获得待测近眼显示器的光轴位置,具体包括:
    S1,所述待测近眼显示器输出第一检测画面;所述面阵光电传感器置于所述待测近眼显示器人眼观看区域内,直接接收来自所述待测近眼显示器的光输出信号并获得光照分布图像;
    S2,通过图像中心识别算法获得所述光照分布图像的中心位置,通过所述传动装置控制所述面阵光电传感器和/或所述成像装置和/或所述待测近眼显示器,以调整所述面阵光电传感器、所述成像装置及所述待测近眼显示器的相对位姿,使得所述成像装置的入瞳中心置于前序所述面阵光电传感器所获得的图像的中心点位置;
    S3,所述待测近眼显示器输出第二检测画面,所述成像装置对焦于第二检测画面,通过所述传动装置控制所述成像装置或所述待测近眼显示器,使两者发生相对运动,使得所述成像装置的接收靶面中心与获取的第二检测画面中心重合,并记录此时所述成像装置及其光轴位置信息;
    S4,所述待测近眼显示器输出第一检测画面,通过所述传动装置控制所述面阵光电传感器和/或所述成像装置和/或所述待测近眼显示器,以调整所述面阵光电传感器、所述成像装置及所述待测近眼显示器的相对位姿,使得所述面阵光电传感器的位姿至垂直于步骤S3中成像装置的光轴,然后控制所述面阵光电传感器在该光轴方向前后调节,直至获得光照分布图像边界相对最为清晰或图像面积最小;
    S5,反复S2至S4步骤,直至步骤S4所述的光照分布图像边界的清晰程度或图像面积达到设定的范围后,获得所述待测近眼显示器的出瞳参数。
  12. 根据权利要求11所述的一种近眼显示器的测量方法,其特征在于,所述图像中心识别算法包括但不限于灰度重心法或几何法。
  13. 根据权利要求11所述的一种近眼显示器的测量方法,其特征在于,所述待测近眼显示器输出的第一检测画面,包括但不限于全白标板画面或黑底白框画面;所述待测近眼显示器输出的第二检测画面,为带中心标记画面,包括但不限于全中心十字叉丝画面。
  14. 一种近眼显示器的测量装置,其特征在于,包括用于夹持待测近眼显示器的样品台、面阵光电传感器、第一传动装置以及程控系统;所述面阵光电传感器面向所述待测近眼显示器的输出方向放置,所述面阵光电传感器或所述样品台与所述第一传动装置相连接;通过程控系统控制第一传动装置,使所述待测近眼显示器与所述面阵光电传感器发生相对运动。
  15. 根据权利要求14所述的一种近眼显示器的测量装置,其特征在于,还包括成像装置,所述面阵光电传感器和所述成像装置分别面向所述待测近眼显示器的输出方向放置,所述成像装置和面阵光电传感器分别与第一传动装置相连接;通过程控系统控制第一传动装置,调整所述面阵光电传感器和所述成像装置的位姿。
  16. 根据权利要求14或15所述的一种近眼显示器的测量装置,其特征在于,所述的第一传动装置包括转动机构和/或平动机构。
  17. 根据权利要求14或15所述的一种近眼显示器的测量装置,其特征在于,所 述的第一传动装置是具有四个或以上旋转轴的机器人装置。
PCT/CN2022/140316 2022-03-04 2022-12-20 一种近眼显示器的测量方法及装置 WO2023165223A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210205733.3 2022-03-04
CN202210205733.3A CN114593897B (zh) 2022-03-04 2022-03-04 一种近眼显示器的测量方法及装置

Publications (1)

Publication Number Publication Date
WO2023165223A1 true WO2023165223A1 (zh) 2023-09-07

Family

ID=81807087

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/140316 WO2023165223A1 (zh) 2022-03-04 2022-12-20 一种近眼显示器的测量方法及装置

Country Status (2)

Country Link
CN (1) CN114593897B (zh)
WO (1) WO2023165223A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117760705A (zh) * 2024-02-22 2024-03-26 武汉精立电子技术有限公司 一种AR产品eyebox的测量方法与系统

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114593897B (zh) * 2022-03-04 2023-07-14 杭州远方光电信息股份有限公司 一种近眼显示器的测量方法及装置
CN114813061B (zh) * 2022-06-23 2022-09-20 武汉精立电子技术有限公司 一种近眼成像设备的光学参数检测方法及系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107884160A (zh) * 2017-09-25 2018-04-06 杭州浙大三色仪器有限公司 虚拟图像光电测量仪
CN109387131A (zh) * 2018-10-22 2019-02-26 中国航空工业集团公司洛阳电光设备研究所 一种平视显示器眼盒测量装置及测量方法
CN110264408A (zh) * 2019-07-05 2019-09-20 芋头科技(杭州)有限公司 近眼显示器的测量方法、装置、系统以及控制器和介质
CN113125114A (zh) * 2020-01-16 2021-07-16 舜宇光学(浙江)研究院有限公司 近眼显示光学系统的检测方法及其系统和平台以及电子设备
CN113252309A (zh) * 2021-04-19 2021-08-13 苏州市计量测试院 一种用于近眼显示设备的测试方法、测试装置及存储介质
WO2021250179A1 (de) * 2020-06-10 2021-12-16 Instrument Systems Optische Messtechnik Gmbh Vorrichtung und verfahren zur vermessung von display-systemen
CN114593897A (zh) * 2022-03-04 2022-06-07 杭州远方光电信息股份有限公司 一种近眼显示器的测量方法及装置

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11513349B2 (en) * 2008-03-13 2022-11-29 Everysight Ltd. Optical see-through (OST) near-eye display (NED) system integrating ophthalmic correction
CN107607294B (zh) * 2017-09-14 2020-01-31 歌尔科技有限公司 一种工业相机入瞳位置检测方法及系统
CN111678674B (zh) * 2020-06-09 2022-08-19 中佳盛建设股份有限公司 近眼显示测量方法及近眼显示测量系统
CN112816183B (zh) * 2021-03-03 2022-09-06 广州计量检测技术研究院 一种vr头戴显示设备移动特性检测装置及方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107884160A (zh) * 2017-09-25 2018-04-06 杭州浙大三色仪器有限公司 虚拟图像光电测量仪
CN109387131A (zh) * 2018-10-22 2019-02-26 中国航空工业集团公司洛阳电光设备研究所 一种平视显示器眼盒测量装置及测量方法
CN110264408A (zh) * 2019-07-05 2019-09-20 芋头科技(杭州)有限公司 近眼显示器的测量方法、装置、系统以及控制器和介质
CN113125114A (zh) * 2020-01-16 2021-07-16 舜宇光学(浙江)研究院有限公司 近眼显示光学系统的检测方法及其系统和平台以及电子设备
WO2021250179A1 (de) * 2020-06-10 2021-12-16 Instrument Systems Optische Messtechnik Gmbh Vorrichtung und verfahren zur vermessung von display-systemen
CN113252309A (zh) * 2021-04-19 2021-08-13 苏州市计量测试院 一种用于近眼显示设备的测试方法、测试装置及存储介质
CN114593897A (zh) * 2022-03-04 2022-06-07 杭州远方光电信息股份有限公司 一种近眼显示器的测量方法及装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117760705A (zh) * 2024-02-22 2024-03-26 武汉精立电子技术有限公司 一种AR产品eyebox的测量方法与系统
CN117760705B (zh) * 2024-02-22 2024-05-14 武汉精立电子技术有限公司 一种AR产品eyebox的测量方法与系统

Also Published As

Publication number Publication date
CN114593897A (zh) 2022-06-07
CN114593897B (zh) 2023-07-14

Similar Documents

Publication Publication Date Title
WO2023165223A1 (zh) 一种近眼显示器的测量方法及装置
CN105675266B (zh) 无限共轭光路测量光学镜头的调制传递函数的装置及方法
CA2819292C (en) Robotic surveying instrument and method for the automated autocollimation of a telescope of a surveying instrument comprising an autocollimation target
JP5858433B2 (ja) 注視点検出方法及び注視点検出装置
US7204596B2 (en) Projector with tilt angle measuring device
CN107024339B (zh) 一种头戴显示设备的测试装置及方法
US10931924B2 (en) Method for the generation of a correction model of a camera for the correction of an aberration
CN110967166B (zh) 近眼显示光学系统的检测方法、检测装置和检测系统
CN105258710B (zh) 一种高精度相机主点标定方法
US10142621B2 (en) Mass production MTF testing machine
TW201100779A (en) System and method for inspecting a wafer (3)
JP2010259605A (ja) 視線測定装置および視線測定プログラム
CN113034612B (zh) 一种标定装置、方法及深度相机
US9990739B1 (en) Method and device for fisheye camera automatic calibration
CN210322247U (zh) 一种光学模组装调测试装置
CN102313525B (zh) 激光束调节平行系统及其调节方法
CN111609995A (zh) 一种光学模组装调测试方法及装置
CN115494652A (zh) 一种对头显设备进行装配的方法、装置、设备及存储介质
CN111044262A (zh) 近眼显示光机模组检测装置
CN110779469B (zh) 一种地平式光电跟踪系统的轴系垂直度检测装置及方法
CN116907380A (zh) 基于图像信息的点衍射干涉仪被测镜精确对准方法及系统
US10685448B2 (en) Optical module and a method for objects' tracking under poor light conditions
US11622104B2 (en) Camera holder for economical and simplified test alignment
TW202232074A (zh) 用於測定遠焦光學系統的調制轉換函數的量測設備與方法
CN111707446A (zh) 光斑中心与探测器接收面中心对准的调节方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22929650

Country of ref document: EP

Kind code of ref document: A1