CN110264408B - Near-eye display measurement method, device, system, controller and medium - Google Patents

Near-eye display measurement method, device, system, controller and medium Download PDF

Info

Publication number
CN110264408B
CN110264408B CN201910605246.4A CN201910605246A CN110264408B CN 110264408 B CN110264408 B CN 110264408B CN 201910605246 A CN201910605246 A CN 201910605246A CN 110264408 B CN110264408 B CN 110264408B
Authority
CN
China
Prior art keywords
imaging element
image
virtual image
pixel
movement amount
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910605246.4A
Other languages
Chinese (zh)
Other versions
CN110264408A (en
Inventor
郭凯凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yutou Technology Hangzhou Co Ltd
Original Assignee
Yutou Technology Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yutou Technology Hangzhou Co Ltd filed Critical Yutou Technology Hangzhou Co Ltd
Priority to CN201910605246.4A priority Critical patent/CN110264408B/en
Publication of CN110264408A publication Critical patent/CN110264408A/en
Application granted granted Critical
Publication of CN110264408B publication Critical patent/CN110264408B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
    • G06T3/4069Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution by subpixel displacements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution
    • G06T3/4076Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution using the original low-resolution images to iteratively correct the high-resolution images

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Eye Examination Apparatus (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a measuring method, a device and a system of a near-eye display, a controller and a medium, wherein the method comprises the following steps: acquiring the pixel proportion of the imaging element and the virtual image, and the movement amount proportion of the image of the virtual image on the imaging element and the virtual image; determining a position to be moved of the virtual image; controlling the virtual image to move to each position to be moved, acquiring an imaging element image of each position to be moved on the imaging element, and acquiring an imaging element image sequence and a relative movement amount between every two adjacent imaging element images; and performing image recombination based on the imaging element image sequence and the relative movement amount between the two adjacent imaging element images to obtain a high-fidelity image of the virtual image to be detected. The invention does not need to mechanically move the device of the measuring system, only needs to scan the test pattern to different positions of the micro display, correspondingly collects a plurality of images to carry out image recombination, can obtain the image of the high-fidelity virtual image, and has simple calculation, low cost and high precision.

Description

Near-eye display measurement method, device, system, controller and medium
Technical Field
The invention relates to the technical field of near-eye display, in particular to a measuring method, a measuring device, a measuring system, a controller and a medium of a near-eye display.
Background
Existing near-eye displays include Virtual Reality displays (Virtual Reality displays) and Augmented Reality displays (Augmented Reality displays), and as shown in fig. 1, a near-eye Display typically includes a microdisplay and a lens, with the distance between the microdisplay and the lens being slightly smaller than the focal length of the lens. Therefore, light rays emitted by one pixel point on the micro display are converged into parallel light rays after passing through the lens, and the parallel light rays enter eyes through pupils to form images on retinas. By extending the light rays entering the eye in the opposite direction, a virtual image point corresponding to the light rays can be obtained, and the image on the whole micro display can form a corresponding virtual image at the same distance. In order to measure the quality of the virtual image of the near-eye display, an imaging device similar to the human eye is usually required to acquire an image of the virtual image to be measured, and the imaging device is called a measurement system.
As shown in fig. 2, the measurement system, which typically includes a camera lens with a front aperture and imaging elements, is used to acquire images of the virtual image with high fidelity. In practical use, the virtual image of the near-eye display has a large field of view and high resolution, and for example, in order to provide a user with an immersive feeling, the field angle in the horizontal and vertical directions needs to be 100 ° × 100 °, and the resolution needs to be 120 pixels per degree, so that 12000 × 12000 pixels of the virtual image exist in the entire field of view of the virtual image. It can be seen that the number and density of pixels in the existing virtual image exceeds the imaging capabilities of the existing test systems on the market today. To solve this problem, the following two methods are commonly used in the prior art to measure near-eye displays, method 1: the method comprises the steps of adopting an imaging system with a small field angle, collecting a part of a virtual image each time, scanning the imaging system through a mechanical device, shooting different parts of the virtual image, and splicing a plurality of images with small fields and high resolution to obtain an image of the whole virtual image. The method 2 comprises the following steps: the method comprises the steps of adopting an imaging system with a large field of view, wherein the number of pixels of an imaging element is smaller than that of a tested virtual image, then moving the positions of the imaging element and the image of the virtual image by using a mechanical device (such as piezoelectric ceramics or parallel glass plates), but ensuring that the relative movement amount at each time is smaller than the pixel size of one imaging element, so as to acquire a plurality of different low-resolution images of the virtual image, and finally obtaining a high-fidelity image of the virtual image by using an image reconstruction method.
However, the existing methods have the following disadvantages: 1) Existing methods require precision mechanical moving devices to change the position of the entire measurement system or imaging element; 2) During measurement, due to frequent mechanical scanning, a mechanical scanning component in a measurement system needs to be regularly corrected; 3) For different near-eye displays to be measured, different scanning devices are required to be matched with the near-eye displays due to different forms, and the cost is high.
Disclosure of Invention
The invention aims to provide a measuring method, a measuring device, a measuring system, a controller and a medium of a near-eye display, which can obtain a high-fidelity virtual image by scanning a test pattern to different positions of a micro display without mechanically moving a device of the measuring system, correspondingly acquiring a plurality of images and then recombining the images, and has the advantages of simple calculation method, low cost and high measuring precision.
In order to solve the above technical problem, according to a first embodiment of the present invention, there is provided a measurement method of a near-eye display, including:
acquiring pixel proportions of the imaging element and the virtual image;
acquiring the ratio of the movement amount of the virtual image on the imaging element and the movement amount of the virtual image according to the pixel ratio;
determining the position to be moved of the virtual image according to the pixel proportion and the movement amount proportion;
controlling the virtual image to move to each position to be moved, correspondingly acquiring an imaging element image corresponding to each position to be moved on an imaging element, acquiring an imaging element image sequence, and acquiring the relative movement amount between every two adjacent imaging element images in the imaging element image sequence; and carrying out image recombination based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images to obtain the image of the virtual image to be detected.
Further, the acquiring pixel proportions of the imaging element and the virtual image includes:
lighting two pixel points along ase:Sub>A first direction of the virtual image, wherein the position coordinates in the first direction are A and B respectively, and the distance is (B-A) × P hr-x Wherein, B>A,P hr-x Representing a pixel size of the virtual image in a first direction;
correspondingly acquiring images corresponding to the two pixel points lightened along the first direction of the virtual image on the imaging element, wherein the position coordinates of the two pixel points lightened along the first direction on the imaging element are respectively a and b, and the distance is (b-a) P lr-x Wherein b is>a,P lr-x Representing a pixel size of the imaging element in a first direction;
according to (B-A) P hr-x =(b-a)*P lr-x The ratio P between the size of the imaging element and the pixel size of the virtual image in the first direction can be obtained lr-x /P hr-x =(B-A)/(b-a);
Obtaining the ratio P between the pixel sizes of the imaging element and the virtual image in the second direction lr-y /P hr-y = (D-C)/(D-C), wherein P lr-y Indicating the pixel size, P, of the imaging element in the second direction hr-y Indicating the pixel size of the virtual image in the second direction, D, C indicating a direction along the virtual imagePosition coordinates of two pixels lighted in the second direction of the image, D>C, d, C represent the position coordinates of the image in the second direction on the imaging element corresponding to two pixel points lit in the second direction of the virtual image, d>c, the second direction is perpendicular to the first direction.
Further, the acquiring a moving amount ratio of the image of the virtual image on the imaging element and the virtual image according to the pixel ratio includes:
the virtual image is shifted in the first direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is shifted on the imaging element by Sx pixels of the imaging element,
Sx=P hr-x /P lr-x =(b-a)/(B-A) (1);
the virtual image is shifted by 1 pixel of the virtual image in the second direction, the image of the virtual image corresponding thereto is shifted by pixels of Sy imaging elements on the imaging elements,
Sy=P hr-y /P lr-y =(d-c)/(D-C) (2)。
further, determining the position to be moved of the virtual image according to the pixel proportion and the movement amount proportion includes:
will P lr-x /P hr-x Rounding to obtain an integer value Rx;
will P lr-y /P hr-y Rounding to obtain an integral value Ry;
the method comprises the steps of establishing a coordinate system by taking an initial position of a virtual image as an origin and the first direction and the second direction as coordinate axes, drawing a first straight line vertical to the first direction at each integer coordinate from the origin [0-Rx ] along the first direction in any coordinate quadrant, drawing a second straight line vertical to the second direction at each integer coordinate from the origin [0-Ry ] along the second direction, and taking all intersection positions of the first straight line and the second straight line as positions to be moved of the virtual image.
Further, the acquiring a relative movement amount between every two adjacent imaging element images in the imaging element image sequence includes:
acquiring a first movement amount of a virtual image corresponding to every two adjacent imaging element images in the imaging element image sequence in a first direction and a second movement amount in a second direction;
and acquiring a first relative movement amount in a first direction and a second relative movement amount in a second direction between every two adjacent imaging element images in the imaging element image sequence according to the first movement amount, the second movement amount and the formula (1) and the formula (2).
Further, the image reorganizing based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images to obtain an image of a virtual image to be measured includes:
and based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images, performing image recombination by adopting a super-resolution image reconstruction algorithm to obtain an image of the virtual image to be detected.
Further, the method further comprises: controlling the virtual image to move to each of the positions to be moved by controlling movement of a test image on the microdisplay.
According to a second embodiment of the present invention, there is provided a measuring apparatus of a near-eye display, including:
a pixel ratio acquisition module configured to acquire pixel ratios of the imaging element and the virtual image;
a movement amount ratio acquisition module configured to acquire a movement amount ratio of an image of the virtual image formed on the imaging element and the virtual image according to the pixel ratio;
the to-be-moved position determining module is configured to determine the to-be-moved position of the virtual image according to the pixel proportion and the movement amount proportion;
an image sequence acquisition module configured to control the virtual image to move to each of the positions to be moved, correspondingly acquire an imaging element image corresponding to each of the positions to be moved on an imaging element, acquire an imaging element image sequence, and acquire a relative movement amount between every two adjacent imaging element images in the imaging element image sequence;
and the image recombination module is configured to perform image recombination based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images to obtain an image of a virtual image to be detected.
Further, the pixel proportion obtaining module comprises a first pixel proportion obtaining unit and a second pixel proportion obtaining unit, wherein,
the first pixel proportion acquisition unit is configured to:
lighting two pixel points along ase:Sub>A first direction of the virtual image, wherein the position coordinates in the first direction are A and B respectively, and the distance is (B-A) × P hr-x Wherein, B>A,P hr-x Representing a pixel size of the virtual image in a first direction;
correspondingly acquiring images corresponding to the two pixel points lightened along the first direction of the virtual image on the imaging element, wherein the position coordinates of the two pixel points lightened along the first direction on the imaging element are respectively a and b, and the distance is (b-a) P lr-x Wherein b is>a,P lr-x Representing a pixel size of the imaging element in a first direction;
according to (B-A) P hr-x =(b-a)*P lr-x The ratio P between the size of the imaging element and the pixel size of the virtual image in the first direction can be obtained lr-x /P hr-x =(B-A)/(b-a);
The second pixel proportion acquisition unit is configured to:
the step of executing the first pixel proportion acquiring unit acquires a proportion P between the sizes of the imaging element and the virtual image pixel in the second direction lr-y /P hr-y = D-C)/(D-C), wherein P lr-y Indicating the pixel size of the imaging element in a second direction, P hr-y Indicating the pixel size of the virtual image in the second direction, D, C indicating the position coordinates of two pixel points lit along the virtual image in the second direction, D>C, d, C represent the position coordinates of the image in the second direction on the imaging element corresponding to two pixel points lit in the second direction of the virtual image, d>c, the second direction is perpendicular to the first direction.
Further, the movement amount ratio acquisition module includes a first movement amount ratio acquisition unit and a second movement amount ratio acquisition unit, wherein,
the first movement amount proportion acquisition unit is configured to:
the virtual image is shifted in the first direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is shifted on the imaging element by Sx pixels of the imaging element,
Sx=P hr-x /P lr-x =(b-a)/(B-A) (1);
the second movement amount proportion acquisition unit is configured to:
the virtual image is shifted in the second direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is shifted on the imaging elements by pixels of the Sy imaging elements,
Sy=P hr-y /P lr-y =(d-c)/(D-C) (2)。
further, the to-be-moved position determining module is specifically configured to:
will P lr-x /P hr-x Rounding to obtain an integer value Rx;
will P lr-y /P hr-y Rounding to obtain an integral value Ry;
the method comprises the steps of establishing a coordinate system by taking an initial position of a virtual image as an origin and the first direction and the second direction as coordinate axes, drawing a first straight line vertical to the first direction at each integer coordinate from the origin [0-Rx ] along the first direction in any coordinate quadrant, drawing a second straight line vertical to the second direction at each integer coordinate from the origin [0-Ry ] along the second direction, and taking all intersection positions of the first straight line and the second straight line as positions to be moved of the virtual image.
Further, the image sequence acquisition module includes a relative movement amount acquisition unit configured to:
acquiring a first movement amount of a virtual image corresponding to every two adjacent imaging element images in the imaging element image sequence in a first direction and a second movement amount in a second direction;
and acquiring a first relative movement amount in a first direction and a second relative movement amount in a second direction between every two adjacent imaging element images in the imaging element image sequence according to the first movement amount, the second movement amount and the formula (1) and the formula (2).
Further, the image recombination module adopts a super-resolution image reconstruction algorithm to carry out image recombination to obtain an image of the virtual image to be detected.
Further, the image sequence acquisition module is also configured to control the virtual image to move to each of the positions to be moved by controlling the movement of the test image on the microdisplay.
According to a third embodiment of the invention, a controller is provided comprising a memory and a processor, the memory storing a computer program enabling the implementation of the steps of the method when the program is executed by the processor.
According to a fourth embodiment of the invention, a computer-readable storage medium is provided for storing a computer program, which when executed by a computer or processor performs the steps of the method.
According to a fifth embodiment of the present invention, there is provided a measurement system of a near-eye display, including:
a measurement assembly comprising a pre-aperture, a camera lens, an imaging element, and a controller,
a near-eye display that forms an image on the imaging element;
the controller comprises a memory and a processor, the memory storing a computer program which, when executed by the processor, is capable of implementing the steps of the method of measurement of the near-eye display.
Further, the field angle of the imaging element is larger than that of the near-eye display, and the resolution of the imaging element is lower than that of the near-eye display.
Compared with the prior art, the invention has obvious advantages and beneficial effects. By means of the technical scheme, the measuring method, the device, the system, the controller and the medium of the near-eye display can achieve considerable technical progress and practicability, have wide industrial utilization value and at least have the following advantages:
(1) The method is low in cost, and can be matched with the existing low-resolution test equipment to obtain high-fidelity virtual image images;
(2) By controlling the movement of the micro-display test pattern, an imaging element image sequence is acquired, and an accurate mechanical device is not needed to move the imaging element or the whole test equipment;
(3) The relative movement amount between every two adjacent images of the imaging element pattern sequence is calculated according to the pixel proportion of the imaging element and the virtual image, so that the calculation is more accurate, and the measurement precision is higher.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical means of the present invention more clearly understood, the present invention may be implemented in accordance with the content of the description, and in order to make the above and other objects, features, and advantages of the present invention more clearly understood, the following preferred embodiments are described in detail with reference to the accompanying drawings.
Drawings
FIG. 1 is a schematic diagram of a conventional imaging principle of a near-eye display;
FIG. 2 is a schematic diagram of a measurement system of a conventional near-eye display;
FIG. 3 (a) is a schematic diagram of an image on a microdisplay of a prior art near-eye display;
FIG. 3 (b) is a schematic diagram of a virtual image presented by a conventional near-eye display;
FIG. 3 (c) is a schematic diagram of an image captured by an imaging element of a conventional near-eye display;
fig. 4 is a flowchart of a measuring method of a near-eye display according to an embodiment of the invention;
fig. 5 (a) is a schematic diagram of a pixel position of a virtual image according to an embodiment of the present invention;
fig. 5 (b) is a schematic diagram of a pixel location of an imaging element according to an embodiment of the invention;
fig. 6 (a) is a schematic diagram of two lighted pixels in a first direction of a virtual image according to an embodiment of the present invention;
fig. 6 (b) is a schematic diagram of pixel positions of two lighted pixels on the imaging element in the first direction of the virtual image according to an embodiment of the present invention;
fig. 7 (a) is a schematic diagram of two lighted pixels in a second direction of a virtual image according to an embodiment of the present invention;
fig. 7 (b) is a schematic diagram of pixel positions of two lighted pixels on the imaging element in the second direction of the virtual image according to an embodiment of the present invention;
FIG. 8 is a schematic diagram illustrating the positions of all the virtual images that need to be moved according to an embodiment of the present invention;
FIG. 9 is a diagram of the amount of movement of the image of the virtual image in units of the imaging element pixel size for the example shown in FIG. 8;
FIG. 10 (a) is a schematic diagram of a virtual image moving process according to an embodiment of the present invention;
FIG. 10 (b) is a schematic view of the image of the corresponding virtual image on the imaging element corresponding to the movement process of FIG. 10 (a);
FIG. 11 (a) is a schematic view of a virtual image moving path along a line scan according to an embodiment of the present invention;
FIG. 11 (b) is a schematic diagram of a virtual image moving path along a column scan according to an embodiment of the present invention;
fig. 12 (a) is a schematic diagram of a virtual image during a measurement process of a near-eye display according to an embodiment of the present invention;
FIG. 12 (b) is a schematic diagram of an image of a virtual image acquired by an imaging element during measurement of a near-eye display according to an embodiment of the invention;
FIG. 12 (c) is a schematic diagram of a virtual image reconstructed from a plurality of low resolution images during a measurement of a near-eye display according to an embodiment of the invention;
fig. 13 is a schematic view of a measuring apparatus of a near-eye display according to an embodiment of the invention.
[ notation ] to show
1: the pixel proportion obtaining module 2: moving quantity proportion acquisition module
3: the to-be-moved position determination module 4: image sequence acquisition module
5: image reorganization module
Detailed Description
To further illustrate the technical means and effects of the present invention for achieving the predetermined objects, the following detailed description of the embodiments and effects of a method, an apparatus, a system, a controller and a medium for measuring a near-eye display according to the present invention will be made with reference to the accompanying drawings and preferred embodiments.
The number and density of the pixels of the virtual image of the existing near-eye display device exceed the imaging capability of the existing test system on the market, as shown in fig. 3 (a), 3 (b) and 3 (c), the image acquired by the imaging element is not the image of the fidelity virtual image, but is an image with lower resolution, and in the prior art, the scanning of a mechanical device or the accurate movement of the components of a measurement system needs to be performed at higher cost to acquire the image of the fidelity virtual image. The embodiment of the invention provides a measuring method of a near-eye display, which does not need to mechanically move a device of a measuring system, only needs to scan a test pattern to different positions of a micro display, correspondingly acquires a plurality of images, and then carries out image recombination to obtain an image of a high-fidelity virtual image, and as shown in fig. 4, the measuring method specifically comprises the following steps:
s1, acquiring pixel proportions of an imaging element and a virtual image;
as an embodiment, as shown in fig. 5, in fig. 5 (a), a triangular array represents a pixel position of a virtual image, in fig. 5 (b), an array of an open circle represents a pixel position of an imaging element, a distance between adjacent pixels is defined as a pixel size, and pixel position coordinate values are defined to be sequentially increased from left to right, wherein the pixel size in the first direction and the pixel size in the second direction may be the same or different. In fig. 5, the pixel size of the first direction of the virtual image is P hr-x Image of the second directionThe size of the element is P hr-y Correspondingly, the pixel size of the first direction of the imaging element is P lr-x The pixel size of the second direction is P lr-x The first direction and the second direction are perpendicular, and as an example, the first direction is a horizontal direction and the second direction is a vertical direction.
The step S1 includes:
s11, lighting two pixel points along ase:Sub>A first direction of the virtual image, wherein the position coordinates in the first direction are A and B respectively, and the distance is (B-A) × P hr-x As shown in FIG. 6 (a), wherein B>A,P hr-x Representing a pixel size of the virtual image in a first direction;
step S12, correspondingly acquiring images corresponding to the two pixel points lightened along the first direction of the virtual image on the imaging element, wherein the position coordinates of the two pixel points lightened along the first direction on the imaging element are respectively a and b, and the distance is (b-a) × P lr-x As shown in FIG. 6 (b), wherein b>a,P lr-x Representing a pixel size of the imaging element in a first direction;
step S13, according to (B-A) P hr-x =(b-a)*P lr-x The ratio P between the size of the imaging element and the pixel size of the virtual image in the first direction can be obtained lr-x /P hr-x = (B-ase:Sub>A)/(B-ase:Sub>A), it will be appreciated that, since the virtual image pixel size and the pixel size of the imaging element are different, the values of ase:Sub>A and B are not necessarily integers when ase:Sub>A and B are integer values;
step S14, synchronization step S11-step S13, obtaining the ratio P between the imaging element and the virtual image pixel size in the second direction lr-y /P hr-y = (D-C)/(D-C), the pixel position coordinate definition becomes larger in order from top to bottom as shown in fig. 7 (a) and 7 (b). Wherein, P lr-y Indicating the pixel size, P, of the imaging element in the second direction hr-y Indicating the pixel size of the virtual image in the second direction, D, C indicating the position coordinates of two pixel points lit along the virtual image in the second direction, D>C, d, C represent the position coordinates of the image in the second direction on the imaging element corresponding to two pixel points lit in the second direction of the virtual image, d>c, the second direction is perpendicular to the first direction.
S2, acquiring the ratio of the movement amount of the virtual image on the imaging element according to the pixel ratio;
as an example, the step S2 includes:
step S21, the virtual image is moved in the first direction by 1 pixel of the virtual image, the image of the virtual image corresponding to the pixel is moved on the imaging element by Sx pixels of the imaging element,
Sx=P hr-x /P lr-x =(b-a)/(B-A) (1);
step S22, the virtual image is moved in the second direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is moved on the imaging element by Sy pixels of the imaging element,
Sy=P hr-y /P lr-y =(d-c)/(D-C) (2)。
thus, when the virtual image is controlled to move along the first direction or the second direction, the movement amount of the image of the virtual image corresponding to the virtual image on the imaging element in the pixel size of the imaging element can be accurately acquired. For example, the virtual image is shifted by 2 pixels of the virtual image in the first direction and 3 pixels of the virtual image in the second direction, and the image of the corresponding virtual image is shifted by 2 × sx in the first direction and 3 × sy in the second direction on the imaging element.
S3, determining a position to be moved of the virtual image according to the pixel proportion and the movement amount proportion;
as an example, the step S3 includes:
step S31, adding P lr-x /P hr-x Rounding to obtain an integer value Rx;
step S32, adding P lr-y /P hr-y Rounding to obtain an integral value Ry;
step S33, using the initial position of the virtual image as an origin, establishing a coordinate system using the first direction and the second direction as coordinate axes, drawing a first straight line perpendicular to the first direction at each integer coordinate from the origin [0-Rx ] along the first direction in any coordinate direction quadrant, drawing a second straight line perpendicular to the second direction at each integer coordinate from the origin [0-Ry ] along the second direction, where all intersection positions of the first straight line and the second straight line are positions to be moved of the virtual image, and then the positions to be moved are (1 + Rx) × (1 + Ry) as shown in fig. 8, where Rx is equal to 4 and Ry is equal to 4, and the intersection point of the dashed lines in fig. 8, that is, the position where the solid triangle is located is all positions where the virtual image needs to be moved, and since there is no relationship between the pixel sizes in the horizontal direction and the vertical direction, the values of Rx and Ry may also be different. Correspondingly, fig. 9 shows that in the case where Rx is equal to 4 and ry is equal to 4, the virtual image is moved in the horizontal direction and the vertical direction by the pixel size of 1 virtual image as a step, and the image of the corresponding virtual image is moved on the imaging element by the movement amount in units of the pixel size of the imaging element.
S4, controlling the virtual image to move to each position to be moved, correspondingly acquiring an imaging element image corresponding to each position to be moved on an imaging element, acquiring an imaging element image sequence, and acquiring the relative movement amount between every two adjacent imaging element images in the imaging element image sequence; fig. 10 (a) shows the movement process of the virtual image in one embodiment, and fig. 10 (b) shows that the corresponding image of the virtual image on the imaging element corresponds to the movement process of fig. 10 (a).
As an example, the virtual image is controlled to move to each position to be moved by controlling the movement of the test image on the microdisplay, so that the embodiment of the invention can acquire a plurality of images on the imaging element by controlling the movement of the test image on the microdisplay without mechanical scanning, and can acquire the image of the virtual image to be measured with high fidelity by performing image recombination.
It should be noted that the specific moving path of the virtual image is not limited, and may be, for example, scanning along a row or scanning along a column, as shown in fig. 11 (a) and 11 (b), as long as all the positions to be moved are finally traversed.
As an example, in step S4, the acquiring a relative movement amount between every two adjacent imaging element images in the imaging element image sequence includes:
step S41, acquiring a first movement amount of a virtual image corresponding to every two adjacent imaging element images in the imaging element image sequence in a first direction and a second movement amount in a second direction;
step S42, obtaining a first relative movement amount in a first direction and a second relative movement amount in a second direction between every two adjacent imaging element images in the imaging element image sequence according to the first movement amount, the second movement amount and the formula (1) and the formula (2).
As can be seen from the above calculation process, the relative movement amount between every two adjacent imaging element images is smaller than the movement amount of the imaging element pixel size.
And S5, carrying out image recombination based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images to obtain a high-fidelity image of the virtual image to be detected.
As an example, in step S5, based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images, image reconstruction is performed by using a super-resolution image reconstruction algorithm, so as to obtain a high-fidelity image of the virtual image to be measured. The imaging element image sequence can be realized by adopting the existing measuring equipment, even if the resolution is low, the imaging element image sequence with low resolution is obtained by the method of the embodiment of the invention, then the super-resolution image reconstruction algorithm is adopted to carry out image reconstruction, the high-fidelity image of the virtual image to be detected can be obtained, the effect images are shown in fig. 12 (a), fig. 12 (b) and fig. 12 (c), and the corresponding near-to-eye display equipment is detected by detecting the quality of the image of the virtual image to be detected.
The embodiment of the present invention further provides a measuring apparatus for a near-eye display, as shown in fig. 13, including a pixel ratio obtaining module 1, a movement amount ratio obtaining module 2, a to-be-moved position determining module 3, an image sequence obtaining module 4, and an image recombining module 5, where the pixel ratio obtaining module 1 is configured to obtain pixel ratios of an imaging element and a virtual image; the movement amount ratio acquisition module 2 is configured to acquire the ratio of the movement amount of the image formed by the virtual image on the imaging element and the virtual image according to the pixel ratio; the to-be-moved position determining module 3 is configured to determine the to-be-moved position of the virtual image according to the pixel proportion and the movement amount proportion; the image sequence acquisition module 4 is configured to control the virtual image to move to each of the positions to be moved, correspondingly acquire an imaging element image corresponding to each of the positions to be moved on an imaging element, acquire an imaging element image sequence, and acquire a relative movement amount between every two adjacent imaging element images in the imaging element image sequence; the image reorganizing module 5 is configured to reorganize the images based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images, so as to obtain a high-fidelity image of the virtual image to be detected.
As an example, the pixel proportion obtaining module 1 includes a first pixel proportion obtaining unit and a second pixel proportion obtaining unit, wherein the first pixel proportion obtaining unit is configured to: as shown in fig. 6 (ase:Sub>A), two pixels are lit along ase:Sub>A first direction of the virtual image, the position coordinates in the first direction are ase:Sub>A and B, respectively, and the distance is (B-ase:Sub>A) × P hr-x Wherein B is>A,P hr-x Representing a pixel size of the virtual image in a first direction; as shown in fig. 6 (b), the images corresponding to the two pixel points lit along the first direction of the virtual image are correspondingly obtained on the imaging element, the position coordinates of the imaging element in the first direction are a and b, respectively, and the distance is (b-a) × P lr-x Wherein b is>a,P xr-x Representing a pixel size of the imaging element in a first direction; according to (B-A) P hr-x =(b-a)*P lr-x The ratio P between the size of the imaging element and the pixel size of the virtual image in the first direction can be obtained lr-x /P hr-x = B-ase:Sub>A)/(B-ase:Sub>A). As shown in fig. 7 (a) and 7 (b), the second pixel proportion acquisition unit is configured to acquire a proportion P between the imaging element and the virtual image pixel size in the second direction with the execution of the first pixel proportion acquisition unit lr-y /P hr-y = D-C)/(D-C), wherein P lr-y Indicating the pixel size, P, of the imaging element in the second direction hr-y Indicating a virtual image asPixel size in two directions, D, C represents the position coordinates of two pixel points lit along the virtual image in the second direction, D>C, d, C represent the position coordinates of the image in the second direction on the imaging element corresponding to two pixel points lit in the second direction of the virtual image, d>c, the second direction is perpendicular to the first direction, and as an example, the first direction is a horizontal direction and the second direction is a vertical direction.
As an example, the movement amount ratio acquisition module 2 includes a first movement amount ratio acquisition unit and a second movement amount ratio acquisition unit, wherein the first movement amount ratio acquisition unit is configured to: the virtual image is shifted in the first direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is shifted on the imaging element by Sx pixels of the imaging element,
Sx=P hr-x /P lr-x =(b-a)/(B-A) (1);
the second movement amount proportion acquisition unit is configured to: the virtual image is shifted in the second direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is shifted on the imaging elements by pixels of the Sy imaging elements,
Sy=P hr-y /P lr-y =(d-c)/(D-C) (2)。
thus, when the virtual image is controlled to move along the first direction or the second direction, the movement amount of the image of the virtual image corresponding to the virtual image on the imaging element in the pixel size of the imaging element can be accurately acquired. For example, the virtual image is shifted by 2 pixels of the virtual image in the first direction and 3 pixels of the virtual image in the second direction, and the image of the corresponding virtual image is shifted by 2 × sx in the first direction and 3 × sy in the second direction on the imaging element.
As an example, the to-be-moved position determining module 3 is specifically configured to: will P lr-x /P hr-x Rounding to obtain an integer value Rx; will P lr-y /P hr-y Rounding to obtain an integer value Ry; establishing a coordinate system by taking the initial position of the virtual image as an origin and the first direction and the second direction as coordinate axes, and sitting on any one of the seatsWithin the beacon quadrant, from the origin [0-Rx ] along the first direction]Is drawn to a first straight line perpendicular to the first direction from the origin along the second direction [0-Ry [ ]]Is the position to be moved of the virtual image, then there are (1 + Rx) (+ 1+ Ry) positions to be moved, as in the example shown in fig. 8, rx is equal to 4, ry is equal to 4, the intersection point of the dashed lines in fig. 8, that is, the position where the solid triangle is located is the position where the virtual image needs to be moved, and since there is no relation between the pixel sizes in the horizontal direction and the vertical direction, the values of Rx and Ry may also be different. Correspondingly, fig. 9 shows that in the case where Rx is equal to 4 and ry is equal to 4, the virtual image is moved in the horizontal direction and the vertical direction by the pixel size of 1 virtual image as a step, and the image of the corresponding virtual image is moved on the imaging element by the movement amount in units of the pixel size of the imaging element.
As an example, the image sequence obtaining module 4 is further configured to control the test image to move to each of the positions to be moved by controlling the test image to move on the microdisplay, so that in the embodiment of the present invention, without mechanical scanning, only by controlling the test image to move on the imaging element on the microdisplay, a plurality of images are correspondingly obtained, and image recombination is performed, so that the image of the virtual image to be detected can be collected with high fidelity. Fig. 10 (a) shows the movement process of the virtual image in one embodiment, and fig. 10 (b) shows that the corresponding image of the virtual image on the imaging element corresponds to the movement process of fig. 10 (a). It should be noted that the specific moving path of the virtual image is not limited, and may be, for example, scanning along a row or scanning along a column, as shown in fig. 11 (a) and 11 (b), as long as all the positions to be moved are finally traversed.
The image sequence acquisition module 4 includes a relative movement amount acquisition unit configured to: acquiring a first movement amount of a virtual image corresponding to every two adjacent imaging element images in the imaging element image sequence in a first direction and a second movement amount in a second direction; and acquiring a first relative movement amount in a first direction and a second relative movement amount in a second direction between every two adjacent imaging element images in the imaging element image sequence according to the first movement amount, the second movement amount and the formula (1) and the formula (2). As can be seen from the above calculation process, the relative movement amount between each adjacent two imaging element images is smaller than the movement amount of the imaging element pixel size.
As an example, the image reconstruction module 5 performs image reconstruction by using a super-resolution image reconstruction algorithm to obtain a high-fidelity image of the virtual image to be measured. The imaging element image sequence can be realized by adopting the existing measuring equipment, even if the resolution is low, the imaging element image sequence with low resolution is obtained by the device of the embodiment of the invention, then the super-resolution image reconstruction algorithm is adopted to carry out image reconstruction, the high-fidelity image of the virtual image to be detected can be obtained, the effect images are shown in fig. 12 (a), fig. 12 (b) and fig. 12 (c), and the corresponding near-to-eye display equipment is detected by detecting the quality of the image of the virtual image to be detected.
An embodiment of the present invention further provides a controller, which includes a memory and a processor, where the memory stores a computer program, and the program, when executed by the processor, can implement the steps of the measurement method of the near-eye display.
Embodiments of the present invention also provide a computer-readable storage medium for storing a computer program, which when executed by a computer or a processor implements the steps of the method for measuring a near-eye display.
The embodiment of the invention also provides a measuring system of the near-eye display, which comprises a measuring component and the near-eye display, wherein the measuring component comprises a front aperture, a camera lens, an imaging element and a controller; the near-eye display forms an image on the imaging element; the controller comprises a memory and a processor, the memory storing a computer program which, when executed by the processor, is capable of implementing the steps of the method of measurement of the near-eye display.
As an example, the measurement system of the near-eye display may be applied to the following scenarios: the field angle of the imaging element is greater than the field angle of the near-eye display, and the resolution of the imaging element is lower than the resolution of the near-eye display.
According to the embodiment of the invention, the resolution of the near-eye display testing system is enhanced, the low-resolution imaging element can be directly adopted to collect images, and a plurality of low-resolution images are utilized to reconstruct the image of the high-fidelity virtual image of the near-eye display. By displaying an image on the virtual image and moving the image by using the relationship between the virtual image and the pixels of the imaging element, a low-resolution image is acquired with the imaging element every time the image is moved to a position, so that the imaging element obtains a plurality of low-resolution images, and there is a movement amount smaller than the size of one imaging element pixel between the low-resolution images. Compared with the traditional super-resolution method, the embodiment of the invention can obtain the controllable movement amount without a precise mechanical movement device.
The embodiment of the invention has lower cost, and can still obtain high-fidelity virtual image images by matching with the super-resolution reconstruction in the existing low-resolution test system; scanning an image on a microdisplay without the need for precise mechanical devices to move the imaging element or the entire test system; through the calibration phase, we can obtain the precise movement amount, so that the calculation is more effective and the accuracy is high.
Although the present invention has been described with reference to a preferred embodiment, it should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (18)

1. A method of measuring a near-eye display, comprising:
acquiring pixel proportions of the imaging element and the virtual image;
acquiring the ratio of the movement amount of the virtual image on the imaging element and the movement amount of the virtual image according to the pixel ratio;
determining the position to be moved of the virtual image according to the pixel proportion and the movement amount proportion;
controlling the virtual image to move to each position to be moved, correspondingly acquiring an imaging element image corresponding to each position to be moved on an imaging element, acquiring an imaging element image sequence, and acquiring the relative movement amount between every two adjacent imaging element images in the imaging element image sequence;
and carrying out image recombination based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images to obtain an image of a virtual image to be detected.
2. The method of measuring a near-eye display of claim 1,
the acquiring of the pixel proportions of the imaging element and the virtual image comprises:
lighting two pixel points along ase:Sub>A first direction of the virtual image, wherein the position coordinates in the first direction are A and B respectively, and the distance is (B-A) × P hr-x Wherein B > A, P hr-x Representing a pixel size of the virtual image in a first direction;
correspondingly acquiring images corresponding to the two pixel points lightened along the first direction of the virtual image on the imaging element, wherein the position coordinates of the two pixel points lightened along the first direction on the imaging element are respectively a and b, and the distance is (b-a) P lr-x Wherein b > a, P lr-x Representing a pixel size of the imaging element in a first direction;
according to (B-A) P hr-x =(b-a)*P lr-x The ratio P between the size of the imaging element and the size of the virtual image pixel in the first direction can be obtained lr-x /P hr-x =(B-A)/(b-a);
Between the size of the imaging element and the pixel size of the virtual image in the second directionRatio P lr-y /P hr-y = D-C)/(D-C), wherein P lr-y Indicating the pixel size, P, of the imaging element in the second direction hr-y The pixel size of the virtual image in the second direction is represented, D, C represents the position coordinates of two pixel points lighted along the second direction of the virtual image, D > C, D, C represent the position coordinates of the image on the imaging element in the second direction corresponding to the two pixel points lighted along the second direction of the virtual image, D > C, and the second direction is perpendicular to the first direction.
3. The method of measuring a near-eye display of claim 2,
the acquiring of the ratio of the moving amount of the virtual image on the imaging element and the virtual image according to the pixel ratio includes:
the virtual image is shifted in the first direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is shifted on the imaging element by Sx pixels of the imaging element,
Sx=P hr-x /P lr-x =(b-a)/(B-A) (1);
the virtual image is shifted in the second direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is shifted on the imaging elements by pixels of the Sy imaging elements,
Sy=P hr-y /P lr-y =(d-c)/(D-C) (2)。
4. the method of measuring a near-eye display of claim 3,
determining the position to be moved of the virtual image according to the pixel proportion and the movement amount proportion, wherein the step comprises the following steps:
will P lr-x /P hr-x Rounding to obtain an integer value Rx;
will P lr-y /P hr-y Rounding to obtain an integer value Ry;
establishing a coordinate system by taking the initial position of a virtual image as an origin and the first direction and the second direction as coordinate axes, drawing a first straight line vertical to the first direction at each integer coordinate from the origin [0-Rx ] along the first direction in any coordinate direction quadrant, drawing a second straight line vertical to the second direction at each integer coordinate from the origin [0-Ry ] along the second direction, wherein all intersection points of the first straight line and the second straight line are positions to be moved of the virtual image.
5. The method of measuring a near-eye display of claim 4,
the acquiring of the relative movement amount between every two adjacent imaging element images in the imaging element image sequence comprises:
acquiring a first movement amount of a virtual image corresponding to every two adjacent imaging element images in the imaging element image sequence in a first direction and a second movement amount in a second direction;
and acquiring a first relative movement amount in a first direction and a second relative movement amount in a second direction between every two adjacent imaging element images in the imaging element image sequence according to the first movement amount, the second movement amount and the formula (1) and the formula (2).
6. The method of measuring a near-eye display of claim 1,
the image reconstruction based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images to obtain the image of the virtual image to be measured comprises the following steps:
and based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images, performing image recombination by adopting a super-resolution image reconstruction algorithm to obtain an image of the virtual image to be detected.
7. The method of measuring a near-eye display of any one of claims 1 to 6,
the method further comprises the following steps: controlling the virtual image to move to each of the positions to be moved by controlling movement of a test image on the microdisplay.
8. A measurement device for a near-eye display, comprising:
a pixel proportion acquisition module configured to acquire pixel proportions of the imaging element and the virtual image;
a moving amount ratio acquisition module configured to acquire an image of the virtual image on the imaging element and a moving amount ratio of the virtual image according to the pixel ratio;
the to-be-moved position determining module is configured to determine the to-be-moved position of the virtual image according to the pixel proportion and the movement amount proportion;
an image sequence acquisition module configured to control the virtual image to move to each of the positions to be moved, correspondingly acquire an imaging element image corresponding to each of the positions to be moved on an imaging element, acquire an imaging element image sequence, and acquire a relative movement amount between every two adjacent imaging element images in the imaging element image sequence;
and the image recombination module is configured to perform image recombination based on the imaging element image sequence and the relative movement amount between two adjacent imaging element images to obtain an image of a virtual image to be detected.
9. The near-eye display measuring device according to claim 8,
the pixel proportion acquisition module comprises a first pixel proportion acquisition unit and a second pixel proportion acquisition unit, wherein,
the first pixel proportion acquisition unit is configured to:
lighting two pixel points along ase:Sub>A first direction of the virtual image, wherein the position coordinates in the first direction are A and B respectively, and the distance is (B-A) × P hr-x Wherein B > A, P hr-x Representing a pixel size of the virtual image in a first direction;
correspondingly acquiring images corresponding to the two pixel points lightened along the first direction of the virtual image on the imaging element, and positioning the imaging element in the first directionMarked a and b, respectively, and the distance is (b-a) × P lr-x Wherein b > a, P lr-x Representing a pixel size of the imaging element in a first direction;
according to (B-A) P hr-x =(b-a)*P lr-x The ratio P between the size of the imaging element and the size of the virtual image pixel in the first direction can be obtained lr-x /P hr-x =(B-A)/(b-a);
The second pixel proportion acquisition unit is configured to:
the step of executing the first pixel proportion acquiring unit acquires a proportion P between the sizes of the imaging element and the virtual image pixel in the second direction lr-y /P hr-y = D-C)/(D-C), wherein P lr-y Indicating the pixel size, P, of the imaging element in the second direction hr-y The pixel size of the virtual image in the second direction is represented, D, C represents the position coordinates of two pixel points lighted along the second direction of the virtual image, D > C, D, C represent the position coordinates of the image on the imaging element in the second direction corresponding to the two pixel points lighted along the second direction of the virtual image, D > C, and the second direction is perpendicular to the first direction.
10. The measurement arrangement of a near-eye display of claim 9,
the movement amount ratio acquisition module includes a first movement amount ratio acquisition unit and a second movement amount ratio acquisition unit, wherein,
the first movement amount proportion acquisition unit is configured to:
the virtual image is shifted in the first direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is shifted on the imaging element by Sx pixels of the imaging element,
Sx=P hr-x /P lr-x =(b-a)/(B-A) (1);
the second movement amount proportion acquisition unit is configured to:
the virtual image is shifted in the second direction by 1 pixel of the virtual image, the image of the virtual image corresponding thereto is shifted on the imaging elements by pixels of the Sy imaging elements,
Sy=P hr-y /P lr-y =(d-c)/(D-C) (2)。
11. the measurement arrangement of a near-eye display of claim 10,
the to-be-moved position determining module is specifically configured to:
will P lr-x /P hr-x Rounding to obtain an integer value Rx;
will P lr-y /P hr-y Rounding to obtain an integral value Ry;
the method comprises the steps of establishing a coordinate system by taking an initial position of a virtual image as an origin and the first direction and the second direction as coordinate axes, drawing a first straight line vertical to the first direction at each integer coordinate from the origin [0-Rx ] along the first direction in any coordinate quadrant, drawing a second straight line vertical to the second direction at each integer coordinate from the origin [0-Ry ] along the second direction, and taking all intersection positions of the first straight line and the second straight line as positions to be moved of the virtual image.
12. The measurement arrangement of a near-eye display of claim 11,
the image sequence acquisition module includes a relative movement amount acquisition unit configured to:
acquiring a first movement amount of a virtual image corresponding to every two adjacent imaging element images in the imaging element image sequence in a first direction and a second movement amount in a second direction;
and acquiring a first relative movement amount in a first direction and a second relative movement amount in a second direction between every two adjacent imaging element images in the imaging element image sequence according to the first movement amount, the second movement amount and the formula (1) and the formula (2).
13. The measurement arrangement of a near-eye display of claim 8,
and the image recombination module adopts a super-resolution image reconstruction algorithm to carry out image recombination to obtain the image of the virtual image to be detected.
14. The near-eye display measurement device according to any one of claims 8 to 13,
the image sequence acquisition module is further configured to control the virtual image to move to each of the positions to be moved by controlling movement of a test image on a microdisplay.
15. A controller comprising a memory and a processor, characterized in that the memory stores a computer program which, when executed by the processor, is capable of carrying out the steps of the method of any one of claims 1 to 7.
16. A computer-readable storage medium for storing a computer program, characterized in that the program realizes the steps of the method according to any one of claims 1 to 7 when executed by a computer or processor.
17. A measurement system for a near-eye display, comprising:
a measurement assembly comprising a pre-aperture, a camera lens, an imaging element, and a controller,
a near-eye display that forms an image on the imaging element;
the controller comprises a memory and a processor, the memory storing a computer program which, when executed by the processor, is capable of implementing the steps of the method of any one of claims 1 to 7.
18. The measurement system of the near-eye display of claim 17,
the field angle of the imaging element is greater than the field angle of the near-eye display, and the resolution of the imaging element is lower than the resolution of the near-eye display.
CN201910605246.4A 2019-07-05 2019-07-05 Near-eye display measurement method, device, system, controller and medium Active CN110264408B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910605246.4A CN110264408B (en) 2019-07-05 2019-07-05 Near-eye display measurement method, device, system, controller and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910605246.4A CN110264408B (en) 2019-07-05 2019-07-05 Near-eye display measurement method, device, system, controller and medium

Publications (2)

Publication Number Publication Date
CN110264408A CN110264408A (en) 2019-09-20
CN110264408B true CN110264408B (en) 2022-12-06

Family

ID=67924637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910605246.4A Active CN110264408B (en) 2019-07-05 2019-07-05 Near-eye display measurement method, device, system, controller and medium

Country Status (1)

Country Link
CN (1) CN110264408B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111629197B (en) * 2020-06-10 2022-07-26 芋头科技(杭州)有限公司 Method, apparatus, controller and medium for improving resolution of light field near-eye display
CN114593897B (en) * 2022-03-04 2023-07-14 杭州远方光电信息股份有限公司 Measuring method and device of near-eye display
WO2024159386A1 (en) * 2023-01-31 2024-08-08 上海显耀显示科技有限公司 Defect detection method for near-eye display

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102164298B (en) * 2011-05-18 2012-10-03 长春理工大学 Method for acquiring element image based on stereo matching in panoramic imaging system
JP5755571B2 (en) * 2012-01-11 2015-07-29 シャープ株式会社 Virtual viewpoint image generation device, virtual viewpoint image generation method, control program, recording medium, and stereoscopic display device
JP6846165B2 (en) * 2016-11-01 2021-03-24 日本放送協会 Image generator, image display system and program
CN108769462B (en) * 2018-06-06 2020-05-05 北京邮电大学 Free visual angle scene roaming method and device

Also Published As

Publication number Publication date
CN110264408A (en) 2019-09-20

Similar Documents

Publication Publication Date Title
CN110264408B (en) Near-eye display measurement method, device, system, controller and medium
CN110809786B (en) Calibration device, calibration chart, chart pattern generation device, and calibration method
CN110967166B (en) Detection method, detection device and detection system of near-eye display optical system
WO2012020760A1 (en) Gaze point detection method and gaze point detection device
CN108063940B (en) Correction system and method for human eye tracking naked eye 3D display system
CN113252309A (en) Testing method and testing device for near-to-eye display equipment and storage medium
CN109862345B (en) Method and system for testing field angle
CN110361167B (en) Testing method of head-up display
CN116519257B (en) Three-dimensional flow field testing method and system based on double-view background schlieren of single-light-field camera
CN110971791B (en) Method for adjusting consistency of optical axis of camera zoom optical system and display instrument
US20190050671A1 (en) Image breathing correction systems and related methods
CN113298886A (en) Calibration method of projector
KR100596976B1 (en) apparatus and method for correcting distorted image and image display system using it
CN111311659A (en) Calibration method based on three-dimensional imaging of oblique plane mirror
JP2006227774A (en) Image display method
CN110189603A (en) A kind of EXPERIMENT OF NEWTON ' S device observed using digital camera
JP2015102532A (en) Three-dimensional shape measurement device
WO2009107365A1 (en) Test method and test device for compound-eye distance measuring device and chart used therefor
CN114972526A (en) Method and device for measuring angle of field, computer device and medium
JP4695557B2 (en) Element image group correction apparatus, element image group acquisition system, element image group correction method, and element image group correction program
KR20150119770A (en) Method for measuring 3-dimensional cordinates with a camera and apparatus thereof
CN116309854A (en) Method, device, equipment, system and storage medium for calibrating augmented reality equipment
KR102295987B1 (en) Calibration method and apparatus of stereo camera module, computer readable storage medium
CN117455919B (en) Background schlieren method, device, equipment and medium based on virtual knife edge
CN112822481A (en) Detection method and detection system for correction quality of stereo camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant