CN113031262B - Integrated imaging system display end pixel value calculation method and system - Google Patents

Integrated imaging system display end pixel value calculation method and system Download PDF

Info

Publication number
CN113031262B
CN113031262B CN202110325606.2A CN202110325606A CN113031262B CN 113031262 B CN113031262 B CN 113031262B CN 202110325606 A CN202110325606 A CN 202110325606A CN 113031262 B CN113031262 B CN 113031262B
Authority
CN
China
Prior art keywords
display
lens
intersection point
pixel
collecting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110325606.2A
Other languages
Chinese (zh)
Other versions
CN113031262A (en
Inventor
闫兴鹏
毛岩
王维锋
蒋晓瑜
荆涛
刘云鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Academy of Armored Forces of PLA
Original Assignee
Academy of Armored Forces of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Academy of Armored Forces of PLA filed Critical Academy of Armored Forces of PLA
Priority to CN202110325606.2A priority Critical patent/CN113031262B/en
Publication of CN113031262A publication Critical patent/CN113031262A/en
Application granted granted Critical
Publication of CN113031262B publication Critical patent/CN113031262B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Measurement Of Radiation (AREA)

Abstract

The invention discloses a method and a system for calculating a pixel value of a display end of an integrated imaging system. The improved SPOC method fully considers the convergence effect of the display lens array on light, converts the pixel value of the display end pixel into the average value of the pixel corresponding to the collecting lens which can be covered by the light through the pixel point, and solves the problem that the pixel of the collecting end corresponding to the light has larger error or is a 'hollow' pixel. In addition, the improved SPOC method includes all the information of the pixels corresponding to the collecting lens which can be covered by the pixel point, which means that more information can be obtained.

Description

Integrated imaging system display end pixel value calculation method and system
Technical Field
The invention relates to the field of integrated imaging systems, in particular to a method and a system for calculating a pixel value of a display end of an integrated imaging system.
Background
In the integrated imaging system, the SPOC method is characterized in that a reference surface is arranged, a display lens array is equivalent to a pinhole array, display end pixels respectively intersect the reference surface and a collecting lens through pinholes corresponding to elementary images where the display end pixels are located, corresponding collecting end pixels are obtained by utilizing intersection points of the display end pixels and the reference surface and optical centers of the corresponding collecting lenses, and pixel values corresponding to the collecting end pixels are pixel values corresponding to the display end pixels.
The SPOC method is simple and practical, solves the problems of mismatching of acquisition and display parameters and inversion of display depth, but has the defects that the display lens array is equivalent to a pinhole array, and obviously, the convergence effect of the display lens array on light rays is not reflected. In addition, only one light ray is used for searching the corresponding acquisition end pixel, when the real camera is used for acquiring the view angle image, the acquisition end pixel corresponding to the light ray has larger error or is a 'hole' pixel due to the existence of mechanical error or human error, and the image corresponding to the display end element image also generates larger error or is a 'hole' pixel by using the SPOC algorithm.
Disclosure of Invention
In order to solve the above problems, the present invention provides a method and a system for calculating a pixel value of a display end of an integrated imaging system, which converts a pixel value of a pixel point of the display end into an average value of pixels corresponding to a collecting lens that light can cover through the pixel point.
In order to achieve the purpose, the invention provides the following scheme:
an integrated imaging system display side pixel value calculation method, the integrated imaging system comprising: the device comprises a display end, a display lens array, a reference surface, a collecting lens array and a collecting end; the reference surface is located intermediate the display lens array and the collection lens array, the display lens array including a plurality of display lenses and the collection lens array including a plurality of collection lenses;
the pixel value calculation method includes:
calculating the x-axis coordinate of the intersection point of the light ray passing through the elementary image pixel point of the display end and the display lens array and the reference surface, and recording the x-axis coordinate as a first intersection point coordinate;
determining the light rays passing through the intersection point of the reference plane and passing through the upper edge and the lower edge of the collecting lens array;
determining an upper edge collecting lens corresponding to the upper edge light rays and a lower edge lens corresponding to the lower edge light rays on the basis of the first intersection point coordinates; the upper edge collecting lens and the lower edge lens are collecting lenses in the collecting lens array;
calculating pixel values of intersection points of the light rays passing through the upper edge collecting lens and the lower edge lens respectively and the plane where the element images of the collecting end are located, wherein the pixel values are pixel values of the collecting end;
and averaging the pixel values of all the acquisition ends to obtain the pixel value of the pixel point of the elementary image of the display end, namely the pixel value of the display end.
Further, the calculation formula of the first intersection point coordinate is as follows:
Figure BDA0002994544810000021
Figure BDA0002994544810000022
wherein, DeltaOIs the first intersection coordinate, nSFor the number of pixels per elementary image on the display side, dSDistance of reference plane to display lens array, gSDistance p from the plane of the display end element image to the display lens arraySTo show the lens size, xSS represents the display side for the x-axis coordinate of the mth pixel of the jth elementary image on the display side.
Further, the determining, based on the first intersection coordinates, an upper edge collecting lens corresponding to the upper edge light and a lower edge lens corresponding to the lower edge light specifically includes:
calculating intersection point coordinates of the upper edge light rays, the lower edge light rays and the collecting lens array based on the first intersection point coordinates, and taking the intersection point coordinates as upper edge intersection point coordinates and lower edge intersection point coordinates;
and respectively determining an upper edge collecting lens and a lower edge collecting lens according to the upper edge intersection point coordinate and the lower edge intersection point coordinate.
Further, the calculation formula of the upper edge intersection point coordinate is as follows:
Figure BDA0002994544810000031
wherein the content of the first and second substances,
Figure BDA0002994544810000032
is the coordinate of the intersection point of the upper edge, j is the jth primitive image of the display end, pSTo indicate the lens size, dSIs the distance from the reference plane to the display lens array, y is the distance from the collection lens array to the display lens array, ΔOIs a first intersection point coordinate;
the calculation formula of the lower edge intersection point coordinate is as follows:
Figure BDA0002994544810000033
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002994544810000034
the coordinates of the intersection of the lower edges.
Further, the determination formula of the upper edge collecting lens is as follows:
Figure BDA0002994544810000035
wherein itopFor the upper edge to pick the lens number, pDD represents the collection end for the size of the collection lens;
the lower edge capture lens is determined by the following formula:
Figure BDA0002994544810000036
wherein ibottomThe lens numbers were collected for the upper edge.
Further, the calculating the pixel value of the intersection point of the light ray passing through the upper edge collecting lens and the lower edge lens respectively and the plane where the primitive image of the collecting end is located specifically includes:
calculating the x-axis coordinate of the intersection point of the plane where the light ray passes through the upper edge collecting lens and the lower edge lens and the elementary image of the collecting end respectively to be a second intersection point coordinate;
calculating the pixel sequence number of the size of the primitive image where the second intersection point is located according to the second intersection point coordinate; the second intersection point is the intersection point of the light and the plane where the primitive image of the acquisition end is located;
and determining the pixel value of the acquisition end according to the pixel sequence number.
Further, the calculation formula of the second intersection point coordinate is as follows:
Figure BDA0002994544810000041
xDuas a second intersection coordinate, ΔOAs coordinates of the first intersection, gDThe distance from the plane where the terminal element image is collected to the collection lens array, y is the distance from the collection lens array to the display lens array, dSDistance of reference plane to display lens array, pDU is the number of collecting lenses in the collecting lens array, itopNumber of lenses, i, collected for the upper edgebottomNumber of lenses, i, for lower edgetop≤u≤ibottom
Further, the calculation formula of the pixel sequence number is as follows:
Figure BDA0002994544810000042
vjmuis the pixel number, nDThe number of pixels of each elementary image at the acquisition end is.
Further, a calculation formula of pixel values of the primitive image pixel points of the display end is as follows:
Figure BDA0002994544810000043
wherein the content of the first and second substances,
Figure BDA0002994544810000044
is the pixel value of the pixel point at the display end, u is the serial number of the collecting lens in the collecting lens array, itopNumber of lenses, i, collected for the upper edgebottomNumber of lenses, i, for lower edgetop≤u≤ibottom
Figure BDA0002994544810000045
In order to acquire the end pixel values,y is the distance from the collecting lens array to the display lens array, and v is the pixel number in the representative elementary image.
The invention also provides a system for calculating the pixel value of the display end of the integrated imaging system, which comprises the following components:
the first intersection point coordinate calculation module is used for calculating the x-axis coordinate of the intersection point of the light ray passing through the elementary image pixel point of the display end and the display lens array and the reference surface and recording the x-axis coordinate as a first intersection point coordinate;
the upper and lower edge light ray module is used for determining the light rays passing through the intersection point of the reference surface and passing through the upper edge and the lower edge of the collecting lens array;
the upper edge and lower edge collecting lens determining module is used for determining an upper edge collecting lens corresponding to the upper edge light rays and a lower edge lens corresponding to the lower edge light rays on the basis of the first intersection point coordinates; the upper edge collecting lens and the lower edge lens are both collecting lenses in the collecting lens array;
the acquisition end pixel value calculating module is used for calculating the pixel value of the intersection point of the plane where the light ray passes through the upper edge acquisition lens and the lower edge lens and the elementary image of the acquisition end respectively to obtain an acquisition end pixel value;
and the display end pixel value calculating module is used for averaging all the acquisition end pixel values to obtain the pixel value of the pixel point of the elementary image of the display end, and the pixel value is the display end pixel value.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects:
the invention provides a method and a system for calculating a pixel value of a display end of an integrated imaging system. The method comprises the following steps: calculating the x-axis coordinate of the intersection point of the light ray passing through the elementary image pixel point of the display end and the display lens array and the reference surface, and recording the x-axis coordinate as a first intersection point coordinate; determining the light rays passing through the intersection point of the reference plane and passing through the upper edge and the lower edge of the collecting lens array; determining an upper edge collecting lens corresponding to the upper edge light rays and a lower edge lens corresponding to the lower edge light rays on the basis of the first intersection point coordinates; the upper edge collecting lens and the lower edge lens are both collecting lenses in the collecting lens array; calculating pixel values of intersection points of the light rays passing through the upper edge collecting lens and the lower edge lens respectively and the plane where the element images of the collecting end are located, wherein the pixel values are pixel values of the collecting end; and averaging the pixel values of all the acquisition ends to obtain the pixel value of the pixel point of the elementary image of the display end, namely the pixel value of the display end. The improved SPOC method fully considers the convergence effect of the display lens array on light, converts the pixel value of the display end pixel into the average value of the pixel corresponding to the collecting lens which can be covered by the light through the pixel point, and solves the problem that the pixel of the collecting end corresponding to the light has larger error or is a 'hollow' pixel. In addition, the improved SPOC method includes all the information of the pixels corresponding to the collecting lens which can be covered by the pixel point, which means that more information can be obtained.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a flowchart illustrating a method for calculating a pixel value at a display end of an integrated imaging system according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of the SPOC algorithm;
fig. 3 is a schematic diagram of the improved SPOC algorithm.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The SPOC method is simple and practical, solves the problems of unmatched acquisition and display parameters and reversed display depth, but has the defects that the display lens array is equivalent to a pinhole array, obviously, the convergence effect of the display lens array on light rays is not reflected, the corresponding acquisition end pixel is searched by using only one light ray, and when a visual angle diagram is acquired by using a real camera, due to the existence of mechanical errors or human errors, the acquisition end pixel corresponding to the light rays has larger errors or is a 'hollow' pixel, and the larger errors are generated by using the SPOC algorithm.
First, the SPOC method is introduced. As shown in fig. 2, a straight line perpendicular to the optical axis of the collecting lens array is taken as an x-axis, a straight line passing through a certain pinhole of the display pinhole array is taken as a z-axis, and an intersection point of the z-axis and a plane where the collecting end element image is located is taken as an origin o, so as to establish a coordinate system.
Displaying the pinhole pitch of the pinhole array as pSThe x coordinate of the mth pixel of the jth primitive image (counting from 0) on the display side is xSThen, then
Figure BDA0002994544810000061
Wherein n isSS represents the display side as the number of pixels per primitive image at the display side. The collecting lens is compact in arrangement and free of gaps, and the distance between the plane where the collecting end element image is located and the collecting lens array is gDD represents the acquisition end, and the distance from the plane where the display end element image is located to the display pinhole array is gSThe distance from the collecting lens array to the display pinhole array is y, a reference plane is selected and positioned between the collecting lens array and the display pinhole array, and the distances from the collecting lens array to the display pinhole array are dD、dSThe x coordinate of the intersection point of the light ray passing through the pixel point of the elementary image and the corresponding display pinhole and the reference plane is deltaOThen, then
Figure BDA0002994544810000062
The x coordinate of the intersection point of the light ray and the collecting lens is deltaDThen, then
Figure BDA0002994544810000063
Corresponding to the collecting lens with the serial number ujmThen, then
Figure BDA0002994544810000071
Wherein p isDIs the size of the collecting lens. The x coordinate of the intersection point of the light ray passing through the intersection point of the reference plane and the corresponding optical center of the collecting lens and the plane where the collecting end element image is located is xDThen, then
Figure BDA0002994544810000072
The corresponding element image serial number is the same as the serial number of the collecting lens and is also ujmPixel number v of primitive image size at intersection point of light and plane of collection end primitive imagejmIs composed of
Figure BDA0002994544810000073
Wherein n isDFor the pixel number of each primitive image at the acquisition end, the corresponding value of the pixel point
Figure BDA0002994544810000074
The pixel value of the intersection point of the light and the plane where the acquisition end group image is located is obtained. Pixel value of display end pixel
Figure BDA0002994544810000075
I.e. the pixel value corresponding to the collecting lens corresponding to the point where the light passes through the pixel point
Figure BDA0002994544810000076
Namely, it is
Figure BDA0002994544810000077
The invention aims to provide a method and a system for calculating a pixel value of a display end of an integrated imaging system, which are used for converting the pixel value of a pixel point of the display end into an average value of pixels corresponding to a collecting lens which can be covered by light passing through the pixel point. The invention improves the SPOC method (as shown in figure 3) and is applied to an integrated imaging system, thereby improving the integrated imaging display effect.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
The integrated imaging system includes: the device comprises a display end, a display lens array, a reference surface, a collecting lens array and a collecting end; the reference surface is located between the display lens array and the collection lens array, the display lens array includes a plurality of display lenses, and the collection lens array includes a plurality of collection lenses. As shown in fig. 1, a method for calculating a display-side pixel value of an integrated imaging system includes the following steps:
step 101: and calculating the x-axis coordinate of the intersection point of the light ray passing through the elementary image pixel point of the display end and the display lens array and the reference surface, and recording as a first intersection point coordinate.
As shown in fig. 3, a coordinate system is established with a straight line perpendicular to the optical axis of the display lens array as an x-axis, a straight line passing through the optical axis of the lens at the bottom of the display lens array as a z-axis, and an intersection point of the straight line and a plane where the collected end element image is located as an origin o. Display lens size pSThe x coordinate of the mth pixel of the jth primitive image on the display side is xSThen, then
Figure BDA0002994544810000081
Wherein n isSFor each pixel of the elementary image at the display endAnd the number S represents a display end. The collecting lens and the display lens are compactly arranged without gaps, and the distance from the plane of the collecting end element image to the collecting lens array is gDD represents the collection end, and the distance between the plane where the display end element image is located and the display lens array is gSThe distance from the collecting lens array to the display lens array is y, a reference plane is selected and located between the collecting lens array and the display lens array, and the distances from the collecting lens array to the display lens array and the display lens array are dD、dSThe x coordinate of the intersection point of the light passing through the pixel point of the elementary image and the optical center of the corresponding display lens and the reference plane is deltaO(first intersection coordinates), then
Figure BDA0002994544810000082
Step 102: and determining the light rays passing through the reference plane intersection point and passing through the upper edge and the lower edge of the collecting lens array.
Step 103: determining an upper edge collecting lens corresponding to the upper edge light rays and a lower edge lens corresponding to the lower edge light rays on the basis of the first intersection point coordinates; the upper edge collection lens and the lower edge lens are both collection lenses in the collection lens array. The method specifically comprises the following steps: calculating intersection point coordinates of the upper edge light rays, the lower edge light rays and the collecting lens array based on the first intersection point coordinates, and taking the intersection point coordinates as upper edge intersection point coordinates and lower edge intersection point coordinates; and respectively determining an upper edge collecting lens and a lower edge collecting lens according to the upper edge intersection point coordinate and the lower edge intersection point coordinate.
Light ray ltop(upper edge ray), lbottom(lower edge ray) passes through the corresponding upper and lower edges of the display lens, and the x-coordinate of the intersection point with the collection lens is
Figure BDA0002994544810000083
Then
Figure BDA0002994544810000084
Figure BDA0002994544810000085
Light ray ltop、lbottomCorresponding to the collecting lens with serial number itop(number of upper edge collecting lens), ibottom(number of lower edge collecting lens), then
Figure BDA0002994544810000091
Figure BDA0002994544810000092
Wherein p isDIs the size of the collecting lens. Note that here itop≤ibottom
Step 104: and calculating the pixel value of the intersection point of the plane where the light ray passes through the upper edge collecting lens and the plane where the lower edge lens and the elementary image of the collecting end are located respectively as the pixel value of the collecting end. The method specifically comprises the following steps:
calculating the x-axis coordinate of the intersection point of the plane where the light ray passes through the upper edge collecting lens and the lower edge lens and the elementary image of the collecting end respectively to be a second intersection point coordinate; calculating the pixel sequence number of the size of the primitive image where the second intersection point is located according to the second intersection point coordinate; the second intersection point is the intersection point of the light and the plane where the primitive image of the acquisition end is located; and determining the pixel value of the acquisition end according to the pixel sequence number.
Light passes through the reference plane intersection and the collection lens itop、ibottomThe optical center of the lens is positioned between the two, and the x coordinate of the intersection point of the optical center of the lens and the plane where the acquisition end element image is positioned is xDu(itop≤u≤ibottom) And u is the serial number of the collection lens in the collection lens array, then
Figure BDA0002994544810000093
Pixel sequence number v of primitive image size of intersection point of light and plane of collection end primitive imagejmu(itop≤u≤ibottom) Is composed of
Figure BDA0002994544810000094
Wherein the serial number of the element image is the same as that of the collecting lens and is u, nDThe pixel value corresponding to the pixel sequence number of the acquisition end is the pixel number of each primitive image of the acquisition end
Figure BDA0002994544810000095
Namely the pixel value of the intersection point of the light and the plane where the acquisition end group element image is positioned
Step 105: and averaging the pixel values of all the acquisition ends to obtain the pixel value of the pixel point of the elementary image of the display end, namely the pixel value of the display end.
Pixel value of display end pixel point
Figure BDA0002994544810000101
That is, the light passes through the pixel value of the pixel point at the collecting end corresponding to the collecting lens which can be covered by the pixel point at the display end
Figure BDA0002994544810000102
Average value of (i), i.e.
Figure BDA0002994544810000103
Where v represents the pixel number in the primitive image.
The invention also provides a system for calculating the pixel value of the display end of the integrated imaging system, which comprises the following components:
the first intersection point coordinate calculation module is used for calculating the x-axis coordinate of the intersection point of the light ray passing through the elementary image pixel point of the display end and the display lens array and the reference surface and recording the x-axis coordinate as a first intersection point coordinate;
the upper and lower edge light ray module is used for determining the light rays passing through the intersection point of the reference surface and passing through the upper edge and the lower edge of the collecting lens array;
the upper edge and lower edge collecting lens determining module is used for determining an upper edge collecting lens corresponding to the upper edge light rays and a lower edge lens corresponding to the lower edge light rays on the basis of the first intersection point coordinates; the upper edge collecting lens and the lower edge lens are both collecting lenses in the collecting lens array;
the acquisition end pixel value calculating module is used for calculating pixel values of intersection points of planes of the elementary images of the acquisition end, which are respectively passed through the upper edge acquisition lens and the lower edge lens by light rays, and the intersection points are the acquisition end pixel values;
and the display end pixel value calculating module is used for averaging all the acquisition end pixel values to obtain the pixel value of the pixel point of the elementary image of the display end, and the pixel value is the display end pixel value.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system disclosed by the embodiment, the description is relatively simple because the system corresponds to the method disclosed by the embodiment, and the relevant points can be referred to the method part for description.
The principles and embodiments of the present invention have been described herein using specific examples, which are provided only to help understand the method and the core concept of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed. In view of the above, the present disclosure should not be construed as limiting the invention.

Claims (6)

1. An integrated imaging system display-side pixel value calculation method, the integrated imaging system comprising: the device comprises a display end, a display lens array, a reference surface, a collecting lens array and a collecting end; the reference surface is located intermediate the display lens array and the collection lens array, the display lens array including a plurality of display lenses and the collection lens array including a plurality of collection lenses;
the pixel value calculation method includes:
calculating the x-axis coordinate of the intersection point of the light ray passing through the elementary image pixel point of the display end and the display lens array and the reference surface, and recording the x-axis coordinate as a first intersection point coordinate;
determining the light rays passing through the intersection point of the reference plane and passing through the upper edge and the lower edge of the collecting lens array;
determining an upper edge collecting lens corresponding to the upper edge light rays and a lower edge lens corresponding to the lower edge light rays on the basis of the first intersection point coordinates; the upper edge collecting lens and the lower edge lens are both collecting lenses in the collecting lens array;
calculating pixel values of intersection points of the light rays passing through the upper edge collecting lens and the lower edge lens respectively and the plane where the element images of the collecting end are located, wherein the pixel values are pixel values of the collecting end;
averaging the pixel values of all the acquisition ends to obtain the pixel value of the pixel point of the elementary image of the display end, wherein the pixel value is the pixel value of the display end;
wherein, the calculation formula of the first intersection point coordinate is as follows:
Figure FDA0003609867520000011
Figure FDA0003609867520000012
wherein, DeltaOIs the first intersection coordinate, nSFor the number of pixels per elementary image on the display side, dSDistance of reference plane to display lens array, gSDistance p from the plane of the display end element image to the display lens arraySTo show the lens size, xSThe x-axis coordinate of the mth pixel of the jth elementary image at the display end is S, which represents the display end;
wherein, based on first nodical coordinate, confirm the last marginal collection lens that last marginal light corresponds and the lower edge lens that lower marginal light corresponds, specifically include:
calculating intersection point coordinates of the upper edge light rays, the lower edge light rays and the collecting lens array based on the first intersection point coordinates, and taking the intersection point coordinates as upper edge intersection point coordinates and lower edge intersection point coordinates;
respectively determining an upper edge collecting lens and a lower edge collecting lens according to the upper edge intersection point coordinate and the lower edge intersection point coordinate;
wherein, the calculating the pixel value of the intersection point of the light ray passing through the upper edge collecting lens and the lower edge lens respectively and the plane where the element image of the collecting end is located specifically comprises:
calculating the x-axis coordinate of the intersection point of the plane where the light ray respectively passes through the upper edge collecting lens and the lower edge lens and the elementary image of the collecting end, and taking the x-axis coordinate as a second intersection point coordinate;
calculating the pixel sequence number of the size of the primitive image where the second intersection point is located according to the second intersection point coordinate; the second intersection point is the intersection point of the light and the plane where the primitive image of the acquisition end is located;
determining a pixel value of an acquisition end according to the pixel sequence number;
the calculation formula of the pixel values of the primitive image pixel points of the display end is as follows:
Figure FDA0003609867520000021
wherein the content of the first and second substances,
Figure FDA0003609867520000022
is the pixel value of the pixel point at the display end, u is the serial number of the collecting lens in the collecting lens array, itopNumber of lenses, i, collected for the upper edgebottomFor lower edge collecting lensesSerial number, itop≤u≤ibottom
Figure FDA0003609867520000023
V is the pixel number in the representative primitive image for the acquisition end pixel value.
2. The integrated imaging system display-side pixel value calculation method according to claim 1, wherein the calculation formula of the upper edge intersection point coordinates is as follows:
Figure FDA0003609867520000024
wherein the content of the first and second substances,
Figure FDA0003609867520000025
is the coordinate of the intersection point of the upper edge, j is the jth primitive image of the display end, pSTo indicate the lens size, dSIs the distance from the reference plane to the display lens array, y is the distance from the collection lens array to the display lens array, ΔOIs a first intersection point coordinate;
the calculation formula of the lower edge intersection point coordinate is as follows:
Figure FDA0003609867520000026
wherein the content of the first and second substances,
Figure FDA0003609867520000027
the coordinates of the lower edge intersection point.
3. The integrated imaging system display-side pixel value calculation method according to claim 2, wherein the upper edge capture lens is determined by the following formula:
Figure FDA0003609867520000031
wherein itopFor the upper edge to pick the lens number, pDD represents the collection end for the size of the collection lens;
the lower edge capture lens is determined by the following formula:
Figure FDA0003609867520000032
wherein ibottomThe lens numbers were collected for the lower edge.
4. The integrated imaging system display-side pixel value calculation method according to claim 1, wherein the calculation formula of the second intersection point coordinate is as follows:
Figure FDA0003609867520000033
xDuas second intersection coordinate, ΔOAs coordinates of the first intersection, gDThe distance from the plane where the terminal element image is collected to the collection lens array, y is the distance from the collection lens array to the display lens array, dSDistance of reference plane to display lens array, pDU is the number of collecting lenses in the collecting lens array, itopNumber of lenses, i, collected for the upper edgebottomNumber of lenses, i, for lower edgetop≤u≤ibottom
5. The integrated imaging system display-side pixel value calculation method according to claim 4, wherein the calculation formula of the pixel sequence number is as follows:
Figure FDA0003609867520000034
vjmuis the pixel number, nDThe number of pixels of each elementary image at the acquisition end is calculated.
6. An integrated imaging system display-side pixel value calculation system using the integrated imaging system display-side pixel value calculation method according to claim 1, comprising:
the first intersection point coordinate calculation module is used for calculating the x-axis coordinate of the intersection point of the light ray passing through the element image pixel point of the display end and the display lens array and the reference surface and recording the x-axis coordinate as a first intersection point coordinate;
the upper and lower edge light ray module is used for determining the light rays passing through the intersection point of the reference surface and passing through the upper edge and the lower edge of the collection lens array;
the upper edge and lower edge collecting lens determining module is used for determining an upper edge collecting lens corresponding to the upper edge light rays and a lower edge lens corresponding to the lower edge light rays on the basis of the first intersection point coordinates; the upper edge collecting lens and the lower edge lens are both collecting lenses in the collecting lens array;
the acquisition end pixel value calculating module is used for calculating pixel values of intersection points of planes of the elementary images of the acquisition end and the light rays passing through the upper edge acquisition lens and the lower edge lens respectively, and is the acquisition end pixel value;
and the display end pixel value calculating module is used for averaging all the acquisition end pixel values to obtain the pixel value of the pixel point of the elementary image of the display end, and the pixel value is the display end pixel value.
CN202110325606.2A 2021-03-26 2021-03-26 Integrated imaging system display end pixel value calculation method and system Active CN113031262B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110325606.2A CN113031262B (en) 2021-03-26 2021-03-26 Integrated imaging system display end pixel value calculation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110325606.2A CN113031262B (en) 2021-03-26 2021-03-26 Integrated imaging system display end pixel value calculation method and system

Publications (2)

Publication Number Publication Date
CN113031262A CN113031262A (en) 2021-06-25
CN113031262B true CN113031262B (en) 2022-06-07

Family

ID=76474364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110325606.2A Active CN113031262B (en) 2021-03-26 2021-03-26 Integrated imaging system display end pixel value calculation method and system

Country Status (1)

Country Link
CN (1) CN113031262B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108965853A (en) * 2018-08-15 2018-12-07 张家港康得新光电材料有限公司 A kind of integration imaging 3 D displaying method, device, equipment and storage medium
WO2019112096A1 (en) * 2017-12-07 2019-06-13 전자부품연구원 Viewpoint image mapping method for integrated imaging system using hexagonal lens
CN110225329A (en) * 2019-07-16 2019-09-10 中国人民解放军陆军装甲兵学院 A kind of artifact free cell picture synthetic method and system
CN110276823A (en) * 2019-05-24 2019-09-24 中国人民解放军陆军装甲兵学院 The integration imaging generation method and system that can be interacted based on ray tracing and in real time

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019112096A1 (en) * 2017-12-07 2019-06-13 전자부품연구원 Viewpoint image mapping method for integrated imaging system using hexagonal lens
CN108965853A (en) * 2018-08-15 2018-12-07 张家港康得新光电材料有限公司 A kind of integration imaging 3 D displaying method, device, equipment and storage medium
CN110276823A (en) * 2019-05-24 2019-09-24 中国人民解放军陆军装甲兵学院 The integration imaging generation method and system that can be interacted based on ray tracing and in real time
CN110225329A (en) * 2019-07-16 2019-09-10 中国人民解放军陆军装甲兵学院 A kind of artifact free cell picture synthetic method and system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《基于多视差函数拟合的集成成像深度提取方法》;王宇,等;《光学学报》;20150430;第35卷(第4期);第0411002-1页-第0411002-7页 *

Also Published As

Publication number Publication date
CN113031262A (en) 2021-06-25

Similar Documents

Publication Publication Date Title
CN109489620B (en) Monocular vision distance measuring method
CN110211043B (en) Registration method based on grid optimization for panoramic image stitching
CN109544628B (en) Accurate reading identification system and method for pointer instrument
CN104599258B (en) A kind of image split-joint method based on anisotropic character descriptor
CN102903101B (en) Method for carrying out water-surface data acquisition and reconstruction by using multiple cameras
KR101759798B1 (en) Method, device and system for generating an indoor two dimensional plan view image
Wang et al. Single view metrology from scene constraints
CN110874854B (en) Camera binocular photogrammetry method based on small baseline condition
CN108965853B (en) Integrated imaging three-dimensional display method, device, equipment and storage medium
CN110634137A (en) Bridge deformation monitoring method, device and equipment based on visual perception
CN106780297A (en) Image high registration accuracy method under scene and Varying Illumination
CN109325981A (en) Based on the microlens array type optical field camera geometrical parameter calibration method for focusing picture point
CN112254656A (en) Stereoscopic vision three-dimensional displacement measurement method based on structural surface point characteristics
CN109214350A (en) A kind of determination method, apparatus, equipment and the storage medium of illumination parameter
CN114792345A (en) Calibration method based on monocular structured light system
CN110274752A (en) The Multifunctional test card and its test method of relay lens image quality
CN103413319A (en) Industrial camera parameter on-site calibration method
TW201326735A (en) Method and system for measuring width
CN113031262B (en) Integrated imaging system display end pixel value calculation method and system
CN107256563B (en) Underwater three-dimensional reconstruction system and method based on difference liquid level image sequence
CN111968182A (en) Calibration method for binocular camera nonlinear model parameters
CN109902695B (en) Line feature correction and purification method for image pair linear feature matching
CN115239801B (en) Object positioning method and device
TWM563247U (en) Microvascular detection device
Zoraja et al. Projector calibration in a two-layer flat refractive geometry for underwater imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant