CN118071836A - Grating visible area calibration method and system based on image definition, electronic equipment and storage medium - Google Patents
Grating visible area calibration method and system based on image definition, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN118071836A CN118071836A CN202410109923.4A CN202410109923A CN118071836A CN 118071836 A CN118071836 A CN 118071836A CN 202410109923 A CN202410109923 A CN 202410109923A CN 118071836 A CN118071836 A CN 118071836A
- Authority
- CN
- China
- Prior art keywords
- value
- image
- grating
- depth
- bright
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 230000000007 visual effect Effects 0.000 claims abstract description 58
- 238000012545 processing Methods 0.000 claims abstract description 14
- 238000004364 calculation method Methods 0.000 claims description 15
- 238000004422 calculation algorithm Methods 0.000 claims description 14
- 239000003086 colorant Substances 0.000 claims description 12
- 238000010606 normalization Methods 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 claims description 2
- 230000016507 interphase Effects 0.000 claims description 2
- 238000013507 mapping Methods 0.000 abstract description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000013135 deep learning Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/27—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/20—Image enhancement or restoration using local operators
- G06T5/30—Erosion or dilatation, e.g. thinning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a raster visual area calibration method, a system, electronic equipment and a storage medium based on image definition, wherein the raster visual area calibration system comprises an initial value setting module, a raster visual area acquisition module, an optimal value acquisition module, an image processing and reasoning module, a position parameter standardization module and a visual area position acquisition module; the grating visual area acquisition module acquires visible area images of light and shade alternate stripes formed on the pure light color plate; the optimal value acquisition module is used for acquiring optimal grating parameters at different depths; the image processing and reasoning module is used for reasoning out the grating visible area parameters at the current depth; the visual area position acquisition module is used for calculating the visual area positions on two sides of the datum point according to the bright stripe central line and the interval between two adjacent bright stripe central lines and outputting the visual area position of the current depth. The invention can accurately calculate the mapping relation between the world coordinate system position acquired by the camera and the grating parameter.
Description
Technical Field
The invention belongs to the technical field of naked eye 3D display, relates to a visual area calibration method, and particularly relates to a raster visual area calibration method, a system, electronic equipment and a storage medium based on image definition.
Background
With the development and progress of naked eye 3D and deep learning technologies, a display combining human eye detection and naked eye 3D display technologies based on deep learning becomes a mainstream direction of naked eye 3D product application and is mature day by day. The naked eye 3D display for human eye detection can realize higher-resolution viewing experience, and meanwhile, the viewing parameters of the grating can be changed according to the human eye position, so that a viewer can watch the best effect at any position.
According to the detected human eye position, the grating parameter is calculated by using calibration data corresponding to the position of the world coordinate system acquired by the camera and the grating parameter, so that a method for accurately realizing the mapping relation between the human eye position and the grating parameter becomes one of the research directions of naked eye 3D technology.
In view of this, there is an urgent need to design a new method for calculating grating parameters so as to overcome at least some of the above-mentioned drawbacks of the existing method for calculating grating parameters.
Disclosure of Invention
The invention provides a raster visual area calibration method, a raster visual area calibration system, electronic equipment and a storage medium based on image definition, which can accurately calculate the mapping relation between the world coordinate system position acquired by a camera and raster parameters.
In order to solve the technical problems, according to one aspect of the present invention, the following technical scheme is adopted:
A method for scaling a raster visual area based on image definition, the method comprising:
s1, vertically fixing a 3D display attached with a grating and a pure light-colored plate required by calibration, and keeping parallelism between the 3D display attached with the grating and the pure light-colored plate;
S2, setting an initial value of a grating parameter k according to a calibration distance, and capturing a visible region image of light and dark alternate stripes formed on a pure light color plate through an imaging device arranged on a machine; wherein, the k value represents the number of the sub-pixels corresponding to the grating cylindrical lens unit;
Step S3, traversing the initial k value up and down within a set range, simultaneously capturing the bright-dark alternate stripe images corresponding to the pure light color plate by the camera, calculating the definition of the bright-dark alternate stripe images, and finding out the optimal k value under the current depth according to the prior value, the depth relation and the definition of the bright-dark alternate stripe images;
s4, in the bright-dark alternate stripe image corresponding to the optimal k value, performing binarization operation on the bright-dark alternate stripe image, and then adopting an image refinement algorithm to determine the center line position pos of the bright stripes and the interval T between the center lines of two adjacent bright stripes;
s5, normalizing the grating visible region position parameters corresponding to the optimal k value to the position taking the datum point as the center, so that interpolation calculation between the calibrated visible region positions is more accurate;
s6, calculating the positions of the visible areas on two sides of the datum point according to the obtained positions of the center lines of the bright stripes and the distances between the adjacent bright stripes, and outputting all the optimal visible area positions on the current depth;
s7, judging whether the next depth is needed to be calibrated; if so, the pure light-colored plate is moved to the corresponding depth, step S2 is repeated, otherwise the scaling is ended.
As an embodiment of the present invention, in the step S2, two colors with distinct contrast are selected as the scaled colors;
In the step S3, the inverse proportion relation between the grating parameter k and the depth d is satisfied, and the formula is as follows:
wherein a and b are constant coefficients, d is the depth from a pure light color plate to a grating, before starting calibration, the optimal k values corresponding to the depths are found at random on a plurality of depths d according to definition, and then the values of a and b are fitted, so that the initial k values at the time of starting calibration are set;
In the step S3, in the k value traversal process, the definition of the bright-dark alternate stripe image corresponding to each k value is calculated, and k when the definition is highest is the optimal k value in the current depth; the image definition Si judgment formula corresponding to the k value is as follows:
Wherein, D 1(f)、D2(f)、D3 (f) respectively represents a calculation method of the definition of the reference-free image, i represents the ith k value traversed under the current depth;
wherein f (x, y) represents the gray value of the corresponding pixel point (x, y) of the image f;
wherein G (x, y) is the convolution of Laplacian operator at pixel point (x, y);
wherein M and N are the number of rows and columns of the image, df is the gray scale variation amplitude, and dx is the distance increment between pixels.
In the step S4, binarization processing is performed on the collected bright-dark alternate stripe image corresponding to the optimal k value, then an image refinement algorithm is adopted to find out the position pos of the center line of the bright stripe, and the interval T between two adjacent bright stripes is calculated. The position pos center of the center point between the two eyes corresponding to the visual area can be calculated:
poscenter=pos+T/4;
in the step S5, the reference point selects a camera zero point;
considering that the scaled visible areas form an interpolatable whole, unifying the visible area position of each depth to be centered on a camera zero point;
Acquiring the actual zero position of the image through parameters in an internal reference matrix of the camera, traversing the currently determined visible area position, and finding out the visible area position closest to the zero point; solving a pixel offset value in the horizontal direction of the current visible area position and the zero point, and calculating the viewpoint number pos0 corresponding to the sub-pixel offset value of the grating visible area according to the linear relation between the pixel offset and the grating horizontal offset value;
In the step S7, a determination is made on the scaling range of the machine, if the next depth data is required to be scaled, the depth value d is changed, and the step S2 is repeated to complete the scaling of the visual area of the current depth.
As one embodiment of the present invention, the reference point represents a projection position of an intersection point of the optical axis of the camera and the light-colored plate on the visible area plane.
According to another aspect of the invention, the following technical scheme is adopted: a raster visual field calibration system based on image sharpness, the raster visual field calibration system comprising:
The initial value setting module is used for setting the initial value of the grating parameter k according to the depth; wherein, the k value represents the number of the sub-pixels corresponding to the grating cylindrical lens unit;
The grating visible region acquisition module is used for acquiring visible region images of light-dark alternate stripes formed on the pure light color plate;
The optimal value acquisition module is used for acquiring optimal grating parameters k under different depths, traversing the initial value k up and down in a set range, calculating the definition of an image with a set color acquired by the camera, and taking the k value with the highest definition of the image as the optimal k value under the current depth;
The image processing and reasoning module is used for determining the central line position pos of the bright stripes in the bright-dark alternate stripe images corresponding to the optimal values by adopting an image thinning algorithm after binarizing the images; and calculating the distance T between the central lines of two adjacent bright stripes, and deducing the grating visual area parameters on the current depth according to pos and T.
The position parameter normalization module is used for normalizing the positioned grating visible region position parameters to the reference point as the center, so that interpolation calculation between the calibrated visible region positions is more accurate;
the visual area position acquisition module is used for calculating the visual area positions on two sides of the datum point according to the bright stripe central line and the interval between two adjacent bright stripe central lines and outputting the visual area position of the current depth.
As one embodiment of the invention, the grating visual area acquisition module selects two colors with obvious contrast as the calibrated colors;
the grating parameter k and the depth d acquired by the optimal value acquisition module meet the inverse proportion relation, and the formula is as follows:
Wherein a and b are constant coefficients, d is the depth from a pure light color plate to a grating, before starting calibration, the optimal k value corresponding to the depth is found on a plurality of depths d at random according to definition, and then the values of a and b are fitted, and the initial k value at the time of starting calibration is set;
In the k value traversal process, the optimal value acquisition module calculates the definition of the bright-dark inter-phase stripe image corresponding to each k value, wherein k when the definition is highest is the optimal k value on the current depth; the image definition S i corresponding to the k value has the following judgment formula:
Wherein, D 1(f)、D2(f)、D3 (f) respectively represents a calculation method of the definition of the reference-free image, i represents the ith k value traversed under the current depth;
wherein f (x, y) represents the gray value of the corresponding pixel point (x, y) of the image f;
wherein G (x, y) is the convolution of Laplacian operator at pixel point (x, y);
wherein M and N are the number of rows and columns of the image, df is the gray scale variation amplitude, and dx is the distance increment between pixels.
As one embodiment of the invention, the image processing and reasoning module carries out binarization processing on the acquired bright-dark alternate stripe image corresponding to the optimal k value, then adopts an image refinement algorithm to find out the position pos of the center line of the bright stripe, calculates the interval T between two adjacent bright stripes, and can calculate the position pos center of the center point between two eyes corresponding to the visible area:
poscener=pos+T/4;
The reference point of the position parameter normalization module selects a camera zero point; the scaled visible areas are combined into an interpolatable whole, and the visible area position of each depth is unified to be centered on a camera zero point;
The position parameter normalization module acquires the actual zero position of the image through parameters in an internal reference matrix of the camera, traverses the currently determined visible area position and finds out the visible area position closest to the zero point; solving a pixel offset value in the horizontal direction of the current visible area position and the zero point, and calculating the viewpoint number pos0 corresponding to the sub-pixel offset value of the grating visible area according to the linear relation between the pixel offset and the grating horizontal offset value;
And the visual area position acquisition module judges the calibration range of the machine, and if the next depth data needs to be calibrated, the depth value d is changed to finish the visual area calibration of the current depth.
As one embodiment of the present invention, the reference point represents a projection position of an intersection point of the optical axis of the camera and the light-colored plate on the visible area plane.
According to a further aspect of the invention, the following technical scheme is adopted: an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the above method when the computer program is executed.
According to a further aspect of the invention, the following technical scheme is adopted: a storage medium having stored thereon computer program instructions which, when executed by a processor, perform the steps of the above method.
The invention has the beneficial effects that: the method, the system, the electronic equipment and the storage medium for calibrating the raster visual area based on the image definition can accurately calculate the mapping relation between the world coordinate system position acquired by the camera and the raster parameter.
Drawings
Fig. 1 is a flowchart of a method for scaling a raster visual area based on image sharpness according to an embodiment of the present invention.
FIG. 2 is a schematic diagram illustrating a system for scaling a raster visual field based on image sharpness according to an embodiment of the present invention.
Fig. 3 is a schematic diagram of an electronic device according to an embodiment of the invention.
Detailed Description
Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
For a further understanding of the present invention, preferred embodiments of the invention are described below in conjunction with the examples, but it should be understood that these descriptions are merely intended to illustrate further features and advantages of the invention, and are not limiting of the claims of the invention.
The description of this section is intended to be illustrative of only a few exemplary embodiments and the invention is not to be limited in scope by the description of the embodiments. It is also within the scope of the description and claims of the invention to interchange some of the technical features of the embodiments with other technical features of the same or similar prior art.
The description of the steps in the various embodiments in the specification is merely for convenience of description, and the implementation of the present application is not limited by the order in which the steps are implemented.
"Connected" in the specification includes both direct and indirect connections.
The invention discloses a raster visual area calibration method based on image definition, and fig. 1 is a flow chart of the raster visual area calibration method based on image definition in an embodiment of the invention; referring to fig. 1, the method for scaling the raster visual area includes:
step S1, vertically fixing the 3D display with the grating and a pure light-colored plate (such as a white plate) required by calibration, and keeping parallelism between the 3D display and the pure light-colored plate;
Step S2, setting an initial value of a grating parameter k according to a calibration distance, and capturing a visible area image of light and dark alternate stripes (such as black and white stripes) formed on a pure light color plate through a camera device arranged on a machine; wherein, the k value represents the number of the sub-pixels corresponding to the grating lenticular lens unit. In one embodiment of the invention, two colors with distinct contrast are selected as the scaled colors.
Step S3, traversing the initial k values up and down within a set range, simultaneously capturing the bright-dark alternate stripe images on the pure light color plate by the camera, calculating the definition of the bright-dark alternate stripe images, and finding out the optimal k value under the current depth according to the prior value and depth relation and the definition of the bright-dark alternate stripe images.
In one embodiment, the inverse ratio relationship between the grating parameter k and the depth d is satisfied as follows:
Wherein a and b are constant coefficients, d is the depth from a pure light color plate to a grating, before starting calibration, the optimal k value corresponding to the depth is found on a plurality of depths d randomly according to definition, then the values of a and b are fitted, and the initial k value at the time of starting calibration is set.
In the traversing process of k values, calculating the definition of the bright-dark alternate stripe image corresponding to each k value, wherein k at the highest definition is the optimal k value at the current depth; the image definition S i corresponding to the k value has the following judgment formula:
Wherein, D 1(f)、D2(f)、D3 (f) respectively represents a calculation method of the definition of the reference-free image, i represents the ith k value traversed under the current depth;
wherein f (x, y) represents the gray value of the corresponding pixel point (x, y) of the image f;
wherein G (x, y) is the convolution of Laplacian operator at pixel point (x, y);
wherein M and N are the number of rows and columns of the image, df is the gray scale variation amplitude, and dx is the distance increment between pixels.
In step S4, binarization is performed on the bright-dark alternate stripe image corresponding to the optimal k value, and then an image refinement algorithm is adopted to determine the center line position pos of the bright stripe (such as white bright stripe) and the interval T between the center lines of two adjacent bright stripes.
In an embodiment of the present invention, in the step S4, binarization processing is performed on the collected bright-dark alternate stripe image corresponding to the optimal k value, then an image refinement algorithm is used to find out a position pos of the center line of the bright stripe, and a space T between two adjacent bright stripes is calculated. The position pos center of the center point between the two eyes corresponding to the visual area can be calculated:
poscenter=pos+T/4。
Step S5, normalizing the grating visible area position parameters corresponding to the optimal k value to the position taking the datum point as the center, so that interpolation calculation between the calibrated visible area positions is more accurate. In one embodiment, the datum point represents a projection position of an intersection point of the optical axis of the camera and the pure light plate on the visible area plane.
In one embodiment, the fiducial point selects a camera zero; consider that the scaled view regions are combined into an interpolatable whole, with the view region position for each depth unified to be centered at the camera zero point. Acquiring the actual zero position of the image through parameters in an internal reference matrix of the camera, traversing the currently determined visible area position, and finding out the visible area position closest to the zero point; and obtaining pixel offset values of the current visible area position and the zero horizontal direction, and calculating the viewpoint number pos0 corresponding to the grating visible area sub-pixel offset value according to the linear relation between the pixel offset and the grating horizontal offset value.
Step S6, calculating the positions of the visible areas on the two sides of the datum point according to the obtained positions of the center lines of the bright stripes and the distances between the adjacent bright stripes, and outputting all the optimal visible area positions on the current depth.
Step S7, judging whether the next depth is needed to be scaled; if so, the pure light-colored plate is moved to the corresponding depth, step S2 is repeated, otherwise the scaling is ended.
In one embodiment, a determination is made as to the scaling range of the machine, and if the next depth data is to be scaled, the depth value d is changed, and step S2 is repeated to complete the scaling of the visual zone for the current depth.
The invention also discloses a grating visible area calibration system based on image definition, and FIG. 2 is a schematic diagram of the grating visible area calibration system based on image definition in an embodiment of the invention; referring to fig. 2, the raster visual area scaling system includes: the system comprises an initial value setting module 1, a grating visible region acquisition module 2, an optimal value acquisition module 3, an image processing and reasoning module 4, a position parameter normalization module 5 and a visible region position acquisition module 6.
The initial value setting module 1 is used for setting an initial value of a grating parameter k according to depth; wherein, the k value represents the number of the sub-pixels corresponding to the grating lenticular lens unit.
The grating visual area acquisition module 2 is used for acquiring a visible area image of light-dark alternate stripes formed on a pure light color plate.
The optimal value obtaining module 3 is configured to obtain optimal grating parameters k at different depths, traverse the initial value k up and down within a set range, perform sharpness calculation on an image with a set color acquired by the camera, and take the k value with the highest sharpness as the optimal k value at the current depth.
The image processing and reasoning module 4 is used for determining the central line position pos of the bright stripes in the bright-dark alternate stripe images corresponding to the optimal values by adopting an image refinement algorithm after binarizing the images; and calculating the distance T between the central lines of two adjacent bright stripes, and deducing the grating visual area parameters on the current depth according to pos and T.
The position parameter normalization module 5 is configured to normalize the positioned grating visible region position parameter to a reference point as a center, so that interpolation calculation between the scaled visible region positions is more accurate.
The visual area position obtaining module 6 is configured to calculate the visual area positions on both sides of the reference point according to the bright stripe center line and the interval between two adjacent bright stripe center lines, and output the visual area position of the current depth.
In one embodiment of the present invention, the raster view area acquisition module 2 selects two colors with distinct contrast as the scaled colors.
The grating parameter k and the depth d acquired by the optimal value acquisition module 3 satisfy the inverse proportion relation, and the formula is as follows:
in the formula, a and b are constant coefficients, d is the depth from a pure light color plate to a grating, before starting calibration, optimal k values corresponding to the depths are found on a plurality of depths d randomly according to definition, values of a and b are fitted, and initial k values at the time of starting calibration are set.
In the k value traversal process, the optimal value acquisition module 3 calculates the definition of the bright-dark alternate stripe image corresponding to each k value, and k when the definition is highest is the optimal k value on the current depth; the image definition S i corresponding to the k value has the following judgment formula:
Wherein, D 1(f)、D2(f)、D3 (f) respectively represents a calculation method of the definition of the reference-free image, i represents the ith k value traversed under the current depth;
wherein f (x, y) represents the gray value of the corresponding pixel point (x, y) of the image f;
wherein G (x, y) is the convolution of Laplacian operator at pixel point (x, y);
wherein M and N are the number of rows and columns of the image, df is the gray scale variation amplitude, and dx is the distance increment between pixels.
In an embodiment of the present invention, the image processing and reasoning module 4 performs binarization processing on the collected bright-dark alternate stripe image corresponding to the optimal k value, then finds out the position pos of the center line of the bright stripe by using an image refinement algorithm, and calculates the interval T between two adjacent bright stripes, so as to calculate the position pos center of the center point between two eyes corresponding to the visible region:
poscenter=pos+T/4;
The reference point of the position parameter normalization module 5 selects a camera zero point; the scaled visible areas are combined into an interpolatable whole, and the visible area position of each depth is unified to be centered on a camera zero point;
the position parameter normalization module 5 acquires the actual zero position of the image through parameters in an internal reference matrix of the camera, traverses the currently determined visible area position and finds out the visible area position closest to the zero point; solving a pixel offset value in the horizontal direction of the current visible area position and the zero point, and calculating the viewpoint number pos0 corresponding to the sub-pixel offset value of the grating visible area according to the linear relation between the pixel offset and the grating horizontal offset value;
The visual area position acquisition module 6 judges the scaling range of the machine, and if the next depth data needs to be scaled, the depth value d is changed to finish the visual area scaling of the current depth.
The invention also discloses an electronic device, and FIG. 3 is a schematic diagram of the composition of the electronic device in an embodiment of the invention; referring to fig. 3, the electronic device includes a memory, a processor, and at least one network interface at a hardware level; the processor may be a microprocessor, and the memory may include a memory, for example, a random access memory (Random Access Memory, RAM), a non-volatile memory (non-volatile memory), and so on. Of course, the electronic device may also be provided with other hardware as desired.
The processor, network interface, and memory may be interconnected by an internal bus, which may be an ISA (industry standard architecture) bus, a PCI (peripheral component interconnect standard) bus, or an EISA (extended industry standard architecture) bus, etc.; the buses may include address buses, data buses, control buses, and the like. The memory is used for storing programs (which can comprise operating system programs and application programs); the program may include program code that may include computer operating instructions. The memory may include memory and non-volatile storage and provide instructions and data to the processor.
In one embodiment, the processor may read the corresponding program from the nonvolatile memory to the memory and then run the program; the processor is capable of executing the program stored in the memory and is specifically configured to perform the following operations (as shown in fig. 1):
Step S1, vertically fixing the 3D display with the grating and a pure light-colored plate required by calibration, and keeping parallelism between the 3D display and the pure light-colored plate;
Step S2, setting an initial value of a grating parameter k according to a calibration distance, and capturing a visible region image of light and dark alternate stripes formed on a pure light color plate through a camera device arranged on a machine; wherein, the k value represents the number of the sub-pixels corresponding to the grating cylindrical lens unit;
Step S3, traversing the initial k value up and down within a set range, simultaneously capturing the bright-dark alternate stripe images on the pure light color plate by the camera, calculating the definition of the bright-dark alternate stripe images, and finding out the optimal k value under the current depth according to the prior value, the depth relation and the definition of the bright-dark alternate stripe images;
Step S4, in the bright-dark alternate stripe image corresponding to the optimal k value, performing binarization operation on the bright-dark alternate stripe image, and then adopting an image refinement algorithm to determine the center line position pos of the bright stripes and the interval T between the center lines of two adjacent bright stripes;
step S5, normalizing the grating visible region position parameters corresponding to the optimal k value to the position taking the datum point as the center, so that interpolation calculation between the calibrated visible region positions is more accurate;
step S6, calculating the positions of the visible areas on two sides of the datum point according to the obtained positions of the center lines of the bright stripes and the distances between the adjacent bright stripes, and outputting all the optimal visible area positions on the current depth;
Step S7, judging whether the next depth is needed to be scaled; if so, the pure light-colored plate is moved to the corresponding depth, step S2 is repeated, otherwise the scaling is ended.
The invention further discloses a storage medium having stored thereon computer program instructions which when executed by a processor perform the following steps of the method of the invention (as shown in fig. 1):
Step S1, vertically fixing the 3D display with the grating and a pure light-colored plate required by calibration, and keeping parallelism between the 3D display and the pure light-colored plate;
Step S2, setting an initial value of a grating parameter k according to a calibration distance, and capturing a visible region image of light and dark alternate stripes formed on a pure light color plate through a camera device arranged on a machine; wherein, the k value represents the number of the sub-pixels corresponding to the grating cylindrical lens unit;
Step S3, traversing the initial k value up and down within a set range, simultaneously capturing the bright-dark alternate stripe images on the pure light color plate by the camera, calculating the definition of the bright-dark alternate stripe images, and finding out the optimal k value under the current depth according to the prior value, the depth relation and the definition of the bright-dark alternate stripe images;
Step S4, in the bright-dark alternate stripe image corresponding to the optimal k value, performing binarization operation on the bright-dark alternate stripe image, and then adopting an image refinement algorithm to determine the center line position pos of the bright stripes and the interval T between the center lines of two adjacent bright stripes;
step S5, normalizing the grating visible region position parameters corresponding to the optimal k value to the position taking the datum point as the center, so that interpolation calculation between the calibrated visible region positions is more accurate;
step S6, calculating the positions of the visible areas on two sides of the datum point according to the obtained positions of the center lines of the bright stripes and the distances between the adjacent bright stripes, and outputting all the optimal visible area positions on the current depth;
Step S7, judging whether the next depth is needed to be scaled; if so, the pure light-colored plate is moved to the corresponding depth, step S2 is repeated, otherwise the scaling is ended.
In summary, the method, the system, the electronic device and the storage medium for calibrating the raster visual area based on the image definition can accurately calculate the mapping relation between the world coordinate system position acquired by the camera and the raster parameter.
It should be noted that the present application may be implemented in software and/or a combination of software and hardware; for example, an Application Specific Integrated Circuit (ASIC), a general purpose computer, or any other similar hardware device may be employed. In some embodiments, the software program of the present application may be executed by a processor to implement the above steps or functions. Likewise, the software program of the present application (including the related data structures) may be stored in a computer-readable recording medium; such as RAM memory, magnetic or optical drives or diskettes, and the like. In addition, some steps or functions of the present application may be implemented in hardware; for example, as circuitry that cooperates with the processor to perform various steps or functions.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The description and applications of the present invention herein are illustrative and are not intended to limit the scope of the invention to the embodiments described above. Effects or advantages referred to in the embodiments may not be embodied in the embodiments due to interference of various factors, and description of the effects or advantages is not intended to limit the embodiments. Variations and modifications of the embodiments disclosed herein are possible, and alternatives and equivalents of the various components of the embodiments are known to those of ordinary skill in the art. It will be clear to those skilled in the art that the present invention may be embodied in other forms, structures, arrangements, proportions, and with other assemblies, materials, and components, without departing from the spirit or essential characteristics thereof. Other variations and modifications of the embodiments disclosed herein may be made without departing from the scope and spirit of the invention.
Claims (10)
1. The method for calibrating the raster visual area based on the image definition is characterized by comprising the following steps of:
s1, vertically fixing a 3D display attached with a grating and a pure light-colored plate required by calibration, and keeping parallelism between the 3D display attached with the grating and the pure light-colored plate;
S2, setting an initial value of a grating parameter k according to a calibration distance, and capturing a visible region image of light and dark alternate stripes formed on a pure light color plate through an imaging device arranged on a machine; wherein, the k value represents the number of the sub-pixels corresponding to the grating cylindrical lens unit;
Step S3, traversing the initial k value up and down within a set range, simultaneously capturing the bright-dark alternate stripe images corresponding to the pure light color plate by the camera, calculating the definition of the bright-dark alternate stripe images, and finding out the optimal k value under the current depth according to the prior value, the depth relation and the definition of the bright-dark alternate stripe images;
s4, in the bright-dark alternate stripe image corresponding to the optimal k value, performing binarization operation on the bright-dark alternate stripe image, and then adopting an image refinement algorithm to determine the center line position pos of the bright stripes and the interval T between the center lines of two adjacent bright stripes;
s5, normalizing the grating visible region position parameters corresponding to the optimal k value to the position taking the datum point as the center, so that interpolation calculation between the calibrated visible region positions is more accurate;
s6, calculating the positions of the visible areas on two sides of the datum point according to the obtained positions of the center lines of the bright stripes and the distances between the adjacent bright stripes, and outputting all the optimal visible area positions on the current depth;
s7, judging whether the next depth is needed to be calibrated; if so, the pure light-colored plate is moved to the corresponding depth, step S2 is repeated, otherwise the scaling is ended.
2. The method for scaling a raster visual area based on image definition according to claim 1, wherein:
in the step S2, two colors with obvious contrast are selected as the calibrated colors;
In the step S3, the inverse proportion relation between the grating parameter k and the depth d is satisfied, and the formula is as follows:
wherein a and b are constant coefficients, d is the depth from a pure light color plate to a grating, before starting calibration, the optimal k values corresponding to the depths are found at random on a plurality of depths d according to definition, and then the values of a and b are fitted, so that the initial k values at the time of starting calibration are set;
In the step S3, in the k value traversal process, the definition of the bright-dark alternate stripe image corresponding to each k value is calculated, and k when the definition is highest is the optimal k value in the current depth; the image definition S i corresponding to the k value has the following judgment formula:
Wherein, D 1(f)、D2(f)、D3 (f) respectively represents a calculation method of the definition of the reference-free image, i represents the ith k value traversed under the current depth;
wherein f (x, y) represents the gray value of the corresponding pixel point (x, y) of the image f;
wherein G (x, y) is the convolution of Laplacian operator at pixel point (x, y);
wherein M and N are the number of rows and columns of the image, df is the gray scale variation amplitude, and dx is the distance increment between pixels.
3. The method for scaling a raster visual area based on image definition according to claim 1, wherein:
In the step S4, binarizing the collected bright-dark alternate stripe image corresponding to the optimal k value, finding out the position pos of the center line of the bright stripe by adopting an image refinement algorithm, and calculating the interval T between two adjacent bright stripes;
The position pos center of the center point between the two eyes corresponding to the visual area can be calculated:
poscenter=pos+T/4;
in the step S5, the reference point selects a camera zero point;
considering that the scaled visible areas form an interpolatable whole, unifying the visible area position of each depth to be centered on a camera zero point;
Acquiring the actual zero position of the image through parameters in an internal reference matrix of the camera, traversing the currently determined visible area position, and finding out the visible area position closest to the zero point; solving a pixel offset value in the horizontal direction of the current visible area position and the zero point, and calculating the viewpoint number pos0 corresponding to the sub-pixel offset value of the grating visible area according to the linear relation between the pixel offset and the grating horizontal offset value;
In the step S7, a determination is made on the scaling range of the machine, if the next depth data is required to be scaled, the depth value d is changed, and the step S2 is repeated to complete the scaling of the visual area of the current depth.
4. The method for scaling a raster visual area based on image definition according to claim 1, wherein:
The datum point represents the projection position of the intersection point of the optical axis of the camera and the pure light color plate on the visible area plane.
5. A raster visual field calibration system based on image definition, characterized in that the raster visual field calibration system comprises:
The initial value setting module is used for setting the initial value of the grating parameter k according to the depth; wherein, the k value represents the number of the sub-pixels corresponding to the grating cylindrical lens unit;
The grating visible region acquisition module is used for acquiring visible region images of light-dark alternate stripes formed on the pure light color plate;
The optimal value acquisition module is used for acquiring optimal grating parameters k under different depths, traversing the initial value k up and down in a set range, calculating the definition of an image with a set color acquired by the camera, and taking the k value with the highest definition of the image as the optimal k value under the current depth;
The image processing and reasoning module is used for determining the central line position pos of the bright stripes in the bright-dark alternate stripe images corresponding to the optimal values by adopting an image thinning algorithm after binarizing the images; calculating the distance T between the central lines of two adjacent bright stripes, and deducing the grating visible region parameters on the current depth according to pos and T;
the position parameter normalization module is used for normalizing the positioned grating visible region position parameters to the reference point as the center, so that interpolation calculation between the calibrated visible region positions is more accurate; and
The visual area position acquisition module is used for calculating the visual area positions on two sides of the datum point according to the bright stripe central line and the interval between two adjacent bright stripe central lines and outputting the visual area position of the current depth.
6. The image sharpness-based raster visual field calibration system of claim 5, wherein:
the grating visual area acquisition module selects two colors with obvious contrast as calibration colors;
the grating parameter k and the depth d acquired by the optimal value acquisition module meet the inverse proportion relation, and the formula is as follows:
Wherein a and b are constant coefficients, d is the depth from a pure light color plate to a grating, before starting calibration, the optimal k value corresponding to the depth is found on a plurality of depths d at random according to definition, and then the values of a and b are fitted, and the initial k value at the time of starting calibration is set;
In the k value traversal process, the optimal value acquisition module calculates the definition of the bright-dark inter-phase stripe image corresponding to each k value, wherein k when the definition is highest is the optimal k value on the current depth; the image definition S i corresponding to the k value has the following judgment formula:
Wherein, D 1(f)、D2(f)、D3 (f) respectively represents a calculation method of the definition of the reference-free image, i represents the ith k value traversed under the current depth;
wherein f (x, y) represents the gray value of the corresponding pixel point (x, y) of the image f;
wherein G (x, y) is the convolution of Laplacian operator at pixel point (x, y);
wherein M and N are the number of rows and columns of the image, df is the gray scale variation amplitude, and dx is the distance increment between pixels.
7. The image sharpness-based raster visual field calibration system of claim 5, wherein:
The image processing and reasoning module carries out binarization processing on the acquired bright-dark alternate stripe image corresponding to the optimal k value, then adopts an image refinement algorithm to find out the position pos of the center line of the bright stripes, calculates the interval T between two adjacent bright stripes, and can calculate the position pos center of the center point between two eyes corresponding to the visible area:
poscenter=pos+T/4;
The reference point of the position parameter normalization module selects a camera zero point; the scaled visible areas are combined into an interpolatable whole, and the visible area position of each depth is unified to be centered on a camera zero point;
The position parameter normalization module acquires the actual zero position of the image through parameters in an internal reference matrix of the camera, traverses the currently determined visible area position and finds out the visible area position closest to the zero point; solving a pixel offset value in the horizontal direction of the current visible area position and the zero point, and calculating the viewpoint number pos0 corresponding to the sub-pixel offset value of the grating visible area according to the linear relation between the pixel offset and the grating horizontal offset value;
And the visual area position acquisition module judges the calibration range of the machine, and if the next depth data needs to be calibrated, the depth value d is changed to finish the visual area calibration of the current depth.
8. The image sharpness-based raster visual field calibration system of claim 5, wherein:
The datum point represents the projection position of the intersection point of the optical axis of the camera and the pure light color plate on the visible area plane.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the steps of the method of any of claims 1 to 4 when the computer program is executed.
10. A storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the method of any of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410109923.4A CN118071836A (en) | 2024-01-26 | 2024-01-26 | Grating visible area calibration method and system based on image definition, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410109923.4A CN118071836A (en) | 2024-01-26 | 2024-01-26 | Grating visible area calibration method and system based on image definition, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118071836A true CN118071836A (en) | 2024-05-24 |
Family
ID=91096437
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410109923.4A Pending CN118071836A (en) | 2024-01-26 | 2024-01-26 | Grating visible area calibration method and system based on image definition, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118071836A (en) |
-
2024
- 2024-01-26 CN CN202410109923.4A patent/CN118071836A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11570423B2 (en) | System and methods for calibration of an array camera | |
CN110717942B (en) | Image processing method and device, electronic equipment and computer readable storage medium | |
US10694101B2 (en) | Contrast-enhanced combined image generation systems and methods | |
CN110264426B (en) | Image distortion correction method and device | |
CN104484659B (en) | A method of to Color medical and gray scale image automatic identification and calibration | |
JP2009544992A (en) | Autostereoscopic system | |
JP5099529B2 (en) | Focus support system and focus support method | |
CN108074237B (en) | Image definition detection method and device, storage medium and electronic equipment | |
CN111951193B (en) | Horizontal distortion correction method and horizontal distortion correction device for image | |
CN112947885B (en) | Method and device for generating curved surface screen flattening image | |
CN109685794B (en) | Camera self-adaptive step length DPC algorithm and device for mobile phone screen defect detection | |
CN112581904A (en) | Moire compensation method for brightness gray scale image of OLED (organic light emitting diode) screen | |
CN108182666B (en) | Parallax correction method, device and terminal | |
US20180059398A1 (en) | Image processing device, imaging device, microscope system, image processing method, and computer-readable recording medium | |
CN116012322A (en) | Camera dirt detection method, device, equipment and medium | |
JP6545270B2 (en) | INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING DEVICE, OUTPUT DEVICE, PROGRAM, AND RECORDING MEDIUM | |
JP5240517B2 (en) | Car camera calibration system | |
JP6529360B2 (en) | Image processing apparatus, imaging apparatus, image processing method and program | |
JP5339070B2 (en) | Displacement measuring apparatus and measuring method | |
CN116563388B (en) | Calibration data acquisition method and device, electronic equipment and storage medium | |
CN118071836A (en) | Grating visible area calibration method and system based on image definition, electronic equipment and storage medium | |
US8102516B2 (en) | Test method for compound-eye distance measuring apparatus, test apparatus, and chart used for the same | |
CN116503248A (en) | Infrared image correction method and system for crude oil storage tank | |
CN112762896B (en) | Device and method for judging and adjusting levelness of large-depth-of-field lens camera | |
JP2006023133A (en) | Instrument and method for measuring three-dimensional shape |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |