CN117419895A - Digital visibility detection method - Google Patents

Digital visibility detection method Download PDF

Info

Publication number
CN117419895A
CN117419895A CN202311302545.3A CN202311302545A CN117419895A CN 117419895 A CN117419895 A CN 117419895A CN 202311302545 A CN202311302545 A CN 202311302545A CN 117419895 A CN117419895 A CN 117419895A
Authority
CN
China
Prior art keywords
light source
imaging
parallel light
light path
optical instrument
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311302545.3A
Other languages
Chinese (zh)
Inventor
宋宁
叶兵
谢佳玫
袁玉芬
张斌杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Xingdi Instrument Co ltd
Original Assignee
Wuxi Xingdi Instrument Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Xingdi Instrument Co ltd filed Critical Wuxi Xingdi Instrument Co ltd
Priority to CN202311302545.3A priority Critical patent/CN117419895A/en
Publication of CN117419895A publication Critical patent/CN117419895A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • G01M11/0257Testing optical properties by measuring geometrical properties or aberrations by analyzing the image formed by the object to be tested

Abstract

The invention relates to the technical field of optical detection, in particular to a digital vision detection method, which comprises the following steps: when the parallel light source is in a working state, determining a first imaging coordinate obtained by light emitted by the parallel light source through the two light path conduction devices through the image acquisition equipment; when the parallel light source is in a working state, the optical instrument to be measured is placed between the light path conduction module and the parallel light source, the second imaging coordinates obtained by the light emitted by the parallel light source through the two light path conduction devices are determined through the image acquisition equipment, and the visibility corresponding to the optical instrument to be measured is determined. Light rays emitted by the image acquisition equipment in the parallel light source do not pass through the optical instrument to be detected and pass through the optical path propagation condition of the optical instrument to be detected under two states, and the coordinate deviation of the two imaging is eliminated by the processing module in the determining process, so that the accuracy of vision detection of the optical instrument to be detected is improved.

Description

Digital visibility detection method
Technical Field
The application relates to the technical field of optical detection, in particular to a digital visibility detection method for multi-caliber use.
Background
The vision is an important basic parameter of the optical system, in order to adapt to different vision requirements of human eyes, the eyepiece of the telescopic system is usually made to be axially adjustable so as to make the light beam of the emergent eyepiece converge or diverge, and in order to quantitatively adjust the vision, the vision detection device is particularly important.
Traditional visibility detection mostly adopts: the operation steps of detecting the visibility of the common visibility cylinder, the wide-range visibility cylinder or the semi-transparent mirror visibility cylinder are as follows: 1. the vision of the vision tube ocular is regulated, so that a tester can see the dividing line clearly; 2. the vision tube leaning seat is tightly attached to the checked eyepiece guard ring, and the vision tube objective lens is axially moved, so that a user can see the vision tube dividing line and the checked instrument dividing line; 3. the measured visibility values are read from the visibility bins sidewall visibility partitions.
However, the traditional diopter cylinder detection method has low detection precision, relies on human eyes to conduct interpretation, reflects human eye errors, and has low detection efficiency.
Disclosure of Invention
The invention aims to overcome the defects and defects of the prior art, and provides a digital vision detection method which improves the detection efficiency of vision of an optical instrument, wherein the method is applied to a digital vision detection system, and the digital vision detection system comprises a parallel light source, an optical instrument to be detected, a light path conduction module, image acquisition equipment and a processing module;
the light path conduction module comprises two light path conduction devices with mirror image structures, and the light output end of the light path conduction device corresponds to the image acquisition equipment;
the light input end of the light path conduction device corresponds to the position of the parallel light source;
the method comprises the following steps:
when the parallel light source is in a working state, determining first imaging coordinates obtained by light emitted by the parallel light source through two light path conducting devices by the image acquisition equipment, wherein the first imaging coordinates comprise two first imaging sub-coordinates;
when the parallel light source is in a working state, the optical instrument to be measured is placed between the light path conduction module and the parallel light source, second imaging coordinates obtained by light emitted by the parallel light source through the two light path conduction devices are determined through the image acquisition equipment, and the second imaging coordinates comprise two second imaging sub-coordinates;
and determining, by the processing module, a visibility corresponding to the optical instrument to be measured based on a deviation of the first imaging coordinate and the second imaging coordinate.
In an alternative embodiment, the parallel light source is implemented as a combination of a backlight and a tele lens.
In an alternative embodiment, the light path conducting device comprises an aperture stop, an oblique prism and a cemented objective lens;
the small aperture diaphragm is connected with the rhombic prism;
the rhombic prism is connected with the gluing objective lens;
the small aperture diaphragm is a light input end of the light path conduction device, and the glue objective is a light output end of the light path conduction device;
the first imaging sub-coordinates and the second imaging sub-coordinates are realized as center coordinates of the parallel light source imaged through the aperture diaphragm.
In an alternative embodiment, the image acquisition device is implemented as a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS) sensor;
when the parallel light source is in a working state, the image acquisition device determines a first imaging coordinate obtained by light emitted by the parallel light source through two light path conduction devices, and the method comprises the following steps:
when the parallel light source is in a working state, determining the position of light emitted by the parallel light source on the CMOS sensor through the light path conducting device by the image acquisition equipment;
the first imaging coordinates are determined by a cross reticle.
In an optional embodiment, when the parallel light source is in a working state, the optical instrument to be measured is placed between the light path conducting module and the parallel light source, and the image acquisition device determines second imaging coordinates obtained by light emitted by the parallel light source through two light path conducting devices, where the second imaging coordinates include:
after the first imaging coordinates are determined, fixing the position of the cross reticle;
when the parallel light source is in a working state, determining the position of light emitted by the parallel light source on the CMOS sensor through the optical instrument to be tested and the light path conduction device by the image acquisition equipment;
the second imaging coordinates are determined by the cross reticle.
In an alternative embodiment, the light path conducting module is further provided with an electric guide rail, and the electric guide rail is used for controlling the hole spacing of the small hole diaphragm;
the method further comprises the steps of:
and generating an aperture adjusting instruction through the processing module, wherein the adjusting instruction is used for controlling the electric guide rail to move so as to adjust the aperture distance of the aperture diaphragm.
In an alternative embodiment, the determining, by the processing module, the visibility corresponding to the optical instrument under test based on the deviation of the first imaging coordinate and the second imaging coordinate includes:
determining a coordinate deviation of the first imaging coordinate from the second imaging coordinate;
determining angular parallax data based on the coordinate deviation and device parameters of the image acquisition device;
and determining the visibility corresponding to the optical instrument to be tested based on the angular parallax data and the aperture of the ocular lens of the optical instrument to be tested.
In an alternative embodiment, the determining angular parallax data based on the coordinate deviation and the device parameters of the image capturing device includes:
the angular parallax data is determined based on the coordinate deviation, the pixel size of the CMOS sensor, and the focal length of the glue objective.
The application at least comprises the following beneficial effects:
in the process of detecting the visibility of the optical instrument to be detected, light rays emitted by the image acquisition equipment in the parallel light source do not pass through the optical instrument to be detected and pass through the optical path propagation condition of the optical instrument to be detected under two states, the corresponding visibility of the optical instrument is determined through the coordinate deviation of the processing module for two imaging, and errors of artificial vision are eliminated in the determining process, so that the accuracy of detecting the visibility of the optical instrument to be detected is improved.
Drawings
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate the invention and together with the description serve to explain, without limitation, the invention. In the drawings:
fig. 1 is a schematic structural diagram of a digital visibility detection system according to an exemplary embodiment of the present application.
Fig. 2 is a schematic structural diagram of another digital visibility detection system according to an exemplary embodiment of the present application.
Fig. 3 is a schematic structural view of an optical path conducting device according to an exemplary embodiment of the present application.
Fig. 4 is a flowchart of a method for detecting digital visibility according to an exemplary embodiment of the present application.
Fig. 5 is a flow chart illustrating another method for detecting digital visibility according to an exemplary embodiment of the present application.
Detailed Description
It should be noted that, without conflict, the embodiments of the present invention and features of the embodiments may be combined with each other. The invention will be described in detail below with reference to the drawings in connection with embodiments.
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate in order to describe the embodiments of the invention herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a schematic structural diagram of a digital vision inspection system according to an exemplary embodiment of the present application, and please refer to fig. 1, the digital vision inspection system includes a parallel light source 110, an optical instrument 120 to be inspected, an optical path conduction module 130, an image capturing device 140, and a processing module 150.
The optical path conduction module 130 comprises two optical path conduction devices 131 with mirror image structures, and the light output end of each optical path conduction device 131 corresponds to the image acquisition equipment 140;
the light input end of the light path guiding device 131 corresponds to the position of the parallel light source 110.
In the embodiment of the present application, the parallel light source is the starting point of the light path and is also the emitting end of the light source. Correspondingly, the image acquisition device is the end point of the light path and is also the receiving end of the light source. After the image acquisition equipment receives the light source, the light path forms an image on the image acquisition equipment, at the moment, the image acquisition equipment is in communication connection with the processing module, and the processing module can acquire the image forming condition on the image acquisition equipment and process the image forming condition so as to detect the visibility of the optical instrument to be detected.
In one embodiment of the present application, please refer to fig. 2, the parallel light source 110 is implemented as a combination of a backlight lens 111 and a tele lens 112, and the effect of the parallel light source is achieved through the combination of the backlight lens 111 and the tele lens 112.
In one embodiment of the present application, please refer to fig. 2 and 3, the optical path guiding device 130 includes a small aperture diaphragm 131, an oblique prism 132, and a cemented objective lens 133. Wherein the aperture diaphragm 131 is connected with the rhombic prism 132; the rhombic prism 132 is connected with the glue objective 133; the aperture diaphragm 131 is a light input end of the light path guiding device 130, and the glue objective 133 is a light output end of the light path guiding device 130; the first imaging sub-coordinate and the second imaging sub-coordinate are realized as center coordinates of parallel light source imaging through the aperture diaphragm.
It should be noted that, in some embodiments of the present application, the image capturing device 140 may be implemented as a CMOS sensor.
In some embodiments of the present application, referring to fig. 2, the optical path conducting module 130 is further configured with an electric rail 160, where the electric rail 160 is used to control the hole pitch of the small hole diaphragm 131.
Fig. 4 is a schematic flow chart of a method for detecting digital visibility according to an exemplary embodiment of the present application, and the method is used in the digital visibility detecting system shown in fig. 1 or fig. 2 for illustration, and includes:
step 401, when the parallel light source is in a working state, determining, by the image acquisition device, a first imaging coordinate obtained by light emitted by the parallel light source through the two light path conduction devices.
In the embodiment of the application, after the parallel light source passes through the two light path conduction devices in the working state, two images are obtained on the image acquisition equipment, the positions of the two images are determined, and then the first imaging coordinates can be obtained, wherein the first imaging coordinates comprise two first imaging sub-coordinates respectively corresponding to the two images.
Step 402, when the parallel light source is in a working state, the optical instrument to be measured is placed between the light path conduction module and the parallel light source, and the second imaging coordinates obtained by the light emitted by the parallel light source through the two light path conduction devices are determined through the image acquisition equipment.
In the embodiment of the application, after the optical instrument to be measured is placed between the optical path conduction module and the parallel light source, light emitted from the parallel light source enters the optical path conduction module through the optical instrument to be measured, and at this time, the second imaging coordinate can be obtained in the same way.
Step 403, determining, by the processing module, a visibility corresponding to the optical instrument to be measured based on the deviation of the first imaging coordinate and the second imaging coordinate.
The process is the visibility determination process of the optical instrument to be measured.
In summary, in the method provided by the embodiment of the present application, in the process of performing visibility detection on an optical instrument to be detected, light rays emitted by the image acquisition device in the parallel light source do not pass through the optical instrument to be detected and pass through the optical path propagation condition of the optical instrument to be detected under two states of the optical instrument to be detected, and coordinate deviation of two imaging is determined by the processing module, so that the visibility corresponding to the optical instrument is determined, and errors of artificial vision are eliminated in the determining process, so that accuracy of visibility detection on the optical instrument to be detected is improved.
Fig. 5 is a flowchart of another digital visibility detection method according to an exemplary embodiment of the present application, and the method is used in the digital visibility detection system shown in fig. 2 for illustration, and includes:
in step 501, when the parallel light source is in an operating state, the position of the light emitted by the parallel light source on the CMOS sensor through the optical path conducting device is determined by the image capturing device.
In the embodiment of the application, the image acquisition device is implemented as a CMOS sensor.
At step 502, first imaging coordinates are determined by a cross reticle.
In the embodiment of the present application, the determination of the two first imaging sub-coordinates is performed by setting the position of the cross reticle, so as to determine the first imaging coordinates.
After determining the first imaging coordinates, the position of the cross reticle is fixed, step 503.
Optionally, the necessary error adjustment is performed before selecting the cross reticle for the first imaging coordinate position determination.
After the first imaging coordinates are determined, the position of the cross reticle needs to be fixed for error determination.
And step 504, when the parallel light source is in a working state, determining the position of the light emitted by the parallel light source on the CMOS sensor through the optical instrument to be tested and the optical path conducting device by the image acquisition equipment.
In step 505, a second imaging coordinate is determined by a cross reticle.
This process is a process of determining the second imaging coordinates.
In step 506, a coordinate deviation of the first imaging coordinate from the second imaging coordinate is determined.
In the embodiment of the present application, the coordinate deviation of the first imaging coordinate and the second imaging coordinate may be implemented as a distance value.
In step 507, angular parallax data is determined based on the coordinate deviations and device parameters of the image acquisition device.
In the embodiment of the present application, the calculation formula of the angular parallax is shown in the following formula 1:
equation 1:
wherein epsilon is the angular parallax of the visual optical system to be measured, d is the product of the cross division center coordinate difference and the pixel size of the CMOS image sensor, and f' m Imaging the focal length of the glue objective for the image acquisition module. That is, the angular parallax data is determined based on the coordinate deviation, the pixel size of the CMOS sensor, and the focal length of the glue objective.
Step 508, determining the visibility corresponding to the optical instrument to be measured based on the angular parallax data and the aperture of the eyepiece of the optical instrument to be measured.
In the embodiment of the present application, the calculation formula of the visibility is shown in the following formula 2:
equation 2: sd=1000 epsilon/D
In the formula, SD is the visibility value of the optical instrument to be measured, and D is the aperture of an eyepiece of the optical instrument to be measured.
In summary, in the method provided by the embodiment of the present application, in the process of performing visibility detection on an optical instrument to be detected, light rays emitted by the image acquisition device in the parallel light source do not pass through the optical instrument to be detected and pass through the optical path propagation condition of the optical instrument to be detected under two states of the optical instrument to be detected, and coordinate deviation of two imaging is determined by the processing module, so that the visibility corresponding to the optical instrument is determined, and errors of artificial vision are eliminated in the determining process, so that accuracy of visibility detection on the optical instrument to be detected is improved.
It is to be understood that the above embodiments are merely illustrative of the application of the principles of the present invention, but not in limitation thereof. Various modifications and improvements may be made by those skilled in the art without departing from the spirit and substance of the invention, and are also considered to be within the scope of the invention.

Claims (8)

1. The digital vision detection method is characterized by being applied to a digital vision detection system, wherein the digital vision detection system comprises a parallel light source, an optical instrument to be detected, a light path conduction module, image acquisition equipment and a processing module;
the light path conduction module comprises two light path conduction devices with mirror image structures, and the light output end of the light path conduction device corresponds to the image acquisition equipment;
the light input end of the light path conduction device corresponds to the position of the parallel light source;
the method comprises the following steps:
when the parallel light source is in a working state, determining first imaging coordinates obtained by light emitted by the parallel light source through two light path conducting devices by the image acquisition equipment, wherein the first imaging coordinates comprise two first imaging sub-coordinates;
when the parallel light source is in a working state, the optical instrument to be measured is placed between the light path conduction module and the parallel light source, second imaging coordinates obtained by light emitted by the parallel light source through the two light path conduction devices are determined through the image acquisition equipment, and the second imaging coordinates comprise two second imaging sub-coordinates;
and determining, by the processing module, a visibility corresponding to the optical instrument to be measured based on a deviation of the first imaging coordinate and the second imaging coordinate.
2. The method of claim 1, wherein the parallel light source is implemented as a combination of a backlight and a tele lens.
3. The method of claim 1, wherein the light path directing means comprises an aperture stop, an axicon, and a cemented objective lens;
the small aperture diaphragm is connected with the rhombic prism;
the rhombic prism is connected with the gluing objective lens;
the small aperture diaphragm is a light input end of the light path conduction device, and the glue objective is a light output end of the light path conduction device;
the first imaging sub-coordinates and the second imaging sub-coordinates are realized as center coordinates of the parallel light source imaged through the aperture diaphragm.
4. The method of claim 1, wherein the image acquisition device is implemented as a complementary metal oxide semiconductor CMOS sensor;
when the parallel light source is in a working state, the image acquisition device determines a first imaging coordinate obtained by light emitted by the parallel light source through two light path conduction devices, and the method comprises the following steps:
when the parallel light source is in a working state, determining the position of light emitted by the parallel light source on the CMOS sensor through the light path conducting device by the image acquisition equipment;
the first imaging coordinates are determined by a cross reticle.
5. The method according to claim 4, wherein the positioning the optical instrument to be measured between the light path conducting module and the parallel light source when the parallel light source is in the working state, the determining, by the image capturing device, the second imaging coordinates obtained by the light emitted by the parallel light source through the two light path conducting devices, includes:
after the first imaging coordinates are determined, fixing the position of the cross reticle;
when the parallel light source is in a working state, determining the position of light emitted by the parallel light source on the CMOS sensor through the optical instrument to be tested and the light path conduction device by the image acquisition equipment;
the second imaging coordinates are determined by the cross reticle.
6. A method according to claim 3, wherein the light path conducting module is further provided with motorized guide rails for controlling the hole pitch of the small hole diaphragm;
the method further comprises the steps of:
and generating an aperture adjusting instruction through the processing module, wherein the adjusting instruction is used for controlling the electric guide rail to move so as to adjust the aperture distance of the aperture diaphragm.
7. The method of claim 3, wherein the determining, by the processing module, the visibility corresponding to the optical instrument under test based on the deviation of the first imaging coordinate from the second imaging coordinate comprises:
determining a coordinate deviation of the first imaging coordinate from the second imaging coordinate;
determining angular parallax data based on the coordinate deviation and device parameters of the image acquisition device;
and determining the visibility corresponding to the optical instrument to be tested based on the angular parallax data and the aperture of the ocular lens of the optical instrument to be tested.
8. The method of claim 7, wherein the determining angular parallax data based on the coordinate deviation and a device parameter of the image capture device comprises:
the angular parallax data is determined based on the coordinate deviation, the pixel size of the CMOS sensor, and the focal length of the glue objective.
CN202311302545.3A 2023-10-09 2023-10-09 Digital visibility detection method Pending CN117419895A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311302545.3A CN117419895A (en) 2023-10-09 2023-10-09 Digital visibility detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311302545.3A CN117419895A (en) 2023-10-09 2023-10-09 Digital visibility detection method

Publications (1)

Publication Number Publication Date
CN117419895A true CN117419895A (en) 2024-01-19

Family

ID=89525607

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311302545.3A Pending CN117419895A (en) 2023-10-09 2023-10-09 Digital visibility detection method

Country Status (1)

Country Link
CN (1) CN117419895A (en)

Similar Documents

Publication Publication Date Title
EP1972886A2 (en) Method and apparatus for detecting a location of a workpiece surface using a surface height focus sensor means
CN104316293B (en) Device and method for determining parallelism of continuous zooming television optical axis
US20140160267A1 (en) Image Pickup Apparatus
CN105791691B (en) A kind of autofocus and its real-time auto focusing method
JP2008528141A (en) Apparatus and method for measuring aberration in eye to be examined
CA2902760A1 (en) Automatic alignment of an imager
CN104122072A (en) Lens module detection apparatus
CN109406105A (en) Virtual image detection method and detection system
CN107490851B (en) Optical detection device and method for left and right zoom system of operating microscope
CN111665025A (en) Diopter measuring device, measuring system and diopter measuring method
CN113242955B (en) Device and method for optically measuring the internal contour of an eyeglass frame
CN104423181B (en) Focusing leveling device that a kind of scanning reflection mirror Oscillation Amplitude is automatically adjusted and method
CN103654721B (en) A kind of method that cornea summit is accurately directed at
JPWO2016157291A1 (en) Measuring head and eccentricity measuring apparatus having the same
CN102297655B (en) Testing method for performing bidirectional positioning and synchronous testing on fiber end face
CN111060293A (en) Focal length testing device
CN117419895A (en) Digital visibility detection method
JP5126648B2 (en) Lens unit alignment device
JP2008026049A (en) Flange focal distance measuring instrument
CN2840920Y (en) Optometry instrument adopting split-screen optical wedge for focusing
US20140320672A1 (en) Method and Apparatus for Measuring Flange Back Focus and Calibrating Track Length Scales of Photographic Objective Lenses
JP2007240168A (en) Inspection apparatus
US10761398B2 (en) Imaging ellipsometer system utilizing a tunable acoustic gradient lens
CN104897077B (en) Self-adapting detecting system and method for the high speed zoom camera to curved surface profile line
JP4909108B2 (en) Corneal imaging device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination