CN105812790B - Method for evaluating verticality between photosensitive surface and optical axis of image sensor and optical test card - Google Patents
Method for evaluating verticality between photosensitive surface and optical axis of image sensor and optical test card Download PDFInfo
- Publication number
- CN105812790B CN105812790B CN201610188600.4A CN201610188600A CN105812790B CN 105812790 B CN105812790 B CN 105812790B CN 201610188600 A CN201610188600 A CN 201610188600A CN 105812790 B CN105812790 B CN 105812790B
- Authority
- CN
- China
- Prior art keywords
- image
- boundary information
- image sensor
- optical axis
- target area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The invention discloses an evaluation method for perpendicularity of a light sensing surface and an optical axis of an image sensor and an optical test card, which are used for quantitatively calibrating the perpendicularity of the light sensing surface and the optical axis of the image sensor. The evaluation method comprises the following steps: collecting N frames of images of the test card in the process of one circle of rotation of a camera lens focusing ring; extracting target areas from at least three same positions of the N frames of images respectively according to a preset coordinate system; respectively calculating the boundary information value of each target area in each frame of image; and calculating the verticality of the photosensitive surface and the optical axis of the image sensor according to the boundary information value.
Description
Technical Field
The invention relates to the technical field of cameras, in particular to an evaluation method for perpendicularity of a light sensing surface and an optical axis of an image sensor and an optical test card.
Background
For an ideal optical imaging system, the image sensor light-sensing surface should be perpendicular to the optical axis so that a completely clear image can be presented when the image sensor light-sensing surface is at the image focal plane by focusing the camera lens. However, in actual production, due to the limitation of machining precision, installation errors and other factors, the photosensitive surface of the image sensor may not be perpendicular to the optical axis, so that the photosensitive surface of the image sensor cannot be completely located on the image focal plane, and a completely clear image cannot be presented, which affects image quality.
Disclosure of Invention
In view of this, the present invention provides a method for evaluating the perpendicularity between the photosensitive surface and the optical axis of an image sensor, which can perform quantitative calibration on the perpendicularity between the photosensitive surface and the optical axis of the image sensor.
The invention provides an evaluation method for perpendicularity of a light sensing surface and an optical axis of an image sensor, which comprises the following steps: collecting N frames of images of the test card in the process of one circle of rotation of a camera lens focusing ring; extracting target areas from at least three same positions of the N frames of images respectively according to a preset coordinate system; respectively calculating the boundary information value of each target area in each frame of image; calculating the verticality between the photosensitive surface and the optical axis of the image sensor according to the boundary information value; calculating the perpendicularity of the image sensor photosurface and the optical axis according to the boundary information value comprises the following steps: selecting a reference target area; normalizing the boundary information values of other target areas of the image where the maximum boundary information value of the reference target area is located; calculating the ratio of the boundary information value of the other normalized target areas to the maximum boundary information value of the reference target area; and outputting the verticality grade of the photosensitive surface and the optical axis of the image sensor according to the ratio, wherein the lower the ratio is, the lower the verticality grade is.
Another embodiment of the present invention also provides an optical test card comprising a gray background, and at least three black sub-patterns disposed on the gray background, the black sub-patterns including rich detail information while having central symmetry.
According to the method for evaluating the verticality of the light-sensitive surface and the optical axis of the image sensor, provided by the embodiment of the invention, the verticality of the light-sensitive surface and the optical axis of the image sensor can be objectively and accurately evaluated, the subjective evaluation of workers on a production line is replaced, the test accuracy is improved, and meanwhile, the labor force is liberated.
Drawings
Fig. 1 is a flowchart illustrating a method for evaluating perpendicularity between a photosensitive surface and an optical axis of an image sensor according to an embodiment of the invention.
Fig. 2 is a schematic diagram of coordinates of a test card according to an embodiment of the present invention.
Fig. 3 is a diagram illustrating an optical test card according to an embodiment of the invention.
Fig. 4 is a schematic diagram of sub-patterns on an optical test card according to an embodiment of the invention.
Fig. 5 is a block diagram of a system for detecting perpendicularity between a light-sensing surface and an optical axis of an image sensor according to an embodiment of the invention.
Fig. 6 is a schematic diagram illustrating a target area selection box of a test card according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a flowchart illustrating a method for evaluating perpendicularity between a photosensitive surface and an optical axis of an image sensor according to an embodiment of the invention. As can be seen from the figure, the method comprises:
and 101, acquiring N frames of images of the test card in the process of one circle of rotation of a camera lens focusing ring. If the focusing lens is a manual focusing lens, the lens focusing ring needs to be slowly rotated for one circle at a constant speed; if the lens is an automatic focusing lens, the lens can be controlled by software to rotate the focusing ring for a circle according to a certain step length, whether the lens is manual or automatic, an image in the rotating process is a process from blurring to clearness and then blurring, and then video data in the whole process is collected to serve as a data base of an evaluating process.
Under the condition that the light-sensitive surface of the image sensor is perpendicular to the optical axis, the lens is used for focusing, when the light-sensitive surface of the image sensor is positioned on an image focal plane (perpendicular to the optical axis), a completely clear image can be presented, once the light-sensitive surface of the image sensor is inclined, the light-sensitive surface of the image sensor cannot be positioned on the image focal plane in any focusing manner, and thus a fuzzy area can appear on the image. Therefore, in the process of one rotation of the lens focusing ring, if a frame of completely clear image can be found, it indicates that the light-sensitive surface of the image sensor can be on the image focal plane, i.e. the light-sensitive surface of the image sensor is perpendicular to the optical axis; if a completely clear image of a frame cannot be found, it means that the image sensor photosensitive surface cannot be on the image focal plane, i.e. the image sensor photosensitive surface is not perpendicular to the optical axis.
And 102, extracting target areas from at least three same positions of the N frames of images respectively. Since three points can determine a plane, a minimum of three target areas can be selected, and whether the light-sensitive surface of the image sensor is perpendicular to the optical axis can be judged by judging whether the three target areas can be simultaneously and clearly imaged. The subsequent detection process is performed on the image data in the target areas in the N-frame images, and the target areas according to the embodiment of the present invention contain the same image information.
And 103, respectively calculating the boundary information value of each target area in each frame of image. The boundary information value can be used for reflecting the definition of an image, and the more the boundary information is, namely the larger the boundary information value is, the clearer the image is, so that in the process of evaluating whether the light-sensing surface of the image sensor is perpendicular to the optical axis, the degree of the perpendicular degree of the light-sensing surface of the image sensor and the optical axis can be further judged by reflecting the abstract definition by the specific boundary information value.
And 104, calculating the perpendicularity between the photosensitive surface and the optical axis of the image sensor according to the boundary information value. When the analysis force is ignored, the maximum boundary information values of the regions containing the same image information on the image are the same. If a certain target area in a frame of image obtains the maximum boundary information value, which indicates that a certain area on the photosensitive surface of the image sensor corresponding to the target area is just on the image focal plane, the larger the boundary information value of other target areas is, the closer the other target areas are to the image focal plane is, that is, the higher the perpendicularity of the photosensitive surface of the image sensor relative to the optical axis is; the smaller the boundary information value of the other target region is, the farther the other target region is from the image focal plane, that is, the lower the perpendicularity of the photosensitive surface of the image sensor with respect to the optical axis is.
According to the method for evaluating the verticality of the light-sensitive surface and the optical axis of the image sensor, whether the light-sensitive surface and the optical axis of the image sensor are vertical or not can be objectively and accurately judged, a detailed verticality grade is given, the subjective evaluation of workers on a production line is replaced, the test accuracy is improved, and meanwhile labor force is liberated.
In one embodiment, step 103 specifically includes:
first, a brightness normalization process is performed on each target region in each frame of image. The purpose of luminance normalization is to compensate for image luminance non-uniformity due to shading (shading) of the lens itself. The specific implementation process is that the brightness mean value Y of 5 target areas is respectively counted
avg_center,Y
avg_LT,Y
avg_RT,Y
avg_LBAnd Y
avg_RBThen, the image is normalized according to the following formula:
Y
LT_norm(i,j)=Y
LT(i,j)*Y
avg_center/Y
avg_LT
Y
RT_norm(i,j)=Y
RT(i,j)*Y
avg_center/Y
avg_RT
Y
LB_norm(i,j)=Y
LB(i,j)*Y
avg_center/Y
avg_LB
Y
RB_norm(i,j)=Y
RB(i,j)*Y
avg_center/Y
avg_RB
in the formula, Y
LT_norm(i,j)、Y
RT_norm(i,j)、Y
LB_norm(i, j) and Y
RB_norm(i, j) are normalized pixel values of upper left corner, upper right corner, lower left corner and lower right corner, respectively, Y
LT(i,j)、Y
RT(i,j)、Y
LB(i, j) and Y
RB(i, j) are the original pixel values input at the top left corner, top right corner, bottom left corner and bottom right corner.
And then, respectively calculating the average value of the absolute values of the filtering results of all pixel points in each target area in each frame of image. The average value is the boundary information value, and the filtering process may use a 3 × 3 high-pass filter as shown below, or may use another form of high-pass filter.
And finally, carrying out logarithmic transformation on the average value obtained from the target area in each frame of image. The objective of the logarithmic transformation is to make the boundary information value match the subjective effect of human eyes, and the larger the value is, the clearer the value is, and the value after the logarithmic transformation is used as the final boundary information value of each region.
In one embodiment, step 104 specifically includes:
step 1041, selecting a reference target area. The reference target area is selected for determining a frame of image on which the verticality calculation is performed, and the reference target area is selected to be clearest, namely, the frame of image with the largest boundary information value is selected for the verticality calculation.
Step 1042, normalizing the boundary information values of other target areas of the image where the maximum boundary information value of the reference target area is located. Since the resolution of the lens center and the four corners is different, different boundary information values are obtained even if the center and the four corners include the same image information and are located on the image focal plane at the same time, and therefore, in order to eliminate the difference in the boundary information values caused by the difference in the resolution, the maximum value of the boundary information of each corner needs to be normalized. For example, if the maximum value of the center boundary information is 2 and the maximum value of the top left corner boundary information is 1.8, the boundary information value of each frame in the top left corner is multiplied by 2/1.8 to be used as a new boundary information value, so that the difference caused by the difference of the analytic forces can be eliminated.
The following describes the process of normalizing the boundary information value in step 1042 by taking "the target area includes the center and four corners of the image, and the center is selected as the reference target area" as an example.
The following formula can be adopted for normalizing the boundary information values of the four corner regions:
Edge
LT_norm=Edge
LT*Edge
max_center/Edge
max_LT
Edge
RT_norm=Edge
RT*Edge
max_center/Edge
max_RT
Edge
LB_norm=Edge
LB*Edge
max_center/Edge
max_LB
Edge
RB_norm=Edge
RB*Edge
max_center/Edge
max_RB
wherein, Edge
LT_norm、Edge
RT_norm、Edge
LB_normAnd Edge
RB_normNormalized boundary values of upper left corner, upper right corner, lower left corner and lower right corner, Edge
LT、Edge
RT、Edge
LBAnd Edge
RBThe original boundary information of the upper left corner, the upper right corner, the lower left corner and the lower right corner, Edge
max_center、Edge
max_LT、Edge
max_RT、Edge
max_LBAnd Edge
max_RBTo indicate the center and the upper left cornerThe maximum boundary information values of the upper right corner, the lower left corner and the lower right corner.
Step 1043, calculating a ratio of the normalized boundary information value of the other target area to the maximum boundary information value of the reference target area. The ratio is used for measuring the perpendicularity of the light sensing surface of the image sensor and the optical axis, if the light sensing surface of the image sensor is perpendicular to the optical axis, the four ratios are close to 1, and the smaller the ratio is, the larger the difference between the corresponding angle and the relative position of the central area is.
And step 1044 of outputting the verticality grade of the light sensing surface and the optical axis of the image sensor according to the ratio.
In one embodiment, the verticality rating may be represented by A, B, C, which in turn represents: excellent, good and unqualified. For example, ratios obtained from four corners are K1, K2, K3, K4, and K1< K2< K3< K4 in this order, the ranking may be performed in such a manner that:
if K1 is more than or equal to 0.9, the classification is A;
if TH1 ≧ K1 ≧ 0.8, and (K4-K1) <0.1, classification is B;
the other cases are classified as C.
In one embodiment, the method shown in fig. 1 further includes, after the step 104, a step 105 of outputting a label of the image frame corresponding to the maximum value of the boundary information corresponding to each target area. Therefore, the position of the photosensitive surface of the image sensor can be adjusted according to the relation between the labels of different target areas. For example, a three-dimensional coordinate system as shown in fig. 2 is preset between the camera and the test card, when the lens focusing ring is rotated in the direction from Far to Near, the smaller the number of the image frame corresponding to the maximum boundary information of the four corners, the more backward the position of the corner of the image sensor is indicated, and if two or more corners obtain the maximum boundary information in the same frame, the front and back positions of the two corners are considered to be consistent. Therefore, the position of the photosensitive surface of the image sensor can be adjusted according to the mark of the image frame corresponding to the maximum value of the four-corner boundary information. In addition, when the automatic focusing lens is used, because the rotating step length of each frame of the lens is determined, the relation between the difference value of the frame number corresponding to the maximum value of the boundary information of the four corners and the adjustment quantity of the structure can be obtained through experimental training in advance, and the error distance between the front position and the rear position of the four corners can be accurately calculated by utilizing the relation.
Fig. 3 is a diagram illustrating an optical test card according to an embodiment of the invention. The test card can be used for evaluating the perpendicularity of the light sensing surface and the optical axis of the image sensor, and as can be seen from the figure, the test card is of a square plane structure as a whole and comprises a gray background 31, a black sub-pattern 32 and a reference mark 33.
A black sub-pattern 32 is respectively arranged at the center and four corners of the gray background 31, and the sub-patterns at the four corners of the test card are symmetrical up and down and left and right. The sub-pattern 32 includes rich detail information while having central symmetry, and fig. 4 gives six exemplary sub-patterns for reference in addition to the sub-pattern shown in fig. 3. It will be understood by those skilled in the art that the location and number of sub-patterns given herein are exemplary only and not limiting.
The reference mark 33 includes a white triangle and a black triangle which are overlapped on one side and can be used as a measuring scale for adjusting the position of the lens, and in order to ensure that the whole test card is full of the whole field of view of the image sensor, the black triangle is ensured to be in the field of view and the white triangle is out of the field of view during the test.
In one embodiment, the aspect ratio of the test card is set to 4: 3 or 16: 9. the aspect ratio of the test card can be determined according to the image resolution of the camera to be tested, for example, for a camera with an image resolution of 800 × 600, the aspect ratio of the test card can be set to 4: 3.
in one embodiment, a single test card is provided that supports a plurality of resolutions on a scale. As shown in FIG. 3, the test card can support both 1080P and 720P resolutions. The sub-patterns of different resolutions may be the same or different.
It will be understood by those skilled in the art that the shape of the test card given herein is merely exemplary, and should be properly arranged according to the camera lens, and is not limited thereto.
According to the optical test card of the embodiment, the boundary information of the image center and the four corner areas can be conveniently extracted, and the optical test card can be used for detecting the verticality of the photosensitive surface and the optical axis of the image sensor.
The following describes a specific example of evaluating the perpendicularity between the photosensitive surface and the optical axis of the image sensor by using the optical test card shown in fig. 3 and the evaluating method shown in fig. 1.
Fig. 5 is a block diagram of a system for detecting perpendicularity between a light-sensing surface and an optical axis of an image sensor according to an embodiment of the invention. As can be seen, the system comprises a test card 51, a camera 52 (with image sensor mounted, resolution 1080P), and an evaluation unit 53. The test process according to the system is specifically as follows:
first, the position of the camera 52 and the test card 51 are adjusted to fill the entire field of view of the test card 51 while ensuring that the target area on the test card 51 is within the selection box. The selection box is superimposed on the test card by the evaluation unit for accurately calibrating the target area (e.g., the center and four corners of the test card), and the superimposed effect of the selection box on the test card 51 is shown in fig. 6.
Then, the camera 52 is rotated to focus the lens around the ring, and the video data in the whole process is input into the evaluation unit.
Finally, the evaluating unit 53 calculates the perpendicularity between the photosensitive surface and the optical axis of the image sensor according to the boundary information of the target area, and outputs an evaluation result.
In a preferred embodiment, the test environment is set to have an illuminance (illumination intensity) of not less than 250lx and to be uniformly illuminated on the test card 51.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and the like that are within the spirit and principle of the present invention are included in the present invention.
Claims (6)
1. An evaluation method for perpendicularity of a light sensing surface and an optical axis of an image sensor is characterized by comprising the following steps:
collecting N frames of images of the test card in the process of one circle of rotation of a camera lens focusing ring;
extracting target regions from at least three same positions of the N frames of images respectively;
respectively calculating the boundary information value of each target area in each frame of image;
calculating the verticality between the photosensitive surface and the optical axis of the image sensor according to the boundary information value;
calculating the perpendicularity between the photosensitive surface and the optical axis of the image sensor according to the boundary information value comprises the following steps: selecting a reference target area; normalizing the boundary information values of other target areas of the image where the maximum boundary information value of the reference target area is located; calculating the ratio of the normalized boundary information value of the other target area to the maximum boundary information value of the reference target area; and outputting the verticality grade of the light sensing surface and the optical axis of the image sensor according to the ratio, wherein the lower the ratio is, the lower the verticality grade is.
2. The method of claim 1, further comprising, after outputting a level of perpendicularity of a photosurface of the image sensor with respect to the optical axis according to the ratio:
and outputting the label of the image frame where the maximum boundary information value corresponding to each target area is located.
3. The method of claim 2, wherein the calculating the boundary information value of each target region in each frame of image comprises:
respectively calculating the average value of the absolute values of the filtering results of all pixel points in each target area in each frame of image;
and carrying out logarithmic transformation on the average value.
4. The method of claim 3, further comprising performing a brightness normalization process on each target region in each frame of image before calculating an average of absolute values of the filtered results of all pixels in each target region in each frame of image.
5. The method of claim 4, wherein the target area comprises a center of each frame of image, and four corners of the upper, lower, left and right.
6. The method of claim 5, wherein the reference target region is a central region.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610188600.4A CN105812790B (en) | 2016-03-29 | 2016-03-29 | Method for evaluating verticality between photosensitive surface and optical axis of image sensor and optical test card |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610188600.4A CN105812790B (en) | 2016-03-29 | 2016-03-29 | Method for evaluating verticality between photosensitive surface and optical axis of image sensor and optical test card |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105812790A CN105812790A (en) | 2016-07-27 |
CN105812790B true CN105812790B (en) | 2020-02-11 |
Family
ID=56454976
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610188600.4A Active CN105812790B (en) | 2016-03-29 | 2016-03-29 | Method for evaluating verticality between photosensitive surface and optical axis of image sensor and optical test card |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105812790B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106488224B (en) * | 2016-10-21 | 2019-03-01 | 上海乐愚智能科技有限公司 | A kind of calibration method and calibrating installation of camera |
CN106803953B (en) * | 2017-02-21 | 2018-10-16 | 上海集成电路研发中心有限公司 | A kind of device and method whether assessment camera focal plane is parallel |
CN108037642A (en) * | 2017-12-27 | 2018-05-15 | 四川大学 | The calibration method of excimer lithography lighting system coherence factor |
CN108037643A (en) * | 2018-03-27 | 2018-05-15 | 四川大学 | A kind of optimum image plane adjusting process based on CCD coherence factor detection devices |
CN108303855A (en) * | 2018-03-27 | 2018-07-20 | 四川大学 | A kind of litho machine coherence factor measurement method based on CCD imagings |
CN108833912A (en) * | 2018-08-22 | 2018-11-16 | 高新兴科技集团股份有限公司 | A kind of measurement method and system of video camera machine core optical axis center and field angle |
CN109509168B (en) * | 2018-08-30 | 2019-06-25 | 易诚博睿(南京)科技有限公司 | A kind of details automatic analysis method for picture quality objective evaluating dead leaf figure |
CN113820100A (en) * | 2021-09-30 | 2021-12-21 | 彩晶光电科技(昆山)有限公司 | Lens detection system for detecting lens |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0961933A (en) * | 1995-08-24 | 1997-03-07 | Victor Co Of Japan Ltd | Camcorder with still camera |
CN1794032A (en) * | 2005-12-30 | 2006-06-28 | 北京中星微电子有限公司 | Automatic focusing method |
CN1932631A (en) * | 2006-10-10 | 2007-03-21 | 北京中星微电子有限公司 | Automatic focusing method for digital image pickup device starting |
CN104639894A (en) * | 2014-12-11 | 2015-05-20 | 北京中星微电子有限公司 | Image focusing method and device for surveillance camera |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160027852A (en) * | 2014-09-02 | 2016-03-10 | 삼성전기주식회사 | System and method for detecting and compensating tilt angle of lens |
-
2016
- 2016-03-29 CN CN201610188600.4A patent/CN105812790B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0961933A (en) * | 1995-08-24 | 1997-03-07 | Victor Co Of Japan Ltd | Camcorder with still camera |
CN1794032A (en) * | 2005-12-30 | 2006-06-28 | 北京中星微电子有限公司 | Automatic focusing method |
CN1932631A (en) * | 2006-10-10 | 2007-03-21 | 北京中星微电子有限公司 | Automatic focusing method for digital image pickup device starting |
CN104639894A (en) * | 2014-12-11 | 2015-05-20 | 北京中星微电子有限公司 | Image focusing method and device for surveillance camera |
Also Published As
Publication number | Publication date |
---|---|
CN105812790A (en) | 2016-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105812790B (en) | Method for evaluating verticality between photosensitive surface and optical axis of image sensor and optical test card | |
WO2020253827A1 (en) | Method and apparatus for evaluating image acquisition accuracy, and electronic device and storage medium | |
CN110044405B (en) | Automatic automobile instrument detection device and method based on machine vision | |
CN103185728B (en) | Image processing apparatus and image processing method | |
AU2007324081B2 (en) | Focus assist system and method | |
CN106604005B (en) | A kind of projection TV Atomatic focusing method and system | |
CN112033965B (en) | 3D arc surface defect detection method based on differential image analysis | |
CN1429013A (en) | Image processing device and method | |
CN110261069B (en) | Detection method for optical lens | |
CN112669394A (en) | Automatic calibration method for vision detection system | |
CN111025701B (en) | Curved surface liquid crystal screen detection method | |
CN105049734A (en) | License camera capable of giving shooting environment shooting prompt and shooting environment detection method | |
WO2019105433A1 (en) | Image distortion detection method and system | |
CN112394064A (en) | Point-line measuring method for screen defect detection | |
CN114140521A (en) | Method, device and system for identifying projection position and storage medium | |
US20170048518A1 (en) | Method and apparatus for adjusting installation flatness of lens in real time | |
CN114993614A (en) | AR head-mounted equipment testing equipment and testing method thereof | |
CN105391998B (en) | Automatic detection method and apparatus for resolution of low-light night vision device | |
CN113375555A (en) | Power line clamp measuring method and system based on mobile phone image | |
CN111044261B (en) | Method, device, storage medium and system for detecting illumination uniformity of eye fundus camera | |
JP6885764B2 (en) | Lens meter | |
JP5339070B2 (en) | Displacement measuring apparatus and measuring method | |
CN114219758A (en) | Defect detection method, system, electronic device and computer readable storage medium | |
CN109565544B (en) | Position designating device and position designating method | |
CN105989587A (en) | Automatic calibration method of multifunctional OCT (optical coherence tomography) system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |