CN116320747A - Method for horizontally checking image sensor and lens - Google Patents
Method for horizontally checking image sensor and lens Download PDFInfo
- Publication number
- CN116320747A CN116320747A CN202310566084.4A CN202310566084A CN116320747A CN 116320747 A CN116320747 A CN 116320747A CN 202310566084 A CN202310566084 A CN 202310566084A CN 116320747 A CN116320747 A CN 116320747A
- Authority
- CN
- China
- Prior art keywords
- lens
- image sensor
- monitoring area
- monitoring
- hypotenuse
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Abstract
The invention belongs to the field of horizontal verification between an image sensor and a lens in a camera, and provides a method for horizontally verifying the image sensor and the lens, which comprises the following steps: the second test beam passes through the lens, the section of the second test beam is square, the square is circumscribed on the outer edge of the lens, any one bevel edge of the square is obtained, two intersection points of the square and the outer edge of the lens are recorded, the intersection point closest to the starting point of the bevel edge is taken as a third reference point, and the other intersection point is taken as a fourth reference point; respectively acquiring a first monitoring area, a second monitoring area, a third monitoring area and a fourth monitoring area; and adjusting the angle between the lens and the image sensor based on the difference of the image definition of the four monitoring areas, so that the lens and the image sensor are in a horizontal state. The invention can finish the verification of the horizontal state at the initial moment, and can finish the accurate verification of whether the lens and the image sensor are in the horizontal state or not according to the difference between the image definition of different monitoring areas.
Description
Technical Field
The invention relates to the field of horizontal verification between an image sensor and a lens in a camera, in particular to a method for horizontally verifying the image sensor and the lens.
Background
The high-resolution image sensor and the lens are directly provided with assembly errors and device errors in production assembly, whether the assembly errors and the device errors are horizontal cannot be detected in independent assembly, after the whole camera assembly is completed, if the problem of unclear local images occurs in the actual use process, the camera needs to be disassembled for position and angle adjustment of the lens or the image sensor in the process of reprocessing, the processing time cost and the labor cost are high, and the user experience is reduced.
Disclosure of Invention
The invention aims to provide a method for horizontally checking an image sensor and a lens, which can finish the horizontal checking operation of the lens and the image sensor before the complete machine assembly of a camera, and reduce the time cost and the labor cost for horizontally checking after the complete machine assembly.
The invention solves the technical problems and adopts the following technical scheme:
the invention discloses a method for horizontally checking an image sensor and a lens, which comprises the following steps:
acquiring a specified distance between a lens and an image sensor according to the model of the camera;
the first test beam passes through the center of the lens, is focused on the image sensor to form a center focus of the lens, performs analog-to-digital conversion on the center focus of the lens, and calculates the distance between the center of the lens and the image sensor according to the center focus of the lens to make the distance equal to the specified distance;
selecting a first reference point at any position except the center of the lens in the lens, connecting the first reference point with the center of the lens in a straight line segment, taking the center point of the connected straight line segment as a second reference point, respectively passing a first test beam through the first reference point and the second reference point, focusing on an image sensor, and respectively forming a first focus corresponding to the first reference point and a second focus corresponding to the second reference point;
respectively carrying out analog-to-digital conversion on the first focus and the second focus to obtain the distance between the first reference point and the second reference point and the image sensor respectively, and if the distances between the first reference point and the second reference point and the image sensor are equal, indicating that the lens and the image sensor are in a horizontal state at the initial moment;
a second test beam passes through the lens and is focused on the image sensor, the section of the second test beam is square, the square is circumscribed on the outer edge of the lens, any one bevel edge of the square is obtained, two intersection points of the square and the outer edge of the lens are recorded, the intersection point closest to the starting point of the bevel edge is taken as a third reference point, and the other intersection point is taken as a fourth reference point;
taking a connecting line between a hypotenuse starting point and a third reference point as a hypotenuse of a first monitoring area, taking a connecting line between the hypotenuse starting point and the center of a lens as a hypotenuse of a second monitoring area, taking a connecting line between the hypotenuse starting point and a fourth reference point as a hypotenuse of a third monitoring area, and taking a connecting line between the hypotenuse starting point and a hypotenuse terminal as a hypotenuse of a fourth monitoring area to respectively acquire the first monitoring area, the second monitoring area, the third monitoring area and the fourth monitoring area;
and respectively calculating the image definition of the four monitoring areas, and adjusting the angles between the lens and the image sensor based on the difference of the image definition of the four monitoring areas so as to enable the lens and the image sensor to be in a horizontal state.
As a further optimization, the cameras of different models have different prescribed distances between the lens and the image sensor.
As a further optimization, the first test beam is circular in cross section and smaller in diameter than the radius of the lens.
As a further optimization, after the distance between the lens center and the image sensor is calculated according to the lens center focus, the difference between the lens center and the image sensor is calculated, and if the difference is within a specified threshold range, the distance between the lens center and the image sensor is regarded as equal to the specified distance.
As a further optimization, if the difference is not within the specified threshold range, the lens and/or the image sensor are adjusted until the distance between the center of the lens and the image sensor is equal to the specified distance.
As a further optimization, when the lens and the image sensor are in a horizontal state at the initial time, the distance between the center of the lens and the image sensor, the distance between the first reference point and the image sensor, and the distance between the second reference point and the image sensor are equal and are all the prescribed distances.
As a further optimization, the acquiring the first monitoring area, the second monitoring area, the third monitoring area and the fourth monitoring area respectively specifically refers to:
taking a connecting line between a hypotenuse starting point and a third reference point as a hypotenuse of a first monitoring area, wherein the first monitoring area is a first square area positioned in the section of the second test beam;
taking a connecting line of a hypotenuse starting point and the center of the lens as a hypotenuse of a second monitoring area, wherein the second monitoring area is an area which is positioned in the section of the second test beam and is except for the first square area;
taking a connecting line between a hypotenuse starting point and a fourth reference point as a hypotenuse of a third monitoring area, wherein the third monitoring area is an area which is positioned in the section of the second test beam and is except the first monitoring area and the second monitoring area;
the line between the hypotenuse starting point and the hypotenuse terminal is taken as the hypotenuse of a fourth monitoring region which is located within the cross section of the second test beam and is other than the first, second and third monitoring regions.
As further optimization, the first monitoring area and the lens area do not have an overlapping area, and the second monitoring area, the third monitoring area and the fourth monitoring area all have an overlapping area with the lens area.
As a further optimization, the image definitions of the four monitoring areas are calculated respectively, and the angles between the lens and the image sensor are adjusted based on the differences of the image definitions of the four monitoring areas, so that the lens and the image sensor are in a horizontal state, specifically:
setting a first number of feature points for a first monitoring area, and calculating average image definition of the first monitoring area based on pixels on each feature point;
calculating the area ratio of the second monitoring area, the third monitoring area and the fourth monitoring area, setting a second number of characteristic points for the second monitoring area, setting a third number of characteristic points for the third monitoring area and setting a fourth number of characteristic points for the fourth monitoring area based on the area ratio;
calculating an average image sharpness of the second monitoring area based on pixels on a second number of feature points set by the second monitoring area, calculating an average image sharpness of the third monitoring area based on pixels on a third number of feature points set by the third monitoring area, and calculating an average image sharpness of the fourth monitoring area based on pixels on a fourth number of feature points set by the fourth monitoring area;
judging whether the average image definition of the four monitoring areas is the same, if so, indicating that the four monitoring areas are parallel to the image sensor, namely, the lens and the image sensor are in a horizontal state, and if not, calculating the angle between the lens and the image sensor according to the difference value of the average image definition of the four monitoring areas;
and adjusting the angle between the lens and the image sensor to ensure that the average image definition of the four monitoring areas is the same, and ensuring that the angle between the lens and the image sensor is in a horizontal state.
The beneficial effects of the invention are as follows: by the method for verifying the image sensor and the lens level, the initial time level state can be verified according to the specified distance between the lens and the image sensor, and further, the accurate verification of whether the lens and the image sensor are in the level state or not can be further completed according to the difference between the image definition of different monitoring areas.
Drawings
FIG. 1 is a flow chart of a method for checking the level of an image sensor and a lens according to an embodiment of the invention;
fig. 2 is a schematic diagram of each monitoring area in an embodiment of the present invention.
Where 101 is a first monitoring area, 102 is a second monitoring area, 103 is a third monitoring area, and 104 is a fourth monitoring area.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. The components of the embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
The embodiment provides a method for checking the level of an image sensor and a lens, the flow chart of which is shown in fig. 1, wherein the method comprises the following steps:
s1, acquiring a specified distance between a lens and an image sensor according to a camera model;
s2, enabling the first test beam to pass through the center of the lens, focusing on the image sensor to form a center focus of the lens, performing analog-to-digital conversion on the center focus of the lens, and calculating the distance between the center of the lens and the image sensor according to the center focus of the lens to enable the distance to be equal to the specified distance;
s3, selecting a first reference point at any position except the center of the lens in the lens, connecting the first reference point with the center of the lens in a straight line segment, taking the center point of the connected straight line segment as a second reference point, respectively passing a first test beam through the first reference point and the second reference point, focusing on an image sensor, and respectively forming a first focus corresponding to the first reference point and a second focus corresponding to the second reference point;
s4, respectively carrying out analog-digital conversion on the first focus and the second focus to obtain the distance between the first reference point and the second reference point and the image sensor respectively, and if the distances between the first reference point and the second reference point and the image sensor are equal, indicating that the lens and the image sensor are in a horizontal state at the initial moment;
s5, a second test beam passes through the lens and is focused on the image sensor, the section of the second test beam is square, the square is circumscribed on the outer edge of the lens, any one bevel edge of the square is obtained, two intersection points of the square and the outer edge of the lens are recorded, the intersection point closest to the starting point of the bevel edge is taken as a third reference point, and the other intersection point is taken as a fourth reference point;
s6, taking a connecting line between a hypotenuse starting point and a third reference point as a hypotenuse of a first monitoring area, taking a connecting line between the hypotenuse starting point and the center of a lens as a hypotenuse of a second monitoring area, taking a connecting line between the hypotenuse starting point and a fourth reference point as a hypotenuse of a third monitoring area, and taking a connecting line between the hypotenuse starting point and a hypotenuse terminal as a hypotenuse of a fourth monitoring area to respectively acquire the first monitoring area, the second monitoring area, the third monitoring area and the fourth monitoring area;
s7, respectively calculating the image definition of the four monitoring areas, and adjusting the angles between the lens and the image sensor based on the difference of the image definition of the four monitoring areas so that the lens and the image sensor are in a horizontal state.
According to the embodiment, the camera lens and the image sensor are in a horizontal state at the initial moment, the average definition of each monitoring area can be calculated according to the divided monitoring areas, so that whether the camera lens and the image sensor are in the horizontal state or not is accurately checked according to the average definition difference of different monitoring areas, and after the whole camera is assembled, the checking work of whether the camera lens and the image sensor are in the horizontal state or not is basically stopped, so that time cost and labor cost are greatly saved, and user experience of a camera product is improved.
In the above method, because the shooting scenes of the cameras with different models are different, there are different requirements on the distance between the lens and the image sensor, and the above method of the embodiment can meet the horizontal verification requirement between the lens and the image sensor in the different cameras, so that the specified distance between the lens and the image sensor of the cameras with different models can be different.
Generally, for the first test beam, since the area to be set on the lens is the center of the lens, the first reference point and the second reference point, and these three points only need to obtain the pixels of the corresponding points, the size and shape of the cross section of the first test beam are not limited too much, and the cross section of the first test beam cannot be set too large, so that the pixel selection of these three points is difficult.
It should be noted that, generally, for setting a predetermined distance, which is a theoretical value set according to different camera models, since there may be a systematic error and a measurement error in the actual method for calculating the distance between the lens center and the image sensor by using the lens center focus, there may be a difference between the calculated actual distance between the lens center and the image sensor and the theoretical distance, and in this case, an error threshold may be set in the actual application process, that is, a difference between the actually calculated distance and the theoretical distance is allowed, so in this embodiment, after calculating the distance between the lens center and the image sensor according to the lens center focus, a difference between the calculated distance and the predetermined distance is calculated, and if the difference is within a predetermined threshold range, it may be regarded as the distance between the lens center and the image sensor and the predetermined distance being equal.
Here, if the difference is not within the specified threshold range, the lens and/or the image sensor may be adjusted until the distance between the center of the lens and the image sensor is equal to the specified distance, thereby ensuring that the lens and the image sensor are in a horizontal state at the initial time.
It should be noted that, when the lens and the image sensor are in a horizontal state at the initial time, the distance between the center of the lens and the image sensor, the distance between the first reference point and the image sensor, and the distance between the second reference point and the image sensor should be equal, and should be the prescribed distance mentioned in the present embodiment.
Further, referring to fig. 2, the acquiring the first monitoring area 101, the second monitoring area 102, the third monitoring area 103, and the fourth monitoring area 104 respectively specifically refers to:
taking a connecting line between a hypotenuse starting point and a third reference point as a hypotenuse of a first monitoring area 101, wherein the first monitoring area 101 is a first square area positioned in the section of a second test beam;
taking a connection line between a hypotenuse starting point and the center of the lens as a hypotenuse of a second monitoring area 102, wherein the second monitoring area 102 is an area which is positioned in the section of the second test beam and is except for the first square area;
taking a connecting line between a hypotenuse starting point and a fourth reference point as a hypotenuse of a third monitoring region 103, wherein the third monitoring region 103 is a region which is positioned in the section of the second test beam and is except for the first monitoring region 101 and the second monitoring region 102;
the line between the hypotenuse start point and the hypotenuse end point is taken as the hypotenuse of the fourth monitoring region 104, which fourth monitoring region 104 is a region which is located within the cross section of the second test beam and which is other than the first monitoring region 101, the second monitoring region 102 and the third monitoring region 103.
In this embodiment, in order to accurately determine whether the lens and the image sensor are in a horizontal state, at least one monitoring area needs to be set without including the lens area, and a plurality of monitoring areas need to be set to overlap with the lens area, so in this embodiment, the first monitoring area 101 does not overlap with the lens area, and the second monitoring area 102, the third monitoring area 103, and the fourth monitoring area 104 all have overlapping areas with the lens area.
It should be added that, the image definitions of the four monitoring areas are calculated respectively, and the angles between the lens and the image sensor are adjusted based on the differences of the image definitions of the four monitoring areas, so that the lens and the image sensor are in a horizontal state, specifically, the method is that:
setting a first number of feature points for the first monitoring area 101, and calculating an average image definition of the first monitoring area 101 based on pixels on each feature point;
calculating the area ratio of the second monitoring area 102, the third monitoring area 103 and the fourth monitoring area 104, setting a second number of characteristic points for the second monitoring area 102, a third number of characteristic points for the third monitoring area 103 and a fourth number of characteristic points for the fourth monitoring area 104 based on the area ratio;
calculating an average image sharpness of the second monitoring area 102 based on pixels on a second number of feature points set by the second monitoring area 102, calculating an average image sharpness of the third monitoring area 103 based on pixels on a third number of feature points set by the third monitoring area 103, and calculating an average image sharpness of the fourth monitoring area 104 based on pixels on a fourth number of feature points set by the fourth monitoring area 104;
judging whether the average image definition of the four monitoring areas is the same, if so, indicating that the four monitoring areas are parallel to the image sensor, namely, the lens and the image sensor are in a horizontal state, and if not, calculating the angle between the lens and the image sensor according to the difference value of the average image definition of the four monitoring areas;
and adjusting the angle between the lens and the image sensor to ensure that the average image definition of the four monitoring areas is the same, and ensuring that the angle between the lens and the image sensor is in a horizontal state.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (9)
1. The method for checking the level of the image sensor and the lens is characterized by comprising the following steps:
acquiring a specified distance between a lens and an image sensor according to the model of the camera;
the first test beam passes through the center of the lens, is focused on the image sensor to form a center focus of the lens, performs analog-to-digital conversion on the center focus of the lens, and calculates the distance between the center of the lens and the image sensor according to the center focus of the lens to make the distance equal to the specified distance;
selecting a first reference point at any position except the center of the lens in the lens, connecting the first reference point with the center of the lens in a straight line segment, taking the center point of the connected straight line segment as a second reference point, respectively passing a first test beam through the first reference point and the second reference point, focusing on an image sensor, and respectively forming a first focus corresponding to the first reference point and a second focus corresponding to the second reference point;
respectively carrying out analog-to-digital conversion on the first focus and the second focus to obtain the distance between the first reference point and the second reference point and the image sensor respectively, and if the distances between the first reference point and the second reference point and the image sensor are equal, indicating that the lens and the image sensor are in a horizontal state at the initial moment;
a second test beam passes through the lens and is focused on the image sensor, the section of the second test beam is square, the square is circumscribed on the outer edge of the lens, any one bevel edge of the square is obtained, two intersection points of the square and the outer edge of the lens are recorded, the intersection point closest to the starting point of the bevel edge is taken as a third reference point, and the other intersection point is taken as a fourth reference point;
taking a connecting line between a hypotenuse starting point and a third reference point as a hypotenuse of a first monitoring area, taking a connecting line between the hypotenuse starting point and the center of a lens as a hypotenuse of a second monitoring area, taking a connecting line between the hypotenuse starting point and a fourth reference point as a hypotenuse of a third monitoring area, and taking a connecting line between the hypotenuse starting point and a hypotenuse terminal as a hypotenuse of a fourth monitoring area to respectively acquire the first monitoring area, the second monitoring area, the third monitoring area and the fourth monitoring area;
and respectively calculating the image definition of the four monitoring areas, and adjusting the angles between the lens and the image sensor based on the difference of the image definition of the four monitoring areas so as to enable the lens and the image sensor to be in a horizontal state.
2. The method of claim 1, wherein the cameras of different models have different prescribed distances from the lens to the image sensor.
3. The method of claim 1, wherein the first test beam has a circular cross-section and a diameter less than a radius of the lens.
4. The method for checking the level of an image sensor and a lens according to claim 1, wherein the difference between the lens center and the image sensor is calculated after the distance between the lens center and the image sensor is calculated according to the focal point of the lens center, and if the difference is within a predetermined threshold, the distance between the lens center and the image sensor is regarded as equal to the predetermined distance.
5. The method according to claim 4, wherein the lens and/or the image sensor is/are adjusted until the distance between the center of the lens and the image sensor is equal to the predetermined distance if the difference is not within the predetermined threshold.
6. The method for verifying the level of an image sensor and a lens according to claim 1, wherein when the lens and the image sensor are in a level state at an initial time, the distance between the center of the lens and the image sensor, the distance between the first reference point and the image sensor, and the distance between the second reference point and the image sensor are equal and are both set distances.
7. The method for checking the level of an image sensor and a lens according to claim 1, wherein the acquiring the first monitoring area, the second monitoring area, the third monitoring area and the fourth monitoring area respectively specifically includes:
taking a connecting line between a hypotenuse starting point and a third reference point as a hypotenuse of a first monitoring area, wherein the first monitoring area is a first square area positioned in the section of the second test beam;
taking a connecting line of a hypotenuse starting point and the center of the lens as a hypotenuse of a second monitoring area, wherein the second monitoring area is an area which is positioned in the section of the second test beam and is except for the first square area;
taking a connecting line between a hypotenuse starting point and a fourth reference point as a hypotenuse of a third monitoring area, wherein the third monitoring area is an area which is positioned in the section of the second test beam and is except the first monitoring area and the second monitoring area;
the line between the hypotenuse starting point and the hypotenuse terminal is taken as the hypotenuse of a fourth monitoring region which is located within the cross section of the second test beam and is other than the first, second and third monitoring regions.
8. The method of claim 7, wherein the first monitoring area and the lens area do not overlap, and the second monitoring area, the third monitoring area, and the fourth monitoring area each have an overlapping area with the lens area.
9. The method for checking the image sensor and the lens level according to claim 1, 7 or 8, wherein the calculating the image sharpness of the four monitoring areas respectively and adjusting the angle between the lens and the image sensor based on the difference of the image sharpness of the four monitoring areas, so that the lens and the image sensor are in a horizontal state, specifically means:
setting a first number of feature points for a first monitoring area, and calculating average image definition of the first monitoring area based on pixels on each feature point;
calculating the area ratio of the second monitoring area, the third monitoring area and the fourth monitoring area, setting a second number of characteristic points for the second monitoring area, setting a third number of characteristic points for the third monitoring area and setting a fourth number of characteristic points for the fourth monitoring area based on the area ratio;
calculating an average image sharpness of the second monitoring area based on pixels on a second number of feature points set by the second monitoring area, calculating an average image sharpness of the third monitoring area based on pixels on a third number of feature points set by the third monitoring area, and calculating an average image sharpness of the fourth monitoring area based on pixels on a fourth number of feature points set by the fourth monitoring area;
judging whether the average image definition of the four monitoring areas is the same, if so, indicating that the four monitoring areas are parallel to the image sensor, namely, the lens and the image sensor are in a horizontal state, and if not, calculating the angle between the lens and the image sensor according to the difference value of the average image definition of the four monitoring areas;
and adjusting the angle between the lens and the image sensor to ensure that the average image definition of the four monitoring areas is the same, and ensuring that the angle between the lens and the image sensor is in a horizontal state.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310566084.4A CN116320747A (en) | 2023-05-19 | 2023-05-19 | Method for horizontally checking image sensor and lens |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310566084.4A CN116320747A (en) | 2023-05-19 | 2023-05-19 | Method for horizontally checking image sensor and lens |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116320747A true CN116320747A (en) | 2023-06-23 |
Family
ID=86785305
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310566084.4A Pending CN116320747A (en) | 2023-05-19 | 2023-05-19 | Method for horizontally checking image sensor and lens |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116320747A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110091101A1 (en) * | 2009-10-20 | 2011-04-21 | Apple Inc. | System and method for applying lens shading correction during image processing |
CN106482640A (en) * | 2016-12-07 | 2017-03-08 | 北京汉邦高科数字技术股份有限公司 | A kind of apparatus and method of integrated machine core optical axis correction |
CN107655421A (en) * | 2016-07-25 | 2018-02-02 | 科罗马森斯股份有限公司 | The technique and device being scanned using stereoscan camera to surface |
CN107702695A (en) * | 2017-09-26 | 2018-02-16 | 歌尔股份有限公司 | Camera module group lens and the method for testing of imaging sensor relative position |
CN108375370A (en) * | 2018-07-02 | 2018-08-07 | 江苏中科院智能科学技术应用研究院 | A kind of complex navigation system towards intelligent patrol unmanned plane |
CN111947896A (en) * | 2020-08-05 | 2020-11-17 | 上海安翰医疗技术有限公司 | System and method for aligning optical center of lens with center of photosensitive surface of imaging sensor |
US20200393645A1 (en) * | 2017-07-28 | 2020-12-17 | Denso Wave Incorporated | Optical information reader and method of manufacturing the optical information reader |
CN112288822A (en) * | 2020-09-22 | 2021-01-29 | 苏州艾微视图像科技有限公司 | Camera active alignment method combined with calibration |
CN113422885A (en) * | 2020-03-02 | 2021-09-21 | 江西晶超光学有限公司 | Lens adjusting method and camera module assembling system |
-
2023
- 2023-05-19 CN CN202310566084.4A patent/CN116320747A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110091101A1 (en) * | 2009-10-20 | 2011-04-21 | Apple Inc. | System and method for applying lens shading correction during image processing |
CN107655421A (en) * | 2016-07-25 | 2018-02-02 | 科罗马森斯股份有限公司 | The technique and device being scanned using stereoscan camera to surface |
CN106482640A (en) * | 2016-12-07 | 2017-03-08 | 北京汉邦高科数字技术股份有限公司 | A kind of apparatus and method of integrated machine core optical axis correction |
US20200393645A1 (en) * | 2017-07-28 | 2020-12-17 | Denso Wave Incorporated | Optical information reader and method of manufacturing the optical information reader |
CN107702695A (en) * | 2017-09-26 | 2018-02-16 | 歌尔股份有限公司 | Camera module group lens and the method for testing of imaging sensor relative position |
CN108375370A (en) * | 2018-07-02 | 2018-08-07 | 江苏中科院智能科学技术应用研究院 | A kind of complex navigation system towards intelligent patrol unmanned plane |
CN113422885A (en) * | 2020-03-02 | 2021-09-21 | 江西晶超光学有限公司 | Lens adjusting method and camera module assembling system |
CN111947896A (en) * | 2020-08-05 | 2020-11-17 | 上海安翰医疗技术有限公司 | System and method for aligning optical center of lens with center of photosensitive surface of imaging sensor |
CN112288822A (en) * | 2020-09-22 | 2021-01-29 | 苏州艾微视图像科技有限公司 | Camera active alignment method combined with calibration |
Non-Patent Citations (1)
Title |
---|
李志远;: "3款新品数码相机快速测试", 数码摄影, no. 07 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109767476B (en) | Automatic focusing binocular camera calibration and depth calculation method | |
CN102169573B (en) | Real-time distortion correction method and system of lens with high precision and wide field of view | |
CN109001224B (en) | Welding seam detection method and detection device | |
CN102063718A (en) | Field calibration and precision measurement method for spot laser measuring system | |
JP5548310B2 (en) | Imaging device, imaging system including imaging device, and imaging method | |
CN107941153B (en) | Visual system for optimizing calibration of laser ranging | |
CN111080705B (en) | Calibration method and device for automatic focusing binocular camera | |
CN103163725A (en) | Camera module detection device and detection method | |
CN105205806B (en) | A kind of precision compensation method based on machine vision | |
CN112255758B (en) | Device and method for realizing simultaneous focusing of screen and workpiece in deflection measurement | |
CN114022370B (en) | Galvanometer laser processing distortion correction method and system | |
CN112308934B (en) | Calibration detection method and device, storage medium and computing equipment | |
CN112381847A (en) | Pipeline end head space pose measuring method and system | |
CN107063644B (en) | Finite object distance distortion measuring method and system | |
CN110827360B (en) | Photometric stereo measurement system and method for calibrating light source direction thereof | |
CN211374003U (en) | Lens testing device | |
CN113163122B (en) | Focusing method | |
CN116320747A (en) | Method for horizontally checking image sensor and lens | |
CN110986760B (en) | Three-dimensional reconstruction-based method and system for checking size of special-shaped structure | |
CN101793515A (en) | Device and method for aiming of micro target pellet with diagnostic device | |
CN112747692A (en) | Three-dimensional measurement method and device for precise small hole | |
CN108286960B (en) | Focusing type light tube array device and photographic detection method | |
CN107462187B (en) | Method and device for determining light spot circle center during coaxiality detection of ceramic ferrule | |
CN112489141B (en) | Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera | |
CN112782201B (en) | Lobster eye optical device calibration method based on X-ray focusing image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |