CN114236745A - Lens alignment method based on three-axis movement - Google Patents

Lens alignment method based on three-axis movement Download PDF

Info

Publication number
CN114236745A
CN114236745A CN202111572046.7A CN202111572046A CN114236745A CN 114236745 A CN114236745 A CN 114236745A CN 202111572046 A CN202111572046 A CN 202111572046A CN 114236745 A CN114236745 A CN 114236745A
Authority
CN
China
Prior art keywords
lens
definition
value
focusing
axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111572046.7A
Other languages
Chinese (zh)
Other versions
CN114236745B (en
Inventor
吴方
杨文冠
卢庆杰
赵治平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Hongjing Optoelectronics Technology Co Ltd
Original Assignee
Guangdong Hongjing Optoelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Hongjing Optoelectronics Technology Co Ltd filed Critical Guangdong Hongjing Optoelectronics Technology Co Ltd
Priority to CN202111572046.7A priority Critical patent/CN114236745B/en
Publication of CN114236745A publication Critical patent/CN114236745A/en
Application granted granted Critical
Publication of CN114236745B publication Critical patent/CN114236745B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/38Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Lens Barrels (AREA)

Abstract

The invention provides a lens aligning method based on three-axis movement, which comprises the following steps: enabling a Z-axis of an imaging sensor to approach a lens to a focusing initial position, starting to be far away from the lens from the focusing initial position to perform a central focusing step, acquiring multiple groups of image definition values and corresponding Z-axis positions, finally recording the position FocusBestPosition of the Z-axis corresponding to the maximum value of the image definition, and moving the imaging sensor to the position FocusBestPosition after focusing is completed; after the center focusing is finished, moving a sensitive lens of the lens on an XY plane to perform lens alignment; after the lens alignment is completed, the Z-axis direction of the imaging sensor reaches the focusing initial position and the defocusing step is executed from the focusing initial position to the distance away from the lens. The invention sets a unique alignment algorithm aiming at the alignment equipment and the alignment procedure, so that the alignment equipment automatically executes the alignment operation of the lens, and the alignment efficiency and the lens resolution yield are effectively improved.

Description

Lens alignment method based on three-axis movement
The invention relates to a split application, which is an original application with the application number of 202110790104.7 and the application date of 2021, 07, 13 and is named as an optical lens aligning method and an optical lens aligning system.
Technical Field
The invention relates to a lens aligning method based on three-axis movement.
Background
An optical lens is composed of a plurality of lenses, wherein the sensitive lens in the optical design can be identified, and the position change of the sensitive lens has great influence on the whole resolution of the optical lens, so that the position of the sensitive lens is moved on the horizontal plane by using lens aligning equipment in production to obtain the position with the best resolution, the yield of the lens resolution is improved, and the quality of the lens resolution is improved.
Disclosure of Invention
The invention provides a lens aligning method based on three-axis movement, which is applied to lens aligning equipment and is realized by the following technical means:
the lens alignment method based on three-axis movement comprises the following steps:
enabling the Z-axis of the imaging sensor to approach the lens to a focusing initial position, and starting to move away from the lens from the focusing initial position to perform a central focusing step;
in the central focusing step, collecting a plurality of groups of image definition values and corresponding Z-axis positions, finally recording the FocusBestPosition of the Z-axis corresponding to the maximum image definition value, and moving the imaging sensor to the FocusBestPosition after focusing is finished; that is to say that the first and second electrodes,
calculating the definition value FocusVal of the central area of the current image before each movement;
comparing the sharpness value FocusVal with a preset sharpness maximum value FocusRef to control the moving distance of the next step:
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusDiff is larger than or equal to the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 1;
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is greater than or equal to the threshold FocusDiff2 and smaller than the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 2;
when the sharpness value FocusVal approaches or exceeds the preset sharpness maximum value FocusRef, the imaging sensor moves away from the lens by a distance MoveDistance3, wherein the approach means that the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is smaller than a threshold FocusDiff 3;
repeatedly executing the operation, and recording the definition maximum FocusMax of the definition value FocusVal and the position FocusBestPosition of the Z axis when the definition maximum FocusMax appears;
when the difference value between the definition value FocusVal and the definition maximum value FocusMax is larger than or equal to the threshold Focusdrop, focusing is completed; moving the imaging sensor to a position FocusBestPosition after focusing is completed;
after the center focusing is finished, moving a sensitive lens of the lens on an XY plane to perform lens alignment; the lens aligning step comprises a coarse adjusting step, a fine adjusting step and a fine adjusting step which are sequentially executed; wherein:
the coarse adjustment step takes the FocusBestPosition as the center, so that the sensitive lens moves in a set moving area to obtain the image definition values corresponding to the coordinate points, and finally the FocusWholeBestPosition1 corresponding to the maximum value of the image definition is recorded; and moving the sensitive lens to a position focuss wheolebestposition 1 after coarse adjustment is completed;
the fine adjustment step takes the FocusWholeBestPosition1 as the center, reduces the active area of the sensitive lens to obtain the image definition values corresponding to the coordinate points in the area, and finally records the FocusWholeBestPosition2 corresponding to the maximum value of the image definition; and moving the sensitive lens to a position focuss wheleebestposition 2 after fine adjustment is completed;
the fine adjustment step further reduces the active area of the sensitive lens by taking the position FocusWholeBestPosition2 as the center to obtain the image definition values corresponding to the coordinate points in the area, and finally records the position FocusWholeBestPosition3 corresponding to the maximum value of the image definition; and moving the sensitive lens to a position focuss wheolebestposition 3 after the fine adjustment is completed;
after the lens alignment is finished, the imaging sensor Z axially reaches a focusing initial position and starts to move away from the lens from the focusing initial position to execute a defocusing step; in the defocusing step, collecting a plurality of groups of image definition values and corresponding Z-axis positions, and finally recording the position FocusWholeBestPosition4 of the Z axis corresponding to the maximum value of the image definition; the imaging sensor is moved to position focuss wheelebestposition 4 after defocus is completed.
In one or more embodiments of the present invention, in the central focusing step, the position or the movement distance of the Z axis in the focusing process is taken as the abscissa, and the sharpness value FocusVal of the central area is taken as the ordinate, so as to output the central focusing curve.
In one or more embodiments of the present invention, the defocusing step performs the following operations:
acquiring an image before each movement and calculating a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefB to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefB is larger than or equal to a threshold FocusWholeDiffB1, the imaging sensor moves away from the lens by a MoveDistanceB 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to the threshold FocusWholeDiffB2 and less than the threshold FocusWholeDiffA1, the imaging sensor moves away from the lens by a distance MoveDistanceB 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefB, the imaging sensor moves away from the lens by a distance MoveDistanceB3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is smaller than a threshold FocusWholeDiffA 3;
repeatedly executing the operation, marking a definition maximum FocusWholeMax where the total definition value FocusWhole appears in the process and a position FocusWholeBestposition4 of the Z axis when the definition maximum FocusWholeMax appears;
when the difference value between the recorded total definition value FocusWhole and the definition maximum value FocusWholeMax is larger than or equal to the threshold FocusWholeDrap or when the total moving length exceeds the threshold MoveDistanceMax, defocusing is completed;
the imaging sensor is moved to position focuss wheelebestposition 4 after defocus is completed.
In one or more embodiments of the present invention, in the defocus step, a Z-axis position or a movement distance in the focusing process is taken as a horizontal coordinate, and a sharpness value is taken as a vertical coordinate, and a defocus curve is output; wherein, curves corresponding to the central region definition value focusValCen and the peripheral region definition value focusValAround (n) are respectively marked by different colors or line types.
In one or more embodiments of the present invention, the imaging sensor is continuously close to the lens along the Z-axis from the initial position, the imaging of the central region of the image goes through the process from blur to sharpness to blur, and stops when the central region of the image is from sharpness to blur and defines the current position as the initial focusing position.
The invention has the beneficial effects that: a unique core adjustment algorithm is set for the core adjustment equipment and the core adjustment process, so that the core adjustment equipment automatically executes the core adjustment operation of the lens, the three operations comprise lens center focusing, lens core adjustment and lens defocusing, wherein the lens core adjustment is also provided with the steps of coarse adjustment, fine adjustment and fine adjustment, and the core adjustment efficiency and the lens resolution yield are effectively improved.
Drawings
Fig. 1 is a general flow chart of the present invention.
FIG. 2 is a flowchart illustrating a lens center focusing method according to the present invention.
Fig. 3 is a flow chart of the lens alignment rough adjustment of the present invention.
Fig. 4 is a flow chart of the lens alignment fine adjustment of the present invention.
FIG. 5 is a flow chart of the present invention for fine tuning of lens alignment.
Fig. 6 is a flow chart of the lens defocusing of the present invention.
Fig. 7 is a structural diagram of the optical lens aligning system of the present invention.
Fig. 8 is an architecture diagram of a lens center focusing module, a lens alignment module and a lens defocusing module of the optical lens alignment system of the present invention.
Fig. 9 is a screenshot of an image captured by the image capture module of the present invention.
Fig. 10 is a defocus curve map of the present invention.
Detailed Description
The scheme of the present application is further described below with reference to the accompanying drawings 1 to 10:
referring to fig. 1 to 6, the method for aligning lens barrel based on three-axis movement includes the following steps:
s1, the imaging sensor is made to approach the lens from the initial position continuously through the Z-axis mechanism, the imaging of the central area of the image goes through the process from blurring to clear to blurring, the temporary Z-axis mechanism from clear to fuzzy stops in the central area of the image, and the current position is defined as the focusing initial position FocusStartPosition;
s2, the imaging sensor is far away from the lens through the Z-axis mechanism, and the center focusing step is executed:
s21, calculating the definition value FocusVal of the central area of the current image before each movement;
s22, comparing the sharpness value FocusVal with a preset sharpness maximum value FocusRef (maximum possible sharpness value) to control the distance moved in the next step:
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusDiff is larger than or equal to the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 1;
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is greater than or equal to the threshold FocusDiff2 and smaller than the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 2;
when the sharpness value FocusVal approaches or exceeds the preset sharpness maximum value FocusRef, the imaging sensor moves away from the lens by a distance MoveDistance3, wherein the approach means that the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is smaller than a threshold FocusDiff 3;
s23, repeatedly executing the steps S21 and S22, and recording the definition maximum FocusMax when the definition value FocusVal appears and the position FocusBestPosition of the Z axis when the definition maximum FocusMax appears;
s24, when the difference value between the definition value FocusVal and the definition maximum value FocusMax is larger than or equal to the threshold Focusdrop, focusing is completed; moving the imaging sensor to a position FocusBestPosition after focusing is completed;
s25, outputting a central focusing curve by taking the Z-axis position or the movement distance in the focusing process as a horizontal coordinate and the sharpness FocusVal of the central area as a vertical coordinate;
s3, moving the sensitive lens of the lens on the horizontal plane through an XY axis mechanism to execute the coarse adjustment step of lens alignment:
s31, setting the active areas of the X axis and the Y axis as [ XRange1, Yrange1] and setting the single movement amount of the axis as MoveStep1 by taking the current position FocusBestPosition as a plane center;
s32, dividing the active area [ XRange1, Yrange1] into grids by taking the moving amount MoveSte1p1 as a unit, wherein each junction point in the grids is a stop point of the movement;
s33, moving the sensitive lens to each stopping point, and collecting a sharpness value FocusValCen of a central area of the image and sharpness values focusvalaround (n) of a plurality of peripheral areas when the sensitive lens is located at the stopping point, where n is 0,1,2, …;
s34, calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are respectivelyThe value range of the weight coefficient of the resolution is [0, 1]];
S35, comparing the total sharpness value focuswheole with a preset sharpness maximum value focuswhelerefa (maximum possible sharpness) to control the distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
s36, traversing each stopping point and recording the maximum value FocusWholeMax of the total definition appearing in the process of the total definition FocusWhole and the corresponding position FocusWholeBestPosition 1; after traversing each stopping point, the coarse tuning core is completed, and the sensitive lens moves to the FocusWholeBestPosition 1;
s37, outputting a coarse tuning core curve by taking the traversal sequence number of each stop point in the coarse tuning step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
s4, moving the sensitive lens of the lens on the horizontal plane through an XY axis mechanism to execute the fine adjustment step of lens alignment:
s41, setting the active areas of the X axis and the Y axis as [ XRange2 and Yrange2] and the single movement amount of the axes as MoveStep2 by taking the current position FocusWholeBestPosition1 as a plane center, wherein XRange2 is smaller than XRange1, Yrange2 is smaller than Yrange1, and MoveStep2 is smaller than MoveStep 1;
s42, dividing the active area [ XRange2, Yrange2] into grids by taking the moving amount MoveSte1p2 as a unit, wherein each junction point in the grids is a stop point of the movement;
s43, moving the sensitive lens to each stopping point, and collecting a sharpness value FocusValCen of a central area of the image and sharpness values focusvalaround (n) of a plurality of peripheral areas when the sensitive lens is located at the stopping point, where n is 0,1,2, …;
s44, calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
S45, comparing the total sharpness value focuswheole with a preset sharpness maximum value focuswhelerefa (maximum possible sharpness) to control the distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
s46, traversing each stopping point and recording the total definition maximum FocusWholeMax and the corresponding position FocusWholeBestPosition2 in the process; after traversing each stopping point, the sensitive lens moves to the position focuss wheelebestposition 2;
s47, outputting a fine adjustment core curve by taking the traversal sequence number of each stop point in the fine adjustment step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
s5, moving the sensitive lens of the lens on the horizontal plane through an XY axis mechanism to execute the fine adjustment step of lens alignment:
s51, setting the moving range of an X axis and a Y axis as [ XRange3 and Yrange3] and the single moving amount of the axes as MoveStep3 by taking the current position FocusWholeBestPosition2 as a plane center, wherein XRange3 is smaller than XRange2, Yrange3 is smaller than Yrange2, and MoveStep3 is smaller than MoveStep 2;
s52, dividing the active area [ XRange3, Yrange3] into grids by taking the moving amount MoveSte1p3 as a unit, wherein each junction point in the grids is a stop point of the movement;
s53, moving the sensitive lens to each stopping point, and collecting a sharpness value FocusValCen of a central area of the image and sharpness values focusvalaround (n) of a plurality of peripheral areas when the sensitive lens is located at the stopping point, where n is 0,1,2, …;
s54, calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
S55, comparing the total sharpness value focuswheole with a preset sharpness maximum value focuswhelerefa (maximum possible sharpness) to control the distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is larger than or equal to a threshold value FocusWholeDiffA1, the movable distance of the sensitive lens is MoveDistanceA 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to a threshold FocusWholeDiffA2 and less than a threshold FocusWholeDiffA1, the moving distance of the sensitive lens is MoveDistanceA 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefA, the sensitive lens moves by a distance MoveDistanceA3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusRef is less than a threshold value FocusWholeDiffA 3;
s56, traversing each stopping point and recording the total definition maximum FocusWholeMax and the corresponding position FocusWholeBestPosition3 in the process; after traversing each stopping point, the sensitive lens moves to the position focuss wheelebestposition 3;
s57, outputting a fine tuning core curve by taking the traversal sequence number of each stop point in the fine tuning step as an abscissa and the definition value as an ordinate; wherein, curves corresponding to the central region definition value FocusValCen and the peripheral region definition value FocusValAround (n) are respectively marked by different colors or line types;
after the steps of coarse adjustment and fine adjustment to fine adjustment, the sensitive lens of the lens is positioned to a more accurate position;
s6, the imaging sensor reaches the focus initial position FocusStartPosition through the Z-axis, and starts to move away from the lens continuously from the position, and the defocus step is executed:
s61, acquiring an image before each movement, and calculating a sharpness value FocusValCen of a central region of the image and sharpness values focusvalaround (n) of a plurality of peripheral regions, where n is 0,1,2, …;
s62, calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
S63, comparing the total sharpness value focuswheole with a preset sharpness maximum focuswhelereb (maximum possible sharpness) to control the distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefB is larger than or equal to a threshold FocusWholeDiffB1, the imaging sensor moves away from the lens by a MoveDistanceB 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to the threshold FocusWholeDiffB2 and less than the threshold FocusWholeDiffA1, the imaging sensor moves away from the lens by a distance MoveDistanceB 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefB, the imaging sensor moves away from the lens by a distance MoveDistanceB3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is smaller than a threshold FocusWholeDiffA 3;
s64, repeatedly executing the steps S61-S63, marking the maximum value FocusWholeMax of the definition of the total definition FocusWhole in the process and the position FocusWholeBestPosition4 of the Z axis when the maximum value FocusWholeMax appears;
s65, when the difference between the recorded total sharpness value focuswhele and the maximum sharpness value focuswhelemax is greater than or equal to the threshold focuswheledrop, or when the total movement length exceeds the threshold movedistancememax, defocus is completed;
s66, moving the imaging sensor to a FocusWholeBestPosition4 after defocusing is completed;
s67, outputting a defocusing curve by taking the Z-axis position or the movement distance in the focusing process as a horizontal coordinate and the definition value as a vertical coordinate; wherein, curves corresponding to the central region definition value focusValCen and the peripheral region definition value focusValAround (n) are respectively marked by different colors or line types.
Referring to fig. 7 to 8, an optical lens barrel aligning system for performing the above method includes:
the image acquisition module is connected with an imaging sensor and is used for selecting the center of the image or/and a plurality of peripheral areas to shoot the images with the same characteristic shape;
the motion control module is used for controlling the XY axis mechanism to drive the sensitive lens of the lens to move horizontally and controlling the Z axis mechanism to drive the imaging sensor to move vertically;
the lens central focusing module is used for carrying out operation processing on the acquired image data so as to finish lens focusing and generate central focusing data;
the lens aligning module is used for performing operation processing on the acquired image data to finish lens aligning and generating aligning data;
and the lens defocusing module is used for performing operation processing on the acquired image data to finish lens defocusing and generate defocusing data.
Specifically, the lens center focusing module comprises a first definition calculating unit, a first imaging sensor vertical movement calculating and controlling unit, a first definition peak value judging and positioning unit and a focusing curve output unit, wherein the first definition calculating unit is used for calculating a definition value of a current image center area, the first imaging sensor vertical movement calculating and controlling unit is used for generating an instruction for driving the imaging sensor to shift to a next shift distance according to a comparison result of the current definition value, the first definition peak value judging and positioning unit is used for judging a definition peak value so as to generate an instruction for driving the imaging sensor to shift to a Z-axis position corresponding to the definition peak value, and the focusing curve output unit is used for outputting a center focusing curve;
the lens aligning module comprises a second definition calculating unit, a lens plane movement calculating and controlling unit, a lens coarse adjusting unit, a lens fine adjusting unit and an aligning curve output unit, wherein the second definition calculating unit is used for calculating definition values of the center and the peripheral area of a current image, the lens plane movement calculating and controlling unit is used for generating an instruction for driving the next displacement distance of the sensitive lens according to a comparison result of a current total definition value, the lens coarse adjusting unit is used for executing lens coarse adjusting operation, the lens fine adjusting unit is used for executing lens fine adjusting operation, and the aligning curve output unit is used for outputting an aligning curve;
the lens defocusing module comprises a third definition calculating unit, a second imaging sensor vertical movement calculating and controlling unit, a second definition peak value judging and positioning unit and a defocusing curve output unit, wherein the third definition calculating unit is used for calculating definition values of the center and the peripheral area of a current image, the second imaging sensor vertical movement calculating and controlling unit is used for generating an instruction for driving the imaging sensor to shift to a next displacement distance according to a comparison result of a current total definition value, the second definition peak value judging and positioning unit is used for judging a definition peak value so as to generate an instruction for driving the imaging sensor to shift to a Z-axis position corresponding to the definition peak value, and the defocusing curve output unit is used for outputting a defocusing curve.
The above preferred embodiments should be considered as examples of the embodiments of the present application, and technical deductions, substitutions, improvements and the like similar to, similar to or based on the embodiments of the present application should be considered as the protection scope of the present patent.

Claims (5)

1. The lens aligning method based on three-axis movement is characterized by comprising the following steps of:
enabling the Z-axis of the imaging sensor to approach the lens to a focusing initial position, and starting to move away from the lens from the focusing initial position to perform a central focusing step;
in the central focusing step, collecting a plurality of groups of image definition values and corresponding Z-axis positions, finally recording the FocusBestPosition of the Z-axis corresponding to the maximum image definition value, and moving the imaging sensor to the FocusBestPosition after focusing is finished; that is to say that the first and second electrodes,
calculating the definition value FocusVal of the central area of the current image before each movement;
comparing the sharpness value FocusVal with a preset sharpness maximum value FocusRef to control the moving distance of the next step:
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusDiff is larger than or equal to the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 1;
when the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is greater than or equal to the threshold FocusDiff2 and smaller than the threshold FocusDiff1, the imaging sensor moves away from the lens by a MoveDistance 2;
when the sharpness value FocusVal approaches or exceeds the preset sharpness maximum value FocusRef, the imaging sensor moves away from the lens by a distance MoveDistance3, wherein the approach means that the difference value between the sharpness value FocusVal and the preset sharpness maximum value FocusRef is smaller than a threshold FocusDiff 3;
repeatedly executing the operation, and recording the definition maximum FocusMax of the definition value FocusVal and the position FocusBestPosition of the Z axis when the definition maximum FocusMax appears;
when the difference value between the definition value FocusVal and the definition maximum value FocusMax is larger than or equal to the threshold Focusdrop, focusing is completed; moving the imaging sensor to a position FocusBestPosition after focusing is completed;
after the center focusing is finished, moving a sensitive lens of the lens on an XY plane to perform lens alignment; the lens aligning step comprises a coarse adjusting step, a fine adjusting step and a fine adjusting step which are sequentially executed; wherein:
the coarse adjustment step takes the FocusBestPosition as the center, so that the sensitive lens moves in a set moving area to obtain the image definition values corresponding to the coordinate points, and finally the FocusWholeBestPosition1 corresponding to the maximum value of the image definition is recorded; and moving the sensitive lens to a position focuss wheolebestposition 1 after coarse adjustment is completed;
the fine adjustment step takes the FocusWholeBestPosition1 as the center, reduces the active area of the sensitive lens to obtain the image definition values corresponding to the coordinate points in the area, and finally records the FocusWholeBestPosition2 corresponding to the maximum value of the image definition; and moving the sensitive lens to a position focuss wheleebestposition 2 after fine adjustment is completed;
the fine adjustment step further reduces the active area of the sensitive lens by taking the position FocusWholeBestPosition2 as the center to obtain the image definition values corresponding to the coordinate points in the area, and finally records the position FocusWholeBestPosition3 corresponding to the maximum value of the image definition; and moving the sensitive lens to a position focuss wheolebestposition 3 after the fine adjustment is completed;
after the lens alignment is finished, the imaging sensor Z axially reaches a focusing initial position and starts to move away from the lens from the focusing initial position to execute a defocusing step; in the defocusing step, collecting a plurality of groups of image definition values and corresponding Z-axis positions, and finally recording the position FocusWholeBestPosition4 of the Z axis corresponding to the maximum value of the image definition; the imaging sensor is moved to position focuss wheelebestposition 4 after defocus is completed.
2. The lens barrel aligning method based on three-axis motion of claim 1, wherein: in the central focusing step, the Z-axis position or the movement distance in the focusing process is taken as an abscissa, and the sharpness value FocusVal of the central area is taken as an ordinate, so that a central focusing curve is output.
3. The lens barrel aligning method based on three-axis motion of claim 1, wherein: the defocusing step performs the following operations:
acquiring an image before each movement and calculating a definition value FocusValCen of a central area of the image and definition values FocusValAround (n) of a plurality of peripheral areas, wherein n is 0,1,2 and …;
calculating the overall definition FocusWhole of the image;
FocusWhole=FocusValCen*CenPower+∑nfocusvalaround (n) aroundpower (n), n ═ 0,1,2, …; wherein, CenPower and AroundPower (n) are weighting coefficients of each resolution, and the value range is [0, 1%];
Comparing the total definition value FocusWhole with a preset definition maximum value FocusWholeRefB to control the moving distance of the next step:
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefB is larger than or equal to a threshold FocusWholeDiffB1, the imaging sensor moves away from the lens by a MoveDistanceB 1;
when the difference value of the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is greater than or equal to the threshold FocusWholeDiffB2 and less than the threshold FocusWholeDiffA1, the imaging sensor moves away from the lens by a distance MoveDistanceB 2;
when the total definition value FocusWhole approaches or exceeds the preset definition maximum value FocusWholeRefB, the imaging sensor moves away from the lens by a distance MoveDistanceB3, wherein the approach means that the difference between the total definition value FocusWhole and the preset definition maximum value FocusWholeRefA is smaller than a threshold FocusWholeDiffA 3;
repeatedly executing the operation, marking a definition maximum FocusWholeMax where the total definition value FocusWhole appears in the process and a position FocusWholeBestposition4 of the Z axis when the definition maximum FocusWholeMax appears;
when the difference value between the recorded total definition value FocusWhole and the definition maximum value FocusWholeMax is larger than or equal to the threshold FocusWholeDrap or when the total moving length exceeds the threshold MoveDistanceMax, defocusing is completed;
the imaging sensor is moved to position focuss wheelebestposition 4 after defocus is completed.
4. The lens barrel aligning method based on three-axis motion of claim 3, wherein: in the defocusing step, outputting a defocusing curve by taking the Z-axis position or the movement distance in the focusing process as a horizontal coordinate and the definition value as a vertical coordinate; wherein, curves corresponding to the central region definition value focusValCen and the peripheral region definition value focusValAround (n) are respectively marked by different colors or line types.
5. The lens barrel aligning method based on three-axis motion of claim 1, wherein: the imaging sensor is continuously close to the lens along the Z-axis direction from the initial position, the imaging of the central area of the image is subject to the process from blurring to clearness and then blurring, and the imaging is stopped when the central area of the image is from clearness to blurring and the current position is defined as the focusing initial position.
CN202111572046.7A 2021-07-13 2021-07-13 Lens core adjusting method based on triaxial movement Active CN114236745B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111572046.7A CN114236745B (en) 2021-07-13 2021-07-13 Lens core adjusting method based on triaxial movement

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111572046.7A CN114236745B (en) 2021-07-13 2021-07-13 Lens core adjusting method based on triaxial movement
CN202110790104.7A CN113406764B (en) 2021-07-13 2021-07-13 Optical lens aligning method and system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN202110790104.7A Division CN113406764B (en) 2021-07-13 2021-07-13 Optical lens aligning method and system

Publications (2)

Publication Number Publication Date
CN114236745A true CN114236745A (en) 2022-03-25
CN114236745B CN114236745B (en) 2024-04-05

Family

ID=77686097

Family Applications (5)

Application Number Title Priority Date Filing Date
CN202111603028.0A Active CN114236747B (en) 2021-07-13 2021-07-13 Adjusting method for optimal resolution position of lens
CN202110790104.7A Active CN113406764B (en) 2021-07-13 2021-07-13 Optical lens aligning method and system
CN202111572046.7A Active CN114236745B (en) 2021-07-13 2021-07-13 Lens core adjusting method based on triaxial movement
CN202111577417.0A Active CN114355556B (en) 2021-07-13 2021-07-13 Optical lens core adjusting system
CN202111585699.9A Active CN114236746B (en) 2021-07-13 2021-07-13 Full-automatic core adjusting method for sensitive lens of optical lens

Family Applications Before (2)

Application Number Title Priority Date Filing Date
CN202111603028.0A Active CN114236747B (en) 2021-07-13 2021-07-13 Adjusting method for optimal resolution position of lens
CN202110790104.7A Active CN113406764B (en) 2021-07-13 2021-07-13 Optical lens aligning method and system

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202111577417.0A Active CN114355556B (en) 2021-07-13 2021-07-13 Optical lens core adjusting system
CN202111585699.9A Active CN114236746B (en) 2021-07-13 2021-07-13 Full-automatic core adjusting method for sensitive lens of optical lens

Country Status (1)

Country Link
CN (5) CN114236747B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114745539A (en) * 2022-04-11 2022-07-12 苏州华星光电技术有限公司 Camera calibration method and calibration device
CN115278101B (en) * 2022-07-29 2024-02-27 维沃移动通信有限公司 Shooting method and device and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990039780A (en) * 1997-11-14 1999-06-05 전주범 How to adjust auto focus of digital camera
DE102013004120A1 (en) * 2012-03-09 2013-09-12 Htc Corporation Electronic device e.g. smart phone has lens module whose focus is adjusted when shift in focus condition of image frame is satisfied, and preprocessing unit is provided to process image frame to provide focus information of frame
CN106534676A (en) * 2016-11-02 2017-03-22 西安电子科技大学 Automatic focus adjustment method for zoom camera systems
CN111432125A (en) * 2020-03-31 2020-07-17 合肥英睿系统技术有限公司 Focusing method and device, electronic equipment and storage medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6360416A (en) * 1986-09-01 1988-03-16 Konica Corp Automatic focusing device
TWI292851B (en) * 2005-08-16 2008-01-21 Premier Image Technology Corp Focusing method for image-grabbing device
CN101762232B (en) * 2008-12-23 2012-01-25 鸿富锦精密工业(深圳)有限公司 Multi-surface focusing system and method
TWI459063B (en) * 2009-01-09 2014-11-01 Hon Hai Prec Ind Co Ltd System and method for focusing on multiple surface of an object
CN102308242B (en) * 2009-12-07 2014-08-20 松下电器产业株式会社 Imaging device and imaging method
CN103776831B (en) * 2012-10-18 2016-12-21 苏州惠生电子科技有限公司 A kind of micro-imaging detecting instrument and automatic focusing method thereof
CN103246131B (en) * 2013-05-20 2016-06-01 爱佩仪光电技术(深圳)有限公司 Utilization can control the focusing motor that camera lens tilts and realize the method for 3 dimension multi-region auto-focusing
CN103543574A (en) * 2013-10-23 2014-01-29 翔德电子科技(深圳)有限公司 Focusing and fine-adjusting method and system for variable-focal-length optical lens
CN106896622B (en) * 2015-12-21 2019-08-30 宁波舜宇光电信息有限公司 Bearing calibration based on multiple spurs from auto-focusing
CN108668118A (en) * 2017-03-31 2018-10-16 中强光电股份有限公司 Autofocus system, the projector with autofocus system and Atomatic focusing method
JP2018185453A (en) * 2017-04-27 2018-11-22 オリンパス株式会社 Microscope system
CN107529011B (en) * 2017-08-23 2018-07-10 珠海安联锐视科技股份有限公司 A kind of motorized zoom lens control method
CN110632735A (en) * 2019-08-16 2019-12-31 俞庆平 Method for searching optimal focal plane in laser direct imaging system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19990039780A (en) * 1997-11-14 1999-06-05 전주범 How to adjust auto focus of digital camera
DE102013004120A1 (en) * 2012-03-09 2013-09-12 Htc Corporation Electronic device e.g. smart phone has lens module whose focus is adjusted when shift in focus condition of image frame is satisfied, and preprocessing unit is provided to process image frame to provide focus information of frame
CN106534676A (en) * 2016-11-02 2017-03-22 西安电子科技大学 Automatic focus adjustment method for zoom camera systems
CN111432125A (en) * 2020-03-31 2020-07-17 合肥英睿系统技术有限公司 Focusing method and device, electronic equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李衡;黄永梅;安涛;: "基于图像清晰度评价的跟踪望远镜自动调焦算法研究", 光学仪器, no. 05, 15 October 2010 (2010-10-15) *

Also Published As

Publication number Publication date
CN114355556A (en) 2022-04-15
CN113406764A (en) 2021-09-17
CN113406764B (en) 2022-01-28
CN114355556B (en) 2024-04-05
CN114236745B (en) 2024-04-05
CN114236747A (en) 2022-03-25
CN114236747B (en) 2024-04-05
CN114236746A (en) 2022-03-25
CN114236746B (en) 2024-04-05

Similar Documents

Publication Publication Date Title
CN113406764B (en) Optical lens aligning method and system
DE68905051T2 (en) Device for detecting tremors in the picture.
CN100550989C (en) Self focusing device, image capture apparatus and auto-focus method
CN108156371B (en) Infrared automatic focusing fast searching method
CN111899164B (en) Image splicing method for multi-focal-segment scene
DE102016120954A1 (en) Imaging device and imaging method
CN103984199B (en) Adjustment and application method for tilting compensation control of lens of automatic focusing camera module
CN108600638B (en) Automatic focusing system and method for camera
CN108519654A (en) A kind of Atomatic focusing method based on electro-hydraulic adjustable zoom lens
CN117555123B (en) Automatic focusing method and device for electron microscope
JP2833836B2 (en) Autofocus method for scanning electron microscope
JP3981778B2 (en) Visual feedback method of mechanical device using imaging device and mechanical device
CN103529544A (en) Nano membrane thickness measuring instrument capable of automatically positioning and focusing
DE102018123402A1 (en) IMAGE FOG CORRECTION DEVICE AND CONTROL PROCEDURE
JP2019161553A (en) Imaging system, imaging apparatus, control method of them, program, and storage medium
CN114815211B (en) Microscope automatic focusing method based on image processing
JP3407366B2 (en) camera
JP7467084B2 (en) Image capture device, image capture device control method and program
JP4072226B2 (en) Imaging device, control method thereof, and storage medium
CN117041710B (en) Coke follower and control method thereof
KR102323136B1 (en) 3D EDOF scanning apparatus adapting flow scan
CN116300265A (en) Camera automatic focusing adjustment method of manipulator
JP3610097B2 (en) Imaging device
CN115696043A (en) Automatic focusing system based on LabVIEW
CN117170081A (en) Automatic focusing method based on image definition identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant