CN111726527A - Focus testing method and device - Google Patents

Focus testing method and device Download PDF

Info

Publication number
CN111726527A
CN111726527A CN202010579698.2A CN202010579698A CN111726527A CN 111726527 A CN111726527 A CN 111726527A CN 202010579698 A CN202010579698 A CN 202010579698A CN 111726527 A CN111726527 A CN 111726527A
Authority
CN
China
Prior art keywords
focus
camera module
far
edge
limit value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010579698.2A
Other languages
Chinese (zh)
Other versions
CN111726527B (en
Inventor
董龙声
余赞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Moqin Intelligent Technology Co ltd
Original Assignee
Shanghai Moqin Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Moqin Intelligent Technology Co ltd filed Critical Shanghai Moqin Intelligent Technology Co ltd
Priority to CN202010579698.2A priority Critical patent/CN111726527B/en
Publication of CN111726527A publication Critical patent/CN111726527A/en
Application granted granted Critical
Publication of CN111726527B publication Critical patent/CN111726527B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/671Focus control based on electronic image sensor signals in combination with active ranging signals, e.g. using light or sound signals emitted toward objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

The embodiment of the invention provides a focus test method and a device, wherein the method comprises the following steps: and calling a driving code of the target camera module, wherein the driving code is provided with a near-focus edge upper limit value and a far-focus edge lower limit value, and the near-focus edge upper limit value and the far-focus edge lower limit value are determined by the three-attitude data of the batch of original camera modules. And the focal length of the target camera module in the horizontal direction is not recorded. And then driving a lens of the target camera module to search the focus of the target camera module according to the upper limit value of the near-focus edge and the lower limit value of the far-focus edge. According to the technical scheme, the upper limit value of the near-focus edge and the lower value of the far-focus edge are determined according to the three-posture data, and therefore the accurate focus test range is determined. The method and the device avoid errors caused by the fact that the target camera module carries out focus test according to an inaccurate focus test range in the prior art, and improve the accuracy of the focus test. And this technical scheme does not burn the focus of the module horizontal direction of making a video recording, has practiced thrift module manufacturing cost.

Description

Focus testing method and device
Technical Field
The invention relates to the field of automatic focusing of cameras, in particular to a focus testing method and device.
Background
The focus test method for the camera in the existing electronic equipment by using the open-loop motor automatic focusing module comprises the following steps: and selecting partial classic modules from the modules of a plurality of batches, and determining the clearest focus value by the lenses in all the modules according to the near-focus value and the far-focus value of the classic modules in the horizontal direction.
However, the above-mentioned debugging method only considers the focal length data in the horizontal direction, so that errors caused by gravity to the focal point test of the lens cannot be avoided, for example, due to gravity, the near-focus value and the far-focus value in the horizontal direction of the lens are inaccurate, and thus an accurate focal point test range cannot be obtained, which causes that some modules are mistaken as unqualified products because the clearest focal point value cannot be found in the actually correct focal point test range.
Therefore, a focus test method and apparatus are needed to reduce focus test errors, accurately determine focus values of sharp points of each lens, and improve focus test accuracy.
Disclosure of Invention
The embodiment of the invention provides a focus testing method and device, which are used for achieving the effect of accurately determining the focus value of a clear point of each lens.
In a first aspect, an embodiment of the present invention provides a focus test method, including:
the client calls a driving code of the target camera module, wherein the driving code is provided with a near-focus edge upper limit value and a far-focus edge lower limit value, and the near-focus edge upper limit value in the driving code and the far-focus edge lower limit value in the driving code are determined by three-posture data of the batch of original camera modules, wherein the three-posture data comprise a horizontal-direction focal length, an upward-direction focal length and a downward-direction focal length of each original camera module. And the focal length of the target camera module in the horizontal direction is not recorded in the target camera module. Further, the client drives the lens of the target camera module through the driving code so as to search the focus of the target camera module.
In the embodiment of the invention, the client determines the upper limit value of the near-focus edge in the driving code and the lower limit value of the far-focus edge in the driving code through the three-attitude data of the batch of original camera modules, then the two data are input into the corresponding driving codes, and the driving codes drive the target camera module to search for the clear focus value, so that the situation that the target camera module cannot reach the position of the focus of the clearest point due to the fact that the focus of the original camera module in the upward direction and the focus of the original camera module in the downward direction are not collected in the prior art is reduced. Meanwhile, in the prior art, the focus in the horizontal direction of each camera module is usually programmed into the register. The manufacturing cost who leads to is high, and should not used by the focus of the horizontal direction of burning record usually, so still have among the prior art to burn record extravagant problem, improved this in the scheme of this application, do not burn the focus of horizontal direction, improve the production efficiency of module, reduce the manufacturing cost of module.
In one possible embodiment, the focal length in the horizontal direction includes a horizontal near focus, a horizontal far focus; the focal length in the upward direction at least comprises an upward near focus; the focal length in the downward direction includes at least a downward far focus.
Further, the upper limit value of the near-focus edge in the driver code and the lower limit value of the far-focus edge in the driver code are determined by three-attitude data of the original camera modules of the batch, and the method includes:
the client side determines the near focus edge value of the original camera module according to the horizontal near focus, the horizontal far focus and the downward near focus of the camera module aiming at each original camera module. And further, the client determines the far focus edge value of the original camera module according to the horizontal near focus, the horizontal far focus and the upward far focus of the original camera module. And finally, the client determines the upper limit value of the near focus edge in the drive code and the lower limit value of the far focus edge in the drive code according to the near focus edge value and the far focus edge value of each original camera module.
In the embodiment of the invention, the client determines the near focus edge value corresponding to each original camera module of the batch according to the three-attitude data of each camera module of the batch, such as the horizontal near focus, the horizontal far focus and the downward near focus, and then determines the far focus edge value according to the horizontal near focus, the horizontal far focus and the upward far focus of each original camera module of the batch. And finally, the client determines a near-focus edge upper limit value and a far-focus edge lower limit value from the near-focus edge values and the far-focus edge values of the original camera modules in the batch. In the technical scheme, the client calculates the near-focus edge value and the far-focus edge value by collecting the batch of three-posture data, provides the maximum and reasonable near-focus edge upper limit value and far-focus edge value (corresponding to the focus test range of the driving target camera module) for each original camera module, reduces the problem that in the prior art, the focus test range of the driving code is determined according to the horizontal near focus and the horizontal far focus of partial modules in the batch of original camera modules, so that the partial camera modules cannot find the clearest focus, and improves the accuracy of focus test.
Further, in a possible embodiment, the determining, by the client, the far focus edge value Xi of the original camera module according to the horizontal near focus, the horizontal far focus, and the downward far focus of the original camera module includes:
Figure BDA0002551954510000031
FD (INF) is downward far focus, FF (INF) is horizontal far focus, and FF (MARCRO) is horizontal near focus.
In the embodiment of the invention, the client determines the far focus edge value Xi of the original camera module through the first formula, so that preparation is made for determining the accurate focus test range of the drive code in the subsequent steps, and the problem that the focus test range of the drive code determined according to the horizontal near focus and the horizontal far focus of the classic module in the prior art is inaccurate is solved.
Further, in a possible embodiment, the determining, by the client, a near-focus edge value Yi of the original camera module according to the horizontal near focus, the horizontal far focus, and the upward near focus of the original camera module includes:
Figure BDA0002551954510000032
wherein FU (MARCRO) is upward near focus, FF (MARCRO) is horizontal near focus, and FF (INF) is horizontal far focus.
In the embodiment of the invention, the client determines the near-focus edge value Yi of the original camera module through the formula II, so that preparation is made for determining the accurate focus test range of the drive code in the subsequent steps, and the problem that the focus test range of the drive code determined according to the horizontal near focus and the horizontal far focus of the classical module in the prior art is inaccurate is solved.
In a possible embodiment, the determining, by the client, the near-focus edge upper limit value and the far-focus edge lower limit value according to the near-focus edge value and the far-focus edge value of each original camera module includes:
the client determines the maximum value in the near-focus edge values of each original camera module as the upper limit value of the near-focus edge; and the client determines the minimum value in the far focus edge values of each original camera module as the far focus edge lower limit value.
In the embodiment of the invention, the client determines the accurate focus test range of the drive code according to the determined near focus edge value and far focus edge value of each module in the batch of original camera modules, so that the problem that the focus test range of the drive code determined according to the horizontal near focus and the horizontal far focus of the classical module in the prior art is inaccurate is solved, the focus test efficiency of the camera modules is improved, and the focus test error caused by the inaccurate focus test range of the drive code is reduced.
In one possible embodiment, the raw camera modules of a batch satisfy the following conditions:
the far focus edge value of the original camera module is in a first interval, and the near focus edge value of the original camera module is in a second interval.
In the embodiment of the invention, before the client performs the focus test on the target camera module, the original camera modules of the batch meeting the requirements are determined according to the formula I and the formula II, so that the near-focus edge value and the far-focus edge value of each module in the original camera modules of the subsequent batch meet the requirements, the accurate focus test range of the drive codes determined in the subsequent step is prepared, and the focus test efficiency of the target camera module is improved.
In a possible embodiment, after the client drives the lens of the target camera module through the driver code so as to find the focus of the target camera module, the method further includes:
and if the focus of the target camera shooting module is not found, determining that the target camera shooting module fails the test.
In the embodiment of the invention, the focus test is carried out on each target camera module according to the focus test range input in the drive code, the focus value of the clearest point is searched, and when the focus value is not searched, the corresponding target camera module is determined to fail the test. Through the focus test method, the target camera shooting module is screened, the camera shooting module meeting the requirement is further determined, and the quality of the obtained camera shooting module is improved.
In a second aspect, an embodiment of the present invention further provides a focus testing apparatus, including:
and the calling unit is used for calling the driving code of the target camera shooting module, which is provided with the near focus edge value and the far focus edge value. And determining the upper limit value of the near-focus edge and the lower limit value of the far-focus edge according to the three-attitude data of the original camera modules of the batch. The three-posture data comprise a focal length in the horizontal direction, a focal length in the upward direction and a focal length in the downward direction of each original camera module; and the focal length of the target camera module in the horizontal direction is not recorded in the target camera module.
And the processing unit is used for driving the lens of the target camera shooting module through the driving code so as to search the focus of the target camera shooting module.
The processing unit is specifically used for determining a near focus edge value of each original camera module according to a horizontal near focus, a horizontal far focus and a downward near focus of the camera module; and determining the far focus edge value of the original camera module according to the horizontal near focus, the horizontal far focus and the upward far focus of the original camera module. Further, according to a near-focus edge value and a far-focus edge value of each original camera module, determining the near-focus edge upper limit value and the far-focus edge lower limit value.
The processing unit is specifically configured to determine a far focus edge value Xi of the original camera module according to a horizontal near focus, a horizontal far focus, and a downward far focus of the original camera module, and includes:
Figure BDA0002551954510000051
FD (INF) is downward far focus, FF (INF) is horizontal far focus, and FF (MARCRO) is horizontal near focus.
The processing unit is specifically configured to determine a near focus edge value Yi of the original camera module according to a horizontal near focus, a horizontal far focus, and an upward near focus of the original camera module, and includes:
Figure BDA0002551954510000052
wherein FU (MARCRO) is upward near focus, FF (MARCRO) is horizontal near focus, and FF (INF) is horizontal far focus.
The processing unit is specifically configured to determine a maximum value of the near-focus edge values of each original camera module as the near-focus edge upper limit value, and determine a minimum value of the far-focus edge values of each original camera module as the far-focus edge lower limit value.
The original camera modules of the batch meet the following conditions: the far focus edge value of the original camera module is in a first interval, and the near focus edge value of the original camera module is in a second interval.
The processing unit is further configured to determine that the target camera module fails the test when the focus of the target camera module is not found.
In a third aspect, another embodiment of the present invention provides a computing device, which includes a memory and a processor, wherein the memory is used for storing a computer program, and the processor is used for calling the program stored in the memory, and executing any one of the methods in the first aspect according to the obtained program.
In a fourth aspect, another embodiment of the present invention provides a computer storage medium, where a computer-executable program is stored, and the computer-executable program is used to make a computer execute any one of the methods in the first aspect.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram illustrating a system architecture of a focus test according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart illustrating a method for focus testing according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of driver code provided in an embodiment of the present application;
fig. 4 is a schematic diagram of a driving architecture according to an embodiment of the present application;
fig. 5 is a schematic flow chart illustrating a method for testing a target camera module according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram illustrating an apparatus for focus testing according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram of a computing device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the prior art, a method for calculating auto focus debugging of a camera module in an equipment research and development stage is as follows: the method comprises the steps of selecting a part of classical camera modules in a large batch of produced modules, determining a focus test range of a drive code according to a far focus value of a clearest point in the horizontal direction of the part of classical camera modules and a near focus value of the clearest point in the horizontal direction, and determining a focus value of the clearest point by each target camera module according to the focus test range of the drive code. However, in the above focus test method, the influence of gravity on the accuracy of the clearest far focus value in the horizontal direction and the clearest near focus value in the horizontal direction of the camera module is ignored, so that the focus test range in the obtained drive code is inaccurate, and further, the clearest focus value cannot be found in the focus test range by some target camera modules, so that the focus test result is influenced, and some target camera modules are mistakenly regarded as unqualified camera modules, so that the accuracy of focus test is reduced.
Illustratively, an embodiment of the present invention provides a system architecture diagram of a focus test, as shown in fig. 1, including a client 10 and a driving device 20, where the client 10 is connected to the driving device 20, the client 10 instructs the driving device 20 to control a driving motor therein to push a lens in a target camera module according to a driving code, so as to perform focusing, and when the lens is focused at a specified distance to a clear point, the client 10 records a focus value of the lens at that time.
In order to solve the problems in the prior art, an embodiment of the present application provides a flowchart of a method for focus test, as shown in fig. 2, including:
step 201, the client 10 calls a drive code of the target camera module, which is provided with a near-focus edge value and a far-focus edge value, wherein an upper limit value of the near-focus edge and a lower limit value of the far-focus edge are determined by three-attitude data of the batch of original camera modules; the three-posture data comprise a focal length in the horizontal direction, a focal length in the upward direction and a focal length in the downward direction of each original camera module; the focal length of the horizontal direction of the camera module is not recorded in each camera module.
For example, the prior art typically records the focal length of the camera module in the horizontal direction into the register. The manufacturing cost who leads to is high, and should not used by the focus of the horizontal direction of burning record usually, so have the extravagant problem of burning record among the prior art, improved this in the scheme of this application, do not burn the focus of horizontal direction, improve the production efficiency of the module of making a video recording, reduce the manufacturing cost of module.
Illustratively, the client 10 invokes the driver code as shown in fig. 3, in which the near-focus edge value MACRO _ MARGI set in the driver code is 0.4 and the far-focus edge value INF _ MARGI is-0.2.
In step 202, the client 10 drives the lens of the target camera module through the driving code so as to find the focus of the target camera module.
Illustratively, the client 10 instructs the driving device 20 to generate different currents according to the driving code in fig. 3 and the near focus edge value 0.4 and the far focus edge value-0.2, so as to control the driving motor therein to push the lens in the target camera module to move to the corresponding position, and further, when the lens is focused to a clear point at a specified distance, determines the focus of the lens at that time.
Illustratively, when a user performs code scanning payment, the payment two-dimensional code of the mobile phone is close to the lens of the code scanning machine, the code scanning distance used by the user is generally between 10cm and 20cm, and the optimal code scanning distance is set to be 15cm in the embodiment of the invention. That is to say, when the focus test is performed on the camera module, the position 15cm away from the right front of the lens is set as an ideal focus, when the target camera module realizes clear focusing at the position, the focus of the lens at the moment is recorded, so that the focus test process of the camera module is completed, and the camera module is determined to be a qualified product.
Illustratively, in conjunction with the schematic diagram of the driving architecture shown in fig. 4, the driving motor in the driving device 20 pushes the lens in the target camera module a, and when the lens is focused to a sharp point at a specified distance, such as 15cm, directly in front of the lens, the focus of the lens at that time is determined. The above steps realize that the client 10 pushes the lens according to the focus test range input in advance in the driver code, and help to accurately determine the focus of the lens.
Optionally, each camera module includes a lens and a focusing motor, and the driving device 20 can adjust the position of the lens by controlling the current in the focusing motor, so as to implement the focus test on the lens. Alternatively, the focus motor may be driven to a different position by a different current, i.e., the focus of the lens can be controlled by changing the value of the current input to the focus motor.
In a possible embodiment, before step 101, the determining, by the client 10, the near-focus edge upper limit value and the far-focus edge lower limit value according to the three-attitude data of the batch of original camera modules includes:
the client 10 determines, for each original camera module, a near focus edge value of the original camera module according to the horizontal near focus, the horizontal far focus, and the downward near focus of the original camera module; the far focus edge value of the original camera module is determined according to the horizontal near focus, the horizontal far focus and the upward far focus of the original camera module, and further, the client 10 determines the near focus edge upper limit value and the far focus edge lower limit value according to the near focus edge value and the far focus edge value of each original camera module. The focal length in the horizontal direction comprises a horizontal near focus and a horizontal far focus, the focal length in the upward direction at least comprises an upward near focus, and the focal length in the downward direction at least comprises a downward far focus.
Illustratively, the client 10 first imports three-pose data of each original camera module of the first batch, as shown in table 1 below, which includes three-pose data of 5 original camera modules of the first batch (where the three-pose data of the lens may include a near focus and a far focus of the lens in the horizontal direction, and a near focus and a far focus of the lens in the horizontal direction).
TABLE 1
Figure BDA0002551954510000091
The client 10 determines the far focus edge value and the near focus edge value of each module according to the three-attitude data of the original camera module. And determining an upper limit value of a near focus edge and a lower limit value of the far focus edge according to each far focus edge value and each near focus edge value of the original camera modules of the first batch in the subsequent step, namely obtaining a focus test range of each target camera module in the subsequent step, realizing accurate test of the focus of each target camera module, and improving the test accuracy.
Further, in a possible embodiment, the determining, by the client 10, the far focus edge value Xi of the original camera module according to the horizontal near focus, the horizontal far focus, and the downward far focus of the original camera module includes:
Figure BDA0002551954510000092
FD (INF) is downward far focus, FF (INF) is horizontal far focus, and FF (MARCRO) is horizontal near focus.
Exemplarily, in combination with the three-attitude data of the original camera module shown in table 1, for the camera module 1 therein, the far focus edge value of the camera module 1 is determined according to the formula one
Figure BDA0002551954510000101
Similarly, the far focus edge value of the camera module 2 is calculated
Figure BDA0002551954510000102
Far focus edge value of camera module 3
Figure BDA0002551954510000103
Far focus edge value of camera module 4
Figure BDA0002551954510000104
Far focus edge value of camera module 5
Figure BDA0002551954510000105
Through the steps, the corresponding far focus edge value is determined according to the downward far focus of the original camera shooting module, the accurate focus test range is determined in the subsequent steps, the situation that a clear focus cannot be found at a specified position (such as a position 15cm right in front of a lens) due to the inaccurate focus test range of part of target camera shooting modules is avoided, and the accuracy of focus test is improved.
Further, in a possible embodiment, the determining, by the client 10, a near-focus edge value Yi of the original camera module according to the horizontal near focus, the horizontal far focus, and the upward near focus of the original camera module includes:
Figure BDA0002551954510000106
wherein FU (MARCRO) is upward near focus, FF (MARCRO) is horizontal near focus, and FF (INF) is horizontal far focus.
Exemplarily, in combination with the three-attitude data of the original camera module shown in table 1, for the camera module 1 therein, the near-focus edge value of the camera module 1 is determined according to formula one
Figure BDA0002551954510000107
Similarly, the near focus edge value of the camera module 2 is calculated
Figure BDA0002551954510000108
Near focus edge value of camera module 3
Figure BDA0002551954510000109
Near-focus edge value of camera module 4
Figure BDA00025519545100001010
Near-focus edge value of camera module 5
Figure BDA00025519545100001011
Through the steps, the corresponding near focus edge value is determined according to the upward near focus of the original camera shooting module, the accurate focus test range is determined in the follow-up steps, the situation that a clear focus cannot be found at a specified position (such as a position 15cm right in front of a lens) due to the inaccurate focus test range of a part of target camera shooting modules is avoided, and the accuracy of focus test is improved.
Illustratively, based on table 1, according to the formula one and the formula two, the far focus edge value and the near focus edge value of each original camera module in the batch are obtained as shown in table 2.
TABLE 2
Figure BDA0002551954510000111
Further, in a possible embodiment, the determining, by the client 10, a near-focus edge upper limit value and a far-focus edge lower limit value according to the near-focus edge value and the far-focus edge value of each original camera module includes:
and determining the maximum value in the near-focus edge values of each original camera module as a near-focus edge upper limit value, and determining the minimum value in the far-focus edge values of each original camera module as a far-focus edge lower limit value.
Illustratively, the maximum value among the near-focus edge values, that is, the near-focus edge value 0.29 of the camera module 1 is determined as the near-focus edge upper limit value, while the minimum value among the far-focus edge values, that is, the far-focus edge value-0.33 of the camera module 3 is determined as the far-focus edge lower limit value. That is, the focus test range of the driver code is determined as: -0.33 to 0.29. And each target camera module in the subsequent step is helped to determine the focus of a clear point according to the focus test range in the step, so that the accuracy of focus test is improved.
Further, in a possible embodiment, each camera module may satisfy the following condition: the far focus edge value of the camera module is in the first interval, and the near focus edge value of the camera module is in the second interval.
Exemplarily, before focus test is carried out on each camera module, the original camera modules are screened, and the original camera modules meeting the requirements are determined. For example, for the camera module 1, determining that the corresponding far focus edge value-3.1 is in a first interval-0.6 to 0 according to a formula I; and determining that the corresponding near-focus edge value is within the second interval of 0 to 0.6 according to a formula II, and further determining that the camera module 1 is an original camera module meeting the requirements and performing focus test. Through carrying out above-mentioned screening to original module of making a video recording, get rid of unqualified original module of making a video recording, follow-up can be according to qualified each original module of making a video recording and determine accurate focus test scope, the help improves focus test accuracy.
Further, in a possible embodiment, after the step 102, the method further includes: if the client 10 does not find the focus of the target camera module, it is determined that the target camera module fails the test.
Illustratively, after determining an accurate focus test range according to a batch of original camera modules, the client 10 performs a focus test on each target camera module, where the focus test range is input into a drive code to drive the drive code, so as to control the current magnitude in the drive device 20, further push a lens in the target camera module according to the current magnitude, if the lens is pushed to a certain position, implement clear focusing at a specified distance, record the position of the lens at that time and determine a corresponding focus, thereby completing the focus test. However, when all positions to which the lens is pushed according to the focus test range cannot achieve sharp focusing at the specified distance, it is determined that the target camera module fails the focus test, and the target camera module may have problems and needs to be further analyzed by the user. Through the steps, the focus test of the target camera module is accurately carried out, qualified products are found out, and the improvement of the product quality is facilitated.
Based on the above description, an embodiment of the present invention further provides a schematic flow chart of a method for testing a target camera module, as shown in fig. 5, where the method includes:
in step 501, the client 10 collects three-pose data of the original camera module.
For example, the client 10 may directly collect the three-pose data of each original camera module as shown in table 1.
In step 502, the client 10 determines whether the three-gesture data is valid, and if so, proceeds to step 503, otherwise, proceeds to step 506.
For example, the client 10 processes the three-pose data of the original camera module, for example, determines a far focus edge value and a near focus edge value of the camera module according to a formula i and a formula ii, and further determines that the three-pose data of the original camera module is valid when the far focus edge value is within a first interval and the near focus edge value is within a second interval.
In step 503, the client 10 drives the lens of the target camera module to find the focus according to the focus test range in the driver code.
Illustratively, the upper limit value of the near-focus edge is determined to be 0.29 and the lower limit value of the far-focus edge is determined to be-0.33 according to the batch original camera module, and then the focus test range in the driver code is-0.33 to 0.29. Further, the client 10 drives a preset driver code, and the driver code instructs the driving device 20 to push the lens of the target camera module according to the focus test range-0.33 to 0.29 input therein, so that the lens searches for a focus.
In step 504, the client 10 determines whether the focus test has passed. When passing, step 505 is entered, otherwise step 506 is entered.
Illustratively, the lens looks to see whether the lens achieves sharp focusing at a preset distance according to a focus test range of-0.33 to 0.29 in the driver code, for example, as shown in fig. 4 in conjunction with the above description, when the lens in the camera module a can achieve sharp focusing on an image b located 15cm directly in front of the lens, the focus at this time is determined, and the test is considered to pass, and the target camera module is qualified. Otherwise, the focus test of the target camera module is not passed, and the process proceeds to step 506.
And 505, when the focus test of the target camera module passes, using the target camera module to perform subsequent production.
Step 506, further analyzing the camera module, and analyzing the camera module for possible problems.
Optionally, the user can check the camera module, and it may be that there is a foreign object stuck inside the camera module to cause the test to fail.
Based on the above description, the embodiment of the present invention further provides a focus test apparatus as shown in fig. 6, which includes a calling unit 601 and a processing unit 602.
The calling unit 601 is configured to call a drive code of the target camera module, where the drive code is provided with a near focus edge value and a far focus edge value. And the upper limit value of the near-focus edge and the lower limit value of the far-focus edge are determined by the three-attitude data of the batch of camera modules. The three-posture data comprise a focal length in the horizontal direction, a focal length in the upward direction and a focal length in the downward direction of each camera module; the focal length of the horizontal direction of the camera module is not recorded in the camera module.
The processing unit 602 is configured to drive the lens of the target camera module through the driving code so as to find a focus of the target camera module.
The processing unit 602 is specifically configured to determine, for each original camera module, a near-focus edge value of the original camera module according to a horizontal near focus, a horizontal far focus, and a downward near focus of the original camera module; and determining the far focus edge value of the original camera module according to the horizontal near focus, the horizontal far focus and the upward far focus of the original camera module. Further, according to the near-focus edge value and the far-focus edge value of each original camera module, determining a near-focus edge upper limit value and a far-focus edge lower limit value.
The processing unit 602 is specifically configured to determine a far focus edge value Xi of the original camera module according to a horizontal near focus, a horizontal far focus, and a downward far focus of the original camera module:
Figure BDA0002551954510000141
FD (INF) is downward far focus, FF (INF) is horizontal far focus, and FF (MARCRO) is horizontal near focus.
The processing unit 602 is specifically configured to determine a near focus edge value Yi of the original camera module according to a horizontal near focus, a horizontal far focus, and an upward near focus of the original camera module:
Figure BDA0002551954510000142
wherein FU (MARCRO) is upward near focus, FF (MARCRO) is horizontal near focus, and FF (INF) is horizontal far focus.
The processing unit 602 is specifically configured to determine a maximum value of the near-focus edge values of each original camera module as a near-focus edge upper limit value, and determine a minimum value of the far-focus edge values of each original camera module as a far-focus edge lower limit value.
The original camera modules of the batch meet the following conditions: the far focus edge value of the original camera module is in the first interval, and the near focus edge value of the original camera module is in the second interval.
The processing unit 602 is further configured to determine that the target camera module fails the test when the focus of the target camera module is not found.
Fig. 7 is a schematic diagram of a computing device provided in an embodiment of the present application, including:
one or more processors 710 and a memory 720, one processor 710 being illustrated in fig. 7. The electronic device performing the focus test method may further include: an input device 730 and an output device 740.
The processor 710, the memory 720, the input device 730, and the output device 740 may be connected by a bus or other means, such as the bus connection in fig. 7.
The memory 720, which is a non-volatile computer-readable storage medium, may be used for storing non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules corresponding to the focus test method in the embodiments of the present application. The processor 710 executes various functional applications and data processing of the server by executing nonvolatile software programs, instructions and modules stored in the memory 720, so as to implement the above-described method embodiment focus test method.
The memory 720 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of the focus test apparatus, and the like. Further, the memory 720 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 720 optionally includes memory remotely located from processor 710, and these remote memories may be connected to the frame injection focus test apparatus via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 730 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the focus test device. The output device 740 may include a display device such as a display screen.
The one or more modules are stored in the memory 720 and, when executed by the one or more processors 710, perform a focus test method in any of the method embodiments described above.
The product can execute the method provided by the embodiment of the application, and has the corresponding functional modules and beneficial effects of the execution method. For technical details that are not described in detail in this embodiment, reference may be made to the methods provided in the embodiments of the present application.
As will be appreciated by one skilled in the art, embodiments of the present application are provided as a method, apparatus (device), or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A focus test method, the method comprising:
calling a driving code of the target camera module; the driving code is provided with a near focus edge upper limit value and a far focus edge lower limit value; the upper limit value of the near-focus edge and the lower limit value of the far-focus edge are determined by three-attitude data of the original camera modules of the batch; the three-posture data comprise a focal length in the horizontal direction, a focal length in the upward direction and a focal length in the downward direction of each original camera module; the focal length of the target camera module in the horizontal direction is not recorded in the target camera module;
and driving the lens of the target camera shooting module through the driving code so as to search the focus of the target camera shooting module.
2. The method of claim 1, wherein the focal length in the horizontal direction comprises a horizontal near focus, a horizontal far focus; the focal length in the upward direction at least comprises an upward near focus; the focal length in the downward direction includes at least a downward far focus;
the upper limit value of the near-focus edge and the lower limit value of the far-focus edge are determined by three-attitude data of the original camera modules of the batch, and the method comprises the following steps:
for each original camera module, determining a near focus edge value of the original camera module according to a horizontal near focus, a horizontal far focus and a downward near focus of the original camera module; determining a far focus edge value of the original camera module according to the horizontal near focus, the horizontal far focus and the upward far focus of the original camera module;
and determining the upper limit value of the near focus edge and the lower limit value of the far focus edge according to the near focus edge value and the far focus edge value of each original camera module.
3. The method as claimed in claim 1, wherein the determining the far focus edge value Xi of the original camera module according to the horizontal near focus, the horizontal far focus and the downward far focus of the original camera module comprises:
Figure FDA0002551954500000011
FD (INF) is downward far focus, FF (INF) is horizontal far focus, and FF (MARCRO) is horizontal near focus.
4. The method according to claim 1, wherein the determining the near focus edge value Yi of the original camera module from the horizontal near focus, the horizontal far focus, and the upward near focus of the original camera module,
Figure FDA0002551954500000021
wherein FU (MARCRO) is upward near focus, FF (MARCRO) is horizontal near focus, and FF (INF) is horizontal far focus.
5. The method of claim 2, wherein determining the near edge upper limit value and the far edge lower limit value according to the near edge value and the far edge value of each original camera module comprises:
determining the maximum value in the near-focus edge values of each original camera module as the upper limit value of the near-focus edge;
and determining the minimum value in the far focus edge values of each original camera module as the far focus edge lower limit value.
6. The method of claim 2, wherein the batches of raw camera modules satisfy the following condition:
the far focus edge value of the original camera module is in a first interval, and the near focus edge value of the original camera module is in a second interval.
7. The method according to any one of claims 1 to 6, wherein after the driving the lens of the target camera module by the driver code to find the focus of the target camera module, further comprises:
and if the focus of the target camera shooting module is not found, determining that the target camera shooting module fails the test.
8. A focus test apparatus, comprising:
the calling unit is used for calling a driving code of the target camera module; the driving code is provided with a near focus edge upper limit value and a far focus edge lower limit value; the upper limit value of the near-focus edge and the lower limit value of the far-focus edge are determined by three-attitude data of the original camera modules of the batch; the three-posture data comprise a focal length in the horizontal direction, a focal length in the upward direction and a focal length in the downward direction of each original camera module; the focal length of the target camera module in the horizontal direction is not recorded in the target camera module;
and the processing unit is used for driving the lens of the target camera shooting module through the driving code so as to search the focus of the target camera shooting module.
9. A computing device, comprising:
a memory for storing a computer program;
a processor for calling a computer program stored in said memory, for executing the method of any one of claims 1 to 7 in accordance with the obtained program.
10. A computer-readable non-transitory storage medium including a computer-readable program which, when read and executed by a computer, causes the computer to perform the method of any one of claims 1 to 7.
CN202010579698.2A 2020-06-23 2020-06-23 Focus testing method and device Active CN111726527B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010579698.2A CN111726527B (en) 2020-06-23 2020-06-23 Focus testing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010579698.2A CN111726527B (en) 2020-06-23 2020-06-23 Focus testing method and device

Publications (2)

Publication Number Publication Date
CN111726527A true CN111726527A (en) 2020-09-29
CN111726527B CN111726527B (en) 2021-11-16

Family

ID=72568314

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010579698.2A Active CN111726527B (en) 2020-06-23 2020-06-23 Focus testing method and device

Country Status (1)

Country Link
CN (1) CN111726527B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113452991A (en) * 2021-08-31 2021-09-28 南昌龙旗信息技术有限公司 Lens focusing state determining method, device, equipment and storage medium
CN114339033A (en) * 2021-12-07 2022-04-12 珠海视熙科技有限公司 Dynamic calibration method and device based on camera far focus

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100295948A1 (en) * 2009-05-21 2010-11-25 Vimicro Corporation Method and device for camera calibration
CN106657786A (en) * 2016-12-26 2017-05-10 昆山丘钛微电子科技有限公司 Compensation method for improving shooting defocus of camera module on non-horizontal direction
CN109348129A (en) * 2018-11-13 2019-02-15 信利光电股份有限公司 A kind of the clarity detection method and system of cameras with fixed focus
CN109451237A (en) * 2018-11-13 2019-03-08 信利光电股份有限公司 A kind of focusing AA method and apparatus of camera module

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100295948A1 (en) * 2009-05-21 2010-11-25 Vimicro Corporation Method and device for camera calibration
CN106657786A (en) * 2016-12-26 2017-05-10 昆山丘钛微电子科技有限公司 Compensation method for improving shooting defocus of camera module on non-horizontal direction
CN109348129A (en) * 2018-11-13 2019-02-15 信利光电股份有限公司 A kind of the clarity detection method and system of cameras with fixed focus
CN109451237A (en) * 2018-11-13 2019-03-08 信利光电股份有限公司 A kind of focusing AA method and apparatus of camera module

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113452991A (en) * 2021-08-31 2021-09-28 南昌龙旗信息技术有限公司 Lens focusing state determining method, device, equipment and storage medium
CN113452991B (en) * 2021-08-31 2021-11-23 南昌龙旗信息技术有限公司 Lens focusing state determining method, device, equipment and storage medium
CN114339033A (en) * 2021-12-07 2022-04-12 珠海视熙科技有限公司 Dynamic calibration method and device based on camera far focus

Also Published As

Publication number Publication date
CN111726527B (en) 2021-11-16

Similar Documents

Publication Publication Date Title
CN111726527B (en) Focus testing method and device
CN107950018B (en) Image generation method and system, and computer readable medium
CN1450398A (en) Image pick up apparatus and camera system thereof
US10230903B2 (en) Image pickup apparatus having image sensor capable of outputting parallax image signal, control method therefor, and storage medium storing control program therefor
Abuolaim et al. Revisiting autofocus for smartphone cameras
CN110784659B (en) Exposure control method and device and storage medium
CN1595219A (en) Image-taking apparatus and focus control program for image-taking apparatus
CN103731604A (en) Tracking apparatus and tracking method
CN103004179A (en) Tracking device, and tracking method
EP3641297A1 (en) Focusing control method and device, computer storable medium and mobile terminal
WO2016200734A1 (en) Optimizing capture of focus stacks
CN107786806A (en) Camera device and focus adjusting method
CN105678242A (en) Focusing method and apparatus in the mode of holding certificate in hands
CN100440935C (en) Control apparatus, control method, computer program, and vidicon
CN104135607A (en) Image pickup apparatus and method for controlling image pickup apparatus
US9094601B2 (en) Image capture device and audio hinting method thereof in focusing
CN103577520A (en) Object searching apparatus, object searching method and computer-readable recording medium
US20130208176A1 (en) Digital photographing apparatus and method for controlling thereof
CN103108123A (en) Digital photographing apparatus and method of controlling the same
CN108833786B (en) Mode control method and electronic equipment
CN102087401A (en) Auto focusing method, recording medium for recording the method, and auto focusing apparatus
CN107124547B (en) Double-camera shooting method and device
CN105681668A (en) Control method, control device and electronic device
CN103988107A (en) Imaging device, method for controlling same, and program
CN105120153A (en) Image photographing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant