CN111756989A - Method and device for controlling focusing of lens - Google Patents

Method and device for controlling focusing of lens Download PDF

Info

Publication number
CN111756989A
CN111756989A CN201910252889.5A CN201910252889A CN111756989A CN 111756989 A CN111756989 A CN 111756989A CN 201910252889 A CN201910252889 A CN 201910252889A CN 111756989 A CN111756989 A CN 111756989A
Authority
CN
China
Prior art keywords
image
determining
lens group
focusing
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910252889.5A
Other languages
Chinese (zh)
Inventor
豆子飞
杜慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201910252889.5A priority Critical patent/CN111756989A/en
Publication of CN111756989A publication Critical patent/CN111756989A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)

Abstract

The disclosure relates to a method and a device for controlling focusing of a lens, which can effectively inhibit moire fringes generated during image acquisition. The method comprises the following steps: determining a first focusing position in the process of driving the lens group to move in a preset interval; determining a second focus position based on the first focus position and a preset offset value; the drive lens group moves to the second focus position. The technical scheme of the disclosure can reduce the coherence between the detail resolution of the shot object and the repetitive unit space frequency of the photosensitive element of the image collector, thereby effectively inhibiting the moire generated when the image is collected.

Description

Method and device for controlling focusing of lens
Technical Field
The present disclosure relates to the field of internet technologies, and in particular, to a method and an apparatus for controlling focusing of a lens.
Background
With the popularization of mobile phones and the characteristic of convenient carrying of mobile phones, taking pictures by using the mobile phones becomes the choice of most people. However, when a scene displayed on a display screen of a computer, a television or the like is shot, moire patterns which look "interesting" inevitably appear. This affects the quality of the original image to some extent. Therefore, how to effectively reduce the occurrence of moire is a technical problem to be solved at present.
Disclosure of Invention
In order to overcome the problems in the related art, embodiments of the present disclosure provide a method and an apparatus for controlling focusing of a lens, which can effectively suppress moire fringes generated when an image is captured.
According to a first aspect of the embodiments of the present disclosure, there is provided a method for controlling focusing of a lens, including:
determining a first focusing position in the process of driving the lens group to move in a preset interval;
determining a second focus position based on the first focus position and a preset offset value;
driving the lens group to move to the second focus position.
In one embodiment, the determining the first focus position during the driving of the lens group within the preset interval may include:
collecting at least one first image in the process that the lens group moves in the preset interval;
determining a first image with highest definition in the at least one first image;
and determining the position of the lens group corresponding to the first image with the highest definition as the first focusing position.
In one embodiment, before determining the first focus position during the process of driving the lens group to move within the preset interval, the method may further include:
determining a third focusing position during the far-focus shooting;
determining a fourth focusing position during close-focus shooting;
and respectively determining the third focusing position and the fourth focusing position as two end points of the preset interval.
In one embodiment, the determining the third focusing position in the far-focus shooting may include:
acquiring at least one second image in the process of shooting a shot object with the object distance larger than a first preset distance;
determining a second image with highest definition in the at least one second image;
and determining the position of the lens group corresponding to the second image with the highest definition as the third focusing position.
In one embodiment, the determining the fourth focusing position in the close-focus shooting may include:
acquiring at least one third image in the process of shooting the shot object with the object distance smaller than the second preset distance;
determining a third image with highest definition in the at least one third image;
and determining the position of the lens group corresponding to the third image with the highest definition as the fourth focusing position.
In one embodiment, before determining the second focus position based on the first focus position and a preset offset value, the method may further include:
analyzing the first image with the highest definition in a frequency domain to obtain the gray distribution of the first image with the highest definition;
determining whether Moire exists in the first image with the highest definition based on the gray level distribution; wherein the step of determining a second focus position based on the first focus position and a preset offset value is performed when moire is present.
In one embodiment, before determining the first focus position during the process of driving the lens group to move within the preset interval, the method may further include:
determining whether a preset button in a camera interface is triggered; and when the preset button in the camera interface is triggered, determining a first focusing position in the process of driving the lens group to move in the preset interval.
According to a second aspect of the embodiments of the present disclosure, there is provided an apparatus for controlling focusing of a lens, including:
the first driving module is configured to drive the lens group to move within a preset interval;
a first determination module configured to determine a first focus position during a process in which the first drive module drives the lens group to move within the preset interval;
a second determination module configured to determine a second focus position based on the first focus position and a preset offset value;
a second driving module configured to drive the lens group to move to the second focus position.
In one embodiment, the first determining module may include:
a first acquisition submodule configured to acquire at least one first image during movement of the lens group within the preset interval;
a first determining submodule configured to determine a first image with highest definition in the at least one first image;
a second determination submodule configured to determine a lens group position corresponding to the first image with the highest sharpness as the first focus position.
In one embodiment, the apparatus may further include:
a third determination module configured to determine a third focused position at the time of telephoto shooting;
a fourth determination module configured to determine a fourth in-focus position at the time of the close-focus shooting;
a fifth determining module configured to determine the third focusing position and the fourth focusing position as two end points of the preset interval, respectively.
In one embodiment, the third determining module may include:
the second acquisition sub-module is configured to acquire at least one second image in the process of shooting the shot object with the object distance greater than the first preset distance;
a third determining submodule configured to determine a second image with highest sharpness in the at least one second image;
a fourth determination submodule configured to determine a lens group position corresponding to the second image with the highest sharpness as the third focusing position.
In one embodiment, the fourth determining module may include:
the third acquisition sub-module is configured to acquire at least one third image in the process of shooting the shot object with the object distance smaller than the second preset distance;
a fifth determining submodule configured to determine a third image with highest definition in the at least one third image;
a sixth determining submodule configured to determine a lens group position corresponding to the third image with the highest sharpness as the fourth focusing position.
In one embodiment, the apparatus may further include:
the analysis module is configured to analyze the first image with the highest definition in a frequency domain to obtain the gray distribution of the first image with the highest definition;
a sixth determination module configured to determine whether moire exists in the first image with the highest sharpness based on the gray scale distribution; wherein the second determination module is triggered when a Moire exists.
In one embodiment, the apparatus may further include:
a seventh determining module configured to determine whether a preset button in the camera interface is triggered; and triggering the first driving module after determining that a preset button in the camera interface is triggered.
According to a third aspect of the embodiments of the present disclosure, there is provided an apparatus for controlling focusing of a lens, including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method described above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the lens group is driven to move in the preset interval, so that the first focusing position which enables the shot image to be clearest is determined in the process that the lens group moves in the preset interval, the second focusing position can be determined according to the preset deviation value and the first focusing position, the distance between the second focusing position and the first focusing position is the preset deviation value, and then the lens group is driven to move to the second focusing position for image acquisition. Therefore, the coherence between the detail resolution of the shot object and the repetitive unit space frequency of the photosensitive element of the image collector is reduced, and the moire generated when the image is collected can be effectively inhibited.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1A is a flowchart illustrating a method of controlling focusing of a lens according to an exemplary embodiment.
FIG. 1B is a camera interface schematic shown in accordance with an example embodiment.
FIG. 2 is a flowchart illustrating a method of controlling focus of a lens according to an exemplary embodiment.
Fig. 3 to 5 are flowcharts illustrating a method for controlling focusing of a lens according to an exemplary embodiment.
Fig. 6 is a flowchart illustrating a method of controlling focusing of a lens according to a third exemplary embodiment.
Fig. 7 is a flowchart illustrating a method of controlling focusing of a lens according to an exemplary embodiment.
Fig. 8A is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment.
Fig. 8B is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment.
Fig. 8C is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment.
Fig. 8D is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment.
Fig. 8E is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment.
Fig. 8F is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment.
Fig. 9 is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment.
Fig. 10 is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
With the popularization of mobile phones and the characteristic of convenient carrying of mobile phones, taking pictures by using the mobile phones becomes the choice of most people. However, when a scene displayed on a display screen of a computer, a television or the like is shot, moire patterns which look "interesting" inevitably appear. This affects the quality of the original image to some extent. Therefore, how to effectively reduce the occurrence of moire is a technical problem to be solved at present.
At present, no method for eliminating moire fringes by taking a picture of a mobile phone exists. In the aspect of photographing by the single lens reflex camera, the exposure condition meets the spatial frequency by installing the low-pass filter in front of the photoelectric coupler, and the high spatial frequency in the photographed image is filtered, so that the Moire lines can be better removed. However, the moire is reduced and simultaneously the high frequency detail information of the shot image is filtered, so that the sharpness of the shot image is greatly reduced.
Moreover, with the coming of the full-face screen of the mobile phone, the mobile phone module is designed in a miniaturized manner, and the addition of the low-pass filter increases the space ratio and the extra hardware cost, so that the low-pass filter is still insufficient for removing Moire patterns at the mobile phone end.
The present disclosure provides a method and an apparatus for controlling lens focusing, which can reduce moire phenomenon under the condition of micro focusing by combining a camera bottom layer driving and focusing algorithm in a software interaction manner on the basis of not additionally increasing hardware.
Fig. 1A is a flowchart illustrating a method of controlling focusing of a lens according to an exemplary embodiment, and fig. 1B is a camera interface diagram according to an exemplary embodiment. The method for controlling lens focusing can be applied to terminal devices (e.g., smart phones, tablet computers), as shown in fig. 1A, and includes the following steps S101 to S103:
in step S101, a first focus position is determined while the lens group is driven to move within a preset section.
In step S102, a second focus position is determined based on the first focus position and a preset offset value.
In step S103, the lens group is driven to move to the second focus position.
In one embodiment, when the terminal device uses the image capturing apparatus to capture a picture displayed on a display screen of an electronic device such as a computer, a television, a mobile phone, etc., if a user finds that moire fringes exist in a preview image of a captured image, the method for controlling focusing of a lens shown in fig. 1A may be manually activated. In the present embodiment, as shown in fig. 1B, the camera interface of the terminal device 11 when taking a photograph includes a moir é elimination button 19 in addition to an image preview area 12, a flash option button 13, an HDR (High-Dynamic Range) option button 14, a front-rear camera switching option button 15, an album preview button 16, a shutter button 17, and a recording button 18. When the user uses the terminal device 11 to shoot a picture displayed on the display screen of the electronic device, if moire fringes are found in the preview image, the moire elimination button 19 may be triggered to start the method for controlling the focusing of the lens shown in fig. 1A, and the process proceeds to step S101.
In another embodiment, the terminal device may perform moire detection on the preview image, and when moire is detected in the preview image, automatically start the method for controlling focusing of the lens shown in fig. 1A, and enter step S101.
In one embodiment, the preset interval in step S101 is a stroke interval of the lens group moving during focusing, which is preset for capturing a picture displayed on a display screen of an electronic device such as a computer, a television, a mobile phone, and the like. In the two endpoints of the preset interval, one endpoint is a third focusing position when the shot image is clearest in the far-focus focusing process, and the other endpoint is a fourth focusing position when the shot image is clearest in the near-focus focusing process. Please refer to the following exemplary embodiments for determining the third focusing position and the fourth focusing position. The first focusing position is searched in the preset interval instead of the maximum stroke interval of the lens group, so that the focusing time can be saved. Of course, in practical applications, the preset interval may also be the maximum stroke interval of the lens group. The lens assembly may include a plurality of lenses, such as 5 to 6 lenses.
In step S101, the lens group may be driven to move within a preset interval according to a preset step. And acquiring an image corresponding to the current position of the lens group every time the lens group moves by one step. When the lens group moves from one end point of the preset area to the other end point, images acquired by the positions of the lens groups can be compared to obtain the position of the lens group corresponding to the clearest image, and the position of the lens group is determined as the first focusing position. In one exemplary embodiment, the sharpest image may be determined by comparing Modulation Transfer Function (MTF) values of the images, wherein the image with the largest MTF value is sharpest.
In an exemplary embodiment, a Driver IC may drive a Voice Coil Motor (VCM) to drive the lens group to perform traversal search in the preset interval, and find a position of the lens group where a captured image is clearest as a first focusing position. The output current value of the driving chip corresponds to the lens group position one by one, the output current value of the driving chip can be controlled by a digital-to-analog converter (DAC), and the DAC controls the output current of the driving chip based on the DAC value (the value which is stored in the register and used for being read by the DAC and converted into the control signal of the driving chip, and is referred to as the DAC value for short) stored in the register in advance, so that the position of the lens group is controlled. That is, the register stores the DAC values corresponding to the third focusing position and the fourth focusing position in advance, for example, the DAC value of the third focusing position stored in the register is Infinity code, and the DAC value of the fourth focusing position is Macro code. The output current value of the driving chip corresponding to the Infinity code is a first current value, and the output current value of the driving chip corresponding to the Macro code is a second current value. The digital-to-analog converter can control the current of the output of the driving chip from the first current value to the second current value, so as to drive the lens group to move from the third focusing position to the fourth lens group position.
In steps S102 to S103, a second focus position may be determined according to a preset offset value and the first focus position, and the lens group is driven to move to the second focus position. And the distance between the second focusing position and the first focusing position is a preset offset value. In one exemplary embodiment, the second focus position is farther from a photosensitive element of the image pickup device than the first focus position. In another exemplary embodiment, the second focus position is closer to the photosensitive element than the first focus position.
Continuing with the exemplary embodiment described above, the register stores in advance the DAC value corresponding to the preset offset value. After the first focus position is determined, the DAC value corresponding to the first focus position may be subtracted by the DAC value corresponding to the preset offset value by the digital-to-analog converter to obtain the DAC value corresponding to the second focus position, and then the lens group is controlled to move to the second focus position based on the DAC value corresponding to the second focus position. Of course, the DAC value corresponding to the second focus position may also be obtained by adding the DAC value corresponding to the first focus position to the DAC value corresponding to the preset offset value by the digital-to-analog converter.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the lens group is driven to move in the preset interval, so that the first focusing position which enables the shot image to be clearest is determined in the process that the lens group moves in the preset interval, the second focusing position can be determined according to the preset deviation value and the first focusing position, the distance between the second focusing position and the first focusing position is the preset deviation value, and then the lens group is driven to move to the second focusing position for image acquisition. Therefore, the coherence between the detail resolution of the shot object and the repetitive unit space frequency of the photosensitive element of the image collector is reduced, and the moire generated when the image is collected can be effectively inhibited.
In one embodiment, the determining the first focus position during the driving of the lens group within the preset interval may include:
collecting at least one first image in the process that the lens group moves in the preset interval;
determining a first image with highest definition in the at least one first image;
and determining the position of the lens group corresponding to the first image with the highest definition as the first focusing position.
In one embodiment, before determining the first focus position during the process of driving the lens group to move within the preset interval, the method may further include:
determining a third focusing position during the far-focus shooting;
determining a fourth focusing position during close-focus shooting;
and respectively determining the third focusing position and the fourth focusing position as two end points of the preset interval.
In one embodiment, the determining the third focusing position in the far-focus shooting may include:
acquiring at least one second image in the process of shooting a shot object with the object distance larger than a first preset distance;
determining a second image with highest definition in the at least one second image;
and determining the position of the lens group corresponding to the second image with the highest definition as the third focusing position.
In one embodiment, the determining the fourth focusing position in the close-focus shooting may include:
acquiring at least one third image in the process of shooting the shot object with the object distance smaller than the second preset distance;
determining a third image with highest definition in the at least one third image;
and determining the position of the lens group corresponding to the third image with the highest definition as the fourth focusing position.
In one embodiment, before determining the second focus position based on the first focus position and a preset offset value, the method may further include:
analyzing the first image with the highest definition in a frequency domain to obtain the gray distribution of the first image with the highest definition;
determining whether Moire exists in the first image with the highest definition based on the gray level distribution; wherein the step of determining a second focus position based on the first focus position and a preset offset value is performed when moire is present.
In one embodiment, before determining the first focus position during the process of driving the lens group to move within the preset interval, the method may further include:
determining whether a preset button in a camera interface is triggered; and when the preset button in the camera interface is triggered, determining a first focusing position in the process of driving the lens group to move in the preset interval.
Please refer to the following embodiments for specific examples of how to control the lens focusing.
Therefore, in the method provided by the embodiment of the present disclosure, the lens group may be driven to move within the preset interval, and in the process that the lens group moves within the preset interval, a first focusing position that makes a captured image clearest is determined, then a second focusing position may be determined according to a preset offset value and the first focusing position, a distance between the second focusing position and the first focusing position is the preset offset value, and then the lens group is driven to move to the second focusing position to perform image acquisition. Therefore, the coherence between the detail resolution of the shot object and the repetitive unit space frequency of the photosensitive element of the image collector is reduced, and the moire generated when the image is collected can be effectively inhibited.
The technical solutions provided by the embodiments of the present disclosure are described below with specific embodiments.
FIG. 2 is a flow chart illustrating a method of controlling focus of a lens according to an exemplary embodiment; in this embodiment, an example of how to determine the stroke interval of lens movement is described by using the above method provided in the embodiment of the present disclosure, as shown in fig. 2, the method includes the following steps:
in step S201, a third in-focus position at the time of telephoto shooting is determined.
In this embodiment, the afocal shooting can be performed by shooting a subject whose subject distance is greater than the first preset distance. In practical applications, the afocal shooting can be performed by shooting a shot object at infinity. In the telephoto shooting process, at least one second image may be acquired, and the clearest second image may be determined from the at least one second image, and then the lens group position corresponding to the clearest second image may be determined as the third focusing position.
In step S202, a fourth in-focus position at the time of close-focus shooting is determined.
In this embodiment, the close-focus shooting may be performed by shooting an object whose object distance is smaller than a second preset distance, where the second preset distance is smaller than the first preset distance. In the process of close-focus shooting, at least one third image can be acquired, the clearest third image is determined from the at least one third image, and then the lens group position corresponding to the clearest third image is determined as the fourth focusing position.
In step S203, the third focusing position and the fourth focusing position are respectively determined as two end points of the preset interval.
In this embodiment, the stroke interval of the lens group can be determined by taking the third focusing position and the fourth focusing position as two end points, and is used as the preset interval.
In step S204, a first focus position is determined while the lens group is driven to move within the preset section.
In step S205, a second focus position is determined based on the first focus position and a preset offset value.
In step S206, the lens group is driven to move to the second focus position.
Steps S204 to S206 in this embodiment are similar to steps S101 to S103 shown in fig. 1A, and are not repeated here.
In this embodiment, the stroke interval of the lens group can be determined by taking the third focusing position during the telephoto shooting and the fourth focusing position during the near-focus shooting as two end points, so as to obtain the preset interval. The first focusing position is searched in the preset interval instead of the maximum stroke interval of the lens group, so that the focusing time can be saved.
3-5 are flow diagrams illustrating a method of controlling focus of a lens according to an exemplary embodiment; in this embodiment, an example of how to determine a focusing position is described by using the above method provided by the embodiment of the present disclosure, as shown in fig. 3 to 5, the method includes the following steps:
in step S301, at least one second image is acquired during shooting of the object whose object distance is greater than the first preset distance.
In this embodiment, the photographed object whose object distance is greater than the first preset distance may be determined first, and then the photographed object may be used as a focusing target to perform focusing. In the focusing process, the lens group can move by a preset step length, and at least one second image is acquired by moving one step.
In step S302, the second image with the highest definition in the at least one second image is determined.
In this embodiment, the second image with the highest sharpness may be determined by comparing modulation transfer parameters (MTFs) of at least one second image. Wherein, the second image with the maximum MTF value has the highest definition. Of course, in practical application, the second image with the highest definition may be determined in other manners.
In step S303, the lens group position corresponding to the second image with the highest sharpness is determined as the third focusing position.
In this embodiment, the lens group position corresponding to the second image with the highest definition can be determined as the third focusing position, and the DAC value corresponding to the third focusing position is stored in the register for subsequent use.
In step S401, at least one third image is acquired during shooting of the object whose object distance is less than the second preset distance.
In this embodiment, the object to be shot whose object distance is smaller than the second preset distance may be determined first, and then the object to be shot may be used as the focusing target to perform focusing. In the focusing process, the lens group can move by a preset step length, and at least one third image is acquired by moving one step.
In step S402, the third image with the highest definition in the at least one third image is determined.
In this embodiment, the third image with the highest sharpness may be determined by comparing the MTF of at least one third image. Wherein, the third image with the largest MTF value has the highest definition. Of course, in practical application, the third image with the highest definition may be determined in other manners.
In step S403, the lens group position corresponding to the third image with the highest sharpness is determined as the fourth focusing position.
In this embodiment, the lens group position corresponding to the third image with the highest definition can be determined as the fourth focusing position, and the DAC value corresponding to the fourth focusing position is stored in the register for subsequent use.
In step S501, at least one first image is collected during the movement of the lens group in the preset interval.
In this embodiment, the image capturing device may move within a preset interval according to a preset step length, and at least one first image is captured every moving step.
In step S502, the first image with the highest definition in the at least one first image is determined.
In this embodiment, the first image with the highest sharpness may be determined by comparing the MTF of at least one first image. Wherein, the definition of the first image with the maximum MTF value is the highest. Of course, in practical application, the first image with the highest definition may be determined in other manners.
In step S503, the lens group position corresponding to the first image with the highest sharpness is determined as the first focus position.
In this embodiment, the lens group position corresponding to the first image with the highest sharpness may be determined as the first focus position, and the DAC value corresponding to the first focus position may be determined for determining the DAC value of the second focus position.
In this embodiment, when the focusing position is determined, the focusing target may be determined, and in the process of focusing the focusing target, at least one image may be collected, and the lens group position corresponding to the clearest image in the at least one image is taken as the focusing position, so that focusing is accurate and an error is small.
FIG. 6 is a flowchart illustrating a method of controlling lens focus in accordance with an exemplary embodiment; the present embodiment utilizes the above method provided by the embodiments of the present disclosure to exemplarily explain how to automatically suppress moire in an image, as shown in fig. 6, including the following steps:
in step S601, at least one first image is collected during the process of driving the lens group to move within the preset interval.
In step S602, the first image with the highest definition in the at least one first image is determined.
In step S603, the lens group position corresponding to the first image with the highest sharpness is determined as the first focus position.
S601 to S603 in this embodiment are similar to steps S501 to S503 shown in fig. 5, and are not described again here.
In step S604, the first image with the highest definition is analyzed in a frequency domain, so as to obtain a gray distribution of the first image with the highest definition.
In this embodiment, the first image with the highest definition may be analyzed in a frequency domain to obtain a gray scale distribution of the first image with the highest definition. In practical application, the discrete distribution of the gray values of the first image can be transformed to the frequency domain to obtain a spectrum image. Then, calculating the spectral amplitude of the spectral image, and determining the region to be detected on the spectral amplitude image. And then, calculating a global optimal segmentation threshold according to the frequency spectrum amplitude of the region to be detected. The calculation formula of the global optimal segmentation threshold value can be as follows
Figure BDA0002012834720000141
Wherein, threshold is an initial threshold, Amplitude is a frequency spectrum Amplitude of the region to be detected in the frequency domain, min (Amplitude) is a minimum value in the frequency spectrum Amplitude of the region to be detected, and max (Amplitude) is a maximum value in the frequency spectrum Amplitude of the region to be detected.
And performing gray segmentation on the first image according to the global optimal segmentation threshold value to obtain the gray distribution of the first image. Of course, in practical applications, other methods may be used to determine the gray scale distribution of the first image.
In step S605, it is determined whether moire exists in the first image having the highest definition based on the gradation distribution. When the moire exists, step S606 is performed.
In this embodiment, if the gray scale distribution of the first image meets a preset condition, for example, is located in a preset empirical interval, it is determined that moire exists in the first image, and if the gray scale distribution of the first image does not meet the preset condition, it is determined that moire does not exist in the first image.
It should be noted that, in practical applications, other methods may also be used to determine whether moire exists in the first image, and are not limited to the methods provided in the present disclosure.
In step S606, a second focus position is determined based on the first focus position and a preset offset value.
In step S607, the lens group is driven to move to the second focusing position.
S606 to S607 in this embodiment are similar to steps S102 to S103 shown in fig. 1A, and are not described again here.
In this embodiment, the gray scale distribution of the first image can be obtained by analyzing the first image, and then whether moire fringes exist in the first image is determined, and when moire fringes are determined to exist, the focusing position is automatically adjusted, generation of moire is suppressed, and image quality is improved.
FIG. 7 is a flowchart illustrating a method of controlling lens focus according to a fourth exemplary embodiment; the present embodiment utilizes the above method provided by the embodiments of the present disclosure to exemplarily explain how to manually suppress moire in an image, as shown in fig. 7, including the following steps:
in step S701, it is determined whether a preset button in the camera interface is triggered. When it is determined that the preset button in the camera interface is triggered, step S702 is performed.
In the present embodiment, as shown in fig. 1B, a moir é elimination button 19 is included in the camera interface, and when the user takes a picture displayed on the display screen of the electronic device using the terminal device 11, if a moir é is found to exist in the preview image, the moir é elimination button 19 may be triggered. When the moire elimination button 19 is activated, the following steps S702 to S704 are executed.
In step S702, a first focus position is determined while the lens group is driven to move within a preset section.
In step S703, a second focus position is determined based on the first focus position and a preset offset value.
In step S704, the lens group is driven to move to the second focus position.
Steps S702 to S704 in this embodiment are similar to steps S101 to S103 shown in fig. 1A, and are not described again here.
In an exemplary embodiment, the terminal device 11 is a high-pass platform-based handset. When the moire elimination button 19 is triggered, a high-pass HAL (hardware abstraction) layer may be entered, and the HAL layer may start a driving chip to execute a CAF (continuous auto focus) algorithm, determine a first focusing position according to a fullswaep (traversal search) manner, and perform fine adjustment based on the first focusing position to obtain a second focusing position, so as to drive the lens group to move to the second focusing position. Therefore, on the basis of not additionally increasing hardware, by combining the camera bottom layer driving and focusing algorithm and in a software interaction mode, under the condition of micro focusing, the phenomenon of Moire can be well inhibited, the definition of an image can not be obviously reduced, and better visual experience is brought to a user when the user shoots the scenes of display screens.
In this embodiment, whether to start the moire suppression function may be determined by determining whether a preset button in the camera interface is triggered, so as to provide a way for a user to manually suppress moire.
Fig. 8A is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment, the apparatus including:
a first driving module 81 configured to drive the lens group to move within a preset interval;
a first determination module 82 configured to determine a first focus position during the process that the first driving module 81 drives the lens group to move within the preset interval;
a second determination module 83 configured to determine a second focus position based on the first focus position and a preset offset value;
a second driving module 84 configured to drive the lens groups to move to the second focus position.
Fig. 8B is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment, where the first determination module 81 may include:
a first acquisition submodule 811 configured to acquire at least one first image during movement of the lens group within the preset interval;
a first determining sub-module 812 configured to determine a first image with highest sharpness of the at least one first image;
a second determination sub-module 813 configured to determine the lens group position corresponding to the first image with the highest sharpness as the first focus position.
Fig. 8C is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment, the apparatus may further include:
a third determination module 85 configured to determine a third focused position at the time of telephoto shooting;
a fourth determination module 86 configured to determine a fourth in-focus position at the time of close-focus shooting;
a fifth determining module 87 configured to determine the third focusing position and the fourth focusing position as two end points of the preset interval, respectively.
Fig. 8D is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment, where the third determining module 85 may include:
the second acquisition sub-module 851 configured to acquire at least one second image during the process of shooting the shot object whose object distance is greater than the first preset distance;
a third determining submodule 852 configured to determine a second image with the highest definition among the at least one second image;
a fourth determining sub-module 853 configured to determine a lens group position corresponding to the second image with the highest sharpness as the third focusing position.
Fig. 8E is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment, where the fourth determining module 86 may include:
a third acquisition sub-module 861 configured to acquire at least one third image during the process of photographing the object whose distance to the photographed object is smaller than the second preset distance;
a fifth determining sub-module 862 configured to determine a third image with the highest definition among the at least one third image;
a sixth determining submodule 863 configured to determine a lens group position corresponding to the third image with the highest sharpness as the fourth focusing position.
Fig. 8F is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment, the apparatus may further include:
the analysis module 88 is configured to analyze the first image with the highest definition in a frequency domain to obtain a gray distribution of the first image with the highest definition;
a sixth determining module 89 configured to determine whether moire exists in the first image with the highest definition based on the gray scale distribution; wherein the second determination module is triggered when a Moire exists.
Fig. 9 is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment, the apparatus may further include:
a seventh determining module 91 configured to determine whether a preset button in the camera interface is triggered; and triggering the first driving module after determining that a preset button in the camera interface is triggered.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 10 is a block diagram illustrating an apparatus for controlling focusing of a lens according to an exemplary embodiment. For example, the apparatus 1000 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 10, the apparatus 1000 may include one or more of the following components: processing component 1002, memory 1004, power component 1006, multimedia component 1008, audio component 1010, input/output (I/O) interface 1012, sensor component 1014, and communications component 1016.
The processing component 1002 generally controls the overall operation of the device 1000, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing elements 1002 may include one or more processors 1020 to execute instructions to perform all or a portion of the steps of the methods described above. Further, processing component 1002 may include one or more modules that facilitate interaction between processing component 1002 and other components. For example, the processing component 1002 can include a multimedia module to facilitate interaction between the multimedia component 1008 and the processing component 1002.
The memory 1004 is configured to store various types of data to support operation at the device 1000. Examples of such data include instructions for any application or method operating on device 1000, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1004 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 1006 provide power to the various components of device 1000. Power components 1006 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 1000.
The multimedia component 1008 includes a screen that provides an output interface between the device 1000 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1008 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 1000 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1010 is configured to output and/or input audio signals. For example, audio component 1010 includes a Microphone (MIC) configured to receive external audio signals when apparatus 1000 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in the memory 1004 or transmitted via the communication component 1016. In some embodiments, audio component 1010 also includes a speaker for outputting audio signals.
I/O interface 1012 provides an interface between processing component 1002 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1014 includes one or more sensors for providing various aspects of status assessment for the device 1000. For example, sensor assembly 1014 may detect an open/closed state of device 1000, the relative positioning of components, such as a display and keypad of apparatus 1000, sensor assembly 1014 may also detect a change in position of apparatus 1000 or a component of apparatus 1000, the presence or absence of user contact with apparatus 1000, orientation or acceleration/deceleration of apparatus 1000, and a change in temperature of apparatus 1000. The sensor assembly 1014 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1014 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1014 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1016 is configured to facilitate communications between the apparatus 1000 and other devices in a wired or wireless manner. The device 1000 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1016 receives a broadcast signal or broadcast associated information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1016 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 1000 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 1004 comprising instructions, executable by the processor 1020 of the device 1000 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. A method for controlling focusing of a lens, the method comprising:
determining a first focusing position in the process of driving the lens group to move in a preset interval;
determining a second focus position based on the first focus position and a preset offset value;
driving the lens group to move to the second focus position.
2. The method of claim 1, wherein determining the first focus position during the driving of the lens group to move within the preset interval comprises:
collecting at least one first image in the process that the lens group moves in the preset interval;
determining a first image with highest definition in the at least one first image;
and determining the position of the lens group corresponding to the first image with the highest definition as the first focusing position.
3. The method of claim 1, wherein before determining the first focus position during the driving of the lens group to move within the preset interval, the method further comprises:
determining a third focusing position during the far-focus shooting;
determining a fourth focusing position during close-focus shooting;
and respectively determining the third focusing position and the fourth focusing position as two end points of the preset interval.
4. The method of claim 3, wherein determining the third focus position for the far focus shot comprises:
acquiring at least one second image in the process of shooting a shot object with the object distance larger than a first preset distance;
determining a second image with highest definition in the at least one second image;
and determining the position of the lens group corresponding to the second image with the highest definition as the third focusing position.
5. The method of claim 3, wherein determining the fourth focus position for the close-focus shot comprises:
acquiring at least one third image in the process of shooting the shot object with the object distance smaller than the second preset distance;
determining a third image with highest definition in the at least one third image;
and determining the position of the lens group corresponding to the third image with the highest definition as the fourth focusing position.
6. The method of claim 2, wherein prior to determining a second focus position based on the first focus position and a preset offset value, further comprising:
analyzing the first image with the highest definition in a frequency domain to obtain the gray distribution of the first image with the highest definition;
determining whether Moire exists in the first image with the highest definition based on the gray level distribution; wherein the step of determining a second focus position based on the first focus position and a preset offset value is performed when moire is present.
7. The method of claim 1, wherein before determining the first focus position during the driving of the lens group to move within the preset interval, the method further comprises:
determining whether a preset button in a camera interface is triggered; and when the preset button in the camera interface is triggered, determining a first focusing position in the process of driving the lens group to move in the preset interval.
8. An apparatus for controlling focusing of a lens, the apparatus comprising:
the first driving module is configured to drive the lens group to move within a preset interval;
a first determination module configured to determine a first focus position during a process in which the first drive module drives the lens group to move within the preset interval;
a second determination module configured to determine a second focus position based on the first focus position and a preset offset value;
a second driving module configured to drive the lens group to move to the second focus position.
9. The apparatus of claim 8, wherein the first determining module comprises:
a first acquisition submodule configured to acquire at least one first image during movement of the lens group within the preset interval;
a first determining submodule configured to determine a first image with highest definition in the at least one first image;
a second determination submodule configured to determine a lens group position corresponding to the first image with the highest sharpness as the first focus position.
10. The apparatus of claim 8, further comprising:
a third determination module configured to determine a third focused position at the time of telephoto shooting;
a fourth determination module configured to determine a fourth in-focus position at the time of the close-focus shooting;
a fifth determining module configured to determine the third focusing position and the fourth focusing position as two end points of the preset interval, respectively.
11. The apparatus of claim 10, wherein the third determining module comprises:
the second acquisition sub-module is configured to acquire at least one second image in the process of shooting the shot object with the object distance greater than the first preset distance;
a third determining submodule configured to determine a second image with highest sharpness in the at least one second image;
a fourth determination submodule configured to determine a lens group position corresponding to the second image with the highest sharpness as the third focusing position.
12. The apparatus of claim 10, wherein the fourth determining module comprises:
the third acquisition sub-module is configured to acquire at least one third image in the process of shooting the shot object with the object distance smaller than the second preset distance;
a fifth determining submodule configured to determine a third image with highest definition in the at least one third image;
a sixth determining submodule configured to determine a lens group position corresponding to the third image with the highest sharpness as the fourth focusing position.
13. The apparatus of claim 9, further comprising:
the analysis module is configured to analyze the first image with the highest definition in a frequency domain to obtain the gray distribution of the first image with the highest definition;
a sixth determination module configured to determine whether moire exists in the first image with the highest sharpness based on the gray scale distribution; wherein the second determination module is triggered when a Moire exists.
14. The apparatus of claim 8, further comprising:
a seventh determining module configured to determine whether a preset button in the camera interface is triggered; and triggering the first driving module after determining that a preset button in the camera interface is triggered.
15. An apparatus for controlling focusing of a lens, the apparatus comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method of any one of claims 1 to 7.
16. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
CN201910252889.5A 2019-03-29 2019-03-29 Method and device for controlling focusing of lens Pending CN111756989A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910252889.5A CN111756989A (en) 2019-03-29 2019-03-29 Method and device for controlling focusing of lens

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910252889.5A CN111756989A (en) 2019-03-29 2019-03-29 Method and device for controlling focusing of lens

Publications (1)

Publication Number Publication Date
CN111756989A true CN111756989A (en) 2020-10-09

Family

ID=72672720

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910252889.5A Pending CN111756989A (en) 2019-03-29 2019-03-29 Method and device for controlling focusing of lens

Country Status (1)

Country Link
CN (1) CN111756989A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112351205A (en) * 2020-10-28 2021-02-09 维沃移动通信有限公司 Shooting control method and device
CN113362300A (en) * 2021-06-03 2021-09-07 豪威科技(武汉)有限公司 Training, detecting and focusing method and shooting device
CN114286011A (en) * 2022-01-06 2022-04-05 维沃移动通信有限公司 Focusing method and device
WO2022088883A1 (en) * 2020-10-30 2022-05-05 中兴通讯股份有限公司 Photographing method and apparatus, terminal, and computer-readable storage medium
WO2022088882A1 (en) * 2020-10-30 2022-05-05 中兴通讯股份有限公司 Photographing method and apparatus, and terminal, and computer readable storage medium
WO2023221119A1 (en) * 2022-05-20 2023-11-23 北京小米移动软件有限公司 Image processing method and apparatus and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106412423A (en) * 2016-09-19 2017-02-15 珠海格力电器股份有限公司 Focusing method and device
CN107493407A (en) * 2016-06-08 2017-12-19 深圳富泰宏精密工业有限公司 Camera arrangement and photographic method
CN108769533A (en) * 2018-06-27 2018-11-06 上海理工大学 A kind of auto-focusing algorithm

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107493407A (en) * 2016-06-08 2017-12-19 深圳富泰宏精密工业有限公司 Camera arrangement and photographic method
CN106412423A (en) * 2016-09-19 2017-02-15 珠海格力电器股份有限公司 Focusing method and device
CN108769533A (en) * 2018-06-27 2018-11-06 上海理工大学 A kind of auto-focusing algorithm

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112351205A (en) * 2020-10-28 2021-02-09 维沃移动通信有限公司 Shooting control method and device
WO2022088883A1 (en) * 2020-10-30 2022-05-05 中兴通讯股份有限公司 Photographing method and apparatus, terminal, and computer-readable storage medium
WO2022088882A1 (en) * 2020-10-30 2022-05-05 中兴通讯股份有限公司 Photographing method and apparatus, and terminal, and computer readable storage medium
CN113362300A (en) * 2021-06-03 2021-09-07 豪威科技(武汉)有限公司 Training, detecting and focusing method and shooting device
CN114286011A (en) * 2022-01-06 2022-04-05 维沃移动通信有限公司 Focusing method and device
CN114286011B (en) * 2022-01-06 2024-01-23 维沃移动通信有限公司 Focusing method and device
WO2023221119A1 (en) * 2022-05-20 2023-11-23 北京小米移动软件有限公司 Image processing method and apparatus and storage medium

Similar Documents

Publication Publication Date Title
CN111756989A (en) Method and device for controlling focusing of lens
CN110557547B (en) Lens position adjusting method and device
CN110493526B (en) Image processing method, device, equipment and medium based on multiple camera modules
CN108154465B (en) Image processing method and device
CN110769147B (en) Shooting method and electronic equipment
US20160119560A1 (en) Imaging device, imaging method, and image processing device
CN107948510B (en) Focal length adjusting method and device and storage medium
CN108259759A (en) focusing method, device and storage medium
CN111741187B (en) Image processing method, device and storage medium
CN110827219B (en) Training method, device and medium of image processing model
CN110266914B (en) Image shooting method, device and computer readable storage medium
CN110620871B (en) Video shooting method and electronic equipment
CN107241535B (en) Flash lamp adjusting device and terminal equipment
CN111461950B (en) Image processing method and device
CN112188096A (en) Photographing method and device, terminal and storage medium
CN112866555B (en) Shooting method, shooting device, shooting equipment and storage medium
CN114666490A (en) Focusing method and device, electronic equipment and storage medium
CN115134517A (en) Shooting control method and device and storage medium
CN114244999A (en) Automatic focusing method and device, camera equipment and storage medium
CN114339017B (en) Distant view focusing method, device and storage medium
CN112203015B (en) Camera control method, device and medium system
CN114339018B (en) Method and device for switching lenses and storage medium
CN109447929B (en) Image synthesis method and device
CN114339015B (en) Photographing processing method, photographing processing device and storage medium
CN110876000B (en) Camera module, image correction method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201009