CN108769538B - Automatic focusing method and device, storage medium and terminal - Google Patents

Automatic focusing method and device, storage medium and terminal Download PDF

Info

Publication number
CN108769538B
CN108769538B CN201810935304.5A CN201810935304A CN108769538B CN 108769538 B CN108769538 B CN 108769538B CN 201810935304 A CN201810935304 A CN 201810935304A CN 108769538 B CN108769538 B CN 108769538B
Authority
CN
China
Prior art keywords
shooting
focusing
area
quality
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810935304.5A
Other languages
Chinese (zh)
Other versions
CN108769538A (en
Inventor
王宇鹭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810935304.5A priority Critical patent/CN108769538B/en
Publication of CN108769538A publication Critical patent/CN108769538A/en
Application granted granted Critical
Publication of CN108769538B publication Critical patent/CN108769538B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Abstract

The embodiment of the application discloses an automatic focusing method, an automatic focusing device, a storage medium and a terminal, wherein the method comprises the following steps: acquiring a shot image and determining the area where a shot target is located; performing quality scoring on the area where the shooting target is located; if the quality score does not meet the preset score range, carrying out focusing operation on the shooting target again; the preset scoring range is determined according to the shooting results of the focused different areas after the different areas of the shooting visual field are synchronously focused and shot by the at least two cameras in advance, and the automatic focusing efficiency and accuracy can be improved.

Description

Automatic focusing method and device, storage medium and terminal
Technical Field
The embodiment of the application relates to the technical field of mobile terminals, in particular to an automatic focusing method, an automatic focusing device, a storage medium and a terminal.
Background
At present, the photographing function becomes a standard configuration of most terminal devices, and a terminal user can easily and quickly realize photographing operation through a portable mobile terminal.
The development of terminal equipment is more and more intelligent, and in order to improve the definition of shooting the image, common terminal equipment can focus to shooting the image automatically. However, the real-time performance of the automatic focusing of the terminal device is poor, the inaccuracy of the current focusing result cannot be found in time, the original focusing parameters are still used for collecting the image, so that the problem of defocusing of the shot image is caused, and improvement is urgently needed.
Disclosure of Invention
An object of the embodiments of the present application is to provide an auto-focusing method, an auto-focusing device, a storage medium, and a terminal, which can improve the efficiency and accuracy of auto-focusing.
In a first aspect, an embodiment of the present application provides an auto-focusing method, including:
acquiring a shot image and determining the area where a shot target is located;
performing quality scoring on the area where the shooting target is located;
if the quality score does not meet the preset score range, carrying out focusing operation on the shooting target again;
the preset scoring range is determined according to the shooting results of the focused different areas after the focusing shooting is performed on the different areas of the shooting visual field synchronously through at least two cameras in advance.
In a second aspect, an embodiment of the present application provides an automatic focusing apparatus, including:
the acquisition module is used for acquiring a shot image;
the area determining module is used for determining the area where the shooting target is located in the shooting image acquired by the acquiring module;
the scoring module is used for scoring the quality of the area where the shooting target is located, which is determined by the area determining module;
the judging module is used for carrying out focusing operation on the shooting target again if the quality score of the scoring module does not meet the preset scoring range;
the preset scoring range is determined according to the shooting results of the focused different areas after the focusing shooting is performed on the different areas of the shooting visual field synchronously through at least two cameras in advance. .
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, and the program, when executed by a processor, implements an auto-focusing method as shown in the first aspect.
In a fourth aspect, an embodiment of the present application provides a terminal, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the auto-focusing method according to the first aspect when executing the computer program.
According to the automatic focusing scheme provided by the embodiment of the application, firstly, a shot image is obtained, and an area where a shot target is located is determined; then, performing quality scoring on the area where the shooting target is located; finally, if the quality score does not meet the preset score range, focusing operation is carried out on the shooting target again; the preset scoring range is determined according to the shooting results of the focused different areas after the different areas of the shooting visual field are synchronously focused and shot by the at least two cameras in advance, and the automatic focusing efficiency and accuracy can be improved.
Drawings
Fig. 1 is a schematic flowchart of an auto-focusing method according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of another auto-focusing method according to an embodiment of the present application;
fig. 3 is a schematic flowchart of another auto-focusing method according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another auto-focusing method according to an embodiment of the present application;
fig. 5 is a schematic flowchart of another auto-focusing method according to an embodiment of the present application;
fig. 6 is a schematic flowchart of another auto-focusing method according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an automatic focusing apparatus according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
The technical scheme of the application is further explained by the specific implementation mode in combination with the attached drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures.
Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the steps as a sequential process, many of the steps can be performed in parallel, concurrently or simultaneously. In addition, the order of the steps may be rearranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
At present, the photographing function becomes a standard configuration of most mobile terminals, and a terminal user can easily and quickly realize photographing operation through a portable terminal device. The development of terminal equipment is more and more intelligent, and in order to improve the definition of shooting the image, common terminal equipment can focus to shooting the image automatically. However, the real-time performance of the terminal device for automatic focusing is poor, for example, if the position of the shooting target changes, the focusing result at this time is not accurate, but the terminal device cannot find the inaccuracy of the current focusing result in time, and still uses the original focusing parameters to collect the image, so that the problem of out-of-focus of the shot image is caused, and improvement is urgently needed.
The embodiment of the application provides an automatic focusing method, which can determine an area where a shooting target is located in an acquired shooting image, score the quality of the area, perform focusing operation on the shooting target again if the quality score does not meet a preset scoring range, and determine whether the re-focusing is needed according to the quality of the shooting target in the shooting image, so that the focusing operation is simplified, and the focusing efficiency and accuracy are improved. The specific scheme is as follows:
fig. 1 is a schematic flow diagram of an auto-focusing method provided in an embodiment of the present application, where the method is used for a situation that a terminal device performs auto-focusing on a shooting target in a process of starting a camera to shoot, and the method may be executed by a terminal device with an image shooting function or a terminal device installed with an image shooting application program, where the terminal device may be a smart phone, a tablet computer, a wearable device, a notebook computer, and the method specifically includes the following steps:
and step 110, acquiring a shot image and determining the area where the shot target is located.
For example, in the photographing mode, the acquiring of the photographed image may be, in the image preview stage, after the camera completes automatic focusing, acquiring the photographed image according to the focusing parameter determined after the current automatic focusing. In the shooting mode, a shot image can be acquired every preset image frame number according to the focusing parameter used when the previous frame image is acquired. Alternatively, the preset image frame number may be 0, that is, a shot image is acquired once for each frame image, or may be a value greater than 0, for example, a shot image is acquired once every 3 frames. After the shot image is acquired, the area where the shot object is located is determined.
Alternatively, the region where the photographic target is located may be a position range where the photographic target is located in the acquired photographic image. In the embodiment of the application, there are many methods for determining the area where the shooting target is located, which may be to extract the edge profile of the shooting target, and take the area surrounded by the edge profile as the area where the shooting target is located; the shooting field of view of the camera may be divided into a plurality of different sub-regions in advance, and when the region where the shooting target is located is determined, which sub-region of the plurality of pre-divided sub-regions the shooting target is in may be determined. Optionally, when the shooting view field is divided into sub-regions, the shooting view field may be divided into at least two sub-regions with the same shape and size, for example, the shooting view field may be divided into four grid regions with the same size and the same size, which are shaped like a Chinese character 'tian'; the image capturing device may be divided into at least two sub-regions according to the importance degree of the region in the capturing field of view, for example, when the end user performs capturing, the capturing target is customarily placed in the central region of the capturing field of view, and thus the image capturing device may be divided into a plurality of sub-regions from the center to the edge in the form of concentric circles. When the area where the shooting target is located is determined, the center position of the shooting target may be determined, and then an area within a preset range of the center position may be used as the area where the target is located, for example, a range of 10 × 10 pixels around the center position of the shooting target may be used as the area where the target is located. Optionally, the specific preset range may be changed according to the size of the shot object, and may be manually adjusted by the terminal user as required, or the size of the preset range may be automatically changed by the system according to the detection of the shot object.
And 120, performing quality grading on the area where the shooting target is located.
The quality score may be an evaluation score for the shooting effect of the area where the shooting target is located.
In the embodiment of the present application, the quality scoring of the area where the shooting target is located may be performed based on statistical characteristics of the area where the shooting target is located, specifically, the quality scoring may be performed based on a pixel mean value, a standard deviation, and an average gradient of the area where the shooting target is located. The average value reflects the average brightness of the area where the shooting target is located, and the larger the average value is in a certain range, the higher the quality score is; the average brightness standard deviation reflects the discrete degree of the gray value in the area where the shooting target is located relative to the average value, and the larger the standard deviation is in a certain range, the higher the quality score is; the average gradient reflects the definition of the region where the shooting target is located, and the larger the average gradient is in a certain range, the higher the definition is, and the higher the corresponding quality score is. Or analyzing from the angle of one or more quality parameters such as definition, contrast, white balance and the like of the area where the shooting target is located, and performing quality scoring or weighted quality scoring by combining the analysis results of multiple angles, for example, designing a scoring system corresponding to each quality parameter and finishing the quality scoring of the area where the shooting target is located by combining the scoring systems of the quality parameters.
And step 130, if the quality score does not meet the preset score range, carrying out focusing operation on the shooting target again.
The preset scoring range is determined according to the shooting results of the focused different areas after synchronously focusing and shooting the different areas of the shooting visual field by at least two cameras in advance, and is used for measuring whether the quality of a shooting target in an image meets the requirement or not, namely whether focusing operation needs to be carried out again or not. In the embodiment of the application, at least two cameras are adopted to synchronously focus different areas of the shooting visual field, so that the efficiency of determining the preset scoring range is improved. The shooting visual field is divided into a plurality of different areas, and the preset scoring range is determined by combining the shooting effects of the different areas after focusing, so that the accuracy of the preset scoring range is enhanced, and therefore, the accuracy of the automatic focusing method is improved by the method for determining the preset scoring range.
In the embodiment of the application, if the quality score does not meet the preset score range, it is indicated that the shooting effect of the area where the shooting target is located in the shot image does not meet the requirement, that is, the image is shot under the condition that the shooting target is out of focus, and focusing operation needs to be performed on the shooting target again, if the quality score meets the preset score range, it is indicated that the shooting effect of the area where the shooting target is located in the shot image meets the requirement, focusing is accurate, and shooting can be performed continuously by using the focusing parameter.
It should be noted that the method for refocusing the shooting target in the embodiment of the present application is not limited, and may be any automatic focusing method in the art. For example, the distance between the shooting target and the terminal device is measured by using ultrasonic waves, a distance sensor, or at least two cameras, and then automatic focusing is performed according to the distance between the shooting target and the terminal device.
The automatic focusing method provided by the embodiment of the application comprises the steps of firstly obtaining a shot image and determining an area where a shot target is located; then, performing quality scoring on the area where the shooting target is located; finally, if the quality score does not meet the preset score range, focusing operation is carried out on the shooting target again; the preset scoring range is determined according to the shooting results of different focused areas after focusing shooting is performed on different areas of a shooting visual field synchronously through at least two cameras in advance. According to the embodiment of the application, whether refocusing is needed or not can be determined according to the relation between the quality score of the area where the shooting target is located in the shot image and the preset scoring range, the problem that the focusing result cannot be found to be inaccurate in time is solved, the focusing operation is simplified, and the efficiency and the accuracy of automatic focusing are improved.
Fig. 2 is a schematic flow chart of another auto-focusing method provided in an embodiment of the present application, which is used to further describe the above embodiment, and includes:
and 210, dividing a shooting visual field into at least two sub focusing areas in advance.
The shooting field of view may be a range that can be shot by a camera on the terminal device. The method includes the steps of dividing a shooting view into at least two sub-focusing areas in advance, wherein the shooting view can be divided into at least two sub-focusing areas with the same shape and size, for example, the shooting view can be divided into four grid areas with the same size and the same shape and size; the sub-focusing area may be divided into at least two sub-focusing areas according to the importance degree of the area in the shooting field, for example, when the end user performs shooting, the shooting target is customarily placed in the central area of the shooting field, so the sub-focusing areas may be divided into a plurality of sub-focusing areas in the form of concentric circles from the center to the edge. Optionally, the shapes and sizes of the sub-focusing regions may be the same or different, and the application is not limited thereto.
And step 220, distributing different sub-focusing areas for the at least two cameras according to a preset rule.
Illustratively, the terminal device is provided with at least two cameras, and different sub-focusing areas are allocated to each camera according to a preset rule. Optionally, at least two cameras on the terminal device may be cameras of the same kind and model, or cameras of different kinds and models. For example, there may be one wide-angle camera and one tele-camera. This is not limited, and may be set according to actual needs.
In the embodiment of the application, the preset rule may be preset according to the characteristic of the camera and/or the setting mode of the camera on the terminal device. Specifically, if the cameras on the terminal device are the same, the preset rule may be set randomly or according to the specific positions of the cameras on the terminal device, for example, if the two cameras are the same and are arranged left and right on the terminal device, when the sub-focusing regions are allocated to the two cameras, the sub-focusing region on the left side in the sub-focusing regions classified in step 210 may be allocated to the camera on the left side, and the sub-focusing region on the right side may be allocated to the camera on the right side. If the types and models of the cameras on the terminal device are different, corresponding sub-focusing regions can be allocated to the cameras by combining the specific attributes of the cameras, for example, one of the two cameras is a wide-angle camera, and the other one is a telephoto camera, a sub-focusing region at the edge position of the shooting view field can be allocated to the wide-angle camera, and a sub-focusing region at the center position of the shooting view field can be allocated to the telephoto camera.
And step 230, controlling at least two cameras to synchronously perform focusing shooting on each distributed sub-focusing area.
For example, after each camera is allocated with its corresponding sub-focusing area, each camera is controlled to synchronously perform focusing shooting on its corresponding sub-focusing area. Specifically, taking a camera as an example, the camera is controlled to perform focusing operation on the allocated first sub-focusing area, and after focusing is completed, an image of the first sub-focusing area is obtained. And then the camera carries out focusing operation on the distributed second sub-focusing area, and after focusing is finished, an image of the second sub-focusing area is obtained, and the like. And each camera synchronously executes focusing shooting operation on each allocated sub-focusing area. The efficiency of focusing shooting is improved, and focusing shooting is carried out on the sub-focusing areas corresponding to the cameras by combining the characteristics of the cameras and the positions of the cameras in the terminal equipment, so that the shooting quality is improved.
And 240, respectively carrying out quality grading on the shot results of the focused different areas.
For example, the quality score is performed on the shooting result of each sub-focusing area shot by each camera in focus. Optionally, the process of performing quality scoring on the shooting result of each focusing area may be the same as the process of performing quality scoring on the area where the shooting target is located in the foregoing embodiment, or other quality scoring methods may also be used, which is not limited in this application.
And step 250, determining a preset grading range according to each quality grade.
For example, the preset score range may be determined by selecting the lowest score and the highest score of the quality scores of the sub-focusing regions to form the preset score range, or by calculating a mean, a variance, a standard deviation, and the like of the quality scores, and determining the preset score range by combining at least one of the mean, the variance, and the standard deviation, for example, by adding or subtracting the variance or the standard deviation from the mean to obtain the preset score range.
It should be noted that the preset score range should be a specific range, and not higher is better. For example, when the quality of a captured image is measured by contrast, if the contrast is too high, the captured image may be distorted, resulting in poor results of the captured image. The best effect of the shot image can be achieved only within a certain range.
And step 260, acquiring a shot image and determining the area where the shot target is located.
And 270, performing quality grading on the area where the shooting target is located.
And step 280, if the quality score does not meet the preset score range, carrying out focusing operation on the shooting target again.
The preset scoring range is determined according to the shooting results of different focused areas after focusing shooting is performed on different areas of a shooting visual field synchronously through at least two cameras in advance.
The automatic focusing method provided by the embodiment of the application can divide a shooting field into a plurality of sub-focusing areas in advance, control a plurality of cameras to synchronously focus and shoot the corresponding sub-focusing areas, determine a preset grading range according to each focusing shooting result, and judge whether the shooting target needs to be focused again according to the determined preset grading range and the quality grade of the area where the shooting target is located in the shot image. The method can improve the determination rate and accuracy of the scoring range, simplify the automatic focusing operation and ensure the efficiency and accuracy of automatic focusing.
Fig. 3 is a schematic flowchart of another auto-focusing method provided in an embodiment of the present application, which is used to further describe the above embodiment, and includes:
and step 310, determining the position of the shooting target in the shooting field of view.
For example, the determination of the position of the shooting target in the shooting field may be manual selection (e.g., clicking, frame selection, etc.) by a user in a preview interface displayed on a display screen of the terminal device, or may be automatic determination of the position of the shooting target in the shooting field after the system automatically performs recognition analysis on the shot image. It is also possible to combine manual and automatic modes, for example, the end user may make a manual correction in case the system determines that the shooting target is located in an inaccurate position.
And step 320, controlling the corresponding camera to acquire the shot image when the preset grading range is determined.
When selecting the camera used for capturing the image this time, after determining the position of the capturing target in the capturing field of view in step 310, it may be determined which sub-focusing area the position belongs to when determining the preset scoring range, and the camera allocated to the sub-focusing area is selected as the camera for capturing the captured image this time.
For example, when the preset scoring range is determined, the shooting field of view is divided into six sub-focusing areas from the center to the edge in a concentric circle mode, the sub-focusing areas corresponding to 1-3 concentric circles close to the center are focused by a long-focus camera, and the sub-focusing areas corresponding to 4-6 concentric circles close to the edge are focused by a wide-angle camera. If the position of the shooting target is in the area of the 2 nd concentric circle in the middle, the long-focus camera corresponding to the 2 nd concentric circle acquires the shot image when the preset scoring range is selected and determined.
And step 330, determining the area where the shooting target is located.
Step 340, performing quality grading on the area where the shooting target is located;
and 350, if the quality score does not meet the preset score range, carrying out focusing operation on the shooting target again.
The preset scoring range is determined according to the shooting results of different focused areas after focusing shooting is performed on different areas of a shooting visual field synchronously through at least two cameras in advance.
According to the automatic focusing method provided by the embodiment of the application, the camera corresponding to the position can be selected to acquire the shot image according to the position of the shot target in the shooting visual field, and then the quality score of the area where the shot target is located is compared with the preset score range, so that whether the shot target needs to be focused again or not is judged. Different cameras are selected according to the position of the shooting target in the visual field to obtain the shot image, so that the focusing accuracy is further enhanced, and the shooting quality of the image is improved.
Fig. 4 is a schematic flowchart of another auto-focusing method provided in an embodiment of the present application, which is used to further describe the above embodiment, and includes:
and step 410, acquiring a shot image and determining the area where the shot target is located.
And step 420, analyzing quality parameters of the area where the shooting target is located.
The quality parameter is a parameter for measuring the quality of the image capturing effect, and may include one or more of exposure time, white balance, contrast, sharpness, and the like, for example.
The main influence is the definition of an image under the condition that a shooting target is out of focus, so that when the quality analysis is carried out on the area where the shooting target is located, the definition in the quality parameters can be analyzed, and whether the shooting target is shot in an out-of-focus state or not is judged.
Alternatively, there are many factors that affect the image sharpness, for example, the exposure time, white balance, contrast setting may also have an effect on sharpness. Therefore, in order to accurately analyze whether the poor image quality is caused by the out-of-focus, the image definition can be analyzed, and if the image quality is poor, other quality parameters such as exposure time, white balance, contrast and the like are analyzed to determine whether the reason influencing the image quality is caused by the out-of-focus.
And 430, performing quality grading on the area where the shooting target is located according to the analysis result.
Optionally, a scoring system corresponding to each quality parameter may be set in advance for each quality parameter, for example, the scoring system corresponding to the definition in the quality parameters may divide the definition of the image into a plurality of grades according to the degree of blur, and the higher the grade is, the worse the definition is. For another example, the scoring system corresponding to the white balance in the quality parameter may be that, when the gray-value ratio of the three primary colors of red, green and blue of the image approaches to 1: 1: and 1, the area where the shooting target is located is closest to the real shooting target, and the quality score is closest to the preset score range.
In the embodiment of the application, if the quality parameter is one, the scoring result of the quality parameter is used as the final quality scoring result of the area where the shooting target is located. If the number of the quality parameters is multiple, the grading results corresponding to the quality parameters can be averaged to obtain the final quality grading result of the area where the shooting target is located; or setting a weight value for each quality parameter, and calculating the weighted score of each quality parameter to obtain the quality score result of the area where the shooting target is located. Optionally, the weight value of each quality parameter may be set by default according to the characteristics of the out-of-focus image, for example, the definition of the image is the most important parameter for analyzing whether the image is out-of-focus, and the exposure time is relatively less important for whether the image is out-of-focus, so that the weight value of the quality parameter of the definition may be set to be larger and the weight value of the quality parameter of the exposure time may be set to be smaller. Or the user can manually set the setting according to the requirement of the user.
And 440, if the quality score does not meet the preset score range, carrying out focusing operation on the shooting target again.
The preset scoring range is determined according to the shooting results of different focused areas after focusing shooting is performed on different areas of a shooting visual field synchronously through at least two cameras in advance.
According to the automatic focusing method provided by the embodiment of the application, the quality of the area where the shooting target is located is analyzed, the quality score of the area is determined, the determined quality score is compared with the preset score range, and then whether the shooting target needs to be focused again or not is judged. The focusing operation can be simplified, and the focusing efficiency and accuracy can be improved.
Fig. 5 is a schematic flowchart of another auto-focusing method provided in an embodiment of the present application, which is used to further describe the above embodiment, and includes:
and step 510, acquiring a shot image, and determining the area where the shot target is located.
And step 520, performing quality scoring on the area where the shooting target is located.
And step 530, if the quality score does not meet the preset score range, re-acquiring the shot image for quality scoring.
For example, the quality score does not satisfy the preset score range, and may not be caused by focusing inaccuracy or caused by external environmental factors. For example, when a user holds a terminal device to shoot a target, shooting is fuzzy due to hand shake, and for example, shooting is fuzzy due to the fact that the user shoots flowers on a tree, a gust of wind blows, and the flowers shake, the above phenomena all cause that the quality score does not meet the preset score range, at this time, it cannot be stated that automatic focusing is inaccurate during shooting, but the automatic focusing is caused by external uncontrollable environmental factors, so that when the quality score does not meet the preset score range, the shooting target can not be directly focused again, but the steps 510 to 520 are returned to be executed again, and the shot image is obtained again according to the same focusing parameters for quality score. The quality of the shot images is graded by acquiring the shot images for multiple times, so that the shooting fuzzy phenomenon caused by external environment factors but not the shooting fuzzy phenomenon caused by inaccurate automatic focusing is eliminated, and unnecessary focusing operation is reduced. For example, if the quality score meets the preset score requirement after the shot image is obtained again for the second time and the quality score is found, the shot image can be continuously shot by adopting the original focusing parameters, and one time of unnecessary focusing operation is reduced.
And 540, if the quality score of the re-acquired shot image does not meet the preset score range, re-focusing the shot target.
For example, if the quality score of the re-acquired photographed image still does not meet the preset score range, it is indicated that the quality score of the photographed image does not meet the preset score range is caused by the out-of-focus of the photographed target, and is not caused by external uncontrollable environmental factors, and the focusing operation needs to be performed on the photographed target again.
The automatic focusing method provided by the embodiment of the application can be used for judging the quality score and the preset score range by acquiring the image of the area where the shooting target is located for multiple times when the quality score of the area where the shooting target is located does not meet the preset score range, and carrying out refocusing operation if the quality score of the area where the shooting target is located does not meet the preset score range. The condition that the refocusing operation is started due to the fact that the quality score does not meet the requirement caused by external uncontrollable factors is avoided, unnecessary refocusing operation is reduced, and the efficiency and the accuracy of automatic focusing are improved.
Fig. 6 is a schematic flowchart of another auto-focusing method provided in an embodiment of the present application, which is used to further describe the above embodiment, and includes:
and step 610, acquiring a shot image and determining the area where the shot target is located.
And step 620, performing quality scoring on the area where the shooting target is located.
Step 630, judging whether the quality score of the area where the shooting target is located meets a preset score range, if so, executing step 670, and if not, executing step 640.
In order to prevent the quality score from not meeting the preset score range due to the deviation of the area where the shooting target is located, when the quality score does not meet the preset score range, the shooting target may not be directly refocused, step 640 is performed to expand the area where the shooting target is located, and step 620 is performed again to perform quality score on the expanded area where the shooting target is located. If the quality score of the shooting target area meets the preset score range, step 670 is executed, and the image is still shot according to the original focusing parameters.
And step 640, enlarging the area where the shooting target is located, and performing quality scoring again.
Optionally, there are many ways to expand the area where the shooting target is located, and the application is not limited to this, for example, when the area where the shooting target is located is an area surrounded by edges of the shooting target, expanding the area where the shooting target is located may be to expand the preset range (for example, expand 10 pixels from top to bottom and from left to right) from top to bottom and from left to right on the basis of the area where the original shooting target is located, so as to obtain a new area where the shooting target is located. When the area where the shooting target is located belongs to a certain sub-area in the pre-divided shooting view range, the area where the shooting target is located may be expanded by combining the area where the original shooting target is located and each sub-area adjacent to the area to form the expanded area where the shooting target is located. For example, when the shooting field of view is divided in advance, the shooting field of view is divided into a plurality of sub-regions in the form of concentric circles from the center to the edge, and if the region where the shooting target is located determined in step 610 is the region where the first concentric circle is located at the center, when the region where the shooting target is located is expanded, the first concentric circle and the second concentric circle may be merged, and the region where the two merged concentric circles are located may be used as the position where the shooting target is located after the expansion.
In the embodiment of the present application, after the area where the shooting target is located is enlarged, step 620 is performed again, and quality scoring is performed on the enlarged area where the shooting target is located.
Step 650, judging whether the quality score of the area where the shooting target is located after the range is expanded meets a preset score range, if so, executing step 670, and if not, executing step 660.
For example, if the quality score of the area where the shooting target is located after the range is expanded still does not meet the preset score range, it is indicated that the reason why the shooting image quality score does not meet the preset score range is that the shooting target is out of focus, and it is not that the area where the shooting target is located is determined to have a deviation, and step 660 needs to be executed to perform focusing operation on the shooting target again. If the quality score of the area where the shooting target is located after the range is expanded meets the preset score range, it is indicated that the automatic focusing of the shooting target is accurate, and if the image quality score does not meet the requirement, it is determined that the position of the shooting target is deviated, and then step 670 is executed to perform image shooting according to the original focusing parameters.
And 660, carrying out focusing operation on the shooting target again.
And step 670, shooting the image by the original focusing parameters.
The original focusing parameters may be the focusing parameters used when the captured image is obtained in step 610.
The automatic focusing method provided by the embodiment of the application can be used for carrying out quality scoring by enlarging the image of the area where the shooting target is located and then comparing the image with the preset scoring range when the quality scoring of the area where the shooting target is located does not meet the preset scoring range, and carrying out refocusing operation if the quality scoring of the area where the shooting target is located does not meet the preset scoring range. The situation that the refocusing operation is started due to the fact that the area where the shooting target is located is determined to have deviation is avoided, unnecessary refocusing operation is reduced, and focusing efficiency and accuracy are improved.
Fig. 7 is a schematic structural diagram of an auto-focusing apparatus according to an embodiment of the present disclosure. As shown in fig. 7, the apparatus includes: an obtaining module 710, an area determining module 720, a scoring module 730, and a determining module 740.
An acquisition module 710 for acquiring a photographed image;
an area determining module 720, configured to determine an area where a shooting target is located in the shooting image obtained by the obtaining module 710;
a scoring module 730, configured to perform quality scoring on the area where the shooting target is located, and is determined by the area determining module 720;
a determining module 740, configured to perform focusing operation on the shooting target again if the quality score of the scoring module 730 does not meet a preset scoring range;
the preset scoring range is determined according to the shooting results of the focused different areas after the focusing shooting is performed on the different areas of the shooting visual field synchronously through at least two cameras in advance.
Further, the above apparatus further comprises:
the focusing area dividing module is used for dividing the shooting visual field into at least two sub-focusing areas in advance;
the focusing area distribution module is used for distributing different sub focusing areas for the at least two cameras according to a preset rule;
and the focusing shooting control module is used for controlling the at least two cameras to synchronously focus and shoot each distributed sub focusing area.
Further, the scoring module 730 is further configured to perform quality scoring on the focused shooting results of different areas respectively;
the above-mentioned device still includes: and the scoring range determining module is used for determining a preset scoring range according to each quality score.
Further, the obtaining module 710 is configured to determine a position of the shooting target in the shooting field of view;
and controlling the corresponding camera of the position to acquire the shot image when the preset scoring range is determined.
Further, the scoring module 730 is configured to perform quality parameter analysis on the area where the shooting target is located;
and performing quality scoring on the area where the shooting target is located according to the analysis result.
Further, the determining module 740 is configured to, if the quality score does not meet a preset scoring range, re-acquire the captured image for quality scoring;
and if the quality score of the re-acquired shot image does not meet the preset score range, carrying out focusing operation on the shot target again.
Further, the determining module 740 is configured to, if the quality score does not meet a preset score range, enlarge an area where the shooting target is located and perform quality scoring again;
and if the repeated quality grading still does not meet the preset grading range, performing focusing operation on the shooting target again.
In the auto-focusing apparatus provided in the embodiment of the present application, the obtaining module 710 first obtains a captured image, and the area determining module 720 determines an area where a capture target is located in the captured image obtained by the obtaining module 710; then, the scoring module 730 scores the quality of the area where the shooting target is located, which is determined by the area determining module 720; finally, the judging module 740 performs focusing operation on the shooting target again when the quality score of the scoring module 730 does not meet the preset scoring range; the preset scoring range is determined according to the shooting results of different focused areas after focusing shooting is performed on different areas of a shooting visual field synchronously through at least two cameras in advance. According to the embodiment of the application, whether refocusing is needed or not can be determined according to the relation between the quality score of the area where the shooting target is located in the shot image and the preset scoring range, the problem that the focusing result cannot be found to be inaccurate in time is solved, the focusing operation is simplified, and the efficiency and the accuracy of automatic focusing are improved.
The device can execute the methods provided by all the embodiments of the application, and has corresponding functional modules and beneficial effects for executing the methods. For details of the technology not described in detail in this embodiment, reference may be made to the methods provided in all the foregoing embodiments of the present application.
Fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 8, the terminal may include: a housing (not shown), a memory 801, a Central Processing Unit (CPU) 802 (also called a processor, hereinafter referred to as CPU), a computer program stored in the memory 801 and operable on the processor 802, a circuit board (not shown), and a power circuit (not shown). The circuit board is arranged in a space enclosed by the shell; the CPU802 and the memory 801 are provided on the circuit board; the power supply circuit is used for supplying power to each circuit or device of the terminal; the memory 801 is used for storing executable program codes; the CPU802 executes a program corresponding to the executable program code by reading the executable program code stored in the memory 801.
The terminal further comprises: peripheral interface 803, RF (Radio Frequency) circuitry 805, audio circuitry 806, speakers 811, power management chip 808, input/output (I/O) subsystem 809, touch screen 812, other input/control devices 810, and external port 804, which communicate over one or more communication buses or signal lines 807.
It should be understood that the illustrated terminal device 800 is merely one example of a terminal, and that the terminal device 800 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The following describes in detail a terminal device provided in this embodiment, where the terminal device is a smart phone as an example.
A memory 801, the memory 801 being accessible by the CPU802, the peripheral interface 803, and the like, the memory 801 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other volatile solid state storage devices.
A peripheral interface 803, said peripheral interface 803 allowing input and output peripherals of the device to be connected to the CPU802 and the memory 801.
I/O subsystem 809, which I/O subsystem 809 may connect input and output peripherals on the device, such as touch screen 812 and other input/control devices 810, to peripheral interface 803. The I/O subsystem 809 may include a display controller 8091 and one or more input controllers 8092 for controlling other input/control devices 810. Where one or more input controllers 8092 receive electrical signals from or transmit electrical signals to other input/control devices 810, other input/control devices 810 may include physical buttons (push buttons, rocker buttons, etc.), dials, slide switches, joysticks, click wheels. It is worth noting that the input controller 8092 may be connected to any of the following: a keyboard, an infrared port, a USB interface, and a pointing device such as a mouse.
The touch screen 812 may be a resistive type, a capacitive type, an infrared type, or a surface acoustic wave type, according to the operating principle of the touch screen and the classification of media for transmitting information. The touch screen 812 may be classified by installation method: external hanging, internal or integral. Classified according to technical principles, the touch screen 812 may be: a vector pressure sensing technology touch screen, a resistive technology touch screen, a capacitive technology touch screen, an infrared technology touch screen, or a surface acoustic wave technology touch screen.
A touch screen 812, which touch screen 812 is an input interface and an output interface between the user terminal and the user, displays visual output to the user, which may include graphics, text, icons, video, and the like. Optionally, the touch screen 812 sends an electrical signal (e.g., an electrical signal of the touch surface) triggered by the user on the touch screen to the processor 802.
The display controller 8091 in the I/O subsystem 809 receives electrical signals from the touch screen 812 or sends electrical signals to the touch screen 812. The touch screen 812 detects a contact on the touch screen, and the display controller 8091 converts the detected contact into an interaction with a user interface object displayed on the touch screen 812, that is, implements a human-computer interaction, and the user interface object displayed on the touch screen 812 may be an icon for running a game, an icon networked to a corresponding network, or the like. It is worth mentioning that the device may also comprise a light mouse, which is a touch sensitive surface that does not show visual output, or an extension of the touch sensitive surface formed by the touch screen.
The RF circuit 805 is mainly used to establish communication between the smart speaker and a wireless network (i.e., a network side), and implement data reception and transmission between the smart speaker and the wireless network. Such as sending and receiving short messages, e-mails, etc.
The audio circuit 806 is mainly used to receive audio data from the peripheral interface 803, convert the audio data into an electric signal, and transmit the electric signal to the speaker 811.
Speaker 811 is used to convert the voice signals received by the smart speaker from the wireless network through RF circuit 805 into sound and play the sound to the user.
And the power management chip 808 is used for supplying power and managing power to the hardware connected with the CPU802, the I/O subsystem and the peripheral interface.
In this embodiment, the cpu802 is configured to:
acquiring a shot image and determining the area where a shot target is located;
performing quality scoring on the area where the shooting target is located;
if the quality score does not meet the preset score range, carrying out focusing operation on the shooting target again;
the preset scoring range is determined according to the shooting results of the focused different areas after the focusing shooting is performed on the different areas of the shooting visual field synchronously through at least two cameras in advance.
Further, the synchronous focusing shooting of different areas of the shooting visual field by at least two cameras in advance comprises:
dividing a shooting visual field into at least two sub-focusing areas in advance;
distributing different sub-focusing areas for at least two cameras according to a preset rule;
and controlling the at least two cameras to synchronously focus and shoot each distributed sub-focusing area.
Further, determining a preset scoring range according to the focused shooting results of different areas, including:
respectively carrying out quality grading on the shot results of the different focused areas;
and determining a preset grading range according to each quality grade.
Further, the acquiring the shot image includes:
determining the position of a shooting target in a shooting field of view;
and controlling the corresponding camera of the position to acquire the shot image when the preset scoring range is determined.
Further, the quality scoring of the area where the shooting target is located includes:
analyzing quality parameters of the area where the shooting target is located;
and performing quality scoring on the area where the shooting target is located according to the analysis result.
Further, if the quality score does not satisfy a preset score range, performing focusing operation on the shooting target again includes:
if the quality score does not meet the preset scoring range, re-acquiring the shot image for quality scoring;
and if the quality score of the re-acquired shot image does not meet the preset score range, carrying out focusing operation on the shot target again.
Further, if the quality score does not satisfy a preset score range, performing focusing operation on the shooting target again includes:
if the quality score does not meet the preset scoring range, enlarging the area where the shooting target is located and re-scoring the quality;
and if the repeated quality grading still does not meet the preset grading range, performing focusing operation on the shooting target again.
Embodiments of the present application further provide a storage medium containing terminal device executable instructions, which when executed by a terminal device processor, are configured to perform an auto-focusing method, where the method includes:
acquiring a shot image and determining the area where a shot target is located;
performing quality scoring on the area where the shooting target is located;
if the quality score does not meet the preset score range, carrying out focusing operation on the shooting target again;
the preset scoring range is determined according to the shooting results of the focused different areas after the focusing shooting is performed on the different areas of the shooting visual field synchronously through at least two cameras in advance.
Further, the synchronous focusing shooting of different areas of the shooting visual field by at least two cameras in advance comprises:
dividing a shooting visual field into at least two sub-focusing areas in advance;
distributing different sub-focusing areas for at least two cameras according to a preset rule;
and controlling the at least two cameras to synchronously focus and shoot each distributed sub-focusing area.
Further, determining a preset scoring range according to the focused shooting results of different areas, including:
respectively carrying out quality grading on the shot results of the different focused areas;
and determining a preset grading range according to each quality grade.
Further, the acquiring the shot image includes:
determining the position of a shooting target in a shooting field of view;
and controlling the corresponding camera of the position to acquire the shot image when the preset scoring range is determined.
Further, the quality scoring of the area where the shooting target is located includes:
analyzing quality parameters of the area where the shooting target is located;
and performing quality scoring on the area where the shooting target is located according to the analysis result.
Further, if the quality score does not satisfy a preset score range, performing focusing operation on the shooting target again includes:
if the quality score does not meet the preset scoring range, re-acquiring the shot image for quality scoring;
and if the quality score of the re-acquired shot image does not meet the preset score range, carrying out focusing operation on the shot target again.
Further, if the quality score does not satisfy a preset score range, performing focusing operation on the shooting target again includes:
if the quality score does not meet the preset scoring range, enlarging the area where the shooting target is located and re-scoring the quality;
and if the repeated quality grading still does not meet the preset grading range, performing focusing operation on the shooting target again.
The computer storage media of the embodiments of the present application may take any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
Of course, the storage medium provided in the embodiments of the present application contains computer-executable instructions, and the computer-executable instructions are not limited to the autofocus operations described above, and may also perform related operations in the autofocus method provided in any embodiments of the present application.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.

Claims (9)

1. An auto-focusing method, comprising:
acquiring a shot image and determining the area where a shot target is located;
performing quality scoring on the area where the shooting target is located;
if the quality score does not meet the preset score range, carrying out focusing operation on the shooting target again;
the preset scoring range is determined according to the focused shooting results of different areas after focusing shooting is performed on the different areas of the shooting visual field synchronously through at least two cameras in advance; the shooting view is a range which can be shot by a camera on the terminal equipment;
wherein, in advance through two at least cameras synchronous to shoot the different regions in the field of vision and focus and shoot, include:
dividing a shooting visual field into at least two sub-focusing areas in advance;
distributing different sub-focusing areas for at least two cameras according to a preset rule;
and controlling the at least two cameras to synchronously focus and shoot each distributed sub-focusing area.
2. The auto-focusing method of claim 1, wherein determining the preset scoring range according to the focused photographing results of different regions comprises:
respectively carrying out quality grading on the shot results of the different focused areas;
and determining a preset grading range according to each quality grade.
3. The auto-focusing method according to claim 1, wherein the acquiring of the photographed image includes:
determining the position of a shooting target in a shooting field of view;
and controlling the corresponding camera of the position to acquire the shot image when the preset scoring range is determined.
4. The auto-focusing method according to claim 1, wherein the quality scoring of the area where the shooting target is located comprises:
analyzing quality parameters of the area where the shooting target is located;
and performing quality scoring on the area where the shooting target is located according to the analysis result.
5. The auto-focusing method according to claim 1, wherein the re-focusing operation on the photographing target if the quality score does not satisfy a preset score range comprises:
if the quality score does not meet the preset scoring range, re-acquiring the shot image for quality scoring;
and if the quality score of the re-acquired shot image does not meet the preset score range, carrying out focusing operation on the shot target again.
6. The auto-focusing method according to claim 1, wherein the re-focusing operation on the photographing target if the quality score does not satisfy a preset score range comprises:
if the quality score does not meet the preset scoring range, enlarging the area where the shooting target is located and re-scoring the quality;
and if the repeated quality grading still does not meet the preset grading range, performing focusing operation on the shooting target again.
7. An auto-focusing apparatus, comprising:
the acquisition module is used for acquiring a shot image;
the area determining module is used for determining the area where the shooting target is located in the shooting image acquired by the acquiring module;
the scoring module is used for scoring the quality of the area where the shooting target is located, which is determined by the area determining module;
the judging module is used for carrying out focusing operation on the shooting target again if the quality score of the scoring module does not meet the preset scoring range;
the preset scoring range is determined according to the focused shooting results of different areas after focusing shooting is performed on the different areas of the shooting visual field synchronously through at least two cameras in advance; the shooting view is a range which can be shot by a camera on the terminal equipment; wherein, in advance through two at least cameras synchronous to shoot the different regions in the field of vision and focus and shoot, include: dividing a shooting visual field into at least two sub-focusing areas in advance; distributing different sub-focusing areas for at least two cameras according to a preset rule; and controlling the at least two cameras to synchronously focus and shoot each distributed sub-focusing area.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out an auto-focusing method as claimed in any one of claims 1 to 6.
9. A terminal comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the auto-focusing method as claimed in any one of claims 1 to 6 when executing the computer program.
CN201810935304.5A 2018-08-16 2018-08-16 Automatic focusing method and device, storage medium and terminal Active CN108769538B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810935304.5A CN108769538B (en) 2018-08-16 2018-08-16 Automatic focusing method and device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810935304.5A CN108769538B (en) 2018-08-16 2018-08-16 Automatic focusing method and device, storage medium and terminal

Publications (2)

Publication Number Publication Date
CN108769538A CN108769538A (en) 2018-11-06
CN108769538B true CN108769538B (en) 2020-09-29

Family

ID=63966425

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810935304.5A Active CN108769538B (en) 2018-08-16 2018-08-16 Automatic focusing method and device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN108769538B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110381259B (en) * 2019-08-13 2021-08-31 广州欧科信息技术股份有限公司 Mural image acquisition method and device, computer equipment and storage medium
CN112585941A (en) * 2019-12-30 2021-03-30 深圳市大疆创新科技有限公司 Focusing method and device, shooting equipment, movable platform and storage medium
CN111935479B (en) * 2020-07-30 2023-01-17 浙江大华技术股份有限公司 Target image determination method and device, computer equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339347A (en) * 2007-07-06 2009-01-07 鸿富锦精密工业(深圳)有限公司 Automatic focusing method and system
CN104270562A (en) * 2014-08-15 2015-01-07 广东欧珀移动通信有限公司 Method and device for focusing when photographing
CN105611158A (en) * 2015-12-23 2016-05-25 北京奇虎科技有限公司 Automatic focus following method and device and user equipment
CN106031155A (en) * 2014-09-26 2016-10-12 深圳市大疆创新科技有限公司 System and method for automatic focusing based on statistic data

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103856708B (en) * 2012-12-03 2018-06-29 原相科技股份有限公司 The method and photographic device of auto-focusing
US9282235B2 (en) * 2014-05-30 2016-03-08 Apple Inc. Focus score improvement by noise correction
JP6525934B2 (en) * 2016-10-14 2019-06-05 キヤノン株式会社 Image processing apparatus and control method
CN108184070B (en) * 2018-03-23 2020-09-08 维沃移动通信有限公司 Shooting method and terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101339347A (en) * 2007-07-06 2009-01-07 鸿富锦精密工业(深圳)有限公司 Automatic focusing method and system
CN104270562A (en) * 2014-08-15 2015-01-07 广东欧珀移动通信有限公司 Method and device for focusing when photographing
CN106031155A (en) * 2014-09-26 2016-10-12 深圳市大疆创新科技有限公司 System and method for automatic focusing based on statistic data
CN105611158A (en) * 2015-12-23 2016-05-25 北京奇虎科技有限公司 Automatic focus following method and device and user equipment

Also Published As

Publication number Publication date
CN108769538A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108668086B (en) Automatic focusing method and device, storage medium and terminal
CN111885294B (en) Shooting method, device and equipment
CN109547701B (en) Image shooting method and device, storage medium and electronic equipment
US9456141B2 (en) Light-field based autofocus
KR102566998B1 (en) Apparatus and method for determining image sharpness
CN108769538B (en) Automatic focusing method and device, storage medium and terminal
CN109194866B (en) Image acquisition method, device, system, terminal equipment and storage medium
CN108647351B (en) Text image processing method and device, storage medium and terminal
CN111182212B (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN109040523B (en) Artifact eliminating method and device, storage medium and terminal
US9838594B2 (en) Irregular-region based automatic image correction
CN103188434A (en) Method and device of image collection
CN108665510B (en) Rendering method and device of continuous shooting image, storage medium and terminal
CN114913121A (en) Screen defect detection system and method, electronic device and readable storage medium
CN110463177A (en) The bearing calibration of file and picture and device
TW201310322A (en) Electronic book display adjustment system and method
US8983227B2 (en) Perspective correction using a reflection
CN109040729B (en) Image white balance correction method and device, storage medium and terminal
JP6283329B2 (en) Augmented Reality Object Recognition Device
CN112449165B (en) Projection method and device and electronic equipment
CN111050081B (en) Shooting method and electronic equipment
CN112948048A (en) Information processing method, information processing device, electronic equipment and storage medium
CN111917986A (en) Image processing method, medium thereof, and electronic device
CN109672829B (en) Image brightness adjusting method and device, storage medium and terminal
CN112584110B (en) White balance adjusting method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant