WO2021217427A1 - 图像处理方法、装置、拍摄设备、可移动平台和存储介质 - Google Patents

图像处理方法、装置、拍摄设备、可移动平台和存储介质 Download PDF

Info

Publication number
WO2021217427A1
WO2021217427A1 PCT/CN2020/087530 CN2020087530W WO2021217427A1 WO 2021217427 A1 WO2021217427 A1 WO 2021217427A1 CN 2020087530 W CN2020087530 W CN 2020087530W WO 2021217427 A1 WO2021217427 A1 WO 2021217427A1
Authority
WO
WIPO (PCT)
Prior art keywords
filter
image
scene
frame
value
Prior art date
Application number
PCT/CN2020/087530
Other languages
English (en)
French (fr)
Inventor
郑子翔
韩守谦
梁大奖
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/087530 priority Critical patent/WO2021217427A1/zh
Priority to CN202080004975.5A priority patent/CN112689853A/zh
Publication of WO2021217427A1 publication Critical patent/WO2021217427A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • the embodiments of the present invention relate to the field of image processing technology, and in particular to an image processing method, device, photographing equipment, removable platform, and storage medium.
  • a filter can be used to filter the captured image to assist in the realization of the autofocus function.
  • filters There are many types and parameters of filters.
  • manual experience is often used to select filters for the camera before the camera leaves the factory, which takes a lot of time, and the selected filter may not perform well. Lead to poor shooting effect.
  • the embodiments of the present invention provide an image processing method, device, photographing equipment, movable platform, and storage medium to solve the technical problem of low efficiency in selecting filters for cameras in the prior art.
  • the first aspect of the embodiments of the present invention provides an image processing method, including:
  • the multiple frames of images to be processed include multiple frames of images taken during the movement of the lens.
  • the multiple frames of images to be processed are multiple frames of images corresponding to a scene.
  • the corresponding multi-frame images are multi-frame images obtained by shooting a scene that matches the scene;
  • the evaluation result corresponding to the first filter it is determined whether the first filter is a filter suitable for the scene.
  • a second aspect of the embodiments of the present invention provides an image processing device, including:
  • Memory used to store computer programs
  • the processor is configured to run a computer program stored in the memory to realize:
  • the multiple frames of images to be processed include multiple frames of images taken during the movement of the lens.
  • the multiple frames of images to be processed are multiple frames of images corresponding to a scene.
  • the corresponding multi-frame images are multi-frame images obtained by shooting a scene that matches the scene;
  • the evaluation result corresponding to the first filter it is determined whether the first filter is a filter suitable for the scene.
  • a third aspect of the embodiments of the present invention provides an image processing device, including:
  • An acquisition circuit for acquiring multiple frames of images to be processed, the multiple frames of images to be processed includes multiple frames of images taken during the lens movement process, wherein the multiple frames of images to be processed are multiple frames corresponding to a scene An image, where the multi-frame images corresponding to the scene are multi-frame images obtained by shooting a scene that matches the scene;
  • a filter circuit configured to filter the multiple frames of images to be processed through the first filter to be analyzed, to obtain a filtering result corresponding to each frame of image
  • An evaluation circuit configured to evaluate the first filter according to the filtering result corresponding to each frame of image
  • the determining circuit is configured to determine whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter.
  • a fourth aspect of the embodiments of the present invention provides a photographing device, including the image processing device described in the second aspect.
  • a fifth aspect of the embodiments of the present invention provides a photographing device, including the image processing device described in the third aspect.
  • a sixth aspect of the embodiments of the present invention provides a movable platform, including the photographing device described in the fourth aspect.
  • a seventh aspect of the embodiments of the present invention provides a movable platform, including the photographing device described in the fifth aspect.
  • An eighth aspect of the embodiments of the present invention provides a computer-readable storage medium having program instructions stored in the computer-readable storage medium, and the program instructions are used to implement the image processing method described in the first aspect.
  • the image processing method, device, shooting equipment, removable platform, and storage medium provided by the embodiments of the present invention can quickly determine whether the first filter is suitable for the current scene, improve the efficiency of selecting the filter for the scene, and improve the shooting efficiency. Effect.
  • FIG. 1 is a schematic flowchart of an image processing method provided by an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of another image processing method provided by an embodiment of the present invention.
  • FIG. 3 is a schematic diagram of a process for evaluating a first filter according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a focal value curve formed by numerical points according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a process for determining a score corresponding to a first filter according to a focal value curve according to an embodiment of the present invention
  • Fig. 6 is a schematic diagram of a focal value curve provided by an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of another focal value curve provided by an embodiment of the present invention.
  • FIG. 8 is a schematic flowchart of yet another image processing method provided by an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of an image processing device provided by an embodiment of the present invention.
  • FIG. 10 is a schematic structural diagram of another image processing apparatus provided by an embodiment of the present invention.
  • the embodiment of the present invention provides an image processing method.
  • the execution subject of the method may be an image processing device in a photographing device. It is understood that the photographing device may be any device with a photographing function such as a camera.
  • the processing device may be implemented as software or a combination of software and hardware.
  • the image processing method provided by the embodiment of the present invention can obtain multiple frames of images to be processed corresponding to a scene, and filter the multiple frames of images to be processed through the first filter to be analyzed, to obtain the corresponding image of each frame And evaluate the first filter according to the obtained filtering result to determine whether the first filter is a filter suitable for the scene.
  • the corresponding filter can be determined for a scene by processing the captured image.
  • filters There are many types and parameters of filters. Different filters may cause different effects of the final shot. Therefore, different filters can be set for different shooting scenes, so as to better show the user the shots. Scenery to meet the shooting needs of various scenes.
  • FIG. 1 is a schematic flowchart of an image processing method provided by an embodiment of the present invention. As shown in Figure 1, the image processing method in this embodiment may include:
  • Step 101 Acquire multiple frames of images to be processed, where the multiple frames of images to be processed include multiple frames of images taken during the lens movement process, where the multiple frames of images to be processed are multiple frames of images corresponding to a scene,
  • the multi-frame images corresponding to the scene are multi-frame images obtained by shooting a scene that matches the scene.
  • the method in the embodiment of the present invention can be used to determine a corresponding filter for a scene.
  • the scene can be any scene such as an indoor scene, an outdoor scene, etc., and it is recorded as a scene to be analyzed, and the scene can be a scene to be analyzed. Of any scene. Assuming that the scene to be analyzed is an indoor scene, that is, a filter corresponding to the indoor scene needs to be determined, the scene may be any scene that conforms to the indoor scene, such as a living room, a bedroom, a kitchen, and the like.
  • the photographing device may include a lens, a focus motor, an image sensor, etc.
  • the light reflected by a scene passes through the lens and then condenses on the image sensor.
  • the image sensor converts the light signal into an electrical signal, thereby forming an image.
  • the focus motor can drive the lens to move, thereby changing the object distance, resulting in continuous changes in the clarity of the shot scene in the image.
  • the movable range of the lens can be determined by the stroke of the focus motor, and the movement of the lens can be done manually or automatically.
  • multiple frames of images corresponding to the scene taken during the movement of the lens can be acquired as the images to be processed, where the multiple frames of images can be all the images taken within the movable range of the lens, or It can be part of the image.
  • the image may be a PNG image or an image in other formats.
  • Step 102 Filter the multiple frames of images to be processed through the first filter to be analyzed, to obtain a filtering result corresponding to each frame of image.
  • the first filter may be an infinite impulse response (Infinity Impulse Response, IIR) filter or a finite impulse response (Finite Impulse Response, FIR) filter.
  • IIR Infinity Impulse Response
  • FIR Finite Impulse Response
  • a filter is selected from a filter set including a plurality of filters as the first filter to be analyzed. The types of multiple filters in the filter set are different, or the parameters of the filters are different.
  • the function of the first filter is to capture the effective signal in the image, and the subsequent focus processing can be completed according to the filtered result. If the design of the first filter is unreasonable, then the result of focusing may be inaccurate, which may easily lead to failure of focusing, and the final image obtained by shooting is blurred, which affects the user experience.
  • Step 103 Evaluate the first filter according to the filtering result corresponding to each frame of image.
  • the multi-frame images may be respectively subjected to filtering processing through the first filter, and then the first filter is evaluated according to the result obtained after filtering.
  • the first filter may be scored according to the filtering result. The more the filtering result meets expectations, the higher the score, and vice versa, the lower the score; or, it may be judged whether the first filter is qualified according to the filtering result. If the preset condition is met, the first filter is considered qualified, otherwise, it is considered unqualified.
  • the specific evaluation strategy for the first filter such as how to score or how to judge whether it is qualified, can be set according to actual needs. It is understandable that different cameras may have different shooting indicators or shooting requirements, and the shooting indicators or shooting requirements may be referred to when determining the specific evaluation strategy of the first filter. In addition, the specific evaluation strategies corresponding to different scenarios may be the same or different, which is not limited in this embodiment.
  • the sharpness of the image can be determined according to the filtering result of each frame of image, and the first filter can be evaluated according to the sharpness.
  • the sharpness of each frame of image For example, if you care about the final presentation effect of the image, you can determine the sharpness of each frame of image according to the filtering result, and find the image with the highest sharpness in each frame of the image, and determine whether the corresponding sharpness meets the requirements. If the image with the highest degree of clarity does not meet the requirements, the first filter can be considered as unqualified, otherwise, the first filter can be considered as qualified.
  • the filter parameters of the first filter to be analyzed are adjusted, and the first filter after the adjusted filter parameters is compared to the above Multiple frames of images to be processed are filtered to obtain a new filtering result corresponding to each image, and the new filtering result is evaluated to determine whether the first filter after adjusting the filter parameters is suitable for The filter of the scene.
  • another filter is selected from a filter set including a plurality of filters, and another filter is used to perform processing on the above-mentioned multi-frames to be processed. Filter the image of to obtain a new filter result corresponding to each image, and evaluate the result of the other filter to determine whether the other filter is a filter suitable for the scene.
  • the first filter is unqualified, otherwise, the first filter is considered to be qualified.
  • judging whether the image with the highest degree of clarity can be quickly selected from it can be achieved according to the method selected during focusing. For example, if the climbing method is selected for focusing in actual shooting, the ideal change trend of the sharpness should be monotonously increasing to a peak, and then monotonously decreasing. If this condition is met, it can be considered that an image with the highest degree of clarity can be quickly selected from it. If other methods are used for focusing during actual shooting, the corresponding method can also be selected to determine whether the image with the highest degree of clarity can be quickly selected to complete the evaluation of the first filter.
  • the above-mentioned degree of clarity obtained after filtering the image by the first filter can realize the evaluation of the first filter. Determining the clarity of the image according to the filtering result can be determined in any manner, which is not limited in this embodiment.
  • the smoothing degree of each frame of image after being filtered can be determined according to the filtering result obtained by the first filter, and whether the requirement is met can be determined according to the smoothing degree.
  • the filtered image is as smooth as possible, and the edges of the objects in the image should not be too sharp; in other scenes, it is hoped that the filtered image should be as sharp as possible, and the edges of the objects should not be too sharp. Vague. Then the first filter can be evaluated according to the smoothing requirements corresponding to the scene to be analyzed.
  • the smoothness of the filtered image can be determined according to the filtering result, for example, the adjacent pixels can be compared to determine the image The degree of smoothness. If the smoothness of each frame image meets the preset smoothing requirements, the first filter can be considered qualified, otherwise, the first filter can be considered unqualified.
  • Step 104 Determine whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter.
  • evaluation results for example, it can be expressed as a specific score, or qualified and unqualified, and so on.
  • the corresponding relationship between the evaluation result and whether it is applicable can be set according to actual needs.
  • the evaluation result may include qualified or unqualified. If the evaluation result of the filter is qualified, it is determined that the first filter is a filter suitable for the scene, if If the evaluation result of the filter is unqualified, it is determined that the first filter is a filter not applicable to the scene.
  • the evaluation result may include a score for the first filter. If the score exceeds a certain score threshold (for example, score A), the first filter is determined to be a filter suitable for the scene, otherwise it is determined that the first filter is not suitable for the scene Filter.
  • a certain score threshold for example, score A
  • the captured images can be continuously acquired during the lens movement, and the images can be filtered through the filter. According to the filtering information of the image in frequency, it can be judged whether it has reached the right time. Focus position.
  • the corresponding focus sensitive frequency domain is different, and this difference has a great influence on whether the focus position can be found. Therefore, the selection of the filter is very important for the actual shooting effect.
  • the above-mentioned method can be used to evaluate the first filter and determine whether the first filter is suitable for the scene to be analyzed. If the first filter is suitable for the scene to be analyzed, it indicates that the first filter meets the shooting requirements in the scene, and the corresponding relationship between the first filter and the scene can be saved in Shooting equipment.
  • the first filter corresponding to the scene is used to filter the captured image, so that the filtering effect meets the shooting requirements in the scene, and the shooting effect is improved .
  • the method described above can be applied to the actual shooting process. Specifically, in the actual shooting process of the user, the method described in step 101 to step 104 above can be used to implement the evaluation of the filter. Further, after step 104, if the first filter is a filter suitable for the scene, focus processing may be performed according to the filtering result corresponding to each frame of image, so as to obtain a clear image.
  • the focusing processing may specifically refer to determining the in-focus position according to the filtering result.
  • the object in the captured image usually undergoes a process from blurry to clear to blurry, where the position of the lens when the object is clearest can be regarded as the in-focus position.
  • There are many specific implementation methods for determining the in-focus position based on the filtering result which is not limited in this embodiment. The above method can be applied to a contrast focusing (CDAF) scheme, and can also be applied to other focusing schemes.
  • CDAF contrast focusing
  • the image processing method provided in this embodiment can obtain multiple frames of images to be processed, and the multiple frames of images to be processed include multiple frames of images taken during the movement of the lens, wherein the multiple frames of images to be processed are one A multi-frame image corresponding to the scene, the multi-frame image corresponding to the scene is a multi-frame image obtained by shooting a scene that matches the scene, and the multi-frame image to be processed is filtered through the first filter to be analyzed , Obtain the filtering result corresponding to each frame of image, evaluate the first filter according to the filtering result corresponding to each frame of image, and determine the first filter according to the evaluation result corresponding to the first filter Whether the filter is a filter suitable for the scene, it is possible to quickly determine whether the first filter is suitable for the current scene, improve the efficiency of selecting the filter for the scene, and improve the shooting effect.
  • the solution provided by the embodiment of the present invention can determine whether a certain filter is a filter suitable for the current scene, and on this basis, a filter suitable for the current scene can be selected from the multiple filters.
  • a filter suitable for the scene can be selected from them. It is understandable that there are many types of filters, and the specific parameters of the filter can also be adjusted. By changing the type and/or parameters of the filter, different filters can be designed, and then from the multiple filters designed You can select a filter suitable for the current scene. Therefore, the process of selecting a filter for a scene can be understood as a process of tuning the parameters of the filter.
  • the selection of filters for scenes relies on the empirical debugging of equipment manufacturers, and there is no complete set of evaluation methods to debug the filters for different scenes and optimize the parameters of the filters. This method of selecting filters based on experience will consume a lot of time, and the final selected filter effect is not good.
  • the performance of different filters can be evaluated according to the image corresponding to the current scene, which effectively reduces the time spent in debugging the filter, and the finally selected filter can effectively improve the focus accuracy and greatly improve the focus.
  • the robustness of the algorithm shortens the cycle of filter debugging.
  • FIG. 2 is a schematic flowchart of another image processing method provided by an embodiment of the present invention.
  • Figure 2 uses two filters as an example to illustrate the implementation method of selecting a filter suitable for the current scene from a plurality of filters.
  • the image processing method may include:
  • Step 201 Acquire multiple frames of images to be processed, where the multiple frames of images to be processed include multiple frames of images taken during the lens movement process, where the multiple frames of images to be processed are multiple frames of images corresponding to a scene.
  • the multi-frame images corresponding to the scene are multi-frame images obtained by shooting a scene that matches the scene.
  • Step 202 Filter the multiple frames of images to be processed through the first filter to be analyzed, and evaluate the first filter according to the filtering result corresponding to the first filter.
  • the multiple frames of images to be processed may be filtered by the first filter to obtain a filtering result corresponding to each frame of image, and according to the filtering result corresponding to each frame of image, the first A filter is evaluated.
  • Step 203 Filter the multiple frames of images to be processed through the second filter to be analyzed, and evaluate the second filter according to the filtering result corresponding to the second filter.
  • the second filter may be any filter different from the first filter. It can be understood that different filters can be obtained by changing the type and/or parameters of the filter.
  • the type of the filter may include, but is not limited to: Butterworth filter, Chebyshev filter, Bessel filter, Elliptic filter, etc. .
  • the parameter of the filter may be the frequency response coefficient of the filter, and the parameters may specifically include, but are not limited to: passband cutoff frequency, stopband cutoff frequency, passband attenuation, stopband attenuation, low cutoff frequency, high Cutoff frequency, passband width, stopband width, etc.
  • Filters of different types and/or parameters can be considered as different filters.
  • a second filter different from the first filter can be evaluated. Specifically, the multiple frames of images to be processed may be filtered by the second filter to obtain a filtering result corresponding to each frame of image, and according to the filtering result of each frame of image by the second filter, The second filter is evaluated.
  • the specific implementation method of evaluating the second filter is similar to the method of evaluating the first filter, and will not be repeated here.
  • step 202 and step 203 can be adjusted according to actual needs.
  • the second filter can be evaluated first, and then the first filter can be evaluated.
  • the first filter and the second filter can be evaluated at the same time.
  • Step 204 According to the evaluation result corresponding to the first filter and the evaluation result corresponding to the second filter, select a filter suitable for the scene from the first filter and the second filter .
  • step 204 it is possible to determine whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter. It can be understood that if the first filter is selected as the filter suitable for the scene from the first filter and the second filter, it indicates that the first filter is suitable for the scene If the second filter is selected as the filter suitable for the scene, then it can be considered that the first filter is not a filter suitable for the scene.
  • the one with better evaluation result can be selected from the first filter and the second filter as suitable for all Describe the filter of the scene.
  • a filter with a higher score may be selected from the first filter and the second filter as the filter suitable for the scene.
  • the specific manifestation of the evaluation result is qualified or unqualified, it is determined whether there is one and only one qualified filter among the first filter and the second filter, and if so, the qualified filter is selected as The filter applied to the scene.
  • the scores of the first filter and the second filter are the same, or the first filter and the second filter are both qualified, then other strategies may be combined, such as selecting an implementation A lower cost filter is used as a filter suitable for the scene.
  • Figure 2 uses two filters as an example to describe the implementation method of selecting a filter suitable for the scene to be analyzed from a plurality of filters.
  • the filters to be analyzed may not be limited to two.
  • the image to be processed can be filtered through the filter to obtain the corresponding image of each frame And evaluate the filter according to the filtering result corresponding to each frame of image obtained by the filter, and finally, select from the multiple filters according to the evaluation results corresponding to the multiple filters The filter applied to the scene.
  • a Butterworth filter can be selected as the filter to be analyzed.
  • the frequency response curve of the Butterworth filter in the passband is flat, simple and easy to implement.
  • the parameters of the Butterworth filter can include passband gain, passband width, and low cutoff band.
  • multiple Butterworth filters can be pre-designed. Different filters have different specific parameters. For example, the cut-off frequency of the first filter is f1 and the cut-off frequency of the second filter is f2. Then, shoot the scene that matches the scene to obtain the image to be processed, and use a plurality of pre-designed Butterworth filters to filter the image to be processed respectively, and filter according to the corresponding filter of each filter As a result, the filter is evaluated, and a filter suitable for the scene is selected from the plurality of filters according to the evaluation result.
  • the first filter and the second filter may be used to filter the image to be processed, respectively, And according to the filtering result corresponding to the first filter and the filtering result corresponding to the second filter, the first filter and the second filter are evaluated respectively, so that the first filter And selecting a filter suitable for the scene from the second filter, can analyze whether multiple filters are suitable for the scene, and select a corresponding filter for the scene from the multiple filters, Effectively improve the efficiency and accuracy of selecting filters for the scene.
  • FIG. 3 is a schematic diagram of a process for evaluating a first filter according to an embodiment of the present invention.
  • the score corresponding to the first filter may be determined according to the filtering result corresponding to each frame of image.
  • determining the score corresponding to the first filter according to the filtering result corresponding to each frame of image may include:
  • Step 301 Determine a focus value corresponding to each frame of image according to the filtering result corresponding to each frame of image.
  • Fourier transform is performed on the gray value of the pixel in the image to obtain the frequency component (for example, including the amplitude of the frequency) related to the gray value of the image.
  • the frequency component for example, the amplitude of the frequency
  • the frequency range (or frequency) corresponding to the region of interest that is, the focus region
  • corresponding digital filtering is performed on the frequency range (or frequency). After filtering, accumulate each frequency in the frequency range, and the accumulated value obtained is the focal value. It should be noted that the frequency component indicates the energy of the image.
  • the focal value may be used to indicate the sharpness of the image, and the sharpness may refer to the overall sharpness of the image or the sharpness of a part of the image.
  • the sharpness may refer to the overall sharpness of the image or the sharpness of a part of the image.
  • the filtering result corresponding to each frame of image may include a signal value corresponding to each pixel in at least a part of the pixels of the image after the filtering process.
  • determining the focal value corresponding to each frame of image according to the filtering result corresponding to each frame of image may include: for each frame of image, combining the signal corresponding to the at least part of the pixels in the image The value is accumulated to obtain the focal value corresponding to the image.
  • the at least part of the pixel points are pixel points of the region of interest.
  • the corresponding signal value is a frequency value corresponding to the at least part of the pixel points.
  • the filtering result may include a signal value corresponding to each pixel among all the pixels of the image.
  • the signal value corresponding to the pixel is the value corresponding to the pixel after being filtered, and the value can be used to represent the gradient information corresponding to the pixel.
  • the signal values corresponding to all pixels of the image after being filtered are accumulated to obtain the focal value corresponding to the image, thereby reflecting the overall clarity of the image.
  • the filtering result may include a signal value corresponding to each pixel in the partial pixels of the image.
  • the signal values corresponding to some pixels of the image are accumulated to obtain the focal value corresponding to the image.
  • the partial pixels may be a plurality of pixels uniformly distributed in the image, and the overall clarity of the image can also be reflected by the partial pixels, and the amount of calculation can be reduced, the image processing speed can be increased, and the focus can be improved. s efficiency.
  • the part of the pixels may include pixels in the region of interest of the user in the image.
  • the region of interest can be determined by various methods. For example, when taking an image, the user can click on the area that he wants to focus on. By recognizing where the user clicks, the area of interest can be determined, or the image can be semantically analyzed, and the area of interest can be determined through semantic analysis, such as selfies. The area where the face is located in the mode can be used as the region of interest.
  • the pixel points in the region of interest can be used as pixels for calculating the focal value, and the signal values corresponding to the pixels in the region of interest are accumulated to obtain the focal value corresponding to the image. Calculating the focus value by the signal value corresponding to the pixel point in the region of interest can make the focus value reflect the clarity of the region of interest in the image and improve the focus accuracy.
  • the above method is to calculate the focal value of the image by superimposing the signal value.
  • other methods can also be used to calculate the focal value.
  • the clarity of the image can be evaluated by the evaluation function, and the output value of the evaluation function can be used as The focal value of the image.
  • the specific expression form of the evaluation function can be set according to actual needs, which is not limited in this embodiment.
  • Step 302 Determine a score corresponding to the first filter according to the focal value corresponding to each frame of image.
  • determining the score corresponding to the first filter according to the focal value corresponding to each frame of image may include: determining a focal value curve according to the focal value corresponding to each frame of image; and according to the focal value curve Determine the score corresponding to the first filter.
  • the focal value curve may refer to a curve formed by connecting the focal values corresponding to each frame of image.
  • determining the focal value curve according to the focal value corresponding to each frame of image may include: for each frame of image, determining the value point corresponding to the image in the focal value curve, and the abscissa of the value point is The serial number of the image, and the ordinate is the focal value corresponding to the image; the focal value curve is generated according to the determined multiple numerical points.
  • the sequence number of the image can be determined according to the order of shooting.
  • FIG. 4 is a schematic diagram of a focal value curve formed by numerical points according to an embodiment of the present invention.
  • the abscissa is from 1 to 41, indicating that 41 frames of images are taken for a scene, and the corresponding focal value can be calculated for each frame of the image to form a focal value curve, and the ordinate is the focal point corresponding to the image. value.
  • the serial number 1 to the serial number 41 are used to represent the lens position in FIG. 4. That is, the abscissa in FIG. 4 corresponds to the lens position 1 to the lens position 41, which respectively correspond to the frame 1 to the frame 41.
  • the strategy for scoring based on the focus value curve can be determined according to the actual focus strategy. For example, if the actual focus needs to meet the need to meet the steep slopes on both sides of the position with the largest focus value to achieve accurate focus, then when evaluating the first filter, the slope on both sides of the highest value in the focus value curve can be used as the score Reference basis, so that the finally selected filter can meet the requirements of focusing.
  • the image processing method provided in this embodiment can determine the focal value corresponding to each frame of image according to the filtering result corresponding to each frame of image, and determine the focal value curve according to the focal value corresponding to each frame of image.
  • the focus value curve determines the score corresponding to the first filter.
  • the focus value curve can intuitively reflect the changing trend of the image clarity, so that the first filter can be evaluated quickly and accurately, and the selection of filters for the scene can be further improved The efficiency and accuracy of the device.
  • determining the score corresponding to the first filter according to the focal value curve may include: according to the focal value curve, passing at least one of the following The score corresponding to the first filter is determined: curve contrast, first ratio, second ratio, maximum opening size, and curve monotonicity.
  • the first ratio is the proportion of numerical points whose focal value is higher than the first threshold
  • the second ratio is the proportion of numerical points whose focal value is lower than the second threshold.
  • FIG. 5 is a schematic diagram of a process for determining a score corresponding to a first filter according to a focal value curve according to an embodiment of the present invention.
  • the score corresponding to the first filter is determined by five dimensions of curve contrast, first ratio, second ratio, maximum opening size, and curve monotonicity.
  • the corresponding scores of the curve contrast, the first ratio, the second ratio, the maximum opening size, and the monotonicity of the curve can be determined through the focal value curve, and the first score can be further determined according to the five scores obtained.
  • the final score of the filter can be determined through the focal value curve, and the first score can be further determined according to the five scores obtained.
  • determining the score corresponding to the first filter according to the focal value curve may include:
  • Step 501 In the focal value curve, determine N numerical points with the largest focal value and M numerical points with the smallest focal value.
  • the M and N are both positive integers.
  • Step 502 Determine the score of the curve contrast corresponding to the first filter according to the ratio between the mean value of the focal values corresponding to the N numerical points and the mean value of the focal values corresponding to the M numerical points.
  • step 501 to step 502 can be used to score the curve contrast of the focal value curve corresponding to the first filter.
  • the curve contrast may refer to the ratio of the mean value of the largest N values to the mean value of the smallest M values in the curve, and the ratio may reflect the quality of the focal value curve to a certain extent.
  • the ratio between the mean value of the largest N focal values and the mean value of the smallest M focal values in the focal value curve, and the score of the curve contrast corresponding to the first filter may be in a positive correlation.
  • the positive correlation means that when the variable x increases, the variable y also increases, that is, the two variables change in the same direction, and when one variable x changes from large to small/from small to large, the other The variable y also changes from large to small/from small to large, then the variable x and the variable y can be regarded as a positive correlation.
  • the ratio between the mean value of the largest N focal values and the mean value of the smallest M focal values may have a positive correlation with the corresponding curve contrast score, that is, the larger the ratio
  • the score of the curve contrast may be higher and the ratio may be smaller, and the score of the curve contrast may be lower.
  • the relationship between the ratio and the score may be a proportional function.
  • the focus value of the image near the in-focus position is relatively large, and the focus value of the image in the out-of-focus position is relatively small.
  • the greater the difference in focus value between in-focus and out-of-focus the better the corresponding shooting effect
  • the score of the curve contrast corresponding to the first filter is higher; if the difference in focus value between in-focus and out-of-focus is small, it means that the first filter has poor filtering effect for the current scene, then the corresponding The score can be lower.
  • the values of M and N can be set according to actual needs.
  • the M may be equal to the value obtained by multiplying the number of frames of the image to be processed by a first coefficient and then rounding
  • the N may be equal to the number of frames of the image to be processed multiplied by the second coefficient.
  • the value obtained by rounding, where the first coefficient and the second coefficient are both greater than 0, and the rounding may be any rounding method such as rounding up, rounding down, or rounding.
  • the second coefficient may be greater than the first coefficient, because in the multi-frame images taken during the lens movement, there are more out-of-focus images and fewer images close to the in-focus state.
  • Setting N to be greater than M can effectively reflect the contrast between the in-focus state and the out-of-focus state.
  • the first coefficient is 10%
  • the second coefficient is 20%.
  • the N can be 4
  • the M can be 8.
  • the maximum focal value can be found from the focal value curve.
  • the 4 numerical points and the 8 numerical points with the smallest focal value are used to score the contrast of the curve.
  • Step 503 Determine the highest value of the focal value curve, and count the number of numerical points whose focal value is greater than a first threshold, where the first threshold is the product of the highest value and a first proportional coefficient.
  • the maximum value may be the maximum focus value in the focus value curve
  • the focus value greater than the first threshold may be a focus value close to the maximum value in value
  • the first scale factor may be based on It can be set according to actual needs. For example, it can be 92%. Then, the number of focal values between 92% and 100% of the highest value can be counted.
  • Step 504 Determine the score of the first ratio corresponding to the first filter according to the ratio of the number of numerical points with the focal value greater than the first threshold to the number of all numerical points in the focal value curve.
  • step 503 to step 504 can be used to score the first ratio of the focal value curve corresponding to the first filter.
  • the first ratio is the proportion of numerical points whose focal value is higher than the first threshold.
  • the ratio of the number of numerical points with a focal value greater than a first threshold to the number of all numerical points in the focal curve, and the score of the first ratio corresponding to the first filter may be negatively correlated relation.
  • the negative correlation means that when the variable x increases, the variable y decreases, that is, the direction of change of the two variables is opposite, and when one variable x changes from large to small/from small to large, the other variable If y changes from small to large/from large to small, then the variable x and the variable y can be considered as a negative correlation.
  • the ratio of the number of numerical points with the focal value greater than the first threshold to the number of all numerical points in the focal curve has a negative correlation with the score of the first ratio, that is to say .
  • the relationship between the ratio and the corresponding score may be an inverse proportional function.
  • the position corresponding to the highest value can be regarded as the in-focus position.
  • the more focus values close to the highest value the more difficult it is to quickly and accurately find the in-focus position during the focusing process, so the focus value is greater than the first threshold.
  • the proportion of the numerical points and the score of the first ratio may be in a negative correlation, which enables the filter selected for the current scene to better realize the function of assisting focus and improve the shooting effect of the image.
  • Step 505 Determine the lowest value of the focal value curve, and count the number of numerical points whose focal value is less than a second threshold, where the second threshold is the product of the lowest value and a second proportionality coefficient.
  • the minimum value may be the minimum focus value in the focus value curve
  • the focus value smaller than the second threshold value may be a focus value close to the minimum value in value
  • the second scale factor may be based on It can be set according to actual needs. For example, it can be 120%. Then, the number of focal values between 100% and 120% of the lowest value can be counted.
  • Step 506 Determine the score of the second ratio corresponding to the first filter according to the ratio of the number of numerical points with the focal value less than the second threshold to the number of all numerical points in the focal curve.
  • step 505 to step 506 can be used to score the second ratio of the focal value curve corresponding to the first filter.
  • the second ratio is the proportion of numerical points whose focal value is lower than the second threshold.
  • the ratio of the number of numerical points whose focus value is less than the second threshold to the number of all numerical points in the focus value curve may have a positive correlation with the score of the second ratio.
  • the focal value is smaller than the second threshold.
  • the ratio of the numerical points of and the score of the second ratio may be in a positive correlation, which can further improve the shooting effect of the image.
  • Step 507 Determine the curvature corresponding to the highest value of the focal value curve.
  • Step 508 Determine the score of the highest value opening size corresponding to the first filter according to the curvature.
  • step 507 to step 508 may be used to score the maximum opening size of the focal value curve corresponding to the first filter.
  • the opening size of the focal value curve at the highest value can be represented by curvature.
  • the curve formed by the numerical point where the highest value is located and several nearby numerical points can be performed by a quadratic equation. Fitting, according to the characteristics of the quadratic equation, can determine the curvature of the curve at the highest value.
  • the score of the curvature and the maximum opening size may have a positive correlation.
  • Step 509 Count the number of monotonically increasing intervals and/or the number of monotonically decreasing intervals in the focus value curve.
  • Step 510 Determine the monotonicity score of the curve corresponding to the first filter according to the number of monotonically increasing intervals and/or the number of monotonically decreasing intervals.
  • step 509 to step 510 may be used to score the monotonicity of the curve corresponding to the focal value curve of the first filter.
  • the ideal focal value curve should increase monotonously to the left of the maximum value and decrease monotonously to the right of the maximum value.
  • the execution order of the above steps in this embodiment is not limited to the order defined by the above sequence numbers.
  • the monotonicity of the curve can be evaluated first and then the highest value opening size can be evaluated first, or the highest value opening size can be evaluated first before evaluation
  • the monotonicity of the curve, or the evaluation of the monotonicity of the curve and the maximum opening size can be carried out at the same time, and those skilled in the art can make any configuration according to specific application requirements and design requirements, which will not be repeated here.
  • Step 511 Determine the score of the first filter according to the curve contrast, the first ratio, the second ratio, the maximum opening size, and the score corresponding to the monotonicity of the curve.
  • a weight value can be assigned to each dimension, and the scores corresponding to the five dimensions can be weighted. And to obtain the score of the first filter.
  • the focal value curve can be used to achieve the scoring of the first filter using the above-mentioned method, so as to determine whether the first filter is a filter suitable for the current scene. .
  • the method can also be used to evaluate other filters. For example, multiple filters can be set in advance, each filter is scored by the above method, and the highest score is selected as the one suitable for the current scene. filter.
  • Fig. 6 is a schematic diagram of a focal value curve provided by an embodiment of the present invention.
  • the definitions of abscissa and ordinate in Fig. 6 and Fig. 4 are similar.
  • the focus value curve fluctuates a lot.
  • the focus easily stays at the fluctuating part, that is, the peak part between the serial numbers 10 and 20 in the figure.
  • the focus value curve shown in FIG. 6 there are two points where the curve is highest where the focus values are relatively close, which has a great influence on the judgment of the focus position. Therefore, the score of the focus curve shown in FIG. 6 may be lower.
  • FIG. 7 is a schematic diagram of another focal value curve provided by an embodiment of the present invention.
  • the definitions of abscissa and ordinate in Fig. 7 and Fig. 4 are similar. For the sake of brevity, I won't repeat it. As shown in Fig. 7, the fluctuation of the focal value curve is less, and the highest value is more obvious, which is better than the focal value curve shown in Fig. 6.
  • the method for determining the score corresponding to the first filter according to the focal value curve may be based on the five dimensions of the curve contrast of the focal value curve, the first ratio, the second ratio, the maximum opening size, and the monotonicity of the curve. Determine the score corresponding to the first filter, so as to better realize the evaluation of the first filter, so that the filter selected for the current scene is more in line with the shooting requirements of the scene, and the filter selected is used to compare the actual Filtering the captured image can make the sharpness of the filtered image change better, and the focus curve formed by the filtering result is more in line with the requirements of focus processing, so that focusing can be achieved quickly and accurately, and the shooting effect of the image can be improved.
  • FIG. 8 is a schematic flowchart of another image processing method provided by an embodiment of the present invention. As shown in FIG. 8, the image processing method may include:
  • Step 801 Acquire multiple frames of to-be-processed images, where the multiple frames of to-be-processed images include multiple frames of images taken during the lens movement process, where the multiple frames of to-be-processed images are multiple frames of images corresponding to a scene,
  • the multi-frame images corresponding to the scene are multi-frame images obtained by shooting a first scene that matches the scene.
  • Step 802 Filter the multiple frames of images to be processed through the first filter to be analyzed to obtain a filtering result corresponding to each frame of image.
  • step 801 to step 802 can be referred to the foregoing embodiment, and will not be repeated here.
  • Step 803 Acquire multiple frames of second images, where the multiple frames of second images are multiple frames of images obtained by shooting a second scene that matches the scene.
  • Step 804 Filter the multiple frames of second images through the first filter to obtain a filtering result corresponding to each frame of the second image.
  • the scene may be a human face scene
  • different faces may be different scenes that are considered to conform to the scene.
  • the first scene may be the face of user A
  • the second scene may be the face of user B.
  • the first scene by shooting the first scene, multiple frames of images taken during the movement of the lens can be obtained, which are recorded as images to be processed; by shooting the second scene, multiple frames taken during the movement of the lens can be obtained
  • the image is recorded as the second image.
  • the image to be processed and the second image may be respectively filtered by the first filter.
  • Step 805 Evaluate the first filter according to the filtering result corresponding to the image to be processed and the filtering result corresponding to the second image.
  • Step 806 Determine whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter.
  • the first filter may be evaluated according to the filtering result corresponding to the image to be processed to obtain a first score
  • the first filter may be evaluated according to the filtering result corresponding to the second image to obtain the first Second score, adding the first score and the second score to obtain the total score corresponding to the first filter.
  • the total score corresponding to the first filter it is determined whether the first filter is a filter suitable for the scene.
  • the implementation method of determining whether the first filter is suitable for the scene according to the images of multiple scenes is described.
  • the scenes used to evaluate the filter may not be limited to two.
  • two or more scenes conforming to the scene may be selected, each scene may be photographed separately, and the first filter may be scored according to the photographed image, and the score may be determined according to the score corresponding to each scene.
  • Combining the solutions provided by the foregoing embodiments can obtain an optional solution for configuring filters for the shooting device, which may involve multiple scenes, multiple scenes, and multiple filters. Specifically, you can first set up multiple shooting scenes, such as indoor scenes, outdoor scenes, etc., for each scene, shoot multiple scenes that fit the scene, for example, for indoor scenes, shoot indoor scenes of multiple houses separately , To obtain images corresponding to multiple scenes, where the image corresponding to each scene may include multiple frames of images taken during lens movement when the scene is taken.
  • multiple shooting scenes such as indoor scenes, outdoor scenes, etc.
  • multiple filters can be designed, for each filter, the image corresponding to multiple scenes in a scene is filtered by the filter, and the filter is scored according to the filtering result, and the filter is The total score for multiple scenes in the scene. After evaluating all the filters, a total score corresponding to each filter can be obtained, and the filter with the highest total score is selected from all the filters as the filter suitable for the scene. By implementing the above steps for each scene, the filter corresponding to each scene can be obtained.
  • the corresponding filter can be called to filter the captured image, so that the filtering result meets the requirements of the current scene.
  • a plurality of pre-designed filters can be used to filter the currently captured image separately, and the filter can be performed according to the filtering result. Evaluation, thereby determining a filter suitable for the current scene among the multiple filters, and performing subsequent focusing processing according to the filtering result corresponding to the filter.
  • the image processing method provided by this embodiment can take multiple scenes corresponding to the scene to be analyzed, thereby comprehensively evaluating the first filter according to the images corresponding to the multiple scenes, and determining whether the first filter is applicable
  • the filters in the scene make the filtered filters have better adaptability to the current scene.
  • the scene corresponding to each image can be manually selected, or the scene corresponding to the image can also be determined according to the image to be processed.
  • the scene may be any of the following: a normal light scene, a strong light scene, a low light scene, a point light source scene, a character scene, and the like.
  • determining the scene corresponding to the image according to the image to be processed may include: according to brightness information corresponding to the image to be processed, and information when the image to be processed is taken.
  • the exposure parameter determines the environmental brightness information; according to the environmental brightness information, the scene corresponding to the image to be processed is determined.
  • the exposure parameter may include the corresponding light-on time when the image is taken, and the like.
  • the exposure parameter may include the corresponding light-on time when the image is taken, and the like.
  • the response on the image sensor per unit time can be used to characterize environmental brightness information, and different responses represent different environmental brightness information. Therefore, the environmental brightness information can be determined by the brightness information of the captured image and the passage of time.
  • the brightness information of the image may be the average value of the brightness values corresponding to each pixel in the image
  • the environmental brightness information may be equal to the brightness information of the image divided by the light-on time.
  • the brightness information and exposure parameters of the image can quickly and accurately detect the ambient brightness information, and can effectively detect normal light scenes, strong light scenes, and low light scenes.
  • the current scene can be considered as a strong light scene; if the environment brightness information is less than the second environment brightness threshold, then the current scene can be considered as a low light scene. If the environment brightness information is less than the first environment brightness information and greater than the second environment brightness information, then the current scene can be regarded as a normally bright scene.
  • determining the scene corresponding to the image according to the image to be processed may include: detecting the brightness information corresponding to the foreground object and the brightness corresponding to the background object in the image to be processed Information; according to the difference or ratio between the brightness information corresponding to the foreground object and the brightness information corresponding to the background object, the scene corresponding to the image to be processed is determined.
  • the foreground object in the image may be an object in the image that is closest to the photographing device, and the background object in the image may be an object other than the foreground object.
  • the distance between the object and the shooting device can be determined by the change trend of the sharpness of the image area where the object is located.
  • the difference or ratio between the brightness information corresponding to the foreground object and the brightness information corresponding to the background object is larger, it means that the brightness of the foreground object in the image is much greater than that of the background object, and the current scene can be determined as a point.
  • Light source scene If the difference or ratio between the brightness information corresponding to the foreground object and the brightness information corresponding to the background object is larger, it means that the brightness of the foreground object in the image is much greater than that of the background object, and the current scene can be determined as a point. Light source scene.
  • determining the scene corresponding to the image according to the image to be processed may include: determining whether the image to be processed exists by detecting whether there is a preset object in the image to be processed The scene corresponding to the image. For example, if there is a human face in the image, the current scene can be regarded as a human face scene.
  • the scene detection can be realized quickly and accurately.
  • the above scene detection schemes can be combined and used. For example, it can first detect whether there is a preset object in the image. If it exists, determine that the current scene is the scene corresponding to the preset object. If it exists, it can detect the brightness information of the foreground object and the brightness information of the background object in the image, and determine whether the current scene is a point light source scene. Light scene or normal light scene.
  • the filters to be analyzed are strictly and scientifically evaluated, and filters suitable for different scenarios are determined.
  • the current scene can be detected through the captured image, and the filter corresponding to the scene can be selected for shooting, which effectively improves the efficiency and accuracy of focusing, and improves the shooting effect.
  • FIG. 9 is a schematic structural diagram of an image processing device provided by an embodiment of the present invention.
  • the image processing device may execute the image processing method corresponding to FIG. 1 to FIG. 8.
  • the image processing device may include:
  • the memory 11 is used to store computer programs
  • the processor 12 is configured to run a computer program stored in the memory to realize:
  • the multiple frames of images to be processed include multiple frames of images taken during the movement of the lens.
  • the multiple frames of images to be processed are multiple frames of images corresponding to a scene.
  • the corresponding multi-frame images are multi-frame images obtained by shooting a scene that matches the scene;
  • the evaluation result corresponding to the first filter it is determined whether the first filter is a filter suitable for the scene.
  • the structure of the image processing apparatus may further include a communication interface 13 for communicating with other devices or a communication network.
  • processor 12 is further configured to:
  • the first filter is a filter suitable for the scene
  • focus processing is performed on the filtering result corresponding to each frame of image.
  • processor 12 is further configured to:
  • the second filter is evaluated according to the filtering result corresponding to the second filter.
  • the processor 12 when determining whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter, the processor 12 is specifically configured to:
  • a filter suitable for the scene is selected from the first filter and the second filter.
  • the processor 12 when evaluating the first filter according to the filtering result corresponding to each frame of image, the processor 12 is specifically configured to:
  • the score corresponding to the first filter is determined.
  • the processor 12 when determining the score corresponding to the first filter according to the filtering result corresponding to each frame of image, the processor 12 is specifically configured to:
  • the score corresponding to the first filter is determined according to the focal value corresponding to each frame of image.
  • the filtering result corresponding to each frame of image includes a signal value corresponding to each pixel in at least some of the pixels of the image after the filtering process;
  • the processor 12 When determining the focal value corresponding to each frame of image according to the filtering result corresponding to each frame of image, the processor 12 is specifically configured to:
  • the signal values corresponding to the at least part of the pixels in the image are accumulated to obtain the focal value corresponding to the image.
  • the processor 12 when determining the score corresponding to the first filter according to the focal value corresponding to each frame of image, the processor 12 is specifically configured to:
  • the score corresponding to the first filter is determined according to the focal value curve.
  • the processor 12 when determining the focal value curve according to the focal value corresponding to each frame of image, the processor 12 is specifically configured to:
  • the abscissa of the numerical point is the serial number of the image, and the ordinate is the focal value corresponding to the image;
  • the focal value curve is generated according to the determined multiple numerical points.
  • the processor 12 when determining the score corresponding to the first filter according to the focal value curve, the processor 12 is specifically configured to:
  • the score corresponding to the first filter is determined by at least one of the following:
  • Curve contrast first ratio, second ratio, maximum opening size, curve monotonicity
  • the first ratio is the proportion of numerical points whose focal value is higher than the first threshold; the second ratio is the proportion of numerical points whose focal value is lower than the second threshold.
  • the processor 12 when the score corresponding to the first filter is determined by curve contrast, the processor 12 is specifically configured to:
  • the M and N are positive integers.
  • the M is equal to the value obtained by multiplying the number of frames of the image to be processed by the first coefficient and then rounding
  • the N is equal to the number of frames of the image to be processed multiplied by the second coefficient.
  • the ratio between the mean value of the focal values corresponding to the N numerical points and the mean value of the focal values corresponding to the M numerical points has a positive correlation with the score of the curve contrast.
  • the processor 12 when the score corresponding to the first filter is determined by the first ratio, the processor 12 is specifically configured to:
  • the score of the first ratio corresponding to the first filter is determined according to the ratio of the number of numerical points with the focal value greater than the first threshold to the number of all numerical points in the focal value curve.
  • the ratio of the number of numerical points with the focal value greater than the first threshold to the number of all numerical points in the focal value curve has a negative correlation with the score of the first ratio.
  • the processor 12 when the score corresponding to the first filter is determined by the second ratio, the processor 12 is specifically configured to:
  • the score of the second ratio corresponding to the first filter is determined according to the ratio of the number of numerical points whose focus value is less than the second threshold to the number of all numerical points in the focus value curve.
  • the ratio of the number of numerical points with the focal value less than the second threshold to the number of all numerical points in the focal value curve is positively correlated with the score of the second ratio.
  • the processor 12 when the score corresponding to the first filter is determined by the highest value opening size, the processor 12 is specifically configured to:
  • the score of the highest value opening size corresponding to the first filter is determined according to the curvature.
  • the curvature and the score of the highest value opening size have a positive correlation.
  • the processor 12 when the score corresponding to the first filter is determined by the monotonicity of the curve, the processor 12 is specifically configured to:
  • the score of the monotonicity of the curve corresponding to the first filter is determined according to the number of monotonically increasing intervals and/or the number of monotonically decreasing intervals.
  • the number of the monotonic increasing interval and/or the monotonic decreasing interval is in a negative correlation with the score of the monotonicity of the curve.
  • the scene is any one of the following: a normal light scene, a strong light scene, a low light scene, a point light source scene, and a character scene.
  • processor 12 is further configured to:
  • the processor 12 when the first filter is evaluated according to the filtering result corresponding to each frame of image, the processor 12 is specifically configured to:
  • the first filter is evaluated according to the filtering result corresponding to the image to be processed and the filtering result corresponding to the second image.
  • processor 12 is further configured to:
  • a scene corresponding to the image is determined.
  • the processor 12 when determining the scene corresponding to the image according to the image to be processed, the processor 12 is specifically configured to:
  • the scene corresponding to the image to be processed is determined.
  • the processor 12 when determining the scene corresponding to the image according to the image to be processed, the processor 12 is specifically configured to:
  • the scene corresponding to the image to be processed is determined.
  • the processor 12 when determining the scene corresponding to the image according to the image to be processed, the processor 12 is specifically configured to:
  • the scene corresponding to the image to be processed is determined by detecting whether there is a preset object in the image to be processed.
  • the image processing device shown in FIG. 9 can execute the methods of the embodiments shown in FIG. 1 to FIG. 8. For parts that are not described in detail in this embodiment, reference may be made to the related descriptions of the embodiments shown in FIG. 1 to FIG. 8. For the implementation process and technical effects of this technical solution, please refer to the description in the embodiment shown in FIG. 1 to FIG. 8, which will not be repeated here.
  • FIG. 10 is a schematic structural diagram of another image processing apparatus provided by an embodiment of the present invention.
  • the image processing device may execute the image processing method corresponding to FIG. 1 to FIG. 8.
  • the image processing device may include:
  • the acquisition circuit 21 is configured to acquire multiple frames of images to be processed, and the multiple frames of images to be processed include multiple frames of images taken during the movement of the lens, wherein the multiple frames of images to be processed are multiple frames corresponding to a scene.
  • Frame images, where the multiple frames of images corresponding to the scene are multiple frames of images obtained by shooting a scene that matches the scene;
  • the filter circuit 22 is configured to filter the multiple frames of images to be processed through the first filter to be analyzed to obtain a filtering result corresponding to each frame of image;
  • the evaluation circuit 23 is configured to evaluate the first filter according to the filtering result corresponding to each frame of image
  • the determining circuit 24 is configured to determine whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter.
  • the determining circuit 24 is also used for:
  • the first filter is a filter suitable for the scene
  • focus processing is performed on the filtering result corresponding to each frame of image.
  • the filter circuit 22 is also used for:
  • the second filter is evaluated according to the filtering result corresponding to the second filter.
  • the determining circuit 24 when determining whether the first filter is a filter suitable for the scene according to the evaluation result corresponding to the first filter, the determining circuit 24 is specifically configured to:
  • a filter suitable for the scene is selected from the first filter and the second filter.
  • the evaluation circuit 23 when the first filter is evaluated according to the filtering result corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
  • the score corresponding to the first filter is determined.
  • the evaluation circuit 23 when determining the score corresponding to the first filter according to the filtering result corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
  • the score corresponding to the first filter is determined according to the focal value corresponding to each frame of image.
  • the filtering result corresponding to each frame of image includes a signal value corresponding to each pixel in at least some of the pixels of the image after the filtering process;
  • the evaluation circuit 23 When determining the focal value corresponding to each frame of image according to the filtering result corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
  • the signal values corresponding to the at least part of the pixels in the image are accumulated to obtain the focal value corresponding to the image.
  • the evaluation circuit 23 when determining the score corresponding to the first filter according to the focal value corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
  • the score corresponding to the first filter is determined according to the focal value curve.
  • the evaluation circuit 23 is specifically configured to:
  • the abscissa of the numerical point is the serial number of the image, and the ordinate is the focal value corresponding to the image;
  • the focal value curve is generated according to the determined multiple numerical points.
  • the evaluation circuit 23 when determining the score corresponding to the first filter according to the focal value curve, is specifically configured to:
  • the score corresponding to the first filter is determined by at least one of the following:
  • Curve contrast first ratio, second ratio, maximum opening size, curve monotonicity
  • the first ratio is the proportion of numerical points whose focal value is higher than the first threshold; the second ratio is the proportion of numerical points whose focal value is lower than the second threshold.
  • the evaluation circuit 23 is specifically configured to:
  • the M and N are positive integers.
  • the M is equal to the value obtained by multiplying the number of frames of the image to be processed by the first coefficient and then rounding
  • the N is equal to the number of frames of the image to be processed multiplied by the second coefficient.
  • the ratio between the mean value of the focal values corresponding to the N numerical points and the mean value of the focal values corresponding to the M numerical points has a positive correlation with the score of the curve contrast.
  • the evaluation circuit 23 when the score corresponding to the first filter is determined by the first ratio, the evaluation circuit 23 is specifically configured to:
  • the score of the first ratio corresponding to the first filter is determined according to the ratio of the number of numerical points with the focal value greater than the first threshold to the number of all numerical points in the focal value curve.
  • the ratio of the number of numerical points with the focal value greater than the first threshold to the number of all numerical points in the focal value curve has a negative correlation with the score of the first ratio.
  • the evaluation circuit 23 when the score corresponding to the first filter is determined by the second ratio, the evaluation circuit 23 is specifically configured to:
  • the score of the second ratio corresponding to the first filter is determined according to the ratio of the number of numerical points whose focus value is less than the second threshold to the number of all numerical points in the focus value curve.
  • the ratio of the number of numerical points with the focal value less than the second threshold to the number of all numerical points in the focal value curve is positively correlated with the score of the second ratio.
  • the evaluation circuit 23 when the score corresponding to the first filter is determined by the highest value opening size, the evaluation circuit 23 is specifically configured to:
  • the score of the highest value opening size corresponding to the first filter is determined according to the curvature.
  • the curvature and the score of the highest value opening size have a positive correlation.
  • the evaluation circuit 23 is specifically configured to:
  • the score of the monotonicity of the curve corresponding to the first filter is determined according to the number of monotonically increasing intervals and/or the number of monotonically decreasing intervals.
  • the number of the monotonic increasing interval and/or the monotonic decreasing interval is in a negative correlation with the score of the monotonicity of the curve.
  • the scene is any one of the following: a normal light scene, a strong light scene, a low light scene, a point light source scene, and a character scene.
  • the filter circuit 22 is also used for:
  • the evaluation circuit 23 when the first filter is evaluated according to the filtering result corresponding to each frame of image, the evaluation circuit 23 is specifically configured to:
  • the first filter is evaluated according to the filtering result corresponding to the image to be processed and the filtering result corresponding to the second image.
  • evaluation circuit 23 is also used for:
  • a scene corresponding to the image is determined.
  • the evaluation circuit 23 when determining the scene corresponding to the image according to the image to be processed, is specifically configured to:
  • the scene corresponding to the image to be processed is determined.
  • the evaluation circuit 23 when determining the scene corresponding to the image according to the image to be processed, is specifically configured to:
  • the scene corresponding to the image to be processed is determined.
  • the evaluation circuit 23 when determining the scene corresponding to the image according to the image to be processed, is specifically configured to:
  • the scene corresponding to the image to be processed is determined by detecting whether there is a preset object in the image to be processed.
  • the image processing apparatus shown in FIG. 10 can execute the methods of the embodiments shown in FIGS. 1 to 8. It can be understood that the methods of the embodiments shown in FIG. 1 to FIG. 8 can be implemented by hardware circuits. For example, calculating the focus value according to the filtering result can be realized by an accumulator; calculating the score of the filter can be realized by an arithmetic unit corresponding to the scoring method; judging whether the score meets the requirements can be realized by a comparator; performing focus processing , Can be achieved by outputting a step signal to the focus motor.
  • An embodiment of the present invention also provides a photographing device, including the image processing device described in any one of the foregoing embodiments.
  • the photographing device may further include a lens, a focus motor, and an image sensor.
  • the focus motor is used to drive the lens to move to change the object distance or image distance.
  • the image sensor is used to convert the light signal passing through the lens into an electric signal to form an image.
  • the shooting device may be a mobile phone, a sports camera, a professional camera, an infrared camera, and the like.
  • a mobile phone a sports camera
  • a professional camera a professional camera
  • an infrared camera and the like.
  • the embodiment of the present invention also provides a movable platform including the above-mentioned photographing equipment.
  • the movable platform may be an unmanned aerial vehicle, an unmanned vehicle, or a cloud platform.
  • the movable platform may also include a body and a power system. The photographing equipment and the power system are provided on the body, and the power system is used to provide power for the movable platform.
  • an embodiment of the present invention also provides a computer-readable storage medium having program instructions stored in the computer-readable storage medium, and the program instructions are used to implement the image processing method described in any of the foregoing embodiments.
  • the disclosed related devices and methods can be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the modules or units is only a logical function division.
  • there may be other division methods for example, multiple units or components may be It can be combined or integrated into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of the present invention essentially or the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read_Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disks or optical disks and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

本发明实施例提供一种图像处理方法、装置、拍摄设备、可移动平台和存储介质,其中方法包括:获取多帧待处理的图像,多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,多帧待处理的图像为一场景对应的多帧图像,场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像;通过待分析的第一滤波器,对多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果;根据每一帧图像对应的滤波结果,对第一滤波器进行评价;根据第一滤波器对应的评价结果,确定第一滤波器是否为适用于场景的滤波器。本发明实施例提供的图像处理方法、装置、拍摄设备、可移动平台和存储介质,可以提高为场景选取滤波器的效率,提高拍摄的效果。

Description

图像处理方法、装置、拍摄设备、可移动平台和存储介质 技术领域
本发明实施例涉及图像处理技术领域,尤其涉及一种图像处理方法、装置、拍摄设备、可移动平台和存储介质。
背景技术
随着成像技术的不断发展,相机的应用也越来越广泛。目前,大多数相机可以提供自动对焦功能,免除用户手动对焦的过程,为用户提供便利。
在实际拍摄过程中,可以利用滤波器对拍摄的图像进行滤波处理,从而辅助实现自动对焦功能。滤波器的种类和参数有很多种,现有技术中,在相机出厂前,往往依靠人工经验来为相机选取滤波器,需要耗费较多的时间,并且所选取的滤波器可能滤波效果不好,导致拍摄效果较差。
发明内容
本发明实施例提供了一种图像处理方法、装置、拍摄设备、可移动平台和存储介质,用以解决现有技术中为相机选取滤波器的效率低下的技术问题。
本发明实施例第一方面提供一种图像处理方法,包括:
获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像;
通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果;
根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价;
根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
本发明实施例第二方面提供一种图像处理装置,包括:
存储器,用于存储计算机程序;
处理器,用于运行所述存储器中存储的计算机程序以实现:
获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像;
通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果;
根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价;
根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
本发明实施例第三方面提供一种图像处理装置,包括:
获取电路,用于获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像;
滤波电路,用于通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果;
评价电路,用于根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价;
确定电路,用于根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
本发明实施例第四方面提供一种拍摄设备,包括第二方面所述的图像处理装置。
本发明实施例第五方面提供一种拍摄设备,包括第三方面所述的图像处理装置。
本发明实施例第六方面提供一种可移动平台,包括第四方面所述的拍摄设备。
本发明实施例第七方面提供一种可移动平台,包括第五方面所述的拍摄设备。
本发明实施例第八方面提供一种计算机可读存储介质,所述计算机可读存储介质中存储有程序指令,所述程序指令用于实现第一方面所述的图像处理方法。
本发明实施例提供的图像处理方法、装置、拍摄设备、可移动平台和存 储介质,可以快速对第一滤波器是否适用于当前场景做出判断,提高为场景选取滤波器的效率,提高拍摄的效果。
附图说明
此处所说明的附图用来提供对本发明的进一步理解,构成本发明的一部分,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。在附图中:
图1为本发明实施例提供的一种图像处理方法的流程示意图;
图2为本发明实施例提供的另一种图像处理方法的流程示意图;
图3为本发明实施例提供的一种对第一滤波器进行评价的流程示意图;
图4为本发明实施例提供的一种通过数值点形成的焦值曲线的示意图;
图5为本发明实施例提供的一种根据焦值曲线确定第一滤波器对应的评分的流程示意图;
图6为本发明实施例提供的一种焦值曲线的示意图;
图7为本发明实施例提供的另一种焦值曲线的示意图;
图8为本发明实施例提供的又一种图像处理方法的流程示意图;
图9为本发明实施例提供的一种图像处理装置的结构示意图;
图10为本发明实施例提供的另一种图像处理装置的结构示意图。
具体实施方式
为使本发明实施例的目的、技术方案和优点更加清楚,下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
除非另有定义,本文所使用的所有的技术和科学术语与属于本发明的技术领域的技术人员通常理解的含义相同。本文中在本发明的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本发明。
本发明实施例提供了一种图像处理方法,该方法的执行主体可以为拍摄设备中的图像处理装置,可以理解的是,所述拍摄设备可以为相机等任意具 有拍摄功能的设备,所述图像处理装置可以实现为软件、或者软件和硬件的组合。
本发明实施例提供的图像处理方法,可以获取一场景对应的多帧待处理的图像,通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果,并根据得到的滤波结果对所述第一滤波器进行评价,以确定所述第一滤波器是否为适用于所述场景的滤波器。
如上所述的图像处理方法,可以通过对拍摄的图像进行处理来为一场景确定对应的滤波器。滤波器的型号和参数多种多样,不同的滤波器可能会导致最终拍摄得到的图像的效果不同,因此,可以为不同的拍摄场景设置不同的滤波器,从而更好地向用户展现所拍摄的景物,满足各个场景的拍摄需求。
下面结合附图,对本发明的一些实施方式作详细说明。在各实施例之间不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
图1为本发明实施例提供的一种图像处理方法的流程示意图。如图1所示,本实施例中的图像处理方法,可以包括:
步骤101、获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像。
本发明实施例中的方法,可以用于为一场景确定对应的滤波器,所述场景可以为任意场景如室内场景、室外场景等,记为待分析的场景,所述景物可以为符合该场景的任意景物。假设所述待分析的场景为室内场景,即,需要确定室内场景对应的滤波器,则所述景物可以为符合室内场景的任意景物,例如客厅、卧室、厨房等。
在确定待分析的场景后,可以选择一个符合该场景的景物,并通过拍摄设备对选择的景物进行拍摄。所述拍摄设备可以包括镜头、对焦电机、图像传感器等,景物反射的光经过所述镜头后会聚在所述图像传感器,所述图像传感器将光信号转换为电信号,从而形成图像。
在对一景物进行拍摄的过程中,对焦电机可以驱动镜头移动,从而改变物距,导致被拍摄的景物在图像中的清晰程度不断变化。镜头的可移动范围可以由对焦电机的行程决定,镜头的移动可以手动进行,也可以自动进行。
本实施例中,可以获取在镜头移动过程中拍摄的所述景物对应的多帧图 像作为待处理的图像,其中,所述多帧图像可以是在镜头的可移动范围内拍摄的全部图像,也可以是其中的部分图像。所述图像可以为PNG图像或其他格式的图像。
步骤102、通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果。
其中,所述第一滤波器可以为无限脉冲响应(Infinity Impulse Respond,IIR)滤波器或者有限脉冲响应(Finite Impulse Response,FIR)滤波器。在一实施方式中,自一个包括多个滤波器的滤波器集合中选择一个滤波器,作为待分析的第一滤波器。在该滤波器集合中的多个滤波器的类型不同,或者滤波器的参数不同。
第一滤波器的作用是把图像中的有效的信号捕获出来,根据滤波后的结果可以完成后续的对焦处理。若第一滤波器设计得不合理,那么对焦的结果可能就是不准确的,容易导致对焦失败,最终拍摄得到的图像模糊,影响用户的使用体验。
步骤103、根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价。
在获取待处理的多帧图像后,可以通过第一滤波器对所述多帧图像分别进行滤波处理,然后,根据滤波后得到的结果对所述第一滤波器进行评价。
根据滤波结果对所述第一滤波器进行评价,可以有多种实现方法。可选的,可以根据滤波结果对所述第一滤波器进行评分,滤波结果越符合期望,则评分越高,反之则评分越低;或者,可以根据滤波结果判断所述第一滤波器是否合格,若满足预设的条件,则认为所述第一滤波器是合格的,反之则认为是不合格的。
其中,对第一滤波器的具体评价策略,例如具体如何评分或者如何判断是否合格,可以根据实际需要来设置。可以理解的是,不同的相机可能会有不同的拍摄指标或拍摄要求,在确定第一滤波器的具体评价策略时可以参考所述拍摄指标或拍摄要求。此外,不同场景对应的具体评价策略可以相同,也可以不同,本实施例对此不作限制。
在一个可选的实施方式中,可以根据每一帧图像的滤波结果确定所述图像的清晰程度,并根据清晰程度对所述第一滤波器进行评价。
举例来说,若在意图像的最终呈现效果,则可以根据滤波结果确定各帧 图像的清晰程度,并在各帧图像中找出清晰程度最高的图像,判断其对应的清晰程度是否满足要求,若清晰程度最高的图像也不满足要求,则可以认为第一滤波器不合格,反之则认为第一滤波器合格。
在一个实施方式中,若所述待分析的第一滤波器不满足要求,则调整所述待分析的第一滤波器的滤波器参数,并将调整滤波器参数之后的第一滤波器对上述多帧待处理的图像进行滤波,以得到每一个图像对应的新的滤波结果,并对所述新的滤波结果进行评价,以确定所述调整滤波器参数之后的第一滤波器是否为适用于所述场景的滤波器。在另一实施方式中,若所述待分析的第一滤波器不满足要求,则从包括多个滤波器的滤波器集合选择另一滤波器,并利用另一滤波器对上述多帧待处理的图像进行滤波,以得到每一个图像对应的新的滤波结果,并对所述另一滤波器结果进行评价,以确定所述另一滤波器是否为适用于所述场景的滤波器。
若在意对焦的速度,则可以根据滤波结果确定各帧图像的清晰程度,并根据各帧图像之间的清晰程度变化,判断能否快速从中选择清晰程度最高的图像,若不能,则认为所述第一滤波器不合格,反之则认为所述第一滤波器合格。
其中,判断能否快速从中选择清晰程度最高的图像,可以根据对焦时所选择的方法来实现。例如,若在实际拍摄时选择爬坡法进行对焦,则理想的清晰程度变化趋势应该是单调增加到一个峰值之后,再单调递减。若满足这一条件,则可以认为能够快速从中选择出清晰程度最高的图像。若在实际拍摄时采用其它方法进行对焦,则也可以选择对应的方法来判断能否快速选出清晰程度最高的图像,以完成对第一滤波器的评价。
以上通过第一滤波器对图像进行滤波处理后得到的清晰程度,可以实现对第一滤波器的评价。根据滤波结果确定图像的清晰程度可以通过任意方式来确定,本实施例对此不作限制。
在另一可选的实施方式中,可以根据第一滤波器得到的滤波结果,确定各帧图像经过滤波后的平滑程度,并根据平滑程度确定是否满足要求。
在某些场景下,希望滤波后的图像越平滑越好,图像中各物体的边缘尽量不要太锐利;而在另外一些场景下,希望滤波后的图像越锐利越好,物体的边缘尽量不要太模糊。那么可以根据待分析的场景对应的平滑要求,来对第一滤波器进行评价。
具体来说,在通过所述第一滤波器对待处理的多帧图像进行滤波后,可以根据滤波结果确定滤波后的图像对应的平滑程度,例如,可以对相邻的像素点进行比较来确定图像的平滑程度。若各帧图像的平滑程度均符合预设的平滑要求,则可以认为所述第一滤波器是合格的,反之则认为所述第一滤波器是不合格的。
步骤104、根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
通过前述分析可知,评价结果的表现方式有很多种,例如可以表现为具体的评分,或者合格与不合格等等。可以根据实际需要来设置评价结果与是否适用的对应关系。
在一个可选的实施方式中,所述评价结果可以包括合格或者不合格,若所述滤波器的评价结果为合格,则确定所述第一滤波器为适用于所述场景的滤波器,若所述滤波器的评价结果为不合格,则确定所述第一滤波器为不适用所述场景的滤波器。
在另一个可选的实施方式中,所述评价结果可以包括对第一滤波器的评分。若所述评分超过一定的分数阈值(例如,分值A),则确定所述第一滤波器为适用于所述场景的滤波器,反之则确定所述第一滤波器为不适用所述场景的滤波器。
在实际应用中,当在某一场景下进行拍摄时,可以在镜头移动过程中不断获取拍摄的图像,通过滤波器可以对图像进行滤波处理,根据图像在频率上的滤波信息可以判断是否到了合焦位置。但是在不同场景下,对应的合焦的敏感频域部分是有差异的,该差异对能否寻找到合焦位置有很大的影响,因此,滤波器的选择对实际拍摄效果至关重要。
本实施例中,可以利用以上所述的方法实现对第一滤波器的评价,并确定所述第一滤波器是否适用于所述待分析的场景。若所述第一滤波器适用于所述待分析的场景,则说明所述第一滤波器满足所述场景下的拍摄需求,可以将所述第一滤波器与所述场景的对应关系保存在拍摄设备中。当用户使用所述拍摄设备在所述场景下进行拍摄时,使用与所述场景对应的所述第一滤波器对拍摄的图像进行滤波,使得滤波效果满足该场景下的拍摄需求,提高拍摄效果。
或者,可以将以上所述的方法应用于实际拍摄过程中。具体来说,可以 在用户的实际拍摄过程中,使用以上步骤101至步骤104所述的方法实现对滤波器的评价。进一步地,在步骤104之后,若所述第一滤波器为适用于所述场景的滤波器,则可以根据所述每一帧图像对应的滤波结果进行对焦处理,从而获得清晰的图像。
其中,所述对焦处理具体可以是指根据滤波结果确定合焦位置。在镜头移动过程中,拍摄的图像中的物体通常会经历从模糊到清晰再到模糊的过程,其中所述物体最清晰时镜头所在的位置可以认为是合焦位置。通过滤波结果确定合焦位置的具体实现方法有很多种,本实施例对此不作限制。以上方法可以应用于反差对焦(CDAF)方案中,也可以应用于其它对焦方案中。
本实施例提供的图像处理方法,可以获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像,通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果,根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价,根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器,从而可以快速对第一滤波器是否适用于当前场景做出判断,提高为场景选取滤波器的效率,提高拍摄的效果。
本发明实施例提供的方案,能够判断某一滤波器是否为适用于当前场景的滤波器,在此基础上,可以实现从所述多个滤波器中选择适用于当前场景的滤波器。
具体来说,当为一场景选择滤波器时,可以预先设计多种滤波器,并从中选择适用于所述场景的滤波器。可以理解的是,滤波器的类型有很多种,滤波器的具体参数也可以进行调整,通过改变滤波器的类型和/或参数可以设计出不同的滤波器,然后从设计的多个滤波器中可以选择适用于当前场景的滤波器。因此,为场景选择滤波器的过程可以理解为对滤波器的参数进行调试的过程。
在一些方案中,为场景选择滤波器依赖于设备厂商的经验性调试,没有一套完善的评价方法来针对不同的场景对滤波器进行调试,优化滤波器的参数。这种基于经验选择滤波器的方法会耗费较多的时间,并且最终选择的滤 波器效果并不好。
在本发明实施例中,可以根据当前场景对应的图像对不同滤波器的性能进行评价,有效减少调试滤波器花费的时间,并且最终选择的滤波器能够有效提升对焦准确率,极大地提升了对焦算法的鲁棒性,缩短了滤波器调试的周期。
图2为本发明实施例提供的另一种图像处理方法的流程示意图。图2以两个滤波器为例来说明从多个滤波器中选择适用于当前场景的滤波器的实现方法。如图2所示,所述图像处理方法可以包括:
步骤201、获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像。
步骤202、通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,根据所述第一滤波器对应的滤波结果,对所述第一滤波器进行评价。
具体来说,可以通过所述第一滤波器对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果,根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价。
对第一滤波器进行评价的具体实现方法可以参见前述实施例,此处不再赘述。
步骤203、通过待分析的第二滤波器,对所述多帧待处理的图像进行滤波,根据所述第二滤波器对应的滤波结果,对所述第二滤波器进行评价。
其中,所述第二滤波器可以是与第一滤波器不同的任意滤波器。可以理解的是,通过改变滤波器的类型和/或参数可以获得不同的滤波器。
可选的,所述滤波器的类型可以包括但不限于:巴特沃斯(Butterworth)滤波器、切比雪夫(Chebyshev)滤波器、贝塞尔(Bessel)滤波器、椭圆(Elliptic)滤波器等。
所述滤波器的参数可以为所述滤波器的频响系数,所述参数具体可以包括但不限于:通带截止频率、阻带截止频率、通带衰减、阻带衰减、低截止频率、高截止频率、通带宽度、阻带宽度等。
不同类型和/或参数的滤波器可以认为是不同的滤波器。在步骤203中,可以对与第一滤波器不同的第二滤波器进行评价。具体地,可以通过所述第 二滤波器对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果,根据所述第二滤波器对每一帧图像的滤波结果,对所述第二滤波器进行评价。
对第二滤波器进行评价的具体实现方法与对第一滤波器进行评价的方法类似,此处不再赘述。
可以理解的是,本发明实施例中给出各个步骤的顺序只是示例,步骤202和步骤203的顺序可以根据实际需要来调整,例如,可以先对第二滤波器进行评价,再对第一滤波器进行评价,或者,可以同时对第一滤波器和第二滤波器进行评价。
步骤204、根据所述第一滤波器对应的评价结果以及所述第二滤波器对应的评价结果,从所述第一滤波器和所述第二滤波器中选择适用于所述场景的滤波器。
通过步骤204可以实现根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。可以理解的是,若从所述第一滤波器和所述第二滤波器中选择第一滤波器作为适用于所述场景的滤波器,那么表明所述第一滤波器是适用于所述场景的滤波器;若选择所述第二滤波器作为适用于所述场景的滤波器,那么可以认为所述第一滤波器不是适用于所述场景的滤波器。
根据所述第一滤波器对应的评价结果以及所述第二滤波器对应的评价结果,可以从所述第一滤波器和所述第二滤波器中选择评价结果更优秀的一个作为适用于所述场景的滤波器。
举例来说,若所述评价结果为评分,则可以从所述第一滤波器和所述第二滤波器中选择评分较高的滤波器作为适用于所述场景的滤波器。
若所述评价结果的具体表现形式为合格或不合格,则判断所述第一滤波器和所述第二滤波器中是否有且仅有一个滤波器合格,若是,则选择合格的滤波器作为适用于所述场景的滤波器。
可选的,若所述第一滤波器和所述第二滤波器的评分相同,或者所述第一滤波器和所述第二滤波器均合格,那么,还可以结合其它策略例如选择一个实现成本更低的滤波器作为适用于所述场景的滤波器。
图2以两个滤波器为例,描述了从多个滤波器中选择一个适用于待分析的场景的滤波器的实现方法。在实际应用中,待分析的滤波器也可以不局限于 两个。当需要从两个以上的滤波器中选择适用于所述场景的滤波器时,针对每一个滤波器,可以通过所述滤波器,对所述待处理的图像进行滤波,得到每一帧图像对应的滤波结果,并根据所述滤波器得到的每一帧图像对应的滤波结果,对所述滤波器进行评价,最后,根据多个滤波器对应的评价结果,从所述多个滤波器中选择适用于所述场景的滤波器。
可选的,可以选择巴特沃斯滤波器作为待分析的滤波器。巴特沃斯滤波器在通带内的频率响应曲线平坦,且简单、易于实现。巴特沃斯滤波器的参数可以包括通带增益、通带宽度和低截止带等。
在为场景选择滤波器时,可以预先设计多个巴特沃斯滤波器,不同的滤波器的具体参数不同,如第一滤波器的截止频率为f1,第二滤波器的截止频率为f2。然后,对符合所述场景的景物进行拍摄,得到待处理的图像,利用预先设计好的多个巴特沃斯滤波器,分别对所述待处理的图像进行滤波,并根据各滤波器对应的滤波结果对滤波器进行评价,根据评价结果从所述多个滤波器中选取适用于所述场景的滤波器。
本实施例提供的图像处理方法,在获取到待分析的场景对应的待处理的图像后,可以分别利用所述第一滤波器和所述第二滤波器对所述待处理的图像进行滤波,并根据所述第一滤波器对应的滤波结果和所述第二滤波器对应的滤波结果,分别对所述第一滤波器和所述第二滤波器进行评价,从而在所述第一滤波器和所述第二滤波器中选择适用于所述场景的滤波器,能够对多个滤波器是否适用于所述场景进行分析,并从多个滤波器中为所述场景选择对应的滤波器,有效提高了为场景选择滤波器的效率和准确率。
图3为本发明实施例提供的一种对第一滤波器进行评价的流程示意图。在上述实施例提供的技术方案的基础上,可选的是,可以根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分。如图3所示,根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分,可以包括:
步骤301、根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值(focus value)。
在一个实施方式中,对图像中像素的灰度值进行傅里叶变换得到关于图像灰度值的频率分量(例如,包括频率的振幅)。之后,将该频率分量(例如,频率的振幅)进行归一化处理,得到归一化处理后的频率分量。在归一 化处理后的频率分量中确定感兴趣的区域(即,对焦区域)对应的频率范围(或频率)。接下来,对所述频率范围(或频率)进行相应的数字滤波。滤波之后,对频率范围中各频率进行累加,得到的累加值即为焦值。需要说明的是,频率分量指示图像的能量。
其中,所述焦值可以用于表示图像的清晰程度,所述清晰程度可以是指图像的整体清晰程度或者图像中部分区域的清晰程度。在对一景物进行拍摄时,可以获取镜头移动过程中拍摄的多帧图像,对于单帧图像来说,图像的焦值越大,说明该图像越清晰。
可选的,每一帧图像对应的滤波结果可以包括所述图像的至少部分像素点中每个像素点在滤波处理后对应的信号值。相应的,根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值,可以包括:针对每一帧图像,将所述图像中所述至少部分像素点对应的信号值进行累加,得到所述图像对应的焦值。所述至少部分像素点为感兴趣区域的像素点。在一个实施方式中,所述对应的信号值为所述至少部分像素点对应的频率值。
在一个可选的实施方式中,所述滤波结果可以包括所述图像的全部像素点中的每个像素点对应的信号值。其中,像素点对应的信号值是该像素点经过滤波后对应的数值,所述数值可以用于表示该像素点对应的梯度信息。
将所述图像的全部像素点经过滤波后对应的信号值进行累加,可以得到所述图像对应的焦值,从而反映所述图像整体的清晰程度。
在另一个可选的实施方式中,所述滤波结果可以包括所述图像的部分像素点中的每个像素点对应的信号值。将所述图像的部分像素点对应的信号值进行累加,得到所述图像对应的焦值。
所述部分像素点可以是在图像中均匀分布的多个像素点,通过所述部分像素点也可以反映所述图像的整体清晰程度,并且能够减少计算量,提高图像处理的速度,从而提升对焦的效率。
或者,所述部分像素点可以包括所述图像中用户感兴趣区域内的像素点。所述感兴趣区域可以通过多种方法来确定。例如,在拍摄图像时,用户可以点击画面中想要对焦的区域,通过识别用户点击的位置,可以确定感兴趣区域,或者,可以对图像进行语义分析,通过语义分析确定感兴趣区域,如自拍模式下人脸所在的区域可以作为所述感兴趣区域。
所述感兴趣区域中的像素点可以作为用于计算焦值的像素点,将所述感 兴趣区域内的像素点对应的信号值进行累加,可以得到所述图像对应的焦值。通过感兴趣区域内的像素点对应的信号值计算焦值,能够使焦值反映图像中感兴趣区域的清晰程度,提高对焦准确性。
以上通过信号值叠加的方法来计算图像的焦值,除此之外,还可以利用其它方法来计算焦值,例如,可以通过评价函数对图像的清晰程度进行评价,评价函数的输出值可以作为所述图像的焦值。评价函数的具体表现形式可以根据实际需要来设置,本实施例对此不作限制。
步骤302、根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分。
可选的,根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分,可以包括:根据各帧图像对应的焦值,确定焦值曲线;根据所述焦值曲线确定所述第一滤波器对应的评分。
其中,焦值曲线可以是指将各帧图像对应的焦值连接起来形成的曲线。具体来说,根据各帧图像对应的焦值,确定焦值曲线,可以包括:针对每一帧图像,确定所述图像在焦值曲线中对应的数值点,所述数值点的横坐标为所述图像的序号,纵坐标为图像对应的焦值;根据所确定的多个数值点生成所述焦值曲线。其中,所述图像的序号可以按照拍摄的顺序确定。
图4为本发明实施例提供的一种通过数值点形成的焦值曲线的示意图。如图4所示,横坐标从1到41,说明针对一景物拍摄了41帧图像,对于每一帧图像都可以计算其对应的焦值,从而形成焦值曲线,纵坐标为图像对应的焦值。需要说明的是,因为各帧图像对应的镜头位置不同,因此在图4中用序列号1至序列号41来表示镜头位置。也就是说,在图4中横坐标对应于镜头位置1至镜头位置41,其分别与帧1至帧41对应。
通过焦值曲线确定第一滤波器的评分的方法可以有很多种。可选的,可以根据实际对焦策略来确定通过焦值曲线进行评分的策略。例如,若实际对焦时需要满足焦值最大的位置两边的坡度较抖才能实现准确合焦,那么在对第一滤波器进行评价时,可以将焦值曲线中最高值两侧的斜率作为评分的参考依据,使得最终选取的滤波器能够满足对焦的要求。
本实施例提供的图像处理方法,可以根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值,根据各帧图像对应的焦值,确定焦值曲线,根据所述焦值曲线确定所述第一滤波器对应的评分,焦值曲线能够直 观地反映图像清晰程度的变化趋势,从而能够快速、准确地实现对第一滤波器的评价,进一步提高为场景选取滤波器的效率和准确性。
在上述实施例提供的技术方案的基础上,可选的是,根据所述焦值曲线确定所述第一滤波器对应的评分,可以包括:根据所述焦值曲线,通过下述至少一项确定所述第一滤波器对应的评分:曲线对比度、第一比值、第二比值、最高值开口大小、曲线单调性。其中,所述第一比值为焦值高于第一阈值的数值点的占比;所述第二比值为焦值低于第二阈值的数值点的占比。下面以图5为例进行说明。
图5为本发明实施例提供的一种根据焦值曲线确定第一滤波器对应的评分的流程示意图。图5所示的方案中,通过曲线对比度、第一比值、第二比值、最高值开口大小以及曲线单调性五个维度来确定所述第一滤波器对应的评分。
具体来说,可以通过所述焦值曲线,确定曲线对比度、第一比值、第二比值、最高值开口大小以及曲线单调性各自对应的评分,并根据得到的五个评分进一步确定所述第一滤波器最终的评分。
如图5所示,根据所述焦值曲线确定所述第一滤波器对应的评分,可以包括:
步骤501、在所述焦值曲线中,确定焦值最大的N个数值点以及焦值最小的M个数值点。
其中,所述M和N均为正整数。
步骤502、根据所述N个数值点对应的焦值的均值与所述M个数值点对应的焦值的均值之间的比值,确定所述第一滤波器对应的曲线对比度的评分。
本实施例中,可以通过步骤501至步骤502来实现对所述第一滤波器对应焦值曲线的曲线对比度进行评分。
其中,所述曲线对比度可以是指曲线中最大的N个值的均值与最小的M个值的均值的比值,所述比值可以在一定程度上反映焦值曲线的好坏程度。
可选的,所述焦值曲线中最大的N个焦值的均值与最小的M个焦值的均值之间的比值,与所述第一滤波器对应的曲线对比度的评分可以为正相关关系。
其中,正相关关系是指,当变量x增大时,变量y也随之增大,即,两个变量的变动方向相同,一个变量x由大到小/由小到大变化时,另一个变量y也由大到小/由小到大变化,那么变量x和变量y可以认为是正相关关系。
在本实施例中,最大的N个焦值的均值与最小的M个焦值的均值之间的比值,与对应的曲线对比度的评分可以为正相关关系,也就是说,所述比值越大,所述曲线对比度的评分可以越高,所述比值越小,所述曲线对比度的评分可以越低,例如所述比值与所述评分之间的关系可以为正比例函数。
在实际拍摄过程中,合焦位置附近的图像的焦值比较大,失焦位置的图像的焦值比较小,合焦和失焦之间的焦值差异越大,对应的拍摄效果就越好,这种情况下第一滤波器对应的曲线对比度的评分较高;如果合焦和失焦之间的焦值差异较小,说明第一滤波器针对当前场景的滤波效果较差,那么对应的评分可以较低。
本实施例中,M和N的取值可以根据实际需要来设置。可选的,所述M可以等于所述待处理的图像的帧数乘以第一系数后取整得到的值,所述N可以等于所述待处理的图像的帧数乘以第二系数后取整得到的值,其中,所述第一系数和所述第二系数均大于0,取整可以为向上取整、向下取整、四舍五入等任意取整方法。
在一些实施方式中,所述第二系数可以大于所述第一系数,因为在镜头移动过程中拍摄的多帧图像中,失焦的图像较多,而接近合焦状态的图像较少,将N设置为大于M,能够有效反映合焦状态和失焦状态下的对比度。
例如,所述第一系数为10%,所述第二系数为20%,假设图像共有41帧,那么所述N可以为4,所述M可以为8,从焦值曲线中找到焦值最大的4个数值点和焦值最小的8个数值点,并以此来对曲线对比度进行评分。
步骤503、确定所述焦值曲线的最高值,并统计焦值大于第一阈值的数值点的数量,所述第一阈值为所述最高值与第一比例系数的乘积。
其中,所述最高值可以是所述焦值曲线中的最大焦值,大于所述第一阈值的焦值可以是在数值上接近所述最高值的焦值,所述第一比例系数可以根据实际需要来设置,例如可以为92%,那么,可以统计在最高值的92%至100%之间的焦值数量。
步骤504、根据所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,确定所述第一滤波器对应的所述第一比值的评分。
本实施例中,可以通过步骤503至步骤504来实现对所述第一滤波器对应焦值曲线的第一比值进行评分。其中,所述第一比值为焦值高于第一阈值的数值点的占比。
可选的,所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第一滤波器对应的第一比值的评分可以为负相关关系。
其中,负相关关系是指,当变量x增大时,变量y随之减小,即,两个变量的变动方向相反,一个变量x由大到小/由小到大变化时,另一个变量y由小到大/由大到小变化,那么变量x和变量y可以认为是负相关关系。
在本实施例中,所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第一比值的评分为负相关关系,也就是说,所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值越大,对应的评分可以越低,所述比值越小,对应的评分可以越高,例如所述比值与对应的评分之间的关系可以为反比例函数。
在实际拍摄过程中,最高值对应的位置可以认为是合焦位置,接近最高值的焦值越多,在对焦过程中就越难快速准确地找到合焦位置,所以焦值大于第一阈值的数值点的占比与所述第一比值的评分可以为负相关关系,能够使为当前场景选取的滤波器更好地实现辅助对焦的功能,提高图像的拍摄效果。
步骤505、确定所述焦值曲线的最低值,并统计焦值小于第二阈值的数值点的数量,所述第二阈值为所述最低值与第二比例系数的乘积。
其中,所述最低值可以是所述焦值曲线中的最小焦值,小于所述第二阈值的焦值可以是在数值上接近所述最低值的焦值,所述第二比例系数可以根据实际需要来设置,例如可以为120%,那么,可以统计在最低值的100%至120%之间的焦值数量。
步骤506、根据所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,确定所述第一滤波器对应的第二比值的评分。
本实施例中,可以通过步骤505至步骤506来实现对所述第一滤波器对应焦值曲线的第二比值进行评分。其中,所述第二比值为焦值低于第二阈值的数值点的占比。
可选的,所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第二比值的评分可以为正相关关系。
在实际拍摄过程中,焦值接近最低值的图像越多,说明失焦图像对应的焦值越小,表示第一滤波器对图像滤波的效果越好,因此,所述焦值小于第 二阈值的数值点的占比与所述第二比值的评分可以为正相关关系,能够进一步提高图像的拍摄效果。
步骤507、确定所述焦值曲线在最高值处对应的曲率。
步骤508、根据所述曲率确定所述第一滤波器对应的最高值开口大小的评分。
本实施例中,可以通过步骤507至步骤508来实现对所述第一滤波器对应焦值曲线的最高值开口大小进行评分。
具体来说,所述焦值曲线在最高值处的开口大小可以通过曲率来表示,可选的,可以通过二次方程对最高值所在的数值点及附近的若干个数值点连成的曲线进行拟合,根据二次方程的特性,可以确定曲线在最高值处的曲率。
可以理解的是,焦值曲线在最高值处对应的曲率越大,说明所述焦值曲线的在最高值处的开口越小,开口越小,说明曲线越抖,有利于快速准确地实现对焦。因此,所述曲率与所述最高值开口大小的评分可以为正相关关系。
步骤509、统计所述焦值曲线中单调递增区间的数量和/或单调递减区间的数量。
其中,若相邻两个区间同为单调递增区间,或者同为单调递减区间,那么这两个区间合并为一个区间。
步骤510、根据所述单调递增区间的数量和/或单调递减区间的数量,确定所述第一滤波器对应的曲线单调性的评分。
本实施例中,可以通过步骤509至步骤510来实现对所述第一滤波器对应焦值曲线的曲线单调性进行评分。
理想的焦值曲线应该是在最高值的左边单调上升,在最高值的右边单调下降。所述焦值曲线的波动越多,就越难实现准确对焦,因此,所述单调递增区间和/或单调递减区间的数量与所述曲线单调性的评分可以为负相关关系,从而可以快速、准确地实现对焦。
可以理解的是,本实施例中的上述步骤的执行顺序并不限于上述序号所限定的顺序,例如,可以先评价曲线单调性再评价最高值开口大小,也可以先评价最高值开口大小再评价曲线单调性,或者,对曲线单调性和最高值开口大小的评价可以同时进行,本领域技术人员可以根据具体的应用需求和设计需求进行任意配置,在此不再赘述。
步骤511、根据曲线对比度、第一比值、第二比值、最高值开口大小以及 曲线单调性对应的评分,确定所述第一滤波器的评分。
可选的,在曲线对比度、第一比值、第二比值、最高值开口大小以及曲线单调性这五个维度中,可以为每个维度分配一权重值,将五个维度对应的评分进行加权求和,得到所述第一滤波器的评分。
以上给出了通过曲线对比度、第一比值、第二比值、最高值开口大小以及曲线单调性五个维度对第一滤波器进行评价的具体实现原理,在其它可选的实施方式中,也可以仅选择其中部分维度对所述第一滤波器进行评价。
在实际应用中,在对第一滤波器进行评价时,可以利用焦值曲线,采用上述方法实现对第一滤波器的评分,从而确定所述第一滤波器是否为适用于当前场景的滤波器。同理,所述方法也可以用于对其它滤波器进行评价,例如,可以预先设置多个滤波器,通过上述方法对每个滤波器进行评分,并从中选择评分最高的作为适用于当前场景的滤波器。
图6为本发明实施例提供的一种焦值曲线的示意图。图6和图4中的横坐标和纵坐标的定义相似。为求简洁,不再赘述。如图6所示,该焦值曲线的波动较多,应用到实际对焦场景时,对焦很容易停留在波动的部位,即图中序号10至20之间的峰值部分。此外,图6所示的焦值曲线中,曲线最高的地方存在两个点的焦值较为接近,对合焦位置的判断有很大的影响。因此,图6所示的对焦曲线的评分可以较低。
图7为本发明实施例提供的另一种焦值曲线的示意图。图7和图4中的横坐标和纵坐标的定义相似。为求简洁,不再赘述。如图7所示,焦值曲线的波动较少,最高值比较明显,优于图6所示的焦值曲线,在图6和图7对应的滤波器中,可以选择图7对应的滤波器作为适用于当前场景的滤波器。
本实施例提供的根据焦值曲线确定第一滤波器对应的评分的方法,可以基于所述焦值曲线的曲线对比度、第一比值、第二比值、最高值开口大小以及曲线单调性五个维度确定所述第一滤波器对应的评分,从而更好地实现对第一滤波器的评价,使得为当前场景筛选出的滤波器更加符合所述场景的拍摄需求,通过筛选出的滤波器对实际拍摄的图像进行滤波处理,能够使滤波后的图像清晰程度变化趋势较好,滤波结果形成的焦值曲线更加符合对焦处理的需求,从而快速准确地实现对焦,提高图像的拍摄效果。
在上述实施例提供的技术方案的基础上,可选的是,针对一个场景,可 以对多个符合该场景的景物进行拍摄,从而提高滤波器对所述场景的适应性。
图8为本发明实施例提供的又一种图像处理方法的流程示意图。如图8所示,所述图像处理方法可以包括:
步骤801、获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的第一景物进行拍摄得到的多帧图像。
步骤802、通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果。
本实施例中,步骤801至步骤802的具体实现原理和过程可以参见前述实施例,此处不再赘述。
步骤803、获取多帧第二图像,所述多帧第二图像为对符合所述场景的第二景物进行拍摄得到的多帧图像。
步骤804、通过所述第一滤波器,对所述多帧第二图像进行滤波,得到每一帧第二图像对应的滤波结果。
可选的,符合所述场景的景物可以有很多个,假设所述场景为人脸场景,那么不同的脸部可以是认为符合所述场景的不同的景物。例如,所述第一景物可以为用户A的脸部,所述第二景物可以为用户B的脸部。
本实施例中,对第一景物进行拍摄,可以得到在镜头移动过程中拍摄的多帧图像,记为待处理的图像;对第二景物进行拍摄,可以得到在镜头移动过程中拍摄的多帧图像,记为第二图像。可以通过所述第一滤波器分别对所述待处理的图像和所述第二图像进行滤波处理。
步骤805、根据所述待处理的图像对应的滤波结果以及所述第二图像对应的滤波结果,对所述第一滤波器进行评价。
步骤806、根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
具体来说,可以根据待处理的图像对应的滤波结果对第一滤波器进行评价,得到第一评分,根据所述第二图像对应的滤波结果,对所述第一滤波器进行评价,得到第二评分,将第一评分与第二评分相加,得到所述第一滤波器对应的总评分。根据所述第一滤波器对应的总评分,确定所述第一滤波器是否为适用于所述场景的滤波器。
以上以两个景物为例,描述了根据多个景物的图像确定第一滤波器是否适用于所述场景的实现方法。在实际应用中,用于对滤波器进行评价的景物也可以不局限于两个。可选的,可以选取符合所述场景的两个以上的景物,对每一个景物分别进行拍摄,并根据拍摄的图像对所述第一滤波器进行评分,根据各个景物对应的评分,确定所述第一滤波器的总评分,并根据所述第一滤波器的总评分确定所述第一滤波器是否为适用于当前场景的滤波器。
将以上各实施例提供的方案结合起来,能够得到一种可选的为拍摄设备配置滤波器的方案,可以涉及多个场景、多个景物和多个滤波器。具体来说,可以首先设置多个拍摄的场景,例如室内场景、室外场景等,针对每一个场景,拍摄符合该场景的多个景物,例如,针对室内场景,分别多个房屋的室内场景进行拍摄,得到多个景物对应的图像,其中,每个景物对应的图像可以包括对该景物进行拍摄时在镜头移动过程中拍摄的多帧图像。
此外,可以设计多个滤波器,针对每一个滤波器,通过所述滤波器对一场景的多个景物对应的图像进行滤波,并根据滤波结果对所述滤波器进行评分,得到滤波器在该场景下针对多个景物的总评分。在对全部滤波器进行评价后,可以得到每个滤波器对应的总评分,从所述全部滤波器中选择一个总评分最高的作为适用于所述场景的滤波器。针对每个场景实施上述步骤,可以得到每个场景对应的滤波器。
在实际拍摄时,可以根据当前的拍摄场景,调用相应的滤波器对拍摄的图像进行滤波,使得滤波结果符合当前场景的需求。
在另一种可选的为拍摄设备配置滤波器的方案中,可以在实际拍摄过程中,通过预先设计好的多个滤波器分别对当前拍摄的图像进行滤波处理,根据滤波结果对滤波器进行评价,从而在多个滤波器中确定适用于当前场景的滤波器,并根据该滤波器对应的滤波结果进行后续的对焦处理。
本实施例提供的图像处理方法,可以通过对待分析的场景对应的多个景物进行拍摄,从而根据多个景物对应的图像对第一滤波器进行综合评价,确定所述第一滤波器是否为适用于所述场景的滤波器,使得筛选出的滤波器对当前场景有更好的适应性。
在上述实施例提供的技术方案的基础上,各个图像对应的场景可以人工进行选择,或者,也可以根据待处理的图像,确定所述图像对应的场景。
可选的,所述场景可以为下述任意一项:正常光亮场景、强光场景、弱 光场景、点光源场景、人物场景等。
在一个可选的实施方式中,根据所述待处理的图像,确定所述图像对应的场景,可以包括:根据所述待处理的图像对应的亮度信息,以及拍摄所述待处理的图像时的曝光参数,确定环境亮度信息;根据所述环境亮度信息,确定所述待处理的图像对应的场景。
其中,所述曝光参数可以包括拍摄图像时对应的通光时间等。在图像拍摄过程中,光打在图像传感器上,可以形成对应的图像。在其它曝光参数不变的情况下,单位时间内图像传感器上的响应可以用于表征环境亮度信息,不同的响应代表环境亮度信息不同。因此,可以通过拍摄的图像的亮度信息以及通过时间确定环境亮度信息。
举例来说,所述图像的亮度信息可以为图像中各像素点对应的亮度值的均值,所述环境亮度信息可以等于所述图像的亮度信息除以通光时间,在所述图像的亮度信息一定的情况下,对应的通光时间越长,说明环境亮度信息越小,对应的通光时间越短,说明环境亮度信息越大。通过图像的亮度信息和曝光参数可以快速、准确地实现对环境亮度信息的检测,能够有效检测出正常光亮场景、强光场景、弱光场景。
其中,若所述环境亮度信息大于第一环境亮度阈值,则可以认为当前场景为强光场景,若所述环境亮度信息小于第二环境亮度阈值,则可以认为当前场景为弱光场景,若所述环境亮度信息小于所述第一环境亮度信息且大于所述第二环境亮度信息,则可以认为当前场景为正常光亮场景。
在另一可选的实施方式中,根据所述待处理的图像,确定所述图像对应的场景,可以包括:检测所述待处理的图像中前景物体对应的亮度信息和后景物体对应的亮度信息;根据所述前景物体对应的亮度信息和所述后景物体对应的亮度信息之间的差值或比值,确定所述待处理的图像对应的场景。
其中,所述图像中的前景物体可以为所述图像中距离拍摄设备最近的物体,所述图像中的后景物体可以为除前景物体以外的其它物体。物体与拍摄设备的距离远近可以通过物体所在图像区域的清晰程度变化趋势确定。
若所述前景物体对应的亮度信息和所述后景物体对应的亮度信息的差值或比值较大,说明所述图像中前景物体的亮度远大于后景物体,此时可以确定当前场景为点光源场景。
在又一可选的实施方式中,根据所述待处理的图像,确定所述图像对应 的场景,可以包括:通过检测所述待处理的图像中是否存在预设物体,确定所述待处理的图像对应的场景。例如,若所述图像中存在人脸,则可以认为当前场景为人脸场景。
通过以上方案,可以快速、准确地实现对场景的检测。在实际应用中,可以将以上的场景检测方案结合起来使用,例如,可以首先检测所述图像中是否存在预设物体,若存在,则确定当前场景为所述预设物体对应的场景,若不存在,则可以检测所述图像中前景物体的亮度信息和后景物体的亮度信息,并判断当前场景是否为点光源场景,若非点光源场景,再通过环境亮度信息判断是否为强光场景、弱光场景或正常光亮场景。
针对多个场景,按照前述各实施例提供的方法,对待分析的滤波器进行严格、科学地评价,确定适用于各个不同场景的滤波器。在实际拍摄过程中,可以通过拍摄的图像对当前场景进行检测,并选择所述场景对应的滤波器进行拍摄,有效提升对焦的效率和准确率,提高拍摄的效果。
图9为本发明实施例提供的一种图像处理装置的结构示意图。所述图像处理装置可以执行上述图1-图8所对应的图像处理方法,参考附图9所示,所述图像处理装置可以包括:
存储器11,用于存储计算机程序;
处理器12,用于运行所述存储器中存储的计算机程序以实现:
获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像;
通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果;
根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价;
根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
可选的,该图像处理装置的结构中还可以包括通信接口13,用于与其他设备或通信网络通信。
在一个可实施的方式中,所述处理器12还用于:
若所述第一滤波器为适用于所述场景的滤波器,则对所述每一帧图像对 应的滤波结果进行对焦处理。
在一个可实施的方式中,所述处理器12还用于:
通过待分析的第二滤波器,对所述多帧待处理的图像进行滤波;
根据所述第二滤波器对应的滤波结果,对所述第二滤波器进行评价。
在一个可实施的方式中,在根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器时,所述处理器12具体用于:
根据所述第一滤波器对应的评价结果以及所述第二滤波器对应的评价结果,从所述第一滤波器和所述第二滤波器中选择适用于所述场景的滤波器。
在一个可实施的方式中,在根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价时,所述处理器12具体用于:
根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分。
在一个可实施的方式中,在根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分时,所述处理器12具体用于:
根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值;
根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分。
在一个可实施的方式中,每一帧图像对应的滤波结果包括所述图像的至少部分像素点中每个像素点在滤波处理后对应的信号值;
在根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值时,所述处理器12具体用于:
针对每一帧图像,将所述图像中所述至少部分像素点对应的信号值进行累加,得到所述图像对应的焦值。
在一个可实施的方式中,在根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分时,所述处理器12具体用于:
根据各帧图像对应的焦值,确定焦值曲线;
根据所述焦值曲线确定所述第一滤波器对应的评分。在一个可实施的方式中,在根据各帧图像对应的焦值,确定焦值曲线时,所述处理器12具体用于:
针对每一帧图像,确定所述图像在焦值曲线中对应的数值点,所述数值点的横坐标为所述图像的序号,纵坐标为图像对应的焦值;
根据所确定的多个数值点生成所述焦值曲线。
在一个可实施的方式中,在根据所述焦值曲线确定所述第一滤波器对应 的评分时,所述处理器12具体用于:
根据所述焦值曲线,通过下述至少一项确定所述第一滤波器对应的评分:
曲线对比度、第一比值、第二比值、最高值开口大小、曲线单调性;
其中,所述第一比值为焦值高于第一阈值的数值点的占比;所述第二比值为焦值低于第二阈值的数值点的占比。
在一个可实施的方式中,在通过曲线对比度确定所述第一滤波器对应的评分时,所述处理器12具体用于:
在所述焦值曲线中,确定焦值最大的N个数值点以及焦值最小的M个数值点;
根据所述N个数值点对应的焦值的均值与所述M个数值点对应的焦值的均值之间的比值,确定所述第一滤波器对应的曲线对比度的评分;
其中,所述M和N为正整数。
在一个可实施的方式中,所述M等于所述待处理的图像的帧数乘以第一系数后取整得到的值,所述N等于所述待处理的图像的帧数乘以第二系数后取整得到的值,其中,所述第一系数和所述第二系数均大于0。
在一个可实施的方式中,所述N个数值点对应的焦值的均值与所述M个数值点对应的焦值的均值之间的比值,与所述曲线对比度的评分为正相关关系。
在一个可实施的方式中,在通过第一比值确定所述第一滤波器对应的评分时,所述处理器12具体用于:
确定所述焦值曲线的最高值,并统计焦值大于第一阈值的数值点的数量,所述第一阈值为所述最高值与第一比例系数的乘积;
根据所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,确定所述第一滤波器对应的所述第一比值的评分。
在一个可实施的方式中,所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第一比值的评分为负相关关系。
在一个可实施的方式中,在通过第二比值确定所述第一滤波器对应的评分时,所述处理器12具体用于:
确定所述焦值曲线的最低值,并统计焦值小于第二阈值的数值点的数量,所述第二阈值为所述最低值与第二比例系数的乘积;
根据所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,确定所述第一滤波器对应的第二比值的评分。
在一个可实施的方式中,所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第二比值的评分为正相关关系。
在一个可实施的方式中,在通过最高值开口大小确定所述第一滤波器对应的评分时,所述处理器12具体用于:
确定所述焦值曲线在最高值处对应的曲率;
根据所述曲率确定所述第一滤波器对应的最高值开口大小的评分。
在一个可实施的方式中,所述曲率与所述最高值开口大小的评分为正相关关系。
在一个可实施的方式中,在通过曲线单调性确定所述第一滤波器对应的评分时,所述处理器12具体用于:
统计所述焦值曲线中单调递增区间的数量和/或单调递减区间的数量;
根据所述单调递增区间的数量和/或单调递减区间的数量,确定所述第一滤波器对应的曲线单调性的评分。
在一个可实施的方式中,所述单调递增区间和/或单调递减区间的数量与所述曲线单调性的评分为负相关关系。
在一个可实施的方式中,所述场景为下述任意一项:正常光亮场景、强光场景、弱光场景、点光源场景、人物场景。
在一个可实施的方式中,所述处理器12还用于:
获取多帧第二图像,所述多帧第二图像为对符合所述场景的第二景物进行拍摄得到的多帧图像;
通过所述第一滤波器,对所述多帧第二图像进行滤波,得到每一帧第二图像对应的滤波结果。
在一个可实施的方式中,在所述根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价时,所述处理器12具体用于:
根据所述待处理的图像对应的滤波结果以及所述第二图像对应的滤波结果,对所述第一滤波器进行评价。
在一个可实施的方式中,所述处理器12还用于:
根据所述待处理的图像,确定所述图像对应的场景。
在一个可实施的方式中,在根据所述待处理的图像,确定所述图像对应的场景时,所述处理器12具体用于:
根据所述待处理的图像对应的亮度信息,以及拍摄所述待处理的图像时 的曝光参数,确定环境亮度信息;
根据所述环境亮度信息,确定所述待处理的图像对应的场景。
在一个可实施的方式中,在根据所述待处理的图像,确定所述图像对应的场景时,所述处理器12具体用于:
检测所述待处理的图像中前景物体对应的亮度信息和后景物体对应的亮度信息;
根据所述前景物体对应的亮度信息和所述后景物体对应的亮度信息之间的差值或比值,确定所述待处理的图像对应的场景。
在一个可实施的方式中,在根据所述待处理的图像,确定所述图像对应的场景时,所述处理器12具体用于:
通过检测所述待处理的图像中是否存在预设物体,确定所述待处理的图像对应的场景。
图9所示图像处理装置可以执行图1-图8所示实施例的方法,本实施例未详细描述的部分,可参考对图1-图8所示实施例的相关说明。该技术方案的执行过程和技术效果参见图1-图8所示实施例中的描述,在此不再赘述。
图10为本发明实施例提供的另一种图像处理装置的结构示意图。所述图像处理装置可以执行上述图1-图8所对应的图像处理方法,参考附图10所示,所述图像处理装置可以包括:
获取电路21,用于获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像;
滤波电路22,用于通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果;
评价电路23,用于根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价;
确定电路24,用于根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
在一个可实施的方式中,所述确定电路24还用于:
若所述第一滤波器为适用于所述场景的滤波器,则对所述每一帧图像对 应的滤波结果进行对焦处理。
在一个可实施的方式中,所述滤波电路22还用于:
通过待分析的第二滤波器,对所述多帧待处理的图像进行滤波;
根据所述第二滤波器对应的滤波结果,对所述第二滤波器进行评价。
在一个可实施的方式中,在根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器时,所述确定电路24具体用于:
根据所述第一滤波器对应的评价结果以及所述第二滤波器对应的评价结果,从所述第一滤波器和所述第二滤波器中选择适用于所述场景的滤波器。
在一个可实施的方式中,在根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价时,所述评价电路23具体用于:
根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分。
在一个可实施的方式中,在根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分时,所述评价电路23具体用于:
根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值;
根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分。
在一个可实施的方式中,每一帧图像对应的滤波结果包括所述图像的至少部分像素点中每个像素点在滤波处理后对应的信号值;
在根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值时,所述评价电路23具体用于:
针对每一帧图像,将所述图像中所述至少部分像素点对应的信号值进行累加,得到所述图像对应的焦值。
在一个可实施的方式中,在根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分时,所述评价电路23具体用于:
根据各帧图像对应的焦值,确定焦值曲线;
根据所述焦值曲线确定所述第一滤波器对应的评分。在一个可实施的方式中,在根据各帧图像对应的焦值,确定焦值曲线时,所述评价电路23具体用于:
针对每一帧图像,确定所述图像在焦值曲线中对应的数值点,所述数值点的横坐标为所述图像的序号,纵坐标为图像对应的焦值;
根据所确定的多个数值点生成所述焦值曲线。
在一个可实施的方式中,在根据所述焦值曲线确定所述第一滤波器对应的评分时,所述评价电路23具体用于:
根据所述焦值曲线,通过下述至少一项确定所述第一滤波器对应的评分:
曲线对比度、第一比值、第二比值、最高值开口大小、曲线单调性;
其中,所述第一比值为焦值高于第一阈值的数值点的占比;所述第二比值为焦值低于第二阈值的数值点的占比。
在一个可实施的方式中,在通过曲线对比度确定所述第一滤波器对应的评分时,所述评价电路23具体用于:
在所述焦值曲线中,确定焦值最大的N个数值点以及焦值最小的M个数值点;
根据所述N个数值点对应的焦值的均值与所述M个数值点对应的焦值的均值之间的比值,确定所述第一滤波器对应的曲线对比度的评分;
其中,所述M和N为正整数。
在一个可实施的方式中,所述M等于所述待处理的图像的帧数乘以第一系数后取整得到的值,所述N等于所述待处理的图像的帧数乘以第二系数后取整得到的值,其中,所述第一系数和所述第二系数均大于0。
在一个可实施的方式中,所述N个数值点对应的焦值的均值与所述M个数值点对应的焦值的均值之间的比值,与所述曲线对比度的评分为正相关关系。
在一个可实施的方式中,在通过第一比值确定所述第一滤波器对应的评分时,所述评价电路23具体用于:
确定所述焦值曲线的最高值,并统计焦值大于第一阈值的数值点的数量,所述第一阈值为所述最高值与第一比例系数的乘积;
根据所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,确定所述第一滤波器对应的所述第一比值的评分。
在一个可实施的方式中,所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第一比值的评分为负相关关系。
在一个可实施的方式中,在通过第二比值确定所述第一滤波器对应的评分时,所述评价电路23具体用于:
确定所述焦值曲线的最低值,并统计焦值小于第二阈值的数值点的数量,所述第二阈值为所述最低值与第二比例系数的乘积;
根据所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值 点的数量的比值,确定所述第一滤波器对应的第二比值的评分。
在一个可实施的方式中,所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第二比值的评分为正相关关系。
在一个可实施的方式中,在通过最高值开口大小确定所述第一滤波器对应的评分时,所述评价电路23具体用于:
确定所述焦值曲线在最高值处对应的曲率;
根据所述曲率确定所述第一滤波器对应的最高值开口大小的评分。
在一个可实施的方式中,所述曲率与所述最高值开口大小的评分为正相关关系。
在一个可实施的方式中,在通过曲线单调性确定所述第一滤波器对应的评分时,所述评价电路23具体用于:
统计所述焦值曲线中单调递增区间的数量和/或单调递减区间的数量;
根据所述单调递增区间的数量和/或单调递减区间的数量,确定所述第一滤波器对应的曲线单调性的评分。
在一个可实施的方式中,所述单调递增区间和/或单调递减区间的数量与所述曲线单调性的评分为负相关关系。
在一个可实施的方式中,所述场景为下述任意一项:正常光亮场景、强光场景、弱光场景、点光源场景、人物场景。
在一个可实施的方式中,所述滤波电路22还用于:
获取多帧第二图像,所述多帧第二图像为对符合所述场景的第二景物进行拍摄得到的多帧图像;
通过所述第一滤波器,对所述多帧第二图像进行滤波,得到每一帧第二图像对应的滤波结果。
在一个可实施的方式中,在所述根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价时,所述评价电路23具体用于:
根据所述待处理的图像对应的滤波结果以及所述第二图像对应的滤波结果,对所述第一滤波器进行评价。
在一个可实施的方式中,所述评价电路23还用于:
根据所述待处理的图像,确定所述图像对应的场景。
在一个可实施的方式中,在根据所述待处理的图像,确定所述图像对应的场景时,所述评价电路23具体用于:
根据所述待处理的图像对应的亮度信息,以及拍摄所述待处理的图像时的曝光参数,确定环境亮度信息;
根据所述环境亮度信息,确定所述待处理的图像对应的场景。
在一个可实施的方式中,在根据所述待处理的图像,确定所述图像对应的场景时,所述评价电路23具体用于:
检测所述待处理的图像中前景物体对应的亮度信息和后景物体对应的亮度信息;
根据所述前景物体对应的亮度信息和所述后景物体对应的亮度信息之间的差值或比值,确定所述待处理的图像对应的场景。
在一个可实施的方式中,在根据所述待处理的图像,确定所述图像对应的场景时,所述评价电路23具体用于:
通过检测所述待处理的图像中是否存在预设物体,确定所述待处理的图像对应的场景。
图10所示图像处理装置可以执行图1-图8所示实施例的方法。可以理解的是,图1-图8所示实施例的方法,可以通过硬件电路来实现。例如,根据滤波结果计算焦值,可以通过累加器来实现;计算滤波器的评分,可以通过与评分方法对应的运算器来实现;判断评分是否满足要求,可以通过比较器来实现;进行对焦处理,可以通过向对焦电机输出步进信号来实现。
本实施例未详细描述的部分,可参考对图1-图8所示实施例的相关说明。该技术方案的执行过程和技术效果参见图1-图8所示实施例中的描述,在此不再赘述。
本发明实施例还提供一种拍摄设备,包括上述任一实施例所述的图像处理装置。
可选的,所述拍摄设备还可以包括镜头、对焦电机和图像传感器。其中,所述对焦电机用于驱动所述镜头移动,以改变物距或像距。所述图像传感器用于将经过所述镜头的光信号转换为电信号,以形成图像。
所述拍摄设备可以为手机、运动相机、专业相机、红外相机等。所述拍摄设备中各部件的结构、功能、执行过程和技术效果可以参见前述实施例中的描述,在此不再赘述。
本发明实施例还提供一种可移动平台,包括以上所述的拍摄设备。所述可移动平台可以为无人机、无人车或者云台等。所述可移动平台还可以包括 机体和动力系统。所述拍摄设备和所述动力系统设于所述机体上,所述动力系统用于为所述可移动平台提供动力。
所述可移动平台中各部件的结构、功能、执行过程和技术效果可以参见前述实施例中的描述,在此不再赘述。
另外,本发明实施例还提供一种计算机可读存储介质,所述计算机可读存储介质中存储有程序指令,所述程序指令用于实现上述任一实施例所述的图像处理方法。
以上各个实施例中的技术方案、技术特征在与本相冲突的情况下均可以单独,或者进行组合,只要未超出本领域技术人员的认知范围,均属于本发明保护范围内的等同实施例。
在本发明所提供的几个实施例中,应该理解到,所揭露的相关装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得计算机处理器(processor)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动 硬盘、只读存储器(Read_Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁盘或者光盘等各种可以存储程序代码的介质。
以上所述仅为本发明的实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (66)

  1. 一种图像处理方法,其特征在于,包括:
    获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像;
    通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果;
    根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价;
    根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
  2. 根据权利要求1所述的方法,其特征在于,还包括:
    若所述第一滤波器为适用于所述场景的滤波器,则对所述每一帧图像对应的滤波结果进行对焦处理。
  3. 根据权利要求1所述的方法,其特征在于,还包括:
    通过待分析的第二滤波器,对所述多帧待处理的图像进行滤波;
    根据所述第二滤波器对应的滤波结果,对所述第二滤波器进行评价。
  4. 根据权利要求3所述的方法,其特征在于,根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器,包括:
    根据所述第一滤波器对应的评价结果以及所述第二滤波器对应的评价结果,从所述第一滤波器和所述第二滤波器中选择适用于所述场景的滤波器。
  5. 根据权利要求1所述的方法,其特征在于,根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价,包括:
    根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分。
  6. 根据权利要求5所述的方法,其特征在于,根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分,包括:
    根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值;
    根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分。
  7. 根据权利要求6所述的方法,其特征在于,每一帧图像对应的滤波结果包括所述图像的至少部分像素点中每个像素点在滤波处理后对应的信号值;
    根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值,包括:
    针对每一帧图像,将所述图像中所述至少部分像素点对应的信号值进行累加,得到所述图像对应的焦值。
  8. 根据权利要求6所述的方法,其特征在于,根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分,包括:
    根据各帧图像对应的焦值,确定焦值曲线;
    根据所述焦值曲线确定所述第一滤波器对应的评分。
  9. 根据权利要求8所述的方法,其特征在于,根据各帧图像对应的焦值,确定焦值曲线,包括:
    针对每一帧图像,确定所述图像在焦值曲线中对应的数值点,所述数值点的横坐标为所述图像的序号,纵坐标为图像对应的焦值;
    根据所确定的多个数值点生成所述焦值曲线。
  10. 根据权利要求9所述的方法,其特征在于,根据所述焦值曲线确定所述第一滤波器对应的评分,包括:
    根据所述焦值曲线,通过下述至少一项确定所述第一滤波器对应的评分:
    曲线对比度、第一比值、第二比值、最高值开口大小、曲线单调性;
    其中,所述第一比值为焦值高于第一阈值的数值点的占比;所述第二比值为焦值低于第二阈值的数值点的占比。
  11. 根据权利要求10所述的方法,其特征在于,通过曲线对比度确定所述第一滤波器对应的评分,包括:
    在所述焦值曲线中,确定焦值最大的N个数值点以及焦值最小的M个数值点;
    根据所述N个数值点对应的焦值的均值与所述M个数值点对应的焦值的均值之间的比值,确定所述第一滤波器对应的曲线对比度的评分;
    其中,所述M和N为正整数。
  12. 根据权利要求11所述的方法,其特征在于,所述M等于所述待处理的图像的帧数乘以第一系数后取整得到的值,所述N等于所述待处理的图像的帧数乘以第二系数后取整得到的值,其中,所述第一系数和所述第二系数均大于0。
  13. 根据权利要求11所述的方法,其特征在于,所述N个数值点对应的焦 值的均值与所述M个数值点对应的焦值的均值之间的比值,与所述曲线对比度的评分为正相关关系。
  14. 根据权利要求10所述的方法,其特征在于,通过第一比值确定所述第一滤波器对应的评分,包括:
    确定所述焦值曲线的最高值,并统计焦值大于第一阈值的数值点的数量,所述第一阈值为所述最高值与第一比例系数的乘积;
    根据所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,确定所述第一滤波器对应的所述第一比值的评分。
  15. 根据权利要求14所述的方法,其特征在于,所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第一比值的评分为负相关关系。
  16. 根据权利要求10所述的方法,其特征在于,通过第二比值确定所述第一滤波器对应的评分,包括:
    确定所述焦值曲线的最低值,并统计焦值小于第二阈值的数值点的数量,所述第二阈值为所述最低值与第二比例系数的乘积;
    根据所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,确定所述第一滤波器对应的第二比值的评分。
  17. 根据权利要求16所述的方法,其特征在于,所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第二比值的评分为正相关关系。
  18. 根据权利要求10所述的方法,其特征在于,通过最高值开口大小确定所述第一滤波器对应的评分,包括:
    确定所述焦值曲线在最高值处对应的曲率;
    根据所述曲率确定所述第一滤波器对应的最高值开口大小的评分。
  19. 根据权利要求18所述的方法,其特征在于,所述曲率与所述最高值开口大小的评分为正相关关系。
  20. 根据权利要求10所述的方法,其特征在于,通过曲线单调性确定所述第一滤波器对应的评分,包括:
    统计所述焦值曲线中单调递增区间的数量和/或单调递减区间的数量;
    根据所述单调递增区间的数量和/或单调递减区间的数量,确定所述第一滤波器对应的曲线单调性的评分。
  21. 根据权利要求20所述的方法,其特征在于,所述单调递增区间和/或单调递减区间的数量与所述曲线单调性的评分为负相关关系。
  22. 根据权利要求1所述的方法,其特征在于,所述场景为下述任意一项:正常光亮场景、强光场景、弱光场景、点光源场景、人物场景。
  23. 根据权利要求22所述的方法,其特征在于,还包括:
    获取多帧第二图像,所述多帧第二图像为对符合所述场景的第二景物进行拍摄得到的多帧图像;
    通过所述第一滤波器,对所述多帧第二图像进行滤波,得到每一帧第二图像对应的滤波结果。
  24. 根据权利要求23所述的方法,其特征在于,所述根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价,包括:
    根据所述待处理的图像对应的滤波结果以及所述第二图像对应的滤波结果,对所述第一滤波器进行评价。
  25. 根据权利要求1所述的方法,其特征在于,还包括:
    根据所述待处理的图像,确定所述图像对应的场景。
  26. 根据权利要求25所述的方法,其特征在于,根据所述待处理的图像,确定所述图像对应的场景,包括:
    根据所述待处理的图像对应的亮度信息,以及拍摄所述待处理的图像时的曝光参数,确定环境亮度信息;
    根据所述环境亮度信息,确定所述待处理的图像对应的场景。
  27. 根据权利要求25所述的方法,其特征在于,根据所述待处理的图像,确定所述图像对应的场景,包括:
    检测所述待处理的图像中前景物体对应的亮度信息和后景物体对应的亮度信息;
    根据所述前景物体对应的亮度信息和所述后景物体对应的亮度信息之间的差值或比值,确定所述待处理的图像对应的场景。
  28. 根据权利要求25所述的方法,其特征在于,根据所述待处理的图像,确定所述图像对应的场景,包括:
    通过检测所述待处理的图像中是否存在预设物体,确定所述待处理的图像对应的场景。
  29. 一种图像处理装置,其特征在于,包括:
    存储器,用于存储计算机程序;
    处理器,用于运行所述存储器中存储的计算机程序以实现:
    获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像;
    通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果;
    根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价;
    根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
  30. 根据权利要求29所述的装置,其特征在于,所述处理器还用于:
    若所述第一滤波器为适用于所述场景的滤波器,则对所述每一帧图像对应的滤波结果进行对焦处理。
  31. 根据权利要求29所述的装置,其特征在于,所述处理器还用于:
    通过待分析的第二滤波器,对所述多帧待处理的图像进行滤波;
    根据所述第二滤波器对应的滤波结果,对所述第二滤波器进行评价。
  32. 根据权利要求31所述的装置,其特征在于,在根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器时,所述处理器具体用于:
    根据所述第一滤波器对应的评价结果以及所述第二滤波器对应的评价结果,从所述第一滤波器和所述第二滤波器中选择适用于所述场景的滤波器。
  33. 根据权利要求29所述的装置,其特征在于,在根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价时,所述处理器具体用于:
    根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分。
  34. 根据权利要求33所述的装置,其特征在于,在根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分时,所述处理器具体用于:
    根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值;
    根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分。
  35. 根据权利要求34所述的装置,其特征在于,每一帧图像对应的滤波结果包括所述图像的至少部分像素点中每个像素点在滤波处理后对应的信号值;
    在根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值时,所述处理器具体用于:
    针对每一帧图像,将所述图像中所述至少部分像素点对应的信号值进行累加,得到所述图像对应的焦值。
  36. 根据权利要求34所述的装置,其特征在于,在根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分时,所述处理器具体用于:
    根据各帧图像对应的焦值,确定焦值曲线;
    根据所述焦值曲线确定所述第一滤波器对应的评分。
  37. 根据权利要求36所述的装置,其特征在于,在根据各帧图像对应的焦值,确定焦值曲线时,所述处理器具体用于:
    针对每一帧图像,确定所述图像在焦值曲线中对应的数值点,所述数值点的横坐标为所述图像的序号,纵坐标为图像对应的焦值;
    根据所确定的多个数值点生成所述焦值曲线。
  38. 根据权利要求37所述的装置,其特征在于,在根据所述焦值曲线确定所述第一滤波器对应的评分时,所述处理器具体用于:
    根据所述焦值曲线,通过下述至少一项确定所述第一滤波器对应的评分:
    曲线对比度、第一比值、第二比值、最高值开口大小、曲线单调性;
    其中,所述第一比值为焦值高于第一阈值的数值点的占比;所述第二比值为焦值低于第二阈值的数值点的占比。
  39. 根据权利要求38所述的装置,其特征在于,在通过曲线对比度确定所述第一滤波器对应的评分时,所述处理器具体用于:
    在所述焦值曲线中,确定焦值最大的N个数值点以及焦值最小的M个数值点;
    根据所述N个数值点对应的焦值的均值与所述M个数值点对应的焦值的均值之间的比值,确定所述第一滤波器对应的曲线对比度的评分;
    其中,所述M和N为正整数。
  40. 根据权利要求39所述的装置,其特征在于,所述M等于所述待处理的图像的帧数乘以第一系数后取整得到的值,所述N等于所述待处理的图像的帧数乘以第二系数后取整得到的值,其中,所述第一系数和所述第二系数均大于0。
  41. 根据权利要求39所述的装置,其特征在于,所述N个数值点对应的焦 值的均值与所述M个数值点对应的焦值的均值之间的比值,与所述曲线对比度的评分为正相关关系。
  42. 根据权利要求38所述的装置,其特征在于,在通过第一比值确定所述第一滤波器对应的评分时,所述处理器具体用于:
    确定所述焦值曲线的最高值,并统计焦值大于第一阈值的数值点的数量,所述第一阈值为所述最高值与第一比例系数的乘积;
    根据所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,确定所述第一滤波器对应的所述第一比值的评分。
  43. 根据权利要求42所述的装置,其特征在于,所述焦值大于第一阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第一比值的评分为负相关关系。
  44. 根据权利要求38所述的装置,其特征在于,在通过第二比值确定所述第一滤波器对应的评分时,所述处理器具体用于:
    确定所述焦值曲线的最低值,并统计焦值小于第二阈值的数值点的数量,所述第二阈值为所述最低值与第二比例系数的乘积;
    根据所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,确定所述第一滤波器对应的第二比值的评分。
  45. 根据权利要求44所述的装置,其特征在于,所述焦值小于第二阈值的数值点的数量与所述焦值曲线中全部数值点的数量的比值,与所述第二比值的评分为正相关关系。
  46. 根据权利要求38所述的装置,其特征在于,在通过最高值开口大小确定所述第一滤波器对应的评分时,所述处理器具体用于:
    确定所述焦值曲线在最高值处对应的曲率;
    根据所述曲率确定所述第一滤波器对应的最高值开口大小的评分。
  47. 根据权利要求46所述的装置,其特征在于,所述曲率与所述最高值开口大小的评分为正相关关系。
  48. 根据权利要求38所述的装置,其特征在于,在通过曲线单调性确定所述第一滤波器对应的评分时,所述处理器具体用于:
    统计所述焦值曲线中单调递增区间的数量和/或单调递减区间的数量;
    根据所述单调递增区间的数量和/或单调递减区间的数量,确定所述第一滤波器对应的曲线单调性的评分。
  49. 根据权利要求48所述的装置,其特征在于,所述单调递增区间和/或单调递减区间的数量与所述曲线单调性的评分为负相关关系。
  50. 根据权利要求29所述的装置,其特征在于,所述场景为下述任意一项:正常光亮场景、强光场景、弱光场景、点光源场景、人物场景。
  51. 根据权利要求50所述的装置,其特征在于,所述处理器还用于:
    获取多帧第二图像,所述多帧第二图像为对符合所述场景的第二景物进行拍摄得到的多帧图像;
    通过所述第一滤波器,对所述多帧第二图像进行滤波,得到每一帧第二图像对应的滤波结果。
  52. 根据权利要求51所述的装置,其特征在于,在所述根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价时,所述处理器具体用于:
    根据所述待处理的图像对应的滤波结果以及所述第二图像对应的滤波结果,对所述第一滤波器进行评价。
  53. 根据权利要求29所述的装置,其特征在于,所述处理器还用于:
    根据所述待处理的图像,确定所述图像对应的场景。
  54. 根据权利要求53所述的装置,其特征在于,在根据所述待处理的图像,确定所述图像对应的场景时,所述处理器具体用于:
    根据所述待处理的图像对应的亮度信息,以及拍摄所述待处理的图像时的曝光参数,确定环境亮度信息;
    根据所述环境亮度信息,确定所述待处理的图像对应的场景。
  55. 根据权利要求53所述的装置,其特征在于,在根据所述待处理的图像,确定所述图像对应的场景时,所述处理器具体用于:
    检测所述待处理的图像中前景物体对应的亮度信息和后景物体对应的亮度信息;
    根据所述前景物体对应的亮度信息和所述后景物体对应的亮度信息之间的差值或比值,确定所述待处理的图像对应的场景。
  56. 根据权利要求53所述的装置,其特征在于,在根据所述待处理的图像,确定所述图像对应的场景时,所述处理器具体用于:
    通过检测所述待处理的图像中是否存在预设物体,确定所述待处理的图像对应的场景。
  57. 一种图像处理装置,其特征在于,包括:
    获取电路,用于获取多帧待处理的图像,所述多帧待处理的图像包括在镜头移动过程中拍摄的多帧图像,其中,所述多帧待处理的图像为一场景对应的多帧图像,所述场景对应的多帧图像为对符合该场景的景物进行拍摄得到的多帧图像;
    滤波电路,用于通过待分析的第一滤波器,对所述多帧待处理的图像进行滤波,得到每一帧图像对应的滤波结果;
    评价电路,用于根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价;
    确定电路,用于根据所述第一滤波器对应的评价结果,确定所述第一滤波器是否为适用于所述场景的滤波器。
  58. 根据权利要求57所述的装置,其特征在于,在根据所述每一帧图像对应的滤波结果,对所述第一滤波器进行评价时,所述评价电路具体用于:
    根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分。
  59. 根据权利要求58所述的装置,其特征在于,在根据所述每一帧图像对应的滤波结果,确定所述第一滤波器对应的评分时,所述评价电路具体用于:
    根据所述每一帧图像对应的滤波结果,确定所述每一帧图像对应的焦值;
    根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分。
  60. 根据权利要求59所述的装置,其特征在于,在根据所述每一帧图像对应的焦值,确定所述第一滤波器对应的评分时,所述评价电路具体用于:
    根据各帧图像对应的焦值,确定焦值曲线;
    根据所述焦值曲线确定所述第一滤波器对应的评分。
  61. 根据权利要求60所述的装置,其特征在于,在根据所述焦值曲线确定所述第一滤波器对应的评分时,所述评价电路具体用于:
    根据所述焦值曲线,通过下述至少一项确定所述第一滤波器对应的评分:
    曲线对比度、第一比值、第二比值、最高值开口大小、曲线单调性;
    其中,所述第一比值为焦值高于第一阈值的数值点的占比;所述第二比值为焦值低于第二阈值的数值点的占比。
  62. 一种拍摄设备,其特征在于,包括:权利要求29-56中任一项所述的图像处理装置。
  63. 一种拍摄设备,其特征在于,包括:权利要求57-61中任一项所述的 图像处理装置。
  64. 一种可移动平台,其特征在于,包括:权利要求62所述的拍摄设备。
  65. 一种可移动平台,其特征在于,包括:权利要求63所述的拍摄设备。
  66. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有程序指令,所述程序指令用于实现权利要求1-28中任意一项所述的图像处理方法。
PCT/CN2020/087530 2020-04-28 2020-04-28 图像处理方法、装置、拍摄设备、可移动平台和存储介质 WO2021217427A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/087530 WO2021217427A1 (zh) 2020-04-28 2020-04-28 图像处理方法、装置、拍摄设备、可移动平台和存储介质
CN202080004975.5A CN112689853A (zh) 2020-04-28 2020-04-28 图像处理方法、装置、拍摄设备、可移动平台和存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/087530 WO2021217427A1 (zh) 2020-04-28 2020-04-28 图像处理方法、装置、拍摄设备、可移动平台和存储介质

Publications (1)

Publication Number Publication Date
WO2021217427A1 true WO2021217427A1 (zh) 2021-11-04

Family

ID=75457690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/087530 WO2021217427A1 (zh) 2020-04-28 2020-04-28 图像处理方法、装置、拍摄设备、可移动平台和存储介质

Country Status (2)

Country Link
CN (1) CN112689853A (zh)
WO (1) WO2021217427A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114827461A (zh) * 2022-04-15 2022-07-29 维沃移动通信(杭州)有限公司 拍摄对焦方法、装置和电子设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113395454B (zh) * 2021-07-06 2023-04-25 Oppo广东移动通信有限公司 图像拍摄的防抖方法与装置、终端及可读存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1829288A (zh) * 2005-03-01 2006-09-06 株式会社理光 摄像装置以及摄像方法
JP2007102061A (ja) * 2005-10-07 2007-04-19 Olympus Corp 画像撮影装置
CN101310205A (zh) * 2006-01-17 2008-11-19 索尼株式会社 对焦控制装置、以及摄像装置
CN105593737A (zh) * 2013-10-02 2016-05-18 奥林巴斯株式会社 焦点检测装置和焦点检测方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1829288A (zh) * 2005-03-01 2006-09-06 株式会社理光 摄像装置以及摄像方法
JP2007102061A (ja) * 2005-10-07 2007-04-19 Olympus Corp 画像撮影装置
CN101310205A (zh) * 2006-01-17 2008-11-19 索尼株式会社 对焦控制装置、以及摄像装置
CN105593737A (zh) * 2013-10-02 2016-05-18 奥林巴斯株式会社 焦点检测装置和焦点检测方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114827461A (zh) * 2022-04-15 2022-07-29 维沃移动通信(杭州)有限公司 拍摄对焦方法、装置和电子设备

Also Published As

Publication number Publication date
CN112689853A (zh) 2021-04-20

Similar Documents

Publication Publication Date Title
US8233078B2 (en) Auto focus speed enhancement using object recognition and resolution
KR101399012B1 (ko) 영상 복원 장치 및 방법
EP2448246B1 (en) Method for focusing
CN107258077B (zh) 用于连续自动聚焦(caf)的系统和方法
KR101412752B1 (ko) 디지털 자동 초점 영상 생성 장치 및 방법
US7941042B2 (en) Auto-focus method, medium, and apparatus for image-capturing
TW201103316A (en) Two-dimensional polynomial model for depth estimation based on two-picture matching
WO2010088079A2 (en) Automatic focusing apparatus and method for digital images using automatic filter switching
CN103119927B (zh) 图像处理装置、摄像装置及图像处理方法
WO2012132486A1 (ja) 撮像装置、撮像方法、プログラム、及びプログラム記憶媒体
WO2021217427A1 (zh) 图像处理方法、装置、拍摄设备、可移动平台和存储介质
US8624986B2 (en) Motion robust depth estimation using convolution and wavelet transforms
WO2007058100A1 (ja) 合焦検出装置
CN103685861A (zh) 用于在深度估计过程中使用增强的场景检测的系统和方法
KR100897768B1 (ko) 자동 초점 조절 방법과 상기 방법을 사용할 수 있는 장치
JP5000030B1 (ja) 画像処理装置、撮像装置、及び画像処理方法
CN110324536B (zh) 一种用于显微相机的图像变化自动感知对焦方法
JP2012058352A (ja) 焦点調整装置および撮像装置
GB2430095A (en) Image data processing method for determining a focus position
WO2017107840A1 (zh) 自动聚焦控制方法和装置
US8270755B2 (en) Method to evaluate contrast value for an image and applications thereof
JP4255186B2 (ja) 焦点合わせ装置
JP2022083913A5 (zh)
US10747089B2 (en) Imaging apparatus and control method of the same
US10477098B2 (en) Control apparatus which sets defocus amount used for focusing, image capturing apparatus, control method, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20933104

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20933104

Country of ref document: EP

Kind code of ref document: A1