CN114554103A - Image capturing method, image capturing apparatus, and storage medium - Google Patents

Image capturing method, image capturing apparatus, and storage medium Download PDF

Info

Publication number
CN114554103A
CN114554103A CN202011350468.5A CN202011350468A CN114554103A CN 114554103 A CN114554103 A CN 114554103A CN 202011350468 A CN202011350468 A CN 202011350468A CN 114554103 A CN114554103 A CN 114554103A
Authority
CN
China
Prior art keywords
distance
metering mode
scene
determining
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011350468.5A
Other languages
Chinese (zh)
Inventor
霍文甲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202011350468.5A priority Critical patent/CN114554103A/en
Publication of CN114554103A publication Critical patent/CN114554103A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes

Abstract

The present disclosure relates to an image capturing method, an image capturing apparatus, and a storage medium, the image capturing method including: determining a current distance scene based on the time-of-flight lattice distance, wherein the distance scene and a photometric mode have a corresponding relation; and determining a target light metering mode matched with the current distance scene, and shooting an image by adopting the target light metering mode. Through the embodiment of the disclosure, the distance scene is identified by using the time-of-flight lattice, the target light metering mode matched with the current distance scene is determined, and the image shooting is performed by using the target light metering mode, so that the dynamic adjustment of the light metering mode in the image shooting is realized, the light dimming quality is optimized, an ideal image exposure effect is achieved, and the user experience is improved.

Description

Image capturing method, image capturing apparatus, and storage medium
Technical Field
The present disclosure relates to the field of terminal technologies, and in particular, to an image capturing method, an image capturing apparatus, and a storage medium.
Background
With the development of electronic technology, the appearance of an electric control automatic light meter enables the complicated process of adjusting the shutter speed and the aperture in the shooting process to be changed. In order to obtain a better shooting effect, a photometric system of the shooting device measures the brightness of light reflected by a shot object, and different photometric modes are selected to perform automatic photometric processing on an image according to different indoor and outdoor shooting scenes. For example, the photographing apparatus provides photometry modes such as spot photometry, average photometry, and center weighted photometry, and adjusts the photometry value of the photographed image according to the photometry result, thereby obtaining a photograph with a better photometry effect.
In the prior art, a fixed dimming window is selected in the shooting dimming process, namely, the whole image is selected to be used as a photometric area to perform photometry, and when the photometry and the photometry are performed on a specific scene, the exposure effect in shooting is poor, and the shooting requirement cannot be met.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image photographing method, an image photographing apparatus, and a storage medium.
According to an aspect of the embodiments of the present disclosure, there is provided an image capturing method including: determining a current distance scene based on the time-of-flight lattice distance, wherein the distance scene and a photometric mode have a corresponding relation; and determining a target light metering mode matched with the current distance scene, and shooting an image by adopting the target light metering mode.
In an embodiment, the current distance scene comprises a first distance scene, a second distance scene, or a third distance scene; determining a current distance scene based on the time-of-flight lattice distance, comprising: if the time-of-flight lattice distance is smaller than a first proportional threshold of the time-of-flight lattice limit distance, determining that the current distance scene is a first distance scene; if the time-of-flight lattice distance is larger than a second proportional threshold of the time-of-flight lattice limit distance, determining that the current distance scene is a second distance scene; if the time-of-flight lattice distance is greater than a first proportional threshold of the time-of-flight lattice limit ranging distance and less than a second proportional threshold of the time-of-flight lattice limit ranging distance, determining that the current distance scene is a third distance scene; wherein the first scaling threshold is less than the second scaling threshold.
In an embodiment, the determining the target metering mode matching the current distance scene includes at least one of: when the current distance scene is a second distance scene, determining a local photometric mode or a multipoint photometric mode as the target photometric mode based on the distance distribution concentration degree of all pixels in the image area; when the current distance scene is a third distance scene, determining a central metering mode or a local metering mode as the target metering mode based on the distance distribution concentration degree of all pixels in the central area of the image; and when the current distance scene is the first distance scene, determining that the central light metering mode is the target light metering mode.
In an embodiment, determining that the local metering mode or the multi-point metering mode is the target metering mode based on the concentration degree of the distance distribution of all the pixels in the image area includes: when a first preset condition is met, determining that a local metering mode is the target metering mode, and determining an area corresponding to a pixel meeting the first preset condition as a metering area; when the first preset condition is not met, determining that the multi-point metering mode is the target metering mode, wherein the metering area corresponding to the target metering mode is the whole image area; wherein the first preset condition comprises: the ratio of the number of pixels in the preset distance range to the number of all pixels in the image area is larger than a first preset proportion threshold value.
In an embodiment, determining that the central light metering mode or the local light metering mode is the target light metering mode based on the concentration degree of the distance distribution of all the pixels in the central area of the image includes: when a second preset condition is met, determining that a central metering mode is the target metering mode, and a metering area corresponding to the target metering mode is the image central area; when the second preset condition is not met, determining that the local light metering mode is the target light metering mode; wherein the second preset condition comprises: in the image central area, the ratio of the number of pixels to the number of all pixels in the image area is larger than a second preset ratio threshold.
In an embodiment, before image capturing is performed in the target metering mode, the method further includes: and if the current distance scene is the first distance scene and the current picture zooming value is larger than the threshold value of the preset picture zooming value, adjusting the light metering area based on the ratio of the current picture zooming value to the maximum zooming multiple.
According to still another aspect of the embodiments of the present disclosure, there is provided an image photographing apparatus including: the system comprises a determining module, a light measuring module and a light measuring module, wherein the determining module is used for determining a current distance scene based on a time-of-flight lattice distance, wherein the distance scene and a light measuring mode have a corresponding relation, and determining a target light measuring mode matched with the current distance scene; and the image shooting module is used for shooting images by adopting the target light metering mode.
In an embodiment, the current distance scene comprises a first distance scene, a second distance scene, or a third distance scene; the determining module determines a current distance scene based on the time-of-flight lattice distance in the following manner: if the time-of-flight lattice distance is smaller than a first proportional threshold of the time-of-flight lattice limit distance, determining that the current distance scene is a first distance scene; if the time-of-flight lattice distance is larger than a second proportional threshold of the time-of-flight lattice limit distance, determining that the current distance scene is a second distance scene; if the time-of-flight lattice distance is greater than a first proportional threshold of the time-of-flight lattice limit ranging distance and less than a second proportional threshold of the time-of-flight lattice limit ranging distance, determining that the current distance scene is a third distance scene; wherein the first proportional threshold is less than the second proportional threshold.
In an embodiment, the determining module determines the target light metering mode matching the current distance scene by using at least one of the following modes: when the current distance scene is a second distance scene, determining a local metering mode or a multipoint metering mode as the target metering mode based on the distance distribution concentration degree of all pixels in the image area; when the current distance scene is a third distance scene, determining a central metering mode or a local metering mode as the target metering mode based on the distance distribution concentration degree of all pixels in the central area of the image; and when the current distance scene is the first distance scene, determining that the central light metering mode is the target light metering mode.
In an embodiment, the determining module determines the local metering mode or the multi-point metering mode as the target metering mode based on the concentration degree of the distance distribution of all the pixels in the image area as follows: when a first preset condition is met, determining that a local metering mode is the target metering mode, and determining an area corresponding to a pixel meeting the first preset condition as a metering area; when the first preset condition is not met, determining that the multi-point metering mode is the target metering mode, wherein the metering area corresponding to the target metering mode is the whole image area; wherein the first preset condition comprises: the ratio of the number of pixels in the preset distance range to the number of all pixels in the image area is larger than a first preset proportion threshold value.
In an embodiment, the determining module determines the central metering mode or the local metering mode as the target metering mode based on the concentration degree of the distance distribution of all the pixels in the central area of the image as follows: when a second preset condition is met, determining that a central light metering mode is the target light metering mode, and a light metering area corresponding to the target light metering mode is the central area; when the second preset condition is not met, determining that the local light metering mode is the target light metering mode; wherein the second preset condition comprises: in the central area, the ratio of the number of pixels to the number of all pixels in the image area is greater than a second preset ratio threshold.
In one embodiment, the image photographing apparatus further includes: and the adjusting module is used for adjusting the photometric area based on the ratio of the current image zooming value to the maximum zooming multiple when the current distance scene is the first distance scene and the current image zooming value is greater than the threshold value of the preset image zooming value.
According to still another aspect of the embodiments of the present disclosure, there is provided an image photographing apparatus including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: performing the image capturing method of any one of the preceding claims.
According to yet another aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium having instructions stored thereon, which, when executed by a processor of a mobile terminal, enable the mobile terminal to perform any one of the aforementioned image capturing methods.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the dynamic adjustment of the light metering mode in image shooting is realized, the light dimming quality is optimized, an ideal image exposure effect is achieved, and therefore user experience is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating an image capturing method according to an exemplary embodiment of the present disclosure.
Fig. 2 is a schematic diagram illustrating a histogram obtained by counting distances between pixels in a shooting scene according to an exemplary embodiment of the disclosure.
FIG. 3 is a schematic diagram illustrating a distance scene versus field of view according to an exemplary embodiment of the present disclosure.
Fig. 4 is a flowchart illustrating an image capturing method according to still another exemplary embodiment of the present disclosure.
Fig. 5 is a flowchart illustrating an image capturing method according to still another exemplary embodiment of the present disclosure.
Fig. 6 is a flowchart illustrating an image capturing method according to still another exemplary embodiment of the present disclosure.
Fig. 7 is a flowchart illustrating an image capturing method according to still another exemplary embodiment of the present disclosure.
8 a-8 d are schematic diagrams illustrating the relationship between the degree of zoom and the light metering area according to an exemplary embodiment of the present disclosure.
Fig. 9 is a block diagram illustrating an image photographing apparatus according to an exemplary embodiment of the present disclosure.
Fig. 10 is a block diagram illustrating an image photographing apparatus according to still another exemplary embodiment of the present disclosure.
Fig. 11 is a block diagram for an image photographing apparatus according to an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
With the development of electronic technology, the appearance of an electric control automatic light meter enables the complicated process of adjusting the shutter speed and the aperture in the shooting process to be changed. In order to obtain a better shooting effect, a photometric system of the shooting device measures the brightness of light reflected by a shot object, and different photometric modes are selected to automatically perform photometric processing on an image according to different indoor and outdoor shooting scenes. For example, the photographing apparatus provides photometry modes such as spot photometry, average photometry, and center weighted photometry, and adjusts the photometry value of the photographed image according to the photometry result, thereby obtaining a photograph with a better photometry effect.
The time-of-flight ranging is to carry out non-contact ranging by a time-of-flight method, namely, the distance between a sensor and an object is obtained by measuring and calculating the time of the near-infrared light which is reflected to the sensor after encountering the object after self-emission and matching with constant light speed. The distance measurement result is accurate, and the anti-interference capability is strong.
In the prior art, a fixed dimming window is selected in the shooting dimming process, namely, the whole image is selected to be used as a photometric area to perform photometry, and when the photometry and the photometry are performed on a specific scene, the exposure effect in shooting is poor, and the shooting requirement cannot be met.
Therefore, the image shooting method identifies the distance scene, determines the light metering mode matched with the current distance scene, performs light metering by adopting the light metering mode matched with the current distance scene, and realizes dynamic adjustment of the light metering mode.
Fig. 1 is a flowchart illustrating an image capturing method according to an exemplary embodiment of the present disclosure, the image capturing method including the following steps, as shown in fig. 1.
In step S101, a current distance scene is determined based on the time-of-flight lattice distance, where the distance scene and the light metering mode have a correspondence relationship.
In step S102, a target metering manner that matches the current distance scene is determined, and image capturing is performed using the target metering manner.
In one embodiment of the disclosure, the image shooting method is applied to a shooting process, and the time-of-flight lattice is arranged on a shooting device. Based on the time-of-flight lattice ranging, the actual position information of each pixel in the field of view can be obtained, and the time-of-flight lattice distance is obtained. For example, the photosensitive chip of the time-of-flight ranging device may adopt a planar array type photosensitive chip to measure the depth information of the surface position of the whole three-dimensional object and acquire the surface geometric structure information of the whole scene of the shooting scene in real time. The time-of-flight lattice distance may be in one-to-one correspondence with the actual position of each pixel in the field of view, i.e. the time-of-flight lattice distance comprises the distance between each pixel in the field and the camera.
And determining a light metering mode matched with the current distance scene, and performing light metering by adopting the light metering mode matched with the current distance scene. The photometry method may be partial photometry, spot photometry, central photometry, or the like. Different light metering modes can correspond to different light metering areas, namely in the determined light metering area, the light metering mode matched with the current distance scene is adopted for metering light.
According to the embodiment of the disclosure, the distance scene is identified by using the time-of-flight lattice, the light metering mode matched with the current distance scene is determined, and the image shooting is performed by adopting the light metering mode matched with the current distance scene, so that the dynamic adjustment of the light metering mode in the image shooting is realized, the light dimming quality is optimized, an ideal image exposure effect is achieved, and the user experience is improved.
Fig. 2 is a schematic diagram illustrating a histogram obtained by counting distances between pixels in a shooting scene according to an exemplary embodiment of the disclosure.
In an embodiment of the present disclosure, the time-of-flight lattice distance is counted, and when the current distance scene is determined, for example, the distance of each pixel point of the shooting object scene may be determined based on the time-of-flight, and in a histogram of the distances of each pixel point, the histogram represents the distance distribution of the pixel points in the scene. The distance distribution threshold may be preset, in the histogram, pixel points larger than the preset distance distribution threshold in the shooting scene are determined, and the average value of the distances of the determined pixel points is determined as the distance of the current scene. Based on the distance of the current scene, a current distance scene is determined. There is a correspondence between the distance scene and the light metering mode.
In one embodiment of the present disclosure, the current distance scene includes a first distance scene, a second distance scene, or a second distance scene.
The current distance scene is divided according to the limit ranging distance of the flight time, namely if the flight time lattice distance is smaller than a first proportional threshold of the limit ranging distance of the flight time lattice, the current distance scene is determined to be a first distance scene. It is understood that limit ranging is the maximum limit value of the time-of-flight lattice, or the calibrated maximum.
And if the flight time lattice distance is greater than a second proportional threshold of the flight time lattice limit distance, determining that the current distance scene is a second distance scene. The first proportional threshold is less than the second proportional threshold.
And if the flight time lattice distance is greater than a first proportional threshold of the flight time lattice limit ranging distance and less than a second proportional threshold of the flight time lattice limit ranging distance, determining that the current distance scene is a third distance scene.
For example, if the first proportional threshold of the limit ranging distance of the time of flight is 5% and the second proportional threshold of the limit ranging distance of the time of flight is 30%, the time of flight lattice distance is less than 5% of the limit ranging distance of the time of flight lattice, and the current distance scene is determined to be a first distance scene, i.e., a short-distance scene. And if the flying time lattice distance is greater than 30% of the ultimate ranging distance of the flying time lattice, determining that the current distance scene is a second distance scene, wherein the second distance scene is a long-distance scene. And if the flight time lattice distance is greater than 5% of the flight time lattice limit ranging distance and less than 30% of the flight time lattice limit ranging distance, determining that the current distance scene is a third distance scene, wherein the third distance scene is a medium distance scene. The first proportional threshold is 5% and the second proportional threshold is 30%, and the first proportional threshold and/or the second proportional threshold may be changed according to the photographing mode of the photographing apparatus, such as telephoto, wide angle, macro, and the like, so as to change the division and determination of the long distance, the medium distance, and the short distance.
Fig. 3 is a schematic diagram illustrating a distance scene and a view field relationship according to an exemplary embodiment of the disclosure, where as shown in fig. 3, when a scene is captured by a capturing device, the view field changes with the object distance, the object distances of objects in the scene corresponding to a large view field are different, and the object distance difference of objects in the scene corresponding to a small view field is smaller. Fig. 3 shows the change of the field of view at the long distance, the medium distance, and the short distance, respectively, from top to bottom. By determining different distance scenes, a basis is provided for selecting a light metering mode.
Fig. 4 is a flowchart illustrating an image capturing method according to still another exemplary embodiment of the present disclosure, the image capturing method including the following steps, as shown in fig. 4.
In step S201, a current distance scene is determined based on the time-of-flight lattice distance, where the distance scene and the photometry mode have a correspondence relationship.
In step S202, when the current distance scene is the second distance scene, the local metering method or the multipoint metering method is determined as the target metering method based on the concentration degree of the distance distribution of all the pixels in the image area.
In step S203, if the current distance scene is the third distance scene, the central metering mode or the partial metering mode is determined as the target metering mode based on the concentration degree of the distance distribution of all the pixels in the central area of the image.
In step S204, in the case where the current distance scene is the first distance scene, it is determined that the center metering mode is the target metering mode.
In step S205, image capturing is performed in the target metering method.
In the embodiment of the present disclosure, when the time-of-flight lattice distance is greater than the second proportional threshold of the time-of-flight lattice limit ranging distance, it is determined that the current distance scene is the second distance scene, that is, the long distance scene. And when the distance distribution of all the pixels in the image area is concentrated, selecting the area with the concentrated distance distribution as a light metering area to perform local light metering. When the distance distribution of all pixels in the image area is uniform, a multi-point light metering mode is selected, so that the selection of the light metering mode is more accurate.
In the embodiment of the present disclosure, under the condition that the time-of-flight distance of the time-of-flight lattice is greater than the first proportional threshold of the time-of-flight lattice limit ranging distance and is less than the second proportional threshold of the time-of-flight lattice limit ranging distance, it is determined that the current distance scene is the third distance scene, i.e., the middle distance scene. And if the corresponding field of view is moderate in the middle-distance scene, determining that the central light metering mode or the local light metering mode is the target light metering mode based on the distance distribution concentration degree of all pixels in the central area of the image. If the distance distribution of all pixels in the central area is concentrated, selecting a central light metering mode, otherwise, selecting a local light metering mode. In the embodiment of the present disclosure, the image central area may be an area of a preset size formed by expanding outward in the image plane with the geometric center of the image as a center point.
In the embodiment of the present disclosure, in the case that the time-of-flight lattice distance is smaller than the first proportional threshold of the time-of-flight lattice limit ranging distance, it is determined that the current distance scene is the first distance scene, that is, the short distance scene. And the corresponding field of view is smaller in the short-distance scene, and the central region light metering mode is determined as the target light metering mode in the short-distance scene.
According to the embodiment of the disclosure, based on the time-of-flight lattice distance, the current distance scene is determined to be a long-distance scene, a medium-distance scene or a short-distance scene, different target light metering modes matched with the current distance scene are determined, and the determined target light metering modes are adopted for image shooting, so that the dynamic adjustment of the light metering modes in the shooting process is realized, the dimming quality is optimized, and an ideal exposure effect is achieved.
Fig. 5 is a flowchart illustrating an image capturing method according to still another exemplary embodiment of the present disclosure, and as shown in fig. 5, the image capturing method includes the following steps.
In step S301, a current distance scene is determined based on the time-of-flight lattice distance, where the distance scene and the photometry mode have a correspondence relationship.
In step S302, if the current distance scene is the second distance scene, when a first preset condition is satisfied, it is determined that the local metering mode is the target metering mode, and an area corresponding to a pixel satisfying the first preset condition is determined as a metering area.
In step S303, if the current distance scene is the second distance scene, when the first preset condition is not satisfied, it is determined that the multi-point metering mode is the target metering mode, and the metering area corresponding to the target metering mode is the entire image area.
In step S304, image capturing is performed in the target metering method.
In an embodiment of the present disclosure, the first preset condition includes: the ratio of the number of pixels in the preset distance range to the number of all pixels in the image area is larger than a first preset proportion threshold value. That is, the number of pixels that is greater than the total number of pixels in the image area by the first preset proportion threshold value, and that is within the preset distance range, should fall within the area where photometry is performed.
And when the flight time lattice distance is greater than a second proportional threshold of the flight time lattice limit distance, determining that the current distance scene is a second distance scene, namely a long-distance scene. And if the corresponding field of view is large in the long-distance scene, selecting a local light metering mode or a multi-point light metering mode as a target light metering mode based on the distance distribution concentration degree of all pixels in the image area.
In a preset distance range, the ratio of the number of pixels to the number of all pixels in the image area is greater than a first preset ratio threshold, and at this time, the pixels in the preset distance range are distributed in a concentrated manner, that is, when a first preset condition is met, the area in the concentrated distance distribution is selected as a light metering area to perform local light metering. When the first preset condition is not met in the image, namely the distance distribution of all pixels in the image area is uniform, when the first preset condition is not met, multipoint photometry is carried out in the whole image area, and the selection of the photometry mode is more accurate.
According to the embodiment of the disclosure, based on the time-of-flight lattice distance, the current distance scene is determined to be a long-distance scene, different target light metering modes matched with the long-distance scene are determined, and the determined target light metering modes are adopted for image shooting, so that the dynamic adjustment of the light metering modes in the shooting process is realized, the dimming quality is optimized, and an ideal exposure effect is achieved.
Fig. 6 is a flowchart illustrating an image capturing method according to still another exemplary embodiment of the present disclosure, the image capturing method including the following steps, as shown in fig. 6.
In step S401, a current distance scene is determined based on the time-of-flight lattice distance, where the distance scene and the photometry mode have a correspondence relationship.
In step S402, if the current distance scene is the third distance scene, when a second preset condition is satisfied, it is determined that the central metering mode is the target metering mode, and the metering area corresponding to the target metering mode is the image central area.
In step S403, if the current distance scene is the third distance scene, when the second preset condition is not satisfied, it is determined that the local metering mode is the target metering mode.
In step S404, image capturing is performed using a target photometry method.
In an embodiment of the present disclosure, the second preset condition includes: in the central area of the image, the ratio of the number of pixels to the number of all pixels in the image area is larger than a second preset proportion threshold. And if the flying time distance of the flying time lattice is greater than a first proportional threshold of the limit ranging distance of the flying time lattice and is less than a second proportional threshold of the limit ranging distance of the flying time lattice, determining that the current distance scene is a third distance scene, namely a middle distance scene. And if the corresponding field of view is moderate in the middle-distance scene, determining that the central light metering mode or the local light metering mode is the target light metering mode based on the distance distribution concentration degree of all pixels in the central area of the image. If the distance distribution of all pixels in the central area of the image is concentrated, selecting a central light metering mode, otherwise, selecting a local light metering mode. In the image central area, when the ratio of the number of pixels to the number of all pixels in the image area is larger than a second preset ratio threshold, the distance distribution of all pixels is concentrated. In the central area of the image, the central metering method is set as the target metering method, and image shooting is performed.
And in the central area of the image, when the ratio of the number of pixels to the number of all pixels in the image area is smaller than a second preset proportion threshold value, the distance distribution of all pixels is uniformly dispersed, and the local light metering mode is determined to be a target light metering mode. In the distribution range of all pixels, local photometry may be performed to capture an image.
According to the embodiment of the disclosure, based on the time-of-flight lattice distance, the current distance scene is determined to be the middle distance scene, different target light metering modes matched with the middle distance scene are determined, and the determined target light metering modes are adopted for image shooting, so that the dynamic adjustment of the light metering modes in the shooting process is realized, the dimming quality is optimized, and an ideal exposure effect is achieved.
Fig. 7 is a flowchart illustrating an image capturing method according to still another exemplary embodiment of the present disclosure, the image capturing method including the following steps, as shown in fig. 7.
In step S501, a current distance scene is determined based on the time-of-flight lattice distance, where the distance scene and the photometry mode have a correspondence relationship.
In step S502, if the current distance scene is the first distance scene, it is determined that the central light metering mode is the target light metering mode, and if the current frame zoom value is greater than the threshold of the preset frame zoom value, the light metering area is adjusted based on the ratio of the current frame zoom value to the maximum zoom factor.
In step S503, image capturing is performed in the target metering mode.
In the embodiment of the present disclosure, when the time-of-flight lattice distance is smaller than the first proportional threshold of the time-of-flight lattice limit ranging distance, it is determined that the current distance scene is the first distance scene, i.e., the close distance scene. And the corresponding field of view is smaller in the short-distance scene, and the central light metering mode is determined as the target light metering mode in the short-distance scene.
8 a-8 d are schematic diagrams illustrating the relationship between the degree of zoom and the light metering area according to an exemplary embodiment of the present disclosure. When the zoom level of the screen is large, the field of view is correspondingly small, and therefore the photometric area should be correspondingly expanded. Referring to fig. 8 a-8 d, the zoom level corresponding to fig. 8a is greater than that of fig. 8b, the boxes in the depth maps in fig. 8a and 8b represent the actual field of view, the solid boxes in fig. 8c and 8d correspond to the boxes in fig. 8a and 8b, respectively, which represent the actual field of view, and the dotted boxes correspond to the light metering regions adjusted based on the zoom levels of the pictures in fig. 8a and 8 b.
In the embodiment of the present disclosure, under a short-distance scene, a central light metering mode is selected, and if the current picture zoom value exceeds the threshold of the picture zoom value, the light metering region is adjusted based on the ratio of the current picture zoom value to the maximum zoom multiple. For example, the expansion may be performed at a predetermined ratio with the center region as the center.
In an example, the threshold of the preset frame zoom value is 5X, the maximum zoom factor is 50X, and the frame zoom value is 7X in the current shooting process, which exceeds the threshold of the preset frame zoom value. The photometric area can be dynamically extended according to the following formula.
ROI_region’=ROI_region*(0.95+roi_thred/max_crop)
In the formula, ROI _ region' is the extended metering area, ROI _ region is the original metering area without scaling, ROI _ thred is the current picture scaling value, and max _ crop is the maximum scaling multiple of the current picture scaling value. It is understood that, in the above formula, 0.95 may be adjusted according to the parameters of the image capturing device.
According to the embodiment of the disclosure, based on the time-of-flight lattice distance, it is determined that the current distance scene is the first distance scene, that is, the short distance scene, and it is determined that the central metering mode is the metering mode matching the current distance scene, and when the picture is greatly zoomed, the metering region is dynamically corrected and adjusted, and central metering is performed based on the adjusted metering region, so as to further improve the reasonable exposure effect adapting to the short distance scene.
Based on the same conception, the embodiment of the disclosure also provides an image shooting device.
It is understood that the image capturing apparatus provided by the embodiments of the present disclosure includes hardware structures and/or software modules for performing the respective functions in order to implement the functions described above. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Fig. 9 is a block diagram illustrating an image photographing apparatus according to an exemplary embodiment of the present disclosure, and referring to fig. 9, the image photographing apparatus 100 includes a determination module 101 and an image photographing module 102.
The determining module 101 determines a current distance scene based on the time-of-flight lattice distance, where the distance scene and the metering mode have a corresponding relationship, and determines a target metering mode matching the current distance scene.
And an image shooting module 102, configured to perform image shooting in a target photometry manner.
In an embodiment, the current distance scene comprises a first distance scene, a second distance scene, or a third distance scene; the determining module 101 determines the current distance scene based on the time-of-flight lattice distance in the following manner: if the flying time lattice distance is smaller than a first proportional threshold of the flying time lattice limit distance, determining that the current distance scene is a first distance scene; if the flying time lattice distance is larger than a second proportional threshold of the flying time lattice limit distance, determining that the current distance scene is a second distance scene; if the flight time lattice distance is larger than a first proportional threshold of the flight time lattice limit ranging distance and smaller than a second proportional threshold of the flight time lattice limit ranging distance, determining that the current distance scene is a third distance scene; wherein the first proportional threshold is less than the second proportional threshold.
In an embodiment, the determining module 101 determines the target metering mode matching the current distance scene by using the following modes: if the current distance scene is a second distance scene, determining that a local photometric mode or a multi-point photometric mode is a target photometric mode based on the distance distribution concentration degree of all pixels in the image area; or if the current distance scene is a third distance scene, determining that the central light metering mode or the local light metering mode is the target light metering mode based on the distance distribution concentration degree of all pixels in the central area of the image; or if the current distance scene is the first distance scene, determining that the central light metering mode is the target light metering mode.
In an embodiment, the determining module 101 determines that the local metering mode or the multi-point metering mode is the target metering mode based on the concentration degree of the distance distribution of all the pixels in the image area as follows: when a first preset condition is met, determining that a local metering mode is a target metering mode, and determining an area corresponding to a pixel meeting the first preset condition as a metering area; when the first preset condition is not met, determining that the multi-point metering mode is a target metering mode, wherein a metering area corresponding to the target metering mode is the whole image area; wherein, the first preset condition comprises: the ratio of the number of pixels in the preset distance range to the number of all pixels in the image area is larger than a first preset proportion threshold value.
In an embodiment, the determining module 101 determines that the central metering mode or the local metering mode is the target metering mode based on the concentration degree of the distance distribution of all the pixels in the central area of the image as follows: when a second preset condition is met, determining that the central light metering mode is a target light metering mode, and a light metering area corresponding to the target light metering mode is an image central area; when the second preset condition is not met, determining that the local light metering mode is the target light metering mode; wherein the second preset condition comprises: in the central area of the image, the ratio of the number of pixels to the number of all pixels in the image area is larger than a second preset proportion threshold.
Fig. 10 is a block diagram illustrating an image photographing apparatus according to an exemplary embodiment, and referring to fig. 10, the image photographing apparatus 100 further includes an adjustment module 103.
The adjusting module 103 is configured to adjust the light metering area based on a ratio of the current frame zoom value to the maximum zoom factor when the current distance scene is the first distance scene and the current frame zoom value is greater than a threshold of a preset frame zoom value.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 11 is a block diagram illustrating an image capture device 800 according to an exemplary embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 11, the apparatus 800 may include one or more of the following components: a processing component 802, a memory 804, a power component 806, a multimedia component 808, an audio component 810, an input/output (I/O) interface 812, a sensor component 814, and a communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power component 806 provides power to the various components of device 800. The power components 806 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, audio component 810 includes a Microphone (MIC) configured to receive external audio signals when apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of the components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is understood that "a plurality" in this disclosure means two or more, and other words are analogous. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It will be further understood that the terms "first," "second," and the like are used to describe various information and that such information should not be limited by these terms. These terms are only used to distinguish one type of information from another, and do not indicate a particular order or degree of importance. Indeed, the terms "first," "second," and the like are fully interchangeable. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that, unless otherwise specified, "connected" includes direct connections between the two without the presence of other elements, as well as indirect connections between the two with the presence of other elements.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (14)

1. An image capturing method, characterized by comprising:
determining a current distance scene based on the time-of-flight lattice distance, wherein the distance scene and a photometric mode have a corresponding relation;
and determining a target light metering mode matched with the current distance scene, and shooting an image by adopting the target light metering mode.
2. The image capturing method according to claim 1, characterized in that the current distance scene includes a first distance scene, a second distance scene, or a third distance scene;
determining a current distance scene based on the time-of-flight lattice distance, comprising:
if the time-of-flight lattice distance is smaller than a first proportional threshold of the time-of-flight lattice limit distance, determining that the current distance scene is a first distance scene;
if the time-of-flight lattice distance is larger than a second proportional threshold of the time-of-flight lattice limit distance, determining that the current distance scene is a second distance scene;
if the time-of-flight lattice distance is greater than a first proportional threshold of the time-of-flight lattice limit ranging distance and less than a second proportional threshold of the time-of-flight lattice limit ranging distance, determining that the current distance scene is a third distance scene;
wherein the first proportional threshold is less than the second proportional threshold.
3. The image capturing method according to claim 1 or 2, wherein the determining of the target metering manner that matches the current distance scene includes at least one of:
when the current distance scene is a second distance scene, determining a local photometric mode or a multipoint photometric mode as the target photometric mode based on the distance distribution concentration degree of all pixels in the image area;
when the current distance scene is a third distance scene, determining a central metering mode or a local metering mode as the target metering mode based on the distance distribution concentration degree of all pixels in the central area of the image;
and when the current distance scene is the first distance scene, determining that the central light metering mode is the target light metering mode.
4. The image capturing method according to claim 3, wherein determining the local metering mode or the multipoint metering mode as the target metering mode based on a degree of concentration of distance distribution of all pixels in an image area includes:
when a first preset condition is met, determining that a local metering mode is the target metering mode, and determining an area corresponding to a pixel meeting the first preset condition as a metering area;
when the first preset condition is not met, determining that the multi-point metering mode is the target metering mode, wherein the metering area corresponding to the target metering mode is the whole image area;
wherein the first preset condition comprises: the ratio of the number of pixels in the preset distance range to the number of all pixels in the image area is larger than a first preset proportion threshold value.
5. The image capturing method according to claim 3, wherein determining the central metering mode or the partial metering mode as the target metering mode based on a degree of concentration of distance distribution of all pixels in a central area of the image includes:
when a second preset condition is met, determining that a central metering mode is the target metering mode, and a metering area corresponding to the target metering mode is the image central area;
when the second preset condition is not met, determining that the local light metering mode is the target light metering mode;
wherein the second preset condition comprises: in the image central area, the ratio of the number of pixels to the number of all pixels in the image area is larger than a second preset ratio threshold.
6. The image capturing method according to claim 3, wherein before image capturing is performed in the objective metering mode, the method further includes:
and if the current distance scene is the first distance scene and the current picture zooming value is larger than the threshold value of the preset picture zooming value, adjusting the light metering area based on the ratio of the current picture zooming value to the maximum zooming multiple.
7. An image capturing apparatus characterized by comprising:
the system comprises a determining module, a light measuring module and a light measuring module, wherein the determining module is used for determining a current distance scene based on a time-of-flight lattice distance, wherein the distance scene and a light measuring mode have a corresponding relation, and determining a target light measuring mode matched with the current distance scene;
and the image shooting module is used for shooting images by adopting the target light metering mode.
8. The image capturing apparatus according to claim 7, wherein the current distance scene includes a first distance scene, a second distance scene, or a third distance scene;
the determining module determines a current distance scene based on the time-of-flight lattice distance in the following manner:
if the time-of-flight lattice distance is smaller than a first proportional threshold of the time-of-flight lattice limit distance, determining that the current distance scene is a first distance scene;
if the time-of-flight lattice distance is larger than a second proportional threshold of the time-of-flight lattice limit distance, determining that the current distance scene is a second distance scene;
if the time-of-flight lattice distance is greater than a first proportional threshold of the time-of-flight lattice limit ranging distance and less than a second proportional threshold of the time-of-flight lattice limit ranging distance, determining that the current distance scene is a third distance scene;
wherein the first proportional threshold is less than the second proportional threshold.
9. The image capturing apparatus according to claim 7 or 8, wherein the determination module determines the target metering mode that matches the current distance scene using at least one of:
when the current distance scene is a second distance scene, determining a local metering mode or a multipoint metering mode as the target metering mode based on the distance distribution concentration degree of all pixels in the image area; or
When the current distance scene is a third distance scene, determining a central metering mode or a local metering mode as the target metering mode based on the distance distribution concentration degree of all pixels in the central area of the image; or
And when the current distance scene is the first distance scene, determining that the central light metering mode is the target light metering mode.
10. The image capturing apparatus according to claim 9, wherein the determining module determines the partial metering mode or the multi-spot metering mode as the target metering mode based on a concentration degree of distance distribution of all pixels in the image area in the following manner:
when a first preset condition is met, determining that a local metering mode is the target metering mode, and determining an area corresponding to a pixel meeting the first preset condition as a metering area;
when the first preset condition is not met, determining that the multi-point metering mode is the target metering mode, wherein the metering area corresponding to the target metering mode is the whole image area;
wherein the first preset condition comprises: the ratio of the number of pixels in the preset distance range to the number of all pixels in the image area is larger than a first preset proportion threshold value.
11. The image capturing apparatus according to claim 9, wherein the determining module determines the central metering mode or the partial metering mode as the target metering mode based on a degree of concentration of distance distribution of all pixels in a central area of the image as follows:
when a second preset condition is met, determining that a central light metering mode is the target light metering mode, and a light metering area corresponding to the target light metering mode is the image central area;
when the second preset condition is not met, determining that the local light metering mode is the target light metering mode;
wherein the second preset condition comprises: in the image central area, the ratio of the number of pixels to the number of all pixels in the image area is larger than a second preset ratio threshold.
12. The image capturing apparatus according to claim 9, characterized in that the image capturing apparatus further comprises:
and the adjusting module is used for adjusting the photometric area based on the ratio of the current image zooming value to the maximum zooming multiple when the current distance scene is the first distance scene and the current image zooming value is greater than the threshold value of the preset image zooming value.
13. An image capturing apparatus, characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: the image capturing method according to any one of claims 1 to 6 is performed.
14. A non-transitory computer readable storage medium having instructions therein, which when executed by a processor of a mobile terminal, enable the mobile terminal to perform the image capturing method of any one of claims 1 to 6.
CN202011350468.5A 2020-11-26 2020-11-26 Image capturing method, image capturing apparatus, and storage medium Pending CN114554103A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011350468.5A CN114554103A (en) 2020-11-26 2020-11-26 Image capturing method, image capturing apparatus, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011350468.5A CN114554103A (en) 2020-11-26 2020-11-26 Image capturing method, image capturing apparatus, and storage medium

Publications (1)

Publication Number Publication Date
CN114554103A true CN114554103A (en) 2022-05-27

Family

ID=81668155

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011350468.5A Pending CN114554103A (en) 2020-11-26 2020-11-26 Image capturing method, image capturing apparatus, and storage medium

Country Status (1)

Country Link
CN (1) CN114554103A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06189188A (en) * 1992-12-22 1994-07-08 Fuji Photo Film Co Ltd Video camera and photometric method therefor
JP2006311060A (en) * 2005-04-27 2006-11-09 Olympus Imaging Corp Imaging device and digital camera
CN101800859A (en) * 2009-02-11 2010-08-11 三星数码影像株式会社 Be used to the image pickup method and the equipment that use multiple metering mode repeatedly to take
JP2011171917A (en) * 2010-02-17 2011-09-01 Nec Corp Mobile terminal, and method and program for setting photographing system
CN104301624A (en) * 2014-10-30 2015-01-21 青岛海信移动通信技术股份有限公司 Image shooting brightness control method and device
CN105578062A (en) * 2014-10-14 2016-05-11 宏碁股份有限公司 Light metering mode selection method and image acquisition device utilizing same
CN108377344A (en) * 2017-01-31 2018-08-07 松下知识产权经营株式会社 Camera system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06189188A (en) * 1992-12-22 1994-07-08 Fuji Photo Film Co Ltd Video camera and photometric method therefor
JP2006311060A (en) * 2005-04-27 2006-11-09 Olympus Imaging Corp Imaging device and digital camera
CN101800859A (en) * 2009-02-11 2010-08-11 三星数码影像株式会社 Be used to the image pickup method and the equipment that use multiple metering mode repeatedly to take
JP2011171917A (en) * 2010-02-17 2011-09-01 Nec Corp Mobile terminal, and method and program for setting photographing system
CN105578062A (en) * 2014-10-14 2016-05-11 宏碁股份有限公司 Light metering mode selection method and image acquisition device utilizing same
CN104301624A (en) * 2014-10-30 2015-01-21 青岛海信移动通信技术股份有限公司 Image shooting brightness control method and device
CN108377344A (en) * 2017-01-31 2018-08-07 松下知识产权经营株式会社 Camera system

Similar Documents

Publication Publication Date Title
CN111586282B (en) Shooting method, shooting device, terminal and readable storage medium
CN109889724B (en) Image blurring method and device, electronic equipment and readable storage medium
CN111314597A (en) Terminal, focusing method and device
CN108462833B (en) Photographing method, photographing device and computer-readable storage medium
US10419662B2 (en) Photographing method for intelligent flight device and intelligent flight device
CN108040204B (en) Image shooting method and device based on multiple cameras and storage medium
CN108154466B (en) Image processing method and device
CN113364965A (en) Shooting method and device based on multiple cameras and electronic equipment
CN107241535B (en) Flash lamp adjusting device and terminal equipment
CN112188096A (en) Photographing method and device, terminal and storage medium
CN111586280A (en) Shooting method, shooting device, terminal and readable storage medium
CN114422687B (en) Preview image switching method and device, electronic equipment and storage medium
CN112702514B (en) Image acquisition method, device, equipment and storage medium
CN114554103A (en) Image capturing method, image capturing apparatus, and storage medium
JP7339939B2 (en) METHOD, APPARATUS AND STORAGE MEDIUM FOR DISPLAYING CAPTURE PREVIEW IMAGES
CN116934823A (en) Image processing method, device, electronic equipment and readable storage medium
CN111277754B (en) Mobile terminal shooting method and device
CN114979455A (en) Photographing method, photographing device and storage medium
CN114338956A (en) Image processing method, image processing apparatus, and storage medium
CN112866555A (en) Shooting method, shooting device, shooting equipment and storage medium
CN114073063B (en) Image processing method and device, camera assembly, electronic equipment and storage medium
WO2023240444A1 (en) Image processing method, apparatus and storage medium
CN109862252B (en) Image shooting method and device
CN115706848A (en) Focusing control method and device, electronic equipment and storage medium
CN115134517A (en) Shooting control method and device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination