CN112866555A - Shooting method, shooting device, shooting equipment and storage medium - Google Patents

Shooting method, shooting device, shooting equipment and storage medium Download PDF

Info

Publication number
CN112866555A
CN112866555A CN201911184619.1A CN201911184619A CN112866555A CN 112866555 A CN112866555 A CN 112866555A CN 201911184619 A CN201911184619 A CN 201911184619A CN 112866555 A CN112866555 A CN 112866555A
Authority
CN
China
Prior art keywords
scene
shooting
shot
distance
depth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911184619.1A
Other languages
Chinese (zh)
Other versions
CN112866555B (en
Inventor
刘霖
李国盛
潘佳河
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201911184619.1A priority Critical patent/CN112866555B/en
Publication of CN112866555A publication Critical patent/CN112866555A/en
Application granted granted Critical
Publication of CN112866555B publication Critical patent/CN112866555B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The present disclosure provides a shooting method, device, apparatus and storage medium, an embodiment of the method of the present disclosure at least provides a shooting mode: and (4) a full-image clear shooting mode. The image without blurring effect can be shot in the full-image clear shooting mode. The method comprises the steps of obtaining depth-of-field information and distance information of a scene to be shot, and judging whether the scene to be shot is a close-range plane shooting scene according to the depth-of-field information and the distance information; and responding to the planar shooting scene with the close distance to be shot, and switching to a full-image clear shooting mode to shoot so as to shoot a full-image clear image in the planar shooting scene. The method has the advantages that the shooting mode without blurring effect can be automatically selected for the screen shooting scene, the clear image of the whole image can be obtained in the plane scene, better shooting effect is achieved, and complicated operation steps caused by manual switching of a user are avoided.

Description

Shooting method, shooting device, shooting equipment and storage medium
Technical Field
The present application relates to the field of shooting technologies, and in particular, to a shooting method, an apparatus, a device, and a storage medium.
Background
With the rapid development of terminal technology, intelligent terminals such as smart phones and tablet computers are equipped with cameras, and through the cameras equipped on the mobile terminals, the intelligent terminals can shoot images. In order to improve the quality of the shot images, a plurality of cameras are arranged on part of the intelligent terminals, and the intelligent terminals often take the standard cameras as main cameras by default. However, for shooting a scene at a short distance, the inventor finds that the main camera often has the problems of inaccurate focusing and unclear shooting when shooting the image of the scene at the short distance.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides a multi-camera module based photographing method, apparatus, device, and storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided a photographing method provided with a full-image clear photographing mode for photographing an image without blurring effect, the method including:
acquiring depth information and distance information of a scene to be shot, wherein the depth information comprises the distance between at least one shot object and another shot object;
judging whether the scene to be shot is a close-range plane shooting scene or not according to the depth of field information and the distance information;
and responding to the scene to be shot as a close-range plane shooting scene, and switching to the full-image clear shooting mode for shooting.
In an alternative embodiment, there is further provided a depth shooting mode for shooting the locally blurred image, the method further comprising:
and responding to the scene to be shot which is a close-range three-dimensional shooting scene, and switching to the depth-of-field shooting mode for shooting.
In an optional embodiment, the full-view clear shooting mode is a shooting mode corresponding to a wide-angle camera, and the switching to the full-view clear shooting mode for shooting includes: and switching to the wide-angle camera for shooting.
In an optional embodiment, the depth of field shooting mode is a shooting mode corresponding to a main camera, and the switching to the depth of field shooting mode for shooting includes: and switching to the main camera for shooting.
In an alternative embodiment, the close-range planar shooting scene is used to represent the scene as: a close-range scene and a planar shooting scene; the close-up scene is determined within a specified distance range based on the distance information.
In an alternative embodiment, the close-range planar shooting scene is used to represent the scene as: a close-range scene and a planar shooting scene; the close-range stereo shooting scene is used for representing the scene as follows: a close-range scene and a stereo shooting scene;
the judging process of the plane shooting scene and the three-dimensional shooting scene comprises the following steps:
comparing the acquired depth of field information with a preset distance threshold;
if the intermediate distance of the depth information is smaller than or equal to a preset distance threshold value, judging that the scene to be shot is a plane shooting scene;
and if at least one distance in the depth of field information is larger than a preset distance threshold value, judging that the scene to be shot is a three-dimensional shooting scene.
In an optional embodiment, if the middle distance of the depth information is smaller than or equal to a preset distance threshold, determining that the scene to be shot is a planar shooting scene includes:
if the middle distance of the depth information is smaller than or equal to a preset distance threshold value, performing character recognition on the scene to be shot;
and if the scene to be shot is a character scene, judging that the scene to be shot is a plane shooting scene.
In an alternative embodiment, the depth of field information is determined by a phase focus PDAF technique and/or data collected by a range sensor of N x M, and the range information is determined by a contrast focus CAF technique and/or data collected by a range sensor.
According to a second aspect of the embodiments of the present disclosure, there is provided a photographing apparatus configured with a full-image clear photographing mode for photographing an image without blurring effect, the apparatus including:
the information acquisition module is configured to acquire depth information and distance information of a scene to be shot, wherein the depth information comprises a distance between at least one shot object and another shot object;
the scene judging module is configured to judge whether the scene to be shot is a close-range plane shooting scene according to the depth information and the distance information;
and the scene response module is configured to respond to the planar shooting scene with the scene to be shot being a close range, and switch to the full-image clear shooting mode for shooting.
In an alternative embodiment, a depth shooting mode for shooting the local blurred image is further configured, and the scene response module is further configured to: and responding to the scene to be shot which is a close-range three-dimensional shooting scene, and switching to the depth-of-field shooting mode for shooting.
In an optional embodiment, the full-view clear shooting mode is a shooting mode corresponding to a wide-angle camera, and the switching of the scene response module to the full-view clear shooting mode for shooting includes: and switching to the wide-angle camera for shooting.
In an optional embodiment, the depth shooting mode is a shooting mode corresponding to the main camera; the scene response module switching to the depth-of-field shooting mode for shooting comprises: and switching to the main camera for shooting.
In an alternative embodiment, the close-range planar shooting scene is used to represent the scene as: a close-range scene and a planar shooting scene; the close-up scene is determined within a specified distance range based on the distance information.
In an alternative embodiment, the close-range planar shooting scene is used to represent the scene as: a close-range scene and a planar shooting scene; the close-range stereo shooting scene is used for representing the scene as follows: a close-range scene and a stereo shooting scene;
the scene response module comprises:
an information comparison submodule configured to compare the acquired depth of field information with a preset distance threshold;
the plane scene judging submodule is configured to judge that the scene to be shot is a plane shooting scene if the middle distance of the depth information is smaller than or equal to a preset distance threshold;
and the stereoscopic scene judging submodule is configured to judge that the scene to be shot is a stereoscopic shooting scene if at least one distance in the depth information is larger than a preset distance threshold.
In an optional embodiment, the stereoscopic scene determination sub-module is specifically configured to:
if the middle distance of the depth information is smaller than or equal to a preset distance threshold value, performing character recognition on the scene to be shot;
and if the scene to be shot is a character scene, judging that the scene to be shot is a plane shooting scene.
In an alternative embodiment, the depth of field information is determined by a phase focus PDAF technique and/or data collected by a range sensor of N x M, and the range information is determined by a contrast focus CAF technique and/or data collected by a range sensor.
According to a third aspect of the embodiments of the present disclosure, there is provided a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of any one of the above methods when executing the program.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of any of the methods described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
the disclosed embodiment provides at least one shooting mode: and (4) a full-image clear shooting mode. The image without blurring effect can be shot in the full-image clear shooting mode. The method comprises the steps of obtaining depth-of-field information and distance information of a scene to be shot, and judging whether the scene to be shot is a close-range plane shooting scene according to the depth-of-field information and the distance information; and responding to the planar shooting scene with the close distance to be shot, and switching to a full-image clear shooting mode to shoot so as to shoot a full-image clear image in the planar shooting scene. The method has the advantages that the shooting mode without blurring effect can be automatically selected for the screen shooting scene, the clear image of the whole image can be obtained in the plane scene, better shooting effect is achieved, and complicated operation steps caused by manual switching of a user are avoided.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a photographing method according to an exemplary embodiment of the present disclosure.
Fig. 2 is a flow chart illustrating another photographing method according to an exemplary embodiment of the present disclosure.
Fig. 3 is a block diagram of a camera device shown in accordance with an exemplary embodiment of the present disclosure.
Fig. 4 is a schematic diagram illustrating a configuration of a photographing apparatus according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It is to be understood that although the terms first, second, third, etc. may be used herein to describe various information, such information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
The method is limited by optical limitation, under the condition that the pixel size and the aperture are larger and larger, the depth of field of short-distance shooting is shorter, the image quality around a lens is more obviously reduced, and when a main camera shoots an image of a short-distance scene, the problems of inaccurate focusing and unclear shooting are often caused.
Therefore, the embodiment of the present disclosure provides a shooting scheme, which at least provides one shooting mode: and (4) a full-image clear shooting mode. The image without blurring effect can be shot in the full-image clear shooting mode. The method comprises the steps of obtaining depth-of-field information and distance information of a scene to be shot, and judging whether the scene to be shot is a close-range plane shooting scene according to the depth-of-field information and the distance information; and responding to the planar shooting scene with the close distance to be shot, and switching to a full-image clear shooting mode to shoot so as to shoot a full-image clear image in the planar shooting scene. The method has the advantages that the shooting mode without blurring effect can be automatically selected for the screen shooting scene, the clear image of the whole image can be obtained in the plane scene, better shooting effect is achieved, and complicated operation steps caused by manual switching of a user are avoided.
The embodiments of the present disclosure will be described below with reference to the accompanying drawings.
As shown in fig. 1, fig. 1 is a flowchart illustrating a photographing method according to an exemplary embodiment of the present disclosure, provided with a full-view clear photographing mode for photographing an image without blurring effect, the method including the steps of:
in step 102, obtaining depth information and distance information of a scene to be shot, wherein the depth information comprises a distance between at least one shot object and another shot object;
in step 104, judging whether the scene to be shot is a close-range plane shooting scene according to the depth information and the distance information;
in step 106, in response to that the scene to be shot is a close-range plane shooting scene, switching to the full-image clear shooting mode for shooting.
The shooting method provided by the embodiment can be implemented by software, or by a combination of software and hardware, or by hardware, and the related hardware can be composed of two or more physical entities, or can be composed of one physical entity. For example, the method of the embodiment can be applied to an intelligent terminal with shooting capability, and the intelligent terminal is at least provided with a full-image clear shooting mode for shooting images without blurring effect. The intelligent terminal may be a device such as a PC, a tablet computer, a notebook computer, and a desktop computer, or a portable device such as a smart phone, a smart camera, and a PDA (Personal Digital Assistant), or a wearable device such as a smart band and a smart watch.
In the embodiment of the present disclosure, at least one shooting mode is configured: and (4) a full-image clear shooting mode. The full-image-definition shooting mode is used for shooting a full-image-definition image, and particularly shooting an image without blurring effect in a close-range mode.
The inventor finds that in some scenes, the user may desire to take a picture with blurring effect, while in some scenes, the user may desire to take a clear picture, and the user can only manually select different cameras to achieve different shooting purposes, which obviously has low efficiency. To this end, in another embodiment, a depth of field shooting mode is also provided. The depth-of-field shooting mode is used for shooting a locally blurred image. Depth of Field (DoF) generally refers to an object distance range in which a camera lens can clearly image a scene to be photographed, an area within the object distance range is called in-focus, an area outside the object distance range is called out-of-focus, the in-focus is a clear image, and the out-of-focus can be a clear image or a blurred image according to the Depth of Field (or Depth of Field), for example: for deep depth of field, clear images can be formed in focus or out of focus; for shallow depth of field, a sharp image can be formed in focus and a blurred image can be formed out of focus.
It should be understood that other capture modes may also be provided, and need not be described herein.
Regarding the full-view clear photographing mode/depth-of-field photographing mode, in one example, a photographing mode corresponding to one camera or a plurality of cameras may be used, and for example, the depth-of-field photographing mode may be obtained in combination with two cameras. Further, a wide-angle camera (ultra wide) can be used to obtain a depth-of-field shooting mode in combination with a main camera (wide), and a main camera or a wide-angle camera can be used to obtain a full-image clear shooting mode in combination with the main camera. The main camera may also be referred to as a standard camera. The wide-angle lens is a photographic lens with a focal length shorter than that of a standard lens, a visual angle larger than that of the standard lens, a focal length longer than that of a fisheye lens and a visual angle smaller than that of the fisheye lens. The wide-angle lens is divided into two types, namely a normal wide-angle lens and an ultra wide-angle lens, and is not limited herein. For example, if the scene is a planar shooting scene in a text mode, a wide-angle lens effect or a wide-angle lens assisting a main shooting lens effect can be used for enhancing, and image fusion is performed; if the scene is a blurring scene (depth-of-field shooting scene), the wide-angle lens and the main shooting lens are combined, and the depth information is input to the image fusion module to achieve the main body prominent blurring effect. By means of the methods, the problem of difficult focusing can be solved and a clear image can be obtained by close-range shooting.
In another example, the full-view clear photographing mode and the depth-of-field photographing mode may be photographing modes corresponding to different cameras, for example, the full-view clear photographing mode is a photographing mode corresponding to a first camera, and the depth-of-field photographing mode is a photographing mode corresponding to a second camera. The first camera is a camera that can be used to capture a clear image of the entire image in a close-range shooting scene, in other words, a camera that can be used to capture an image without blurring effect in a close-range shooting scene. The second camera is a camera that can be used to capture a locally blurred image in a close-up capture scene. For example, the focal length of the first camera is smaller than the focal length of the second camera. Accordingly, when the photographing mode is switched, a lens may be switched. Specifically, the switching to the full-image clear shooting mode for shooting includes: switching to the first camera for shooting; the switching to the depth-of-field shooting mode for shooting includes: and switching to the second camera for shooting.
According to the embodiment, different cameras are adopted to shoot different scenes, and pertinence is achieved, so that the shooting effect is better. It should be understood that the full-view clear shooting mode may be one of the shooting modes of the first camera, and the first camera may also be configured with other shooting modes, and the second camera is the same, which is not repeated herein.
In one example, a wide-angle camera is configured in the terminal device, and the first camera may be the wide-angle camera. Further, a main camera is also configured in the terminal device, and the second camera may be the main camera. Accordingly, when the photographing mode is switched, a lens may be switched. Specifically, the switching to the full-image clear shooting mode for shooting includes: switching to a wide-angle camera for shooting; the switching to the depth-of-field shooting mode for shooting includes: and switching to the main camera for shooting.
This embodiment chooses more suitable camera lens to shoot for use to different scenes, reaches better shooting effect. When the image of a plane scene is shot, the problem of inaccurate focusing in close-range shooting can be solved, and a clear image can be obtained.
It should be understood that the first camera and the second camera may also be other cameras, as long as the first camera can shoot a clear picture of the whole image in a close-distance shooting scene, and the second camera can shoot a local blurred picture in a close-distance shooting scene, which is not repeated herein.
The scene to be photographed may be a scene corresponding to the finder frame. The short-distance plane shooting scene is a short-distance scene and a short-distance plane shooting scene; the short-distance three-dimensional shooting scene is a short-distance scene and is short for three-dimensional shooting scenes. The close-up scene is used to indicate that the shooting scene is currently at a close-up. For example, a close-range scene may be determined within a specified distance range based on distance information. The distance information may be a distance between the subject and a distance measuring module (e.g., a distance sensor) on the camera or the terminal. When the acquired distance information is within the specified distance range, the current short-distance scene can be judged, and when the acquired distance information is outside the specified distance range, the current short-distance scene can be judged not to be in. The specified distance range may be a range of the pre-configured object to be photographed and the camera/ranging module, and the range is small, so that the object to be photographed is characterized to be closer to the object to be photographed (the camera) within the specified distance range by using the distance information. The specified distance range can be configured according to requirements.
The embodiment of the disclosure can acquire the depth of field information and the distance information of the scene to be shot, and judge whether the scene to be shot is a close-range planar shooting scene or not according to the depth of field information and the distance information, and further judge whether the scene to be shot is a close-range stereoscopic shooting scene or not.
With respect to the depth information, the depth information may include at least one pitch. Any one object in the scene to be photographed can be taken as a photographed object. For example, an entity corresponding to any pixel point in the frame can be taken as a subject. The pitch is a distance between one subject and another subject. For example, one subject may be a near-point subject and the other subject may be a far-point subject. The near-point photographic subject may be a subject closer to the camera/ranging module among the photographic subjects (e.g., one of P photographic subjects closest to the camera/ranging module among all the photographic subjects), and the far-point photographic subject may be a subject farther from the camera/ranging module among the photographic subjects (e.g., one of Q photographic subjects farthest from the camera/ranging module among all the photographic subjects). By defining a pair of the photographic subject as a near point photographic subject and a far point photographic subject, scene determination accuracy can be improved. The depth of field information may include one or more pitches. The judgment of the planar scene or the three-dimensional scene is carried out through a plurality of intervals, and the judgment accuracy can be improved.
As to how to obtain the depth of field information, for example, the depth of field information may be determined by a Phase Detection Auto Focus (PDAF) technique, which is also called Phase Detection Auto Focus (PDAF), and the principle of the PDAF is to reserve some shielding pixel points on the photosensitive element, specially for Phase Detection, and determine the Focus offset value according to the distance between pixels and the variation thereof, so as to achieve accurate Focus. For this purpose, the phase focusing technique can be used to determine the distance between different objects to be photographed. And some terminals are also provided with distance sensors, the depth information can also be determined by the information collected by the distance sensors. The distance sensor may be a TOF (Time of flight) sensor or the like. So-called time-of-flight 3D imaging may be achieved by continuously sending light pulses to the target, and then receiving light returning from the object with a sensor, and finding the target distance by detecting the time of flight (round trip) of the light pulses. To this end, depth information may be determined from data collected by the distance sensor.
In order to be able to obtain a plurality of spacings, the distance sensor may also be an N × M distance sensor, so that a plurality of spacings are obtained. To improve the accuracy of the depth information, in one example, the depth information may also be determined in conjunction with the data collected by the distance sensor and the phase focus PDAF technique. For example, the depth of field information is determined by the data collected by the N × M distance sensor and the phase focus PDAF technique, thereby improving the accuracy of the depth of field information.
It should be understood that the depth information may also be obtained in other manners as long as the distance between different objects can be determined, which is not described herein again.
Regarding how the distance information is obtained, in one example, the distance information may be determined using a contrast diagonal CAF technique, while in some terminals, distance sensors are also provided, and the distance information may also be determined from information collected by the distance sensors. The distance sensor may be a TOF (Time of flight) sensor or the like. In order to improve the accuracy of the distance information, in one example, the depth of field information may also be determined by combining the data collected by the distance sensor and the contrast focus CAF technique, thereby improving the accuracy of the distance information.
The planar shooting scene may be an approximately planar scene, and the distance between different objects in the scene is within an allowable distance range, and the scene may be considered as the planar shooting scene. For example, scenes such as characters and pictures are often shot in a flat scene, and a user desires to shoot a clear image of the entire image. For example, the plane shooting scene is used for representing that the middle distance of the depth information is less than or equal to a preset distance threshold; specifically, the process of determining the planar shooting scene may include: comparing the acquired depth of field information with a preset distance threshold; and if the intermediate distance of the depth information is smaller than or equal to a preset distance threshold value, judging that the scene to be shot is a plane shooting scene.
In this embodiment, a scene to be photographed with a distance smaller than or equal to a preset distance threshold may be used as a plane photographing scene, so as to determine the plane photographing scene.
In some applications, it is desirable to shoot a clear text plan at a short distance, and for this reason, in one embodiment, the plane shooting scene is used to represent that the middle distance of the depth information is less than or equal to a preset distance threshold, and the scene to be shot is a text scene. For example, if the middle distance of the depth information is less than or equal to a preset distance threshold, determining that the scene to be shot is a planar shooting scene may include:
if the middle distance of the depth information is smaller than or equal to a preset distance threshold value, performing character recognition on the scene to be shot;
and if the scene to be shot is a character scene, judging that the scene to be shot is a plane shooting scene.
In the embodiment, the scene to be shot is limited to be a short-distance and plane scene, and the scene to be shot is also limited to be a character scene, so that a clear character plane graph of the whole graph is shot.
Regarding the stereoscopic shooting scene, the distance between different objects in the scene is not within the allowable distance range, and the scene can be considered as the stereoscopic shooting scene. For example, when a scene such as a doll or a petal is photographed, it is considered to be a stereoscopic photographing scene. When shooting a stereoscopic shooting scene at a short distance, the user is considered to be required to shoot an image with a blurring effect at that time, and for this reason, the shooting mode can be switched to a depth-of-field shooting mode for shooting. For example, the stereo shooting scene is used for representing that at least one distance in the depth information is larger than a preset distance threshold. For example, the process of determining the stereo shooting scene may include: comparing the acquired depth of field information with a preset distance threshold; and if at least one distance in the depth of field information is larger than a preset distance threshold value, judging that the scene to be shot is a three-dimensional shooting scene.
In this embodiment, a scene to be photographed with a distance greater than a preset distance threshold may be used as a stereo photographing scene, so as to determine the stereo photographing scene. In some examples, it may be further defined that when X distances are greater than a preset distance threshold, it is determined that the scene to be photographed is a stereoscopic photographing scene, and the X value is configured according to a requirement.
When shooting at a close distance, if a plane picture or a character plane is shot, the whole picture is expected to be clear, but if a stereo shooting scene such as a doll or a petal is shot, perfect blurring is also expected. The disclosed embodiment enables the terminal to intelligently address this problem.
The various technical features in the above embodiments can be arbitrarily combined, so long as there is no conflict or contradiction between the combinations of the features, but the combination is limited by the space and is not described one by one, and therefore, any combination of the various technical features in the above embodiments also belongs to the scope disclosed in the present specification.
One of the combinations is exemplified below.
As shown in fig. 2, fig. 2 is a flowchart illustrating another photographing method according to an exemplary embodiment of the present disclosure, provided with a wide-angle camera that can photograph an image without a blurring effect at a close distance and a main camera that can photograph a partially blurred image, the method including:
in step 202, obtaining depth information and distance information of a scene to be shot, wherein the depth information comprises a distance between at least one shot object and another shot object;
in step 204, it is determined that the scene to be shot is a close-range planar shooting scene or a close-range stereoscopic shooting scene according to the depth information and the distance information;
in step 206, in response to that the scene to be shot is a close-range plane shooting scene, switching to the wide-angle camera for shooting;
in step 208, in response to that the scene to be shot is a close-range stereoscopic shooting scene, switching to the main camera for shooting.
Fig. 2 is the same as the related art in fig. 1, and the details are not repeated here.
When the plane shooting scene and the three-dimensional shooting scene are faced, the wide-angle camera and the main camera are correspondingly selected for shooting, the clear image of the whole image can be obtained in the plane shooting scene, and the background blurring effect image is obtained in the three-dimensional shooting scene, so that the automatic camera switching is realized, the better shooting effect is achieved, and the complex operation of a user is reduced.
For example, for the shooting logic, depth of field information is added, and in combination with distance information, it can be determined whether the current scene is a stereoscopic shooting scene or a planar shooting scene. The method comprises the following steps that a scene such as a doll or a petal is shot in a three-dimensional shooting mode, at the moment, the shooting logic can select a main camera to shoot, and a good background blurring effect is obtained. If the scene is shot in a plane, such as shooting a plane character scene, the shooting logic can select a wide-angle lens to shoot, so that the characters of the whole picture are clearer.
While, for purposes of simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present disclosure is not limited by the order of acts, as some steps may, in accordance with the present disclosure, occur in other orders and concurrently.
Further, those skilled in the art should also appreciate that the embodiments described in the specification are exemplary embodiments and that acts and modules referred to are not necessarily required by the disclosure.
Corresponding to the embodiment of the shooting method, the disclosure also provides an embodiment of a shooting device, equipment applied by the device and a storage medium.
As shown in fig. 3, fig. 3 is a block diagram of a photographing apparatus configured with a full-image clear photographing mode for photographing an image without blurring effect according to an exemplary embodiment of the present disclosure, the apparatus including:
the information acquisition module 32 is configured to acquire depth information and distance information of a scene to be shot, wherein the depth information includes a distance between at least one shot object and another shot object;
a scene judging module 34 configured to judge whether the scene to be shot is a close-range plane shooting scene according to the depth information and the distance information;
and the scene response module 36 is configured to respond to the planar shooting scene with the scene to be shot being a short distance, and switch to the full-image clear shooting mode for shooting.
In an alternative embodiment, a depth shooting mode for shooting the local blurred image is further configured, and the scene response module 36 is further configured to: and responding to the scene to be shot which is a close-range three-dimensional shooting scene, and switching to the depth-of-field shooting mode for shooting.
Correspondingly, the scene determination module 34 is further configured to determine whether the scene to be shot is a close-range stereoscopic shooting scene according to the depth information and the distance information.
In an optional embodiment, the full-view clear shooting mode is a shooting mode corresponding to a wide-angle camera, and the switching of the scene response module 36 to the full-view clear shooting mode for shooting includes: and switching to the wide-angle camera for shooting.
In an optional embodiment, the depth shooting mode is a shooting mode corresponding to the main camera; the switching of the scene response module 36 to the depth shooting mode for shooting includes: and switching to the main camera for shooting.
In an alternative embodiment, the close-range planar shooting scene is used to represent the scene as: a close-range scene and a planar shooting scene; the close-up scene is determined within a specified distance range based on the distance information.
In an alternative embodiment, the close-range planar shooting scene is used to represent the scene as: a close-range scene and a planar shooting scene; the close-range stereo shooting scene is used for representing the scene as follows: a close-range scene and a stereo shooting scene;
the scene response module 36 includes (not shown in fig. 3):
an information comparison submodule configured to compare the acquired depth of field information with a preset distance threshold;
the plane scene judging submodule is configured to judge that the scene to be shot is a plane shooting scene if the middle distance of the depth information is smaller than or equal to a preset distance threshold;
and the stereoscopic scene judging submodule is configured to judge that the scene to be shot is a stereoscopic shooting scene if at least one distance in the depth information is larger than a preset distance threshold.
In an optional embodiment, the stereoscopic scene determination sub-module is specifically configured to:
if the middle distance of the depth information is smaller than or equal to a preset distance threshold value, performing character recognition on the scene to be shot;
and if the scene to be shot is a character scene, judging that the scene to be shot is a plane shooting scene.
In an alternative embodiment, the depth of field information is determined by a phase focus PDAF technique and/or data collected by a range sensor of N x M, and the range information is determined by a contrast focus CAF technique and/or data collected by a range sensor.
The specific details of the implementation process of the functions and actions of each module in the device are referred to the implementation process of the corresponding step in the method, and are not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, wherein the modules described as separate parts may or may not be physically separate, and the parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the disclosed solution. One of ordinary skill in the art can understand and implement it without inventive effort.
Accordingly, the present disclosure also provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps of any of the above-described photographing methods.
The present disclosure may take the form of a computer program product embodied on one or more storage media including, but not limited to, disk storage, CD-ROM, optical storage, and the like, having program code embodied therein. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
Accordingly, the present disclosure provides a computer device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of any of the above-mentioned shooting methods when executing the program.
Fig. 4 is a schematic diagram illustrating a structure of a camera 400 according to an exemplary embodiment. For example, the apparatus 400 may be a user equipment with a shooting function, and may be embodied as a mobile phone, a computer, a digital broadcast terminal, a tablet device, a medical device, a fitness device, a personal digital assistant, a wearable device such as a smart watch, a smart glasses, a smart band, a smart running shoe, and the like.
Referring to fig. 4, the apparatus 400 may include one or more of the following components: processing components 402, memory 404, power components 406, multimedia components 408, audio components 410, input/output (I/O) interfaces 412, sensor components 414, and communication components 416.
The processing component 402 generally controls overall operation of the apparatus 400, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 402 may include one or more processors 420 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 402 can include one or more modules that facilitate interaction between the processing component 402 and other components. For example, the processing component 402 can include a multimedia module to facilitate interaction between the multimedia component 408 and the processing component 402.
The memory 404 is configured to store various types of data to support operations at the device 400. Examples of such data include instructions for any application or method operating on the device 400, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 404 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power supply components 406 provide power to the various components of device 400. The power components 406 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 400.
The multimedia component 408 includes a screen that provides an output interface between the device 400 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of the touch or slide action but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 408 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the device 400 is in an operational mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 410 is configured to output and/or input audio signals. For example, audio component 410 includes a Microphone (MIC) configured to receive external audio signals when apparatus 400 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 404 or transmitted via the communication component 416. In some embodiments, audio component 410 also includes a speaker for outputting audio signals.
The I/O interface 412 provides an interface between the processing component 402 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor component 414 includes one or more sensors for providing various aspects of status assessment for the apparatus 400. For example, the sensor assembly 414 may detect an open/closed state of the device 400, the relative positioning of the components, such as a display and keypad of the apparatus 400, the sensor assembly 414 may also detect a change in the position of the apparatus 400 or a component of the apparatus 400, the presence or absence of user contact with the apparatus 400, orientation or acceleration/deceleration of the apparatus 400, and a change in the temperature of the apparatus 400. The sensor assembly 414 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 414 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 414 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 416 is configured to facilitate wired or wireless communication between the apparatus 400 and other devices. The apparatus 400 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G LTE, 5G NR, or a combination thereof. In an exemplary embodiment, the communication component 416 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 416 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 400 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 404 including instructions that, when executed by the processor 420 of the apparatus 400, enable the apparatus 400 to perform a method of capturing, the method including:
acquiring depth information and distance information of a scene to be shot, wherein the depth information comprises the distance between at least one shot object and another shot object;
judging whether the scene to be shot is a close-range plane shooting scene or not according to the depth of field information and the distance information;
and responding to the scene to be shot as a close-range plane shooting scene, and switching to the full-image clear shooting mode for shooting.
The non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. A photographing method provided with a full-picture clear photographing mode for photographing an image without blurring effect, the method comprising:
acquiring depth information and distance information of a scene to be shot, wherein the depth information comprises the distance between at least one shot object and another shot object;
judging whether the scene to be shot is a close-range plane shooting scene or not according to the depth of field information and the distance information;
and responding to the scene to be shot as a close-range plane shooting scene, and switching to the full-image clear shooting mode for shooting.
2. The method of claim 1, wherein a depth shooting mode is further provided for shooting the locally blurred image, the method further comprising:
and responding to the scene to be shot which is a close-range three-dimensional shooting scene, and switching to the depth-of-field shooting mode for shooting.
3. The method according to claim 2, wherein the full-view-resolution shooting mode is a shooting mode corresponding to a wide-angle camera, and the depth-of-field shooting mode is a shooting mode corresponding to a main camera;
the switching to the full-image clear shooting mode for shooting comprises: switching to the wide-angle camera for shooting;
the switching to the depth-of-field shooting mode for shooting includes: and switching to the main camera for shooting.
4. The method of claim 1, wherein the close-range planar capture scene is used to represent a scene as: a close-range scene and a planar shooting scene; the close-up scene is determined within a specified distance range based on the distance information.
5. The method of claim 2, wherein the close-range planar capture scene is used to represent a scene as: a close-range scene and a planar shooting scene; the close-range stereo shooting scene is used for representing the scene as follows: a close-range scene and a stereo shooting scene;
the judging process of the plane shooting scene and the three-dimensional shooting scene comprises the following steps:
comparing the acquired depth of field information with a preset distance threshold;
if the intermediate distance of the depth information is smaller than or equal to a preset distance threshold value, judging that the scene to be shot is a plane shooting scene;
and if at least one distance in the depth of field information is larger than a preset distance threshold value, judging that the scene to be shot is a three-dimensional shooting scene.
6. The method according to claim 5, wherein the determining that the scene to be photographed is a planar photographing scene if the depth information intermediate distance is smaller than or equal to a preset distance threshold value comprises:
if the middle distance of the depth information is smaller than or equal to a preset distance threshold value, performing character recognition on the scene to be shot;
and if the scene to be shot is a character scene, judging that the scene to be shot is a plane shooting scene.
7. The method according to any one of claims 1 to 6, wherein the depth of field information is determined by data acquired by a distance sensor of N x M and/or a phase focus PDAF technique, and wherein the distance information is determined by data acquired by a distance sensor and/or a contrast focus CAF technique.
8. A photographing apparatus configured with a full-picture clear photographing mode for photographing an image without blurring effect, the apparatus comprising:
the information acquisition module is configured to acquire depth information and distance information of a scene to be shot, wherein the depth information comprises a distance between at least one shot object and another shot object;
the scene judging module is configured to judge whether the scene to be shot is a close-range plane shooting scene according to the depth information and the distance information;
and the scene response module is configured to respond to the planar shooting scene with the scene to be shot being a close range, and switch to the full-image clear shooting mode for shooting.
9. The apparatus of claim 8, further configured with a depth capture mode for capturing the partially blurred image, the scene response module further configured to: and responding to the scene to be shot which is a close-range three-dimensional shooting scene, and switching to the depth-of-field shooting mode for shooting.
10. The apparatus according to claim 9, wherein the full-view-resolution shooting mode is a shooting mode corresponding to a wide-angle camera, and the depth-of-field shooting mode is a shooting mode corresponding to a main camera;
the scene response module is switched to the full-image clear shooting mode for shooting, and comprises the following steps: switching to the wide-angle camera for shooting;
the scene response module switching to the depth-of-field shooting mode for shooting comprises: and switching to the main camera for shooting.
11. The apparatus of claim 8, wherein the close-range planar shooting scene is used to represent a scene: a close-range scene and a planar shooting scene; the close-up scene is determined within a specified distance range based on the distance information.
12. The apparatus of claim 9, wherein the close-range planar shooting scene is used to represent a scene: a close-range scene and a planar shooting scene; the close-range stereo shooting scene is used for representing the scene as follows: a close-range scene and a stereo shooting scene; the scene response module comprises:
an information comparison submodule configured to compare the acquired depth of field information with a preset distance threshold;
the plane scene judging submodule is configured to judge that the scene to be shot is a plane shooting scene if the middle distance of the depth information is smaller than or equal to a preset distance threshold;
and the stereoscopic scene judging submodule is configured to judge that the scene to be shot is a stereoscopic shooting scene if at least one distance in the depth information is larger than a preset distance threshold.
13. The apparatus according to claim 12, wherein the stereoscopic scene determination sub-module is specifically configured to:
if the middle distance of the depth information is smaller than or equal to a preset distance threshold value, performing character recognition on the scene to be shot;
and if the scene to be shot is a character scene, judging that the scene to be shot is a plane shooting scene.
14. The apparatus according to any one of claims 8 to 13, wherein the depth of field information is determined by data collected by a distance sensor of N x M and/or a phase focus PDAF technique, and wherein the distance information is determined by data collected by a distance sensor and/or a contrast focus CAF technique.
15. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the steps of the method of any one of claims 1 to 7 are implemented when the program is executed by the processor.
16. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN201911184619.1A 2019-11-27 2019-11-27 Shooting method, shooting device, shooting equipment and storage medium Active CN112866555B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911184619.1A CN112866555B (en) 2019-11-27 2019-11-27 Shooting method, shooting device, shooting equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911184619.1A CN112866555B (en) 2019-11-27 2019-11-27 Shooting method, shooting device, shooting equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112866555A true CN112866555A (en) 2021-05-28
CN112866555B CN112866555B (en) 2022-08-05

Family

ID=75985819

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911184619.1A Active CN112866555B (en) 2019-11-27 2019-11-27 Shooting method, shooting device, shooting equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112866555B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117354624A (en) * 2023-12-06 2024-01-05 荣耀终端有限公司 Camera switching method, device and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009284404A (en) * 2008-05-26 2009-12-03 Nec Corp Captured image synthesis method and digital camera
US20100079589A1 (en) * 2008-09-26 2010-04-01 Sanyo Electric Co., Ltd. Imaging Apparatus And Mode Appropriateness Evaluating Method
CN105959555A (en) * 2016-06-02 2016-09-21 广东欧珀移动通信有限公司 Shooting mode automatic adjustment method and device and mobile terminal
CN106851113A (en) * 2017-03-30 2017-06-13 努比亚技术有限公司 A kind of photographic method and mobile terminal based on dual camera
CN106990646A (en) * 2016-01-20 2017-07-28 深圳富泰宏精密工业有限公司 Many lens systems, its method of work and portable electron device
CN107231529A (en) * 2017-06-30 2017-10-03 努比亚技术有限公司 Image processing method, mobile terminal and storage medium
CN108419008A (en) * 2018-01-30 2018-08-17 努比亚技术有限公司 A kind of image pickup method, terminal and computer readable storage medium
CN109729266A (en) * 2018-12-25 2019-05-07 努比亚技术有限公司 A kind of image capturing method, terminal and computer readable storage medium
CN109923850A (en) * 2017-07-07 2019-06-21 华为技术有限公司 A kind of image capture device and method
CN110300267A (en) * 2019-07-19 2019-10-01 维沃移动通信有限公司 Photographic method and terminal device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009284404A (en) * 2008-05-26 2009-12-03 Nec Corp Captured image synthesis method and digital camera
US20100079589A1 (en) * 2008-09-26 2010-04-01 Sanyo Electric Co., Ltd. Imaging Apparatus And Mode Appropriateness Evaluating Method
CN106990646A (en) * 2016-01-20 2017-07-28 深圳富泰宏精密工业有限公司 Many lens systems, its method of work and portable electron device
CN105959555A (en) * 2016-06-02 2016-09-21 广东欧珀移动通信有限公司 Shooting mode automatic adjustment method and device and mobile terminal
CN106851113A (en) * 2017-03-30 2017-06-13 努比亚技术有限公司 A kind of photographic method and mobile terminal based on dual camera
CN107231529A (en) * 2017-06-30 2017-10-03 努比亚技术有限公司 Image processing method, mobile terminal and storage medium
CN109923850A (en) * 2017-07-07 2019-06-21 华为技术有限公司 A kind of image capture device and method
CN108419008A (en) * 2018-01-30 2018-08-17 努比亚技术有限公司 A kind of image pickup method, terminal and computer readable storage medium
CN109729266A (en) * 2018-12-25 2019-05-07 努比亚技术有限公司 A kind of image capturing method, terminal and computer readable storage medium
CN110300267A (en) * 2019-07-19 2019-10-01 维沃移动通信有限公司 Photographic method and terminal device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117354624A (en) * 2023-12-06 2024-01-05 荣耀终端有限公司 Camera switching method, device and storage medium

Also Published As

Publication number Publication date
CN112866555B (en) 2022-08-05

Similar Documents

Publication Publication Date Title
CN110493526B (en) Image processing method, device, equipment and medium based on multiple camera modules
CN108419016B (en) Shooting method and device and terminal
CN110557547B (en) Lens position adjusting method and device
CN106210496B (en) Photo shooting method and device
CN109889724B (en) Image blurring method and device, electronic equipment and readable storage medium
CN108154465B (en) Image processing method and device
CN108040204B (en) Image shooting method and device based on multiple cameras and storage medium
CN111756989A (en) Method and device for controlling focusing of lens
KR101701814B1 (en) Method and apparatus for displaying framing information
CN110827219B (en) Training method, device and medium of image processing model
CN112866555B (en) Shooting method, shooting device, shooting equipment and storage medium
CN111461950B (en) Image processing method and device
CN112188096A (en) Photographing method and device, terminal and storage medium
CN106469446B (en) Depth image segmentation method and segmentation device
CN114422687B (en) Preview image switching method and device, electronic equipment and storage medium
CN114666490B (en) Focusing method, focusing device, electronic equipment and storage medium
CN111698414B (en) Image signal processing method and device, electronic device and readable storage medium
CN107707819B (en) Image shooting method, device and storage medium
CN114244999A (en) Automatic focusing method and device, camera equipment and storage medium
CN114390189A (en) Image processing method, device, storage medium and mobile terminal
CN114268731B (en) Camera switching method, camera switching device and storage medium
CN114339017B (en) Distant view focusing method, device and storage medium
CN113852752B (en) Photo taking method, photo taking device and storage medium
CN109447929B (en) Image synthesis method and device
US11838655B2 (en) Image acquiring method and apparatus, electronic device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant