CN113810598B - Photographing method, electronic device and storage medium - Google Patents

Photographing method, electronic device and storage medium Download PDF

Info

Publication number
CN113810598B
CN113810598B CN202110919953.8A CN202110919953A CN113810598B CN 113810598 B CN113810598 B CN 113810598B CN 202110919953 A CN202110919953 A CN 202110919953A CN 113810598 B CN113810598 B CN 113810598B
Authority
CN
China
Prior art keywords
image
camera
blurring
blur
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110919953.8A
Other languages
Chinese (zh)
Other versions
CN113810598A (en
Inventor
乔晓磊
肖斌
丁大钧
朱聪超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110919953.8A priority Critical patent/CN113810598B/en
Publication of CN113810598A publication Critical patent/CN113810598A/en
Priority to PCT/CN2022/093613 priority patent/WO2023016025A1/en
Application granted granted Critical
Publication of CN113810598B publication Critical patent/CN113810598B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a photographing method, electronic equipment and a storage medium, relates to the field of photographing, and solves the problem that when two images are acquired through two cameras to be fused to obtain a photographed image, the fusion boundary of the photographed image is obvious. The specific scheme is as follows: the electronic equipment starts a camera; displaying a preview interface, wherein the preview interface comprises a first control; detecting a first operation on a first control; responding to the first operation, the first camera acquires a first image, and the second camera acquires a second image, wherein the definition of the second image is higher than that of the first image; carrying out fuzzy processing on the second image to obtain a third image; fusing the third image and the first image to obtain a fourth image; the fourth image is saved.

Description

Photographing method, electronic device and storage medium
Technical Field
The present disclosure relates to the field of photographing, and in particular, to a photographing method, an electronic device, and a storage medium.
Background
With the continuous optimization of mobile phone cameras, the mobile phone has more and more powerful photographing functions, and more users use the mobile phone to photograph. At present, in order to take photos with longer focal length and/or larger field angle, a long-focus camera and/or an ultra-wide-angle camera, etc. are additionally arranged on a mobile phone besides a conventional main camera. When a photo is taken, the mobile phone can often select a camera corresponding to a zoom multiple according to different zoom multiples to take the photo. In some mobile phones, two cameras are used for shooting a shot object at the same time in a partial zoom multiple range, and then images shot by the two cameras are fused, so that the imaging quality of pictures shot by the mobile phone is improved.
However, because the images shot by the different cameras have differences, when the mobile phone adopts two cameras to shoot a shot object simultaneously, the definition of the images shot by the two cameras respectively has larger difference, and finally, the images shot by the two cameras respectively and fused have larger definition difference at the fusion boundary, so that the fusion boundary is obvious, and the finally obtained fusion image has stronger splicing feeling.
Disclosure of Invention
The application provides a photographing method, electronic equipment and a storage medium, which solve the problem that when two images are acquired through two cameras to be fused to obtain a photographed image, the fusion boundary of the photographed image is obvious.
In order to achieve the purpose, the following technical scheme is adopted in the application:
in a first aspect, the present application provides a photographing method, which is applicable to an electronic device. The electronic equipment comprises a first camera and a second camera, wherein the field angle of the first camera is different from that of the second camera. The method comprises the following steps: the electronic equipment starts a camera; displaying a preview interface, wherein the preview interface comprises a first control; detecting a first operation on a first control; responding to the first operation, the first camera acquires a first image, and the second camera acquires a second image, wherein the definition of the second image is higher than that of the first image; carrying out fuzzy processing on the second image to obtain a third image; fusing the third image and the first image to obtain a fourth image; the fourth image is saved.
By adopting the technical scheme, when the electronic equipment shoots through the two cameras and then fuses the images respectively obtained by shooting to obtain the shot images, the electronic equipment reduces the definition of the corresponding images in a mode of blurring the images with higher definition in the two obtained images. Therefore, the definition difference between the images respectively obtained by the two cameras due to the difference of the resolution and the noise reduction capability between the cameras can be reduced. Furthermore, the two images with small definition difference are fused, and a shot image with unobvious fusion boundary and weak splicing sense can be obtained.
In one possible implementation, the field of view of the first camera is greater than the field of view of the second camera.
Generally, the image captured by the camera with the small field angle has higher definition, so when the field angle of the first camera is larger than that of the second camera, the second image acquired by the second camera is blurred to reduce the definition of the second image, so that the definition difference between the first image and the blurred second image (i.e. the third image) can be reduced, and the splicing feeling of the fused images is reduced.
In another possible implementation manner, the blurring the second image to obtain a third image includes: determining the fuzzy degree according to the similarity between the second image and the first image and the corresponding relation between the preset similarity and the fuzzy degree; and carrying out fuzzy processing on the second image according to the determined fuzzy degree.
The definition difference between the first image and the second image can be determined according to the similarity, and the higher the similarity is, the smaller the definition difference is. Therefore, the blurring strength is determined according to the similarity between the first image and the second image, so that the degree of blurring the second image can be adjusted based on the difference in sharpness between the second image and the first image. Therefore, the situation that the definition of the fused image is not improved compared with that of the first image due to the fact that the definition of the second image after being processed is poorer than that of the first image caused by excessive blurring processing of the second image is avoided.
In another possible implementation, the similarity is a structural similarity SSIM value.
The similarity is expressed by the structural similarity value, so that the similarity between the first image and the second image can be measured more accurately from the aspect of image composition.
In another possible implementation, the similarity is inversely proportional to the degree of blur.
The higher the similarity is, the smaller the difference between the degrees of sharpness of the first image and the second image is, so that the similarity is inversely proportional to the degree of blur, and when the similarity is higher, that is, the difference between the degrees of sharpness of the first image and the second image is smaller, the situation that the degree of sharpness of the second image after being processed is still worse than that of the first image due to the fact that the second image is excessively blurred, and the degree of sharpness of the fused image is not improved compared with that of the first image can be avoided.
In another possible implementation manner, the blurring processing is performed on the second image to obtain a third image, and the method includes: determining the blurring degree according to the corresponding light sensitivity of the second image and the corresponding relation between the preset light sensitivity and the blurring degree; and carrying out fuzzy processing on the second image according to the determined fuzzy degree.
In general, a camera capable of obtaining a relatively clear image has a strong noise removal capability, and thus, when noise of an image increases due to an increase in sensitivity, the second image is clearer than the first image. I.e. the higher the sensitivity the greater the difference in sharpness between the second image and the first image. Therefore, the degree of blurring is determined according to the sensitivity, so that the degree of blurring the second image can be adjusted based on the difference in sharpness between the second image and the first image. Therefore, the situation that the definition of the fused image is not improved compared with that of the first image due to the fact that the definition of the second image after being processed is poorer than that of the first image caused by excessive blurring processing of the second image is avoided.
In another possible implementation, the sensitivity is proportional to the blur dynamics.
The higher the sensitivity is, the larger the difference between the sharpness of the first image and the sharpness of the second image is, so that the sensitivity is in proportion to the blurring strength, and when the sensitivity is lower, that is, the difference between the sharpness of the first image and the sharpness of the second image is smaller, the situation that the sharpness of the second image after being processed is worse than that of the first image due to the fact that the second image is excessively blurred, and the sharpness of the image after being fused is not improved compared with that of the first image is avoided.
In another possible implementation manner, the blurring the second image to obtain a third image includes: determining the blurring strength according to the corresponding ambient brightness of the second image and the corresponding relation between the preset ambient brightness and the blurring strength; and carrying out fuzzy processing on the second image according to the determined fuzzy degree.
In general, a camera capable of obtaining a relatively clear image has a relatively strong denoising capability, and thus, when noise of an image increases due to an increase in ambient brightness, the second image is clearer than the first image. I.e. the higher the ambient brightness, the greater the difference in sharpness between the second image and the first image. Therefore, the blurring strength is determined according to the ambient brightness, so that the degree of blurring the second image can be adjusted based on the difference in sharpness between the second image and the first image. Therefore, the situation that the definition of the fused image is not improved compared with that of the first image due to the fact that the definition of the second image after being processed is poorer than that of the first image caused by excessive blurring processing of the second image is avoided.
In another possible implementation, the ambient brightness is proportional to the blur strength.
The higher the ambient brightness is, the larger the difference between the definitions of the first image and the second image is, so that the ambient brightness is in direct proportion to the blurring strength, and when the ambient brightness is lower, that is, when the difference between the definitions of the first image and the second image is smaller, the situation that the definition of the fused image is not improved compared with that of the first image due to the fact that the definition of the second image after being processed is worse than that of the first image due to the fact that the second image is subjected to excessive blurring processing is avoided.
In another possible implementation, the blurring process includes any one of: gaussian blur, surface blur, box blur, kawase blur, double blur, shot blur, shift blur, aperture blur, granular blur, radial blur, directional blur.
In another possible implementation manner, the first image is an image obtained by shooting with the first camera and adjusting the image to a current zoom multiple through digital zooming.
In another possible implementation manner, the first image is an image directly captured by the first camera; fusing the third image and the first image to obtain a fourth image, wherein the fourth image comprises: carrying out digital zooming on the first image to adjust the first image to the current zooming multiple; and fusing the third image and the first image subjected to digital zooming to obtain a fourth image.
In a second aspect, the present application provides a photographing apparatus that can be applied to an electronic device including a first camera and a second camera, a field angle of the first camera and a field angle of the second camera being different. The apparatus is for implementing the method of the first aspect described above. The functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules corresponding to the above functions, for example, a processing module, a display module, and the like.
The display module can be used for displaying a preview interface when the electronic equipment starts the camera, wherein the preview interface comprises a first control; the processing module may be configured to detect a first operation on a first control; responding to the first operation, acquiring a first image through a first camera, and acquiring a second image through a second camera, wherein the definition of the second image is higher than that of the first image; carrying out fuzzy processing on the second image to obtain a third image; fusing the third image and the first image to obtain a fourth image; the fourth image is saved.
In one possible implementation, the field of view of the first camera is greater than the field of view of the second camera.
In another possible implementation manner, the processing module is specifically configured to determine the degree of blur according to the similarity between the second image and the first image and according to a preset correspondence between the similarity and the degree of blur; and carrying out fuzzy processing on the second image according to the determined fuzzy degree.
In another possible implementation, the similarity is a structural similarity SSIM value.
In another possible implementation, the similarity is inversely proportional to the degree of blur.
In another possible implementation manner, the processing module is specifically configured to determine the blurring strength according to the sensitivity corresponding to the second image and a preset correspondence between the sensitivity and the blurring strength; and carrying out fuzzy processing on the second image according to the determined fuzzy degree.
In another possible implementation, the sensitivity is proportional to the blur dynamics.
In another possible implementation manner, the processing module is specifically configured to determine the blurring strength according to a preset corresponding relationship between the environment brightness and the blurring strength, and according to the environment brightness corresponding to the second image; and carrying out fuzzy processing on the second image according to the determined fuzzy degree.
In another possible implementation, the ambient brightness is proportional to the blur strength.
In another possible implementation, the blurring process includes any of: gaussian blur, surface blur, box blur, kawase blur, double blur, shot blur, shift blur, aperture blur, granular blur, radial blur, directional blur.
In another possible implementation manner, the first image is an image obtained by digitally zooming an image captured by the first camera to a current zoom multiple.
In another possible implementation manner, the first image is an image directly captured by the first camera; the processing module is specifically used for carrying out digital zooming on the first image so as to adjust the first image to the current zooming multiple; and fusing the third image and the first image subjected to digital zooming to obtain a fourth image.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, a memory for storing instructions executable by the processor. The processor is configured to execute the above instructions to enable the electronic device to implement the photographing method according to the first aspect or any one of the possible implementation manners of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having computer program instructions stored thereon. The computer program instructions, when executed by the electronic device, cause the electronic device to implement the photographing method as described in the first aspect or any one of the possible implementations of the first aspect.
In a fifth aspect, the present application provides a computer program product, which includes computer readable code, and when the computer readable code runs in an electronic device, the electronic device is caused to implement the photographing method according to the first aspect or any one of the possible implementation manners of the first aspect.
It should be understood that the beneficial effects of the second to fifth aspects can be seen from the description of the first aspect, and are not repeated herein.
Drawings
Fig. 1 is a schematic view of an application of a related art photographing using two cameras;
FIG. 2 is a schematic diagram of an application of image fusion provided in the related art;
FIG. 3 is a schematic diagram of another image fusion application provided in the related art;
fig. 4 is a schematic application diagram of a photographing method according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 6 is a schematic diagram illustrating a system architecture of an electronic device according to an embodiment of the present disclosure;
fig. 7 is a schematic flowchart of a photographing method according to an embodiment of the present disclosure;
fig. 8 is a scene schematic diagram of a photographing operation according to an embodiment of the present application;
FIG. 9 is a schematic diagram illustrating a relationship between structure similarity and fuzzy degree according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a photographing device according to an embodiment of the present application.
Detailed Description
With the continuous development of mobile phones, the camera of the mobile phone is continuously upgraded, and the photographing function of the mobile phone is more and more powerful. At present, besides a conventional main camera (generally, a wide-angle camera), many mobile phones are additionally provided with cameras (i.e., cameras with different angles of view, wherein the focal length of the camera with the larger angle of view is shorter) which have different focal lengths from the main camera, such as a telephoto camera and an ultra-wide-angle camera. Therefore, when a user uses the mobile phone to shoot a picture, the mobile phone can provide a longer focal length through the telephoto camera to obtain a better telephoto shooting effect, and can also provide a larger field angle through the ultra-wide angle camera to obtain a better wide-angle shooting effect. Therefore, when the user adjusts the zoom factor to zoom in or zoom out the shot image, the mobile phone selects the corresponding camera according to the different zoom factors to shoot. For example, when the user adjusts the zoom factor higher to enlarge the shot image, the mobile phone may select to use the telephoto camera for shooting, thereby obtaining a shot image with higher quality while enlarging the shot image. For another example, when the user reduces the zoom factor to zoom in the captured image, the mobile phone may choose to capture the image using the super wide-angle camera, so as to obtain a captured image with high quality while zooming out the captured image. Moreover, most of the cameras arranged in the current mobile phones adopt fixed-focus cameras (namely, cameras with fixed focal lengths), so that the cameras with different focal lengths arranged in the mobile phones can shoot a shot image with high imaging quality only at a certain zoom multiple corresponding to the focal length of the camera.
For example, the zoom multiple corresponding to the focal length of the main camera may be set to 1.0x, the zoom multiple corresponding to the focal length of the ultra-wide camera may be set to 0.4x, and the zoom multiple corresponding to the focal length of the tele camera may be set to 3.5x. That is, when the zoom factor is adjusted to 1.0x, the imaging quality of the shot image shot by the mobile phone through the main camera is high, when the zoom factor is adjusted to 0.4x, the imaging quality of the shot image shot by the mobile phone through the ultra-wide camera is high, and when the zoom factor is adjusted to 3.5x, the imaging quality of the shot image shot by the mobile phone through the tele camera is high. Therefore, when the zoom multiple adjusted by the user is greater than or equal to 0.4x and less than 1.0x, the mobile phone can adopt the ultra-wide-angle camera to shoot, when the zoom multiple adjusted by the user is greater than or equal to 1.0x and less than 3.5x, the mobile phone can adopt the main camera to shoot, and when the zoom multiple adjusted by the user is greater than or equal to 3.5x, the mobile phone can adopt the long-focus camera to shoot.
Based on the above example, when the zoom factor adjusted by the user is not 0.4x, 1.0x, or 3.5x, the imaging quality of the photographed image photographed by the mobile phone may be degraded. For example, when the user adjusts the zoom factor to 1.0x, the mobile phone may capture a captured image with a high imaging quality by using the main camera, and when the user adjusts the zoom factor to 2.5x, although the mobile phone may continue to use the main camera, at this time, since the focal length of the main camera is fixed, the captured image obtained by capturing is a digital zoom based on the image captured by the main camera (that is, the image captured by the main camera is amplified to obtain a captured image corresponding to the zoom factor), and the definition of the captured image is reduced compared to that of the image captured by the main camera. For another example, when the zoom factor is adjusted to 0.9x by the user, the mobile phone may use the super-wide-angle camera to capture an image, but at this time, since the zoom factor is greater than the zoom factor corresponding to the focal length of the super-wide-angle camera, the captured image is a digital zoom based on the captured image captured by the super-wide-angle camera (i.e., the image captured by the super-wide-angle camera is amplified to obtain a captured image corresponding to the zoom factor), the definition of the captured image is reduced compared to that of the captured image captured by the super-wide-angle camera, and when the zoom factor is adjusted to 0.4x by the user, the mobile phone may use the super-wide-angle camera to capture a captured image with higher imaging quality.
At present, in order to improve the imaging quality of a shot image shot by a mobile phone, when a zoom multiple adjusted by a user is not a zoom multiple corresponding to a focal length of each camera, the mobile phone may simultaneously use two cameras with adjacent focal lengths to shoot, and then fuse images respectively shot by the two cameras to obtain a final shot image.
Illustratively, continue to take the example that the zoom multiple corresponding to the focal length of the main camera is set to 1.0x, the zoom multiple corresponding to the focal length of the ultra-wide camera is set to 0.4x, and the zoom multiple corresponding to the focal length of the tele camera is set to 3.5x. As shown in fig. 1, when the zoom factor is adjusted to 2.0x-3.5x by the user, the mobile phone may perform shooting by using the telephoto camera on the basis of performing shooting by using the main camera, so as to fuse the image obtained by digitally zooming the image shot by the main camera with the image shot by the telephoto camera to obtain a final shot image. When the zoom multiple is adjusted to be 0.6x-0.9x by a user, the mobile phone can shoot by adopting the main camera on the basis of shooting by using the super wide-angle camera. So that the image shot by the ultra-wide-angle camera is fused with the image shot by the main camera after the image is subjected to digital zooming to obtain the final shot image.
For example, when the user adjusts the zoom factor to 2.5 ×, the mobile phone may perform photographing using a telephoto camera simultaneously on the basis of photographing using a main camera. At this time, since the zoom factor corresponding to the focal length of the telephoto camera is 3.5x and is larger than the zoom factor 2.5x adjusted by the user, as shown in fig. 2, the image taken by the telephoto camera (as shown in (b) of fig. 2) is a partial image within the image taken by the main camera (as shown in (a) of fig. 2) when the zoom factor is adjusted to 2.5x by digital zooming. Therefore, the mobile phone can obtain a shot image by fusing the image shot by the main camera and the image shot by the telephoto camera when the image shot by the main camera is adjusted to the zoom multiple of 2.5x by digital zooming (as shown in fig. 2 (c)), so as to improve the definition of the image of the part of the image shot by the telephoto camera, which is overlapped with the image shot by the main camera through digital zooming, and thus improve the definition of the finally obtained shot image.
For another example, when the user adjusts the zoom factor to 0.7x, the mobile phone may perform shooting using the main camera simultaneously with shooting using the super-wide camera. At this time, since the zoom factor corresponding to the focal length of the main camera is 1.0x and is greater than the zoom factor adjusted by the user, which is 0.7x, the image shot by the main camera is a partial image in the image shot by the ultra-wide camera when the zoom factor is adjusted to 0.7x by digital zooming. Therefore, the mobile phone can fuse the image obtained by digitally zooming the image shot by the ultra-wide-angle camera to the zoom multiple of 0.7x with the image shot by the main camera, so as to improve the definition of the image which is overlapped with the image shot by the main camera in the image obtained by digitally zooming the ultra-wide-angle camera, thereby improving the definition of the finally obtained shot image.
However, when the imaging quality (such as definition, color saturation, etc.) of the captured image is improved by the way of fusing the images captured by the two cameras respectively to obtain the captured image, since the two images before the fusion are captured by the different cameras respectively and the resolution, noise reduction capability, etc. of the different cameras are different, as shown in fig. 3, the definition of the two images is greatly different, which results in a large definition difference at the fused boundary of the captured image after the fusion, and the fused boundary is obvious, so that the finally obtained captured image (i.e., the fused image) has a strong splicing feeling.
In order to solve the above problem, an embodiment of the present application provides a photographing method, which may be applied to a scene that an electronic device with a photographing function performs photographing through a plurality of cameras.
In the embodiment of the present application, the photographing method may be that, as shown in fig. 4, the electronic device may perform photographing through two cameras with different focal lengths (i.e., cameras with different field angles) to obtain two images, for example, a first image and a second image. Then, one of the images, for example, the second image, is blurred, and the blurred image (for example, the third image) is fused with the first image, so that the fused image can be used as a captured image (or referred to as a fourth image).
For example, generally, the image captured by the camera with the relatively large focal length is higher in definition, and the second image may be an image obtained by the camera with the relatively large focal length (i.e., the camera with the relatively small field angle) of the two images, that is, an image obtained by the camera with the relatively large focal length is blurred. Of course, in other embodiments, the second image may also be an image obtained by a camera with a relatively small focal length (i.e., a camera with a relatively large field angle) in the two images, which is not limited herein, and generally, the blurring processing on the image with higher definition may be determined according to which image of the two images has higher definition.
The photographing method can be applied when the zoom multiple adjusted by the user is not the zoom multiple corresponding to the focal length of each camera set by the electronic equipment. And when the number of the cameras with different focal lengths arranged by the electronic equipment is three or more, the two cameras involved in the photographing method can be specifically determined according to the zoom multiple adjusted by the user. For example, one camera whose zoom factor corresponding to the focal length is greater than the zoom factor adjusted by the user and one camera whose zoom factor corresponding to the focal length is less than the zoom factor adjusted by the user can be respectively used as the two cameras involved in the photographing method. For another example, different combinations of two cameras may be set according to different zoom factor ranges, and then the corresponding two cameras are determined according to the zoom factor range in which the zoom factor adjusted by the user is located. Of course, in other embodiments of the embodiment of the present application, the electronic device may also perform image capturing and the like through two fixed cameras with different focal lengths all the time. Therefore, in the embodiment of the application, there is no limitation on when the electronic device applies the photographing method to perform image photographing through the first camera and the second camera together, and the method can be set according to actual requirements.
The two cameras with different focal lengths can be used simultaneously, and the focal lengths of the cameras can be adjacent focal lengths and the like. For example, two cameras are one ultra wide angle camera, one primary camera, or one primary camera, one tele camera, etc.
The shot image (i.e., the fourth image) is an image finally shot by the mobile phone when the user shoots through the mobile phone, or an image finally shot by the mobile phone and displayed to the user.
It should be noted that the image captured by the camera with the relatively small focal length of the two cameras, such as the first image mentioned above, is generally an image captured by the corresponding camera (e.g., the camera with the relatively small focal length of the two cameras) after being adjusted to the user-adjusted zoom multiple (i.e., the current zoom multiple) through digital zooming. Therefore, the fused image can meet the zoom multiple adjusted by a user. Of course, in some other embodiments, the image captured by the corresponding camera may be directly used as the first image, and when fusing is performed subsequently, the electronic device may adjust the first image to the zoom multiple adjusted by the user through digital zoom (i.e., the current zoom multiple), and then fuse the first image with the third image to be used as the captured image. Here, a specific mode of making the final captured image satisfy the zoom factor adjusted by the user is not limited.
Therefore, when the electronic equipment shoots through the two cameras and fuses the images respectively shot to obtain the shot images, the electronic equipment reduces the definition of the corresponding images in a mode of blurring the images with higher definition in the two obtained images. Therefore, the definition difference between the images respectively obtained by the two cameras due to the difference of the resolution and the noise reduction capability between the cameras can be reduced. Furthermore, the two images with small definition difference are fused, and a shot image with unobvious fusion boundary and weak splicing sense can be obtained.
Hereinafter, a photographing method provided in an embodiment of the present application will be described with reference to the accompanying drawings.
In the embodiment of the present application, the electronic device with a photographing function may be a mobile phone, a tablet computer, a handheld computer, a PC, a cellular phone, a Personal Digital Assistant (PDA), a wearable device (e.g., a smart watch, a smart band), a smart home device (e.g., a television), a vehicle machine (e.g., a car computer), a smart screen, a game machine, and an Augmented Reality (AR)/Virtual Reality (VR) device. The embodiment of the present application is not particularly limited to the specific device form of the electronic device.
In the embodiment of the present application, the electronic device is provided with at least two cameras with different focal lengths (i.e., two cameras with different field angles). Illustratively, the electronic device is provided with one main camera (typically a wide-angle camera), one tele camera having a longer focal length than the main camera, and one ultra-wide camera having a shorter focal length than the main camera.
Exemplarily, taking an electronic device as a mobile phone as an example, fig. 5 shows a schematic structural diagram of the electronic device provided in the embodiment of the present application. That is, the electronic device shown in fig. 5 may be a cellular phone, for example.
As shown in fig. 5, the electronic device may include a processor 510, an external memory interface 520, an internal memory 521, a Universal Serial Bus (USB) interface 530, a charging management module 540, a power management module 541, a battery 542, an antenna 1, an antenna 2, a mobile communication module 550, a wireless communication module 560, an audio module 570, a speaker 570A, a receiver 570B, a microphone 570C, a headset interface 570D, a sensor module 580, a button 590, a motor 591, an indicator 592, a camera 593, a display 594, a Subscriber Identification Module (SIM) card interface 595, and the like. Among them, the sensor module 580 may include a pressure sensor 580A, a gyro sensor 580B, an air pressure sensor 580C, a magnetic sensor 580D, an acceleration sensor 580E, a distance sensor 580F, a proximity light sensor 580G, a fingerprint sensor 580H, a temperature sensor 580J, a touch sensor 580K, an ambient light sensor 580L, a bone conduction sensor 580M, and the like.
It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic device. In other embodiments, an electronic device may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 510 may include one or more processing units, such as: processor 510 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
The controller may be a neural center and a command center of the electronic device. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 510 for storing instructions and data. In some embodiments, the memory in processor 510 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 510. If the processor 510 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 510, thereby increasing the efficiency of the system.
In some embodiments, processor 510 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The wireless communication function of the electronic device may be implemented by the antenna 1, the antenna 2, the mobile communication module 550, the wireless communication module 560, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in an electronic device may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 550 may provide a solution including 2G/3G/4G/5G wireless communication applied on the electronic device. The mobile communication module 550 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 550 can receive electromagnetic waves from the antenna 1, and can perform filtering, amplification, and other processing on the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 550 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 550 may be disposed in the processor 510. In some embodiments, at least some of the functional modules of the mobile communication module 550 may be disposed in the same device as at least some of the modules of the processor 510.
The wireless communication module 560 may provide solutions for wireless communication applied to electronic devices, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 560 may be one or more devices integrating at least one communication processing module. The wireless communication module 560 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on electromagnetic wave signals, and transmits the processed signals to the processor 510. The wireless communication module 560 may also receive a signal to be transmitted from the processor 510, frequency-modulate it, amplify it, and convert it into electromagnetic waves via the antenna 2 to radiate it.
In some embodiments, antenna 1 of the electronic device is coupled to the mobile communication module 550 and antenna 2 is coupled to the wireless communication module 560 so that the electronic device can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device implements display functions via the GPU, the display screen 594, and the application processor. The GPU is a microprocessor for image processing, coupled to a display screen 594 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 510 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 594 is used for displaying images, video, and the like. The display screen 594 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device may include 1 or N display screens 594, N being a positive integer greater than 1.
The electronic device may implement a capture function via the ISP, the camera 593, the video codec, the GPU, the display screen 594, and the application processor. In some embodiments, the electronic device may include 1 or N cameras 593, N being a positive integer greater than 1. For example, in the embodiment of the present application, the electronic device may include three cameras, where one is a main camera, one is a telephoto camera, and one is an ultra-wide angle camera.
The internal memory 521 may be used to store computer-executable program code, which includes instructions. The processor 510 executes various functional applications of the electronic device and data processing by executing instructions stored in the internal memory 521. The internal memory 521 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area can store data (such as audio data, phone book and the like) created in the using process of the electronic device. In addition, the internal memory 521 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like.
Of course, it should be understood that the above fig. 5 is only an exemplary illustration of the electronic device in the form of a mobile phone. If the electronic device is a tablet computer, a handheld computer, a PC, a PDA, a wearable device (e.g., a smart watch, a smart bracelet), a smart home device (e.g., a television), a vehicle machine (e.g., a vehicle-mounted computer), a smart screen, a game machine, and an AR/VR device, the structure of the electronic device may include fewer structures than those shown in fig. 5, or may include more structures than those shown in fig. 5, which is not limited herein.
The methods in the following embodiments may be implemented in an electronic device having the above hardware structure.
In this embodiment of the application, as shown in fig. 6, a system architecture of an electronic device may include an application layer, a framework layer (framework), a Hardware Abstraction Layer (HAL), a driver layer (driver), a firmware layer (FW), and a hardware layer (hardware, HW). The application layer may be used to deploy an application program, for example, in this embodiment, a camera application may be deployed in the application layer. The frame layer may be
Figure GDA0003794904750000091
A frame,
Figure GDA0003794904750000092
Frames, etc. system frames, are not limited herein. The hardware abstraction layer can deploy a uniform interface of each hardware, for example, in this embodiment, a camera hardware abstraction layer (cam) may be deployed in the hardware abstraction layerra HAL 3). A module (camera algorithm module (libcamra algo)) for implementing the photographing method provided by the embodiment of the present application may also be deployed in the hardware abstraction layer. The Driver layer may be configured to deploy a Driver component of each hardware device, for example, in this embodiment, the Driver layer may be deployed with a video device Driver (V4L 2 Driver), an Image Video Processor (IVP) Driver (IVP Driver) or a DSP Driver (DSP Driver), an NPU Driver (NPU Driver), a GPU Driver (GPU Driver), and the like. The firmware layer can be used to deploy firmware of each hardware device, for example, in the embodiment of the present application, the firmware layer may deploy internet of things firmware (lite-OS FW) so as to drive an image sensor, a time of flight (TOF) sensor, an ISP, and the like. The hardware layer includes various hardware of the electronic device, for example, in the embodiment of the present application, the hardware layer may include an image sensor, a TOF sensor, an ISP, an IVP or DSP, an NPU, a GPU, and the like.
For example, a module (camera algorithm module) implementing the photographing method provided by the embodiment of the present application may be initialized into the hardware abstraction layer when a user opens a camera application deployed in the application layer. When the user adjusts the zoom factor in the camera application to the zoom factor that needs the electronic device to shoot through two cameras simultaneously, in response to the shooting operation of the user (for example, the camera application includes a preview interface, the preview interface includes a first control (or called a shutter control), the shooting operation is the first operation of the user on the first control, such as clicking the shutter control, etc.), the camera application in the application layer can send the shooting instruction to the image sensor through the frame layer, the camera hardware abstraction layer, the video device driver and the internet of things firmware in sequence, so that the image sensor can obtain an image in response to the shooting instruction. The camera application can issue a photographing instruction to the image sensor corresponding to the camera according to the camera required to be used. For example, when the electronic device is provided with a main camera, a telephoto camera and a super-wide angle camera, if the electronic device needs to take pictures by using the main camera and the telephoto camera together, the camera application may send a picture taking instruction to the image sensor of the main camera and the image sensor of the telephoto camera, respectively. If the electronic equipment needs to adopt the main camera and the super wide-angle camera to shoot simultaneously, the camera application can send a shooting instruction to the image sensor of the main camera and the image sensor of the super wide-angle camera respectively. When the two corresponding image sensors receive the photographing instruction and acquire the image, the image sensors can transmit the image to the ISP. And after the ISP processes the received images according to a preset mode, the two processed images can be driven by the Internet of things firmware and the video equipment to be sent to the camera hardware abstraction layer. After the camera hardware abstraction layer receives the two images, the camera hardware abstraction layer can send the two images to a camera algorithm module for implementing the photographing method of the embodiment of the application. After the two images are received by the camera algorithm module, corresponding hardware (such as an IVP, a DSP, an NPU, a GPU, and the like) can be called by using corresponding drivers (such as an IPV, a DSP Driver, an NPU Driver, a GPU Driver, and the like) according to the photographing method of the embodiment of the present application to perform a blur process on an image captured by a camera with a relatively large focal length in the two images, and the blurred image is fused with another image to obtain a photographed image. Finally, the camera algorithm module may obtain the captured image from hardware that processes the fused image to obtain the captured image, and send the captured image obtained by fusing the blurred image and the other image to the camera application deployed in the application layer through the camera hardware abstraction layer and the framework layer, so that the camera application displays and/or stores the received captured image.
The following electronic equipment is used as a mobile phone and is provided with a main camera (wide-angle camera), a long-focus camera and a super wide-angle camera, wherein the zooming multiple corresponding to the focal length of the main camera is set to be 1.0x, the zooming multiple corresponding to the focal length of the super wide-angle camera is set to be 0.4x, and the zooming multiple corresponding to the focal length of the long-focus camera is set to be 3.5x. When the zoom multiple is adjusted to be 2.0x-3.5x by a user, the mobile phone shoots by using the main camera and the long-focus camera. When the zoom factor is adjusted to be 0.6x-0.9x by a user, the mobile phone uses the ultra-wide-angle camera and the main camera to perform shooting, for example, a specific implementation of the shooting method provided by the embodiment of the present application is illustrated.
Fig. 7 shows a flowchart of a photographing method provided in an embodiment of the present application. As shown in fig. 7, the photographing method may include the following S701-S703.
When a user opens a photographing interface of the mobile phone and adjusts the zoom multiple, if the adjusted zoom multiple is not the zoom multiple corresponding to the focal length of each camera set by the mobile phone (i.e., the adjusted zoom multiple is not 0.4x, 1.0x, or 3.5x in this example), and the adjusted zoom multiple is within a preset zoom multiple range photographed by using two cameras, the mobile phone may use two corresponding cameras with different focal lengths to photograph when the user performs a photographing operation, so as to respectively obtain images photographed by the two cameras. For example, the handset executes the following S701.
And S701, when the zoom multiple is within a preset range, responding to the photographing operation of a user, and acquiring a first image by the mobile phone through the first camera and acquiring a second image by the second camera.
The first camera may acquire the first image and the second camera may acquire the second image simultaneously, and in some other embodiments, the first camera may acquire the first image and the second camera may acquire the second image at a smaller time interval, which is not limited herein.
It should be noted that the focal lengths of the first camera and the second camera are different, for example, in this embodiment of the application, the focal length of the second camera is greater than the focal length of the first camera (which is taken as an example below), that is, the first camera may be a main camera set by the electronic device in this example, and then the second camera may be a telephoto camera set by the electronic device, or the first camera may be an ultra-wide-angle camera set by the electronic device in this example, and then the second camera may be a main camera set by the electronic device. Therefore, the second image shot by the second camera with the relatively long focal length can be contained in the first image shot by the first camera with the relatively short focal length, so that the first image and the second image can be fused subsequently.
Generally, a camera combination composed of the first camera and the second camera corresponds to a preset range, that is, different preset ranges correspond to different camera combinations. For example, in the embodiment of the present application, when the preset range is 2.0x to 3.5x, the first camera may be a main camera, and the second camera may be a telephoto camera, and when the preset range is 0.6x to 0.9x, the first camera may be an ultra-wide angle camera, and the second camera may be a main camera.
For example, when the zoom multiple set by the user is adjusted to be 2.5x, since the zoom multiple is within a preset range of 2.0x to 3.5x, in response to a photographing operation of the user, the mobile phone may capture a first image through a main camera (i.e., the first camera is the main camera), and capture a second image through a telephoto camera (i.e., the second camera is the telephoto camera). When the zoom multiple set by the user is adjusted to be 0.7x, the zoom multiple is within a preset range of 0.6x-0.9x, so that in response to the photographing operation of the user, the mobile phone can photograph through the ultra-wide-angle camera (i.e., the first camera is the ultra-wide-angle camera) to acquire a first image, and photograph through the main camera (i.e., the second camera is the main camera) to acquire a second image.
The mobile phone may further include a preview interface, where the preview interface includes a first control (or called a shutter control or a photographing control), and the photographing operation of the user may be a first operation (such as a click operation, a long-press operation, and the like) performed by the user on the first control. Illustratively, as shown in fig. 8, a preview interface is displayed in the mobile phone, and the interface includes a preview box, a photographing control 801 and a zooming control 802. The preview frame is used for displaying the preview image of the shot object under the current zoom multiple and the shooting mode. The photographing control 801 is used for triggering a photographing action of the mobile phone. The zoom control 802 may be used to adjust the zoom factor, and the current zoom factor may be displayed on the zoom control. As shown in fig. 8, when the user adjusts the zoom factor to 2.5x, a preview image with the zoom factor of 2.5x may be displayed in the preview frame. At this time, the user can perform a photographing operation by clicking the photographing control 801.
Optionally, in other embodiments of the embodiment of the present application, the photographing operation of the user may also be a pressing operation performed on a preset key (such as a power key, a volume key, and the like). Therefore, in the embodiment of the present application, the photographing operation of the user is not limited, and the photographing operation of the user is all the photographing operation of the user as long as the operation is used for triggering the mobile phone to photograph.
The first image obtained by the mobile phone through the first camera shooting may be an image obtained through the first camera shooting and matched with the zoom multiple corresponding to the first camera focal length (for example, an image acquired by the first camera after being processed by an ISP), or an image obtained through the first camera shooting and matched with the zoom multiple corresponding to the first camera focal length after being processed by a digital zoom. I.e., an image that matches the zoom factor of the user adjustment setting (i.e., the current zoom factor). The second image obtained by the mobile phone through the second camera shooting may be an image obtained by the second camera shooting and matched with the zoom factor corresponding to the second camera focal length (for example, an image obtained by processing an image acquired by the second camera by an ISP). When the first image shot by the first camera is an image which is obtained by shooting through the first camera and is matched with the zoom multiple corresponding to the focal length of the first camera, the mobile phone can adjust the zoom multiple of the first image to the zoom multiple adjusted by the user through digital zooming (namely the current zoom multiple) and then fuse the first image with the third image to obtain a fused image which is matched with the zoom multiple adjusted and set by the user during subsequent fusion, so that the fused image can be conveniently taken as a shot image subsequently.
After the mobile phone captures the first image and the second image, the mobile phone may perform the following S702.
And S702, carrying out fuzzy processing on the second image according to a preset rule to obtain a third image.
The blurring processing may include gaussian blurring, surface blurring, frame blurring, kawase blurring, double blurring, shot blurring, shift axis blurring, aperture blurring, granular blurring, radial blurring, direction blurring, and the like, that is, the blurring processing may be performed on the second image by using a blurring algorithm such as gaussian blurring, surface blurring, frame blurring, kawase blurring, double blurring, shot blurring, shift axis blurring, aperture blurring, granular blurring, radial blurring, direction blurring, and the like.
As an example, in the embodiment of the present application, the preset rule for performing the blurring processing on the second image may be to determine a corresponding blurring strength (blu) according to a similarity between the first image and the second image (which may be used to represent a magnitude of a difference in sharpness between the first image and the second image, where the higher the similarity between the first image and the second image is, the smaller the sharpness difference is, and the lower the similarity is, the larger the sharpness difference is), and then perform the blurring processing on the second image by using a blurring algorithm according to the corresponding blurring strength. Therefore, when the difference between the definition of the first image and the definition of the second image is small (namely, when the SSIM value is high), the condition that the definition of the fused image is not improved compared with that of the first image due to the excessive blurring processing of the second image is avoided.
For example, the similarity between the first image and the second image may be expressed by Structural Similarity (SSIM). Wherein, SSIM values of the first image (i.e. image x) and the second image (i.e. image y) can be calculated by using the following formula:
Figure GDA0003794904750000121
wherein x is an image x (e.g., a first image), y is an image y (e.g., a second image), μ x Is the average value of x, μ y Is the average value of y and is,
Figure GDA0003794904750000122
is the variance of x and is,
Figure GDA0003794904750000123
is the variance of y, σ xy Is the covariance of x and y, c 1 =(k 1 L) 2 ,c 2 =(k 2 L) 2 L is the dynamic range of pixel values (i.e., the maximum value of the span of image pixel values, e.g., for an 8-bit channel image, which is a binary image)Pixel value range is 0-255, then L = 255), k 1 =0.01,k 2 =0.03。
SSIM values range from 0 to 1. When the two images are identical, the SSIM value is equal to 1.
Therefore, the maximum blurring strength and the minimum blurring strength can be calibrated based on the SSIM values of the first image and the second image, so as to obtain the corresponding relationship between the SSIM values of the first image and the second image and the blurring strength, so that the corresponding blurring strength can be determined according to the SSIM values of the first image and the second image.
For example, a low SSIM value that defines the first image and the second image as having too low a similarity (i.e., a low SSIM value that is less than the low SSIM value may define the first image and the second image as having too low a similarity), and a high SSIM value that defines the first image and the second image as having a higher similarity (i.e., a high SSIM value that is greater than the high SSIM value may define the first image and the second image as having a higher similarity) may be determined based on empirical data (e.g., data obtained from test verification). Therefore, the maximum fuzzy degree can be calibrated based on the low SSIM value, and the minimum fuzzy degree can be calibrated based on the high SSIM value. For example, when the SSIM values of the first image and the second image are low SSIM values, the blurring strength is linearly adjusted until the third image obtained after the blurring processing of the second image is fused with the first image has a higher sharpness improvement compared with the first image, and the fusion boundary is not obvious, and then the blurring strength at this time can be used as the maximum blurring strength. Similarly, when the SSIM values of the first image and the second image are high SSIM values, the blurring strength is linearly adjusted until the third image obtained after the blurring processing of the second image is fused with the first image, the third image has a higher definition improvement compared with the first image, and the fusion boundary is not obvious, so that the blurring strength at this time can be used as the minimum blurring strength. Then, the fuzzy degrees corresponding to the SSIM values lower than the low SSIM value may be set as the maximum fuzzy degree, the fuzzy degrees corresponding to the SSIM values higher than the high SSIM value may be set as the minimum fuzzy degree, and the SSIM value between the low SSIM value and the high SSIM value may be linearly corresponding to the fuzzy degree between the maximum fuzzy degree and the maximum fuzzy degree. For example, taking the determined low SSIM value as 0.25 and the determined high SSIM value as 0.38, if the maximum fuzzy degree obtained by calibrating the fuzzy degree when the SSIM value is 0.25 is 9 and the minimum fuzzy degree obtained by calibrating the fuzzy degree when the SSIM value is 0.38 is 1, the corresponding relationship curve between the fuzzy degree and the SSIM value as shown in fig. 9 can be obtained.
As another example, in general, since the second camera capturing the second image by the user has a focal length larger than that of the first camera and has a higher noise removal capability than that of the first camera, when the noise of the image increases due to the Increased Sensitivity (ISO), the second image captured by the second camera is clearer than the first image captured by the first camera, and thus, when the sensitivities of the first image and the second image are close to each other, the higher the sensitivity is, the greater the difference in sharpness between the second image and the first image is. Therefore, in the embodiment of the present application, the preset rule for performing the blurring processing on the second image may be to determine a corresponding blurring strength according to a sensitivity (ISO) corresponding to the second image, that is, a sensitivity of the second camera when the second image is captured, and then perform the blurring processing on the second image by using a blurring algorithm according to the corresponding blurring strength. Therefore, the situation that when the difference of the definition of the first image and the definition of the second image is small, the definition of the fused image is not improved compared with that of the first image due to the fact that the second image is subjected to excessive blurring processing is avoided.
For example, the sensitivity may be divided into segments, and then the blurring strength corresponding to the different sensitivity segments may be set from low to high according to a rule that the higher the sensitivity is, the larger the blurring strength corresponding to the different sensitivity segments is. When the blurring degree is greater than a certain value, in order to avoid a situation that the blurring degree of the fused image is not improved compared with the first image due to excessive blurring processing on the second image caused by blurring processing with higher blurring degree, the blurring degree can be used as the maximum blurring degree, so that higher sensitivity segments correspond to the maximum blurring degree.
For example, the sensitivity may be classified into 100 to 1000, 1000 to 2000, 2000 to 3000, 3000 to 4000, 4000 to 5000, 5000 to 6000, and the like. Thus, the corresponding blurring strength when the sensitivity is 100-1000 is set to 1, the corresponding blurring strength when the sensitivity is 1000-2000 is set to 3, the corresponding blurring strength when the sensitivity is 2000-3000 is set to 5, the corresponding blurring strength when the sensitivity is 3000-4000 is set to 7, and the corresponding blurring strength when the sensitivity is 4000-5000 is set to 9. For example, the specific parameter settings of the sensitivity segmentation and the corresponding blurring strength of the above example can be as follows:
Figure GDA0003794904750000131
Figure GDA0003794904750000141
therein, <? xml version = "1.0" encoding = "GB2312"? It is indicated that the parameter setting is configured in an extensible markup language (XML) file of version 1.0. < iso100> and the like indicate the index of the sensitivity segment, < blu >1</blu > indicates the degree of blur corresponding to the sensitivity segment.
As another example, in general, since the second camera used by the user to capture the second image has a focal length greater than that of the first camera and has a higher noise removal capability than that of the first camera, when the noise of the image increases due to an increase in the ambient brightness (LV), the second image captured by the second camera is clearer than the first image captured by the first camera, and therefore, when the ambient brightness of the first image and the second image is close to each other, the higher the ambient brightness is, the greater the difference in sharpness between the first image and the second image may be. Therefore, in the embodiment of the present application, the preset rule for performing the blurring processing on the second image may be that the corresponding blurring strength is determined according to the ambient brightness corresponding to the second image, and then the blurring processing is performed on the second image by using a blurring algorithm according to the corresponding blurring strength. Therefore, the situation that when the difference of the definition of the first image and the definition of the second image is small, the definition of the fused image is not improved compared with that of the first image due to the fact that the second image is subjected to excessive blurring processing is avoided.
The ambient brightness is generally an average brightness of ambient light measured by the mobile phone from ambient light. When the mobile phone uses the camera to shoot, the exposure parameter adopted by the mobile phone can be obtained by calculation according to the ambient brightness, namely the exposure parameter of the image shot by the camera is obtained by calculation according to the ambient brightness. Therefore, in the embodiment of the present application, the ambient brightness can be obtained according to the exposure parameter of the second image.
For example, the ambient brightness may be segmented, and then the blur strength corresponding to different ambient brightness segments is set from low to high according to a rule that the higher the ambient brightness, the greater the corresponding blur strength. When the blurring degree is greater than a certain value, in order to avoid a situation that the blurring degree of the fused image is not improved compared with the first image due to excessive blurring processing on the second image caused by blurring processing with higher blurring degree, the blurring degree can be used as the maximum blurring degree, so that higher sensitivity segments correspond to the maximum blurring degree.
For example, ambient brightness may be divided into 100-1000, 1000-2000, 2000-3000, 3000-4000, 4000-5000, 5000-6000, and so on. Thus, the corresponding fuzzy degree when the ambient brightness is 100-1000 is set to be 1, the corresponding fuzzy degree when the ambient brightness is 1000-2000 is set to be 3, the corresponding fuzzy degree when the ambient brightness is 2000-3000 is set to be 5, the corresponding fuzzy degree when the ambient brightness is 3000-4000 is set to be 7, and the corresponding fuzzy degree when the ambient brightness is 4000-5000 is set to be 9. For example, the specific parameter settings of the ambient brightness segment and its corresponding blur strength of the above example may be as follows:
Figure GDA0003794904750000142
Figure GDA0003794904750000151
therein, <? xml version = "1.0" encoding = "GB2312"? It is shown that the parameter settings are configured in an extensible markup language (XML) file of version 1.0. < lv 100> and the like represent the index of the sensitivity segment, and < blu >1</blu > represents the blur strength corresponding to the ambient brightness segment.
Optionally, according to different fuzzy algorithms, fuzzy parameters corresponding to different fuzzy degrees may be determined according to a specific fuzzy algorithm.
For example, taking a fuzzy algorithm as a gaussian blur as an example, the formula of the gaussian blur can be as follows:
Figure GDA0003794904750000152
wherein u is 2 +v 2 To blur the radius, σ is the standard deviation of a normal distribution.
At this time, if the standard deviation is set to 1, the gaussian matrix of the gaussian blur with the blur strength of 3 is obtained by the above formula as follows:
Figure GDA0003794904750000153
that is, when the blurring strength is 3, the blurring process can be performed according to the gaussian matrix when gaussian blurring is adopted.
The gaussian matrix of the gaussian blur with a blur strength of 5 is:
Figure GDA0003794904750000154
that is, when the blurring strength is 5, the blurring process can be performed according to the gaussian matrix when gaussian blurring is adopted.
And S703, fusing the third image with the first image.
Optionally, when the third image is fused with the first image, the third image may be superimposed on a portion of the first image that coincides with the content of the third image, or the portion of the first image that coincides with the content of the third image may be directly replaced by the third image, or other algorithms are used for fusion, which is not limited herein.
After the third image and the first image are fused, the fused image (i.e., the fourth image) may be saved as a captured image.
For example, when the mobile phone adopts the system architecture shown in fig. 6, the camera algorithm module may call the IVP, the DSP or the CPU according to the above embodiments, and send the SSIM values of the first image and the second image and the relationship curve between the SSIM values and the blur strength (or the configuration parameters of the relationship between the sensitivity and the blur strength of the second image, or the configuration parameters of the relationship between the ambient brightness and the blur strength of the second image) to the IVP or the DSP, so that the IVP or the DSP determines the blur strength to be used for blurring the second image according to the parameters. The IVP or the DSP can return the determined fuzzy strength to the camera algorithm module, the camera algorithm module can send the determined fuzzy strength, the first image and the second image to the GPU, so that the GPU can be called to conduct fuzzy processing on the second image according to the determined fuzzy strength to obtain a third image, and the first image and the third image are fused to obtain a shot image. The GPU may return the captured image to the camera algorithm module, which may be sent to a camera application deployed in the application layer through the camera hardware abstraction layer and the framework layer to facilitate the camera application to display and/or store the received captured image. Of course, the above is merely an example, in other embodiments of the present application, the camera algorithm module may also flexibly invoke the IVP, the DSP, the CPU, the GPU, and the like to perform operations such as determining the blur strength, blurring the second image, and fusing the third image obtained after the blurring process with the first image, and therefore, how the camera algorithm module schedules relevant hardware to implement the method of the foregoing embodiment is not specifically limited here, and may be set according to the processing capability and function of the hardware such as the IVP, the DSP, the CPU, the GPU, and the like.
Optionally, in this embodiment of the application, if the first image is an image obtained by digitally zooming the image captured by the first camera to a zoom multiple adjusted by a user (i.e., a current zoom multiple), an image obtained by fusing the third image and the first image may be used as a final captured image. If the first image is an image shot by the first camera and matched with the zoom multiple corresponding to the focal length of the first camera, the first image can be adjusted to the zoom multiple adjusted by the user (namely the current zoom multiple) through digital zooming and then fused with the third image when fusion is carried out, so that the fused image can be matched with the zoom multiple adjusted and set by the user to be used as a final shot image. Therefore, there is no limitation on when the image is adjusted by digital zoom to match the final image to the user-adjusted zoom factor.
By adopting the method in the embodiment, when the electronic equipment shoots through the two cameras and then fuses the respectively shot images to obtain the shot images, the electronic equipment reduces the definition of the corresponding images in a mode of blurring the images with higher definition in the two obtained images. Therefore, the definition difference between images respectively shot by the two cameras caused by the difference of the resolution and the noise reduction capability among the cameras can be reduced. Furthermore, the two images with small definition difference are fused, so that a shot image with an unobvious fusion boundary and a weak splicing feeling can be obtained.
Corresponding to the method in the foregoing embodiment, the embodiment of the present application further provides a photographing apparatus. The apparatus can be applied to the electronic device described above for implementing the method in the foregoing embodiment. The functions of the device can be realized by hardware, and can also be realized by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above-described functions. For example, fig. 10 shows a schematic configuration of a photographing apparatus, which includes, as shown in fig. 10: a processing module 1001 and a display module 1002. The processing module 1001 and the display module 1002 may be used together to implement the related methods in the foregoing embodiments.
It should be understood that the division of units or modules (hereinafter referred to as units) in the above apparatus is only a division of logical functions, and may be wholly or partially integrated into one physical entity or physically separated in actual implementation. And the units in the device can be realized in the form of software called by the processing element; or can be implemented in the form of hardware; part of the units can also be realized in the form of software called by a processing element, and part of the units can be realized in the form of hardware.
For example, each unit may be a processing element separately set up, or may be implemented by being integrated into a chip of the apparatus, or may be stored in a memory in the form of a program, and a function of the unit may be called and executed by a processing element of the apparatus. In addition, all or part of the units can be integrated together or can be independently realized. The processing element described herein, which may also be referred to as a processor, may be an integrated circuit having signal processing capabilities. In implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in a processor element or in a form called by software through the processor element.
In one example, the units in the above apparatus may be one or more integrated circuits configured to implement the above method, such as: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of at least two of these integrated circuit forms.
As another example, when a unit in a device may be implemented in the form of a processing element scheduler, the processing element may be a general purpose processor, such as a CPU or other processor capable of invoking programs. As another example, these units may be integrated together and implemented in the form of a system-on-a-chip (SOC).
In one implementation, the unit of the above apparatus for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler. For example, the apparatus may include a processing element and a memory element, the processing element calling a program stored by the memory element to perform the method described in the above method embodiments. The memory elements may be memory elements on the same chip as the processing elements, i.e. on-chip memory elements.
In another implementation, the program for performing the above method may be in a memory element on a different chip than the processing element, i.e. an off-chip memory element. At this time, the processing element calls or loads a program from the off-chip storage element onto the on-chip storage element to call and execute the method described in the above method embodiment.
For example, the embodiments of the present application may also provide an apparatus, such as: an electronic device may include: a processor, a memory for storing instructions executable by the processor. The processor is configured to execute the above instructions, so that the electronic device realizes the photographing method implemented by the electronic device in the foregoing embodiment. The memory may be located within the electronic device or external to the electronic device. And the processor includes one or more.
In another implementation, the unit of the apparatus for implementing the steps of the above method may be configured as one or more processing elements, and these processing elements may be disposed on the electronic device corresponding to the foregoing, where the processing elements may be integrated circuits, for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits may be integrated together to form a chip.
For example, the embodiment of the present application also provides a chip system, and the chip system may be applied to the electronic device. The chip system includes one or more interface circuits and one or more processors; the interface circuit and the processor are interconnected through a line; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuit to implement the method related to the electronic device in the above method embodiments.
Embodiments of the present application further provide a computer program product, which includes an electronic device, such as the electronic device described above, and computer instructions for operating.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of software products, such as: and (5) programming. The software product is stored in a program product, such as a computer readable storage medium, and includes several instructions for causing a device (which may be a single chip, a chip, or the like) or a processor (processor) to perform all or part of the steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
For example, embodiments of the present application may also provide a computer-readable storage medium having stored thereon computer program instructions. The computer program instructions, when executed by the electronic device, cause the electronic device to implement the method of taking a picture as described in the method embodiments described above.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A photographing method is applied to an electronic device, the electronic device comprises a first camera and a second camera, and the field angle of the first camera is different from the field angle of the second camera, and the method comprises the following steps:
the electronic equipment starts a camera;
displaying a preview interface, wherein the preview interface comprises a first control;
detecting a first operation on the first control;
responding to the first operation, the first camera acquires a first image, and the second camera acquires a second image; the definition of the second image is higher than that of the first image;
performing fuzzy processing on the second image to obtain a third image;
fusing the third image and the first image to obtain a fourth image;
saving the fourth image;
the blurring processing of the second image to obtain a third image includes:
determining the fuzzy degree according to the similarity between the second image and the first image and the corresponding relation between the preset similarity and the fuzzy degree; wherein the similarity is a structural similarity SSIM value, and the similarity is in inverse proportion to the fuzzy degree;
performing fuzzy processing on the second image according to the determined fuzzy degree; or the like, or, alternatively,
determining the blurring degree according to the sensitivity of the second image and the corresponding relation between the preset sensitivity and the blurring degree; the light sensitivity is divided in a segmentation mode, the higher the light sensitivity is, the higher the fuzzy degree corresponding to the segmentation is, and the lower the light sensitivity is, the lower the fuzzy degree corresponding to the segmentation is;
performing fuzzy processing on the second image according to the determined fuzzy degree; or the like, or, alternatively,
determining the blurring strength according to the corresponding relation between the preset environment brightness and the blurring strength according to the environment brightness when the second image is shot; the environmental brightness is divided in sections, the higher the environmental brightness is, the higher the corresponding fuzzy strength of the section is, and the lower the environmental brightness is, the smaller the corresponding fuzzy strength of the section is;
and carrying out fuzzy processing on the second image according to the determined fuzzy degree.
2. The method of claim 1, wherein a field angle of the first camera is greater than a field angle of the second camera.
3. The method of claim 1, wherein the sensitivity is proportional to the blur level.
4. The method of claim 1, wherein the ambient brightness is proportional to the blur level.
5. The method according to any one of claims 1 to 4, wherein the blurring process comprises any one of: gaussian blur, surface blur, box blur, kawase blur, double blur, shot blur, shift blur, aperture blur, granular blur, radial blur, directional blur.
6. The method of claim 1, wherein the first image is obtained by digitally zooming the image captured by the first camera to a current zoom magnification.
7. The method of claim 1, wherein the first image is an image directly captured by the first camera; the fusing the third image and the first image to obtain a fourth image, including:
carrying out digital zooming on the first image to adjust the first image to the current zooming multiple;
and fusing the third image and the first image subjected to digital zooming to obtain the fourth image.
8. An electronic device, comprising: a processor, a memory for storing the processor-executable instructions, the processor configured to, when executed, cause the electronic device to implement the method of any of claims 1 to 7.
9. A computer readable storage medium having stored thereon computer program instructions; computer program instructions, which, when executed by an electronic device, cause the electronic device to carry out the method of any one of claims 1 to 7.
CN202110919953.8A 2021-08-11 2021-08-11 Photographing method, electronic device and storage medium Active CN113810598B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110919953.8A CN113810598B (en) 2021-08-11 2021-08-11 Photographing method, electronic device and storage medium
PCT/CN2022/093613 WO2023016025A1 (en) 2021-08-11 2022-05-18 Image capture method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110919953.8A CN113810598B (en) 2021-08-11 2021-08-11 Photographing method, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN113810598A CN113810598A (en) 2021-12-17
CN113810598B true CN113810598B (en) 2022-11-22

Family

ID=78893436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110919953.8A Active CN113810598B (en) 2021-08-11 2021-08-11 Photographing method, electronic device and storage medium

Country Status (2)

Country Link
CN (1) CN113810598B (en)
WO (1) WO2023016025A1 (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113810598B (en) * 2021-08-11 2022-11-22 荣耀终端有限公司 Photographing method, electronic device and storage medium
CN116723394B (en) * 2022-02-28 2024-05-10 荣耀终端有限公司 Multi-shot strategy scheduling method and related equipment thereof
CN114782296B (en) * 2022-04-08 2023-06-09 荣耀终端有限公司 Image fusion method, device and storage medium
CN116245741B (en) * 2022-06-28 2023-11-17 荣耀终端有限公司 Image processing method and related device
CN116051368B (en) * 2022-06-29 2023-10-20 荣耀终端有限公司 Image processing method and related device
CN115348390A (en) * 2022-08-23 2022-11-15 维沃移动通信有限公司 Shooting method and shooting device
CN116051435B (en) * 2022-08-23 2023-11-07 荣耀终端有限公司 Image fusion method and electronic equipment
CN117835077A (en) * 2022-09-27 2024-04-05 华为终端有限公司 Shooting method, electronic equipment and medium
CN117729445A (en) * 2024-02-07 2024-03-19 荣耀终端有限公司 Image processing method, electronic device and computer readable storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9749549B2 (en) * 2015-10-06 2017-08-29 Light Labs Inc. Methods and apparatus for facilitating selective blurring of one or more image portions
CN107959778B (en) * 2017-11-30 2019-08-20 Oppo广东移动通信有限公司 Imaging method and device based on dual camera
CN110290300A (en) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 Equipment imaging method, device, storage medium and electronic equipment
CN112188096A (en) * 2020-09-27 2021-01-05 北京小米移动软件有限公司 Photographing method and device, terminal and storage medium
CN112995467A (en) * 2021-02-05 2021-06-18 深圳传音控股股份有限公司 Image processing method, mobile terminal and storage medium
CN113012085A (en) * 2021-03-18 2021-06-22 维沃移动通信有限公司 Image processing method and device
CN113810598B (en) * 2021-08-11 2022-11-22 荣耀终端有限公司 Photographing method, electronic device and storage medium

Also Published As

Publication number Publication date
WO2023016025A1 (en) 2023-02-16
CN113810598A (en) 2021-12-17

Similar Documents

Publication Publication Date Title
CN113810598B (en) Photographing method, electronic device and storage medium
CN112150399B (en) Image enhancement method based on wide dynamic range and electronic equipment
CN111179282A (en) Image processing method, image processing apparatus, storage medium, and electronic device
CN114092364A (en) Image processing method and related device
US20240119566A1 (en) Image processing method and apparatus, and electronic device
CN116801093B (en) Image processing method, device and storage medium
CN116996762B (en) Automatic exposure method, electronic equipment and computer readable storage medium
CN114466134A (en) Method and electronic device for generating HDR image
WO2022267506A1 (en) Image fusion method, electronic device, storage medium, and computer program product
CN113096022A (en) Image blurring processing method and device, storage medium and electronic equipment
CN117061861B (en) Shooting method, chip system and electronic equipment
CN116668862B (en) Image processing method and electronic equipment
CN113810622B (en) Image processing method and device
CN115631250B (en) Image processing method and electronic equipment
US11989863B2 (en) Method and device for processing image, and storage medium
CN117395495B (en) Image processing method and electronic equipment
CN116452437B (en) High dynamic range image processing method and electronic equipment
CN116051368B (en) Image processing method and related device
CN117440253B (en) Image processing method and related device
CN117135468B (en) Image processing method and electronic equipment
CN116723408B (en) Exposure control method and electronic equipment
CN115705663B (en) Image processing method and electronic equipment
CN116668838B (en) Image processing method and electronic equipment
CN117528265A (en) Video shooting method and electronic equipment
CN117835077A (en) Shooting method, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant