WO2023016025A1 - Procédé et dispositif de capture d'image - Google Patents

Procédé et dispositif de capture d'image Download PDF

Info

Publication number
WO2023016025A1
WO2023016025A1 PCT/CN2022/093613 CN2022093613W WO2023016025A1 WO 2023016025 A1 WO2023016025 A1 WO 2023016025A1 CN 2022093613 W CN2022093613 W CN 2022093613W WO 2023016025 A1 WO2023016025 A1 WO 2023016025A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
blur
electronic device
strength
Prior art date
Application number
PCT/CN2022/093613
Other languages
English (en)
Chinese (zh)
Inventor
乔晓磊
肖斌
丁大钧
朱聪超
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2023016025A1 publication Critical patent/WO2023016025A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Definitions

  • the present application relates to the field of photographing, in particular to a photographing method and equipment.
  • the camera function of mobile phones is becoming more and more powerful, and more and more users use mobile phones to take pictures.
  • many mobile phones will be equipped with a conventional main camera, an additional telephoto camera and/or an ultra-wide-angle camera, etc. .
  • the mobile phone when taking photos, the mobile phone often selects a camera with variable zoom multiples to take pictures according to different zoom multiples.
  • two cameras are also used to shoot the subject at the same time, and then the images captured by the two cameras are fused, thereby improving the imaging quality of the photos taken by the mobile phone.
  • the present application provides a photographing method and device, which solves the problem that the fusion boundary of the photographed image is obvious when two images are acquired by two cameras for fusion to obtain a photographed image.
  • the present application provides a method for taking pictures, which can be applied to electronic devices.
  • the electronic device includes a first camera and a second camera, and the angle of view of the first camera is different from that of the second camera.
  • the method includes: the electronic device starts the camera; displays a preview interface, and the preview interface includes a first control; detects a first operation on the first control; in response to the first operation, the first camera acquires the first image, and the second camera acquires the second Two images, the definition of the second image is higher than that of the first image; the second image is blurred to obtain the third image; the third image is fused with the first image to obtain the fourth image; the fourth image is saved image.
  • the electronic device when the electronic device takes pictures with two cameras, and then fuses the images obtained separately to obtain the captured image, the electronic device blurs the image with higher resolution among the two obtained images. Decreases the sharpness of the corresponding image. Therefore, the difference in definition between the images acquired by the two cameras caused by the difference in resolution and noise reduction capability between the cameras can be reduced. Furthermore, by fusing the two images with a small difference in definition, a captured image with an indistinct fusion boundary and a weak sense of splicing can be obtained.
  • the field angle of the first camera is larger than the field angle of the second camera.
  • the image captured by the camera with a small field of view has a higher definition. Therefore, when the field of view of the first camera is larger than that of the second camera, the second image captured by the second camera is blurred. By reducing the sharpness of the second image, the difference between the sharpness of the first image and the blurred second image (ie, the third image) can be reduced, thereby reducing the splicing feeling of the fused image.
  • blurring the second image to obtain the third image includes: according to the similarity between the second image and the first image, according to the preset correspondence between similarity and blur strength , determine the blur strength; perform blur processing on the second image according to the determined blur strength.
  • the sharpness difference between the first image and the second image can be determined according to the similarity, and the higher the similarity, the smaller the sharpness difference. Therefore, the blur strength is determined according to the similarity between the first image and the second image, so that the degree of blur processing on the second image can be adjusted based on the sharpness difference between the second image and the first image. In this way, it is avoided that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring processing of the second image, and that the sharpness of the fused image is not improved compared with the first image.
  • the similarity is a structural similarity SSIM value.
  • the similarity is represented by the structural similarity value, and the similarity between the first image and the second image can be more accurately measured from the perspective of image composition.
  • the similarity is inversely proportional to the blurriness.
  • the similarity is inversely proportional to the blur strength. To avoid the situation that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring of the second image, and the sharpness of the fused image is not improved compared with the first image.
  • performing blur processing on the second image to obtain the third image includes: determining the blur according to the corresponding relationship between the sensitivity and the blur strength according to the sensitivity of the second image. Intensity: perform blur processing on the second image according to the determined blur intensity.
  • a camera capable of obtaining a relatively clear image has a strong denoising capability, therefore, when image noise increases due to increased light sensitivity, the second image will be clearer than the first image. That is, the higher the sensitivity, the greater the difference in definition between the second image and the first image. Therefore, the blur strength is determined according to the sensitivity, so that the degree of blur processing on the second image can be adjusted based on the sharpness difference between the second image and the first image. In this way, it is avoided that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring processing of the second image, and that the sharpness of the fused image is not improved compared with the first image.
  • ISO is proportional to blur strength.
  • the sensitivity is proportional to the blur strength, and when the sensitivity is low, the sharpness of the first image and the second image When the difference is small, avoid the situation that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring of the second image, and the sharpness of the fused image is not improved compared with the first image.
  • performing blur processing on the second image to obtain the third image includes: according to the ambient brightness corresponding to the second image, and according to the preset correspondence between the ambient brightness and the blur strength, determining the blur Intensity: perform blur processing on the second image according to the determined blur intensity.
  • a camera capable of obtaining a relatively clear image has a strong denoising capability, so when image noise increases due to increased ambient brightness, the second image will be clearer than the first image. That is, the higher the ambient brightness, the greater the difference in clarity between the second image and the first image. Therefore, the blur strength is determined according to the brightness of the environment, so that the degree of blur processing on the second image can be adjusted based on the sharpness difference between the second image and the first image. In this way, it is avoided that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring processing of the second image, and that the sharpness of the fused image is not improved compared with the first image.
  • the ambient brightness is proportional to the blur strength.
  • the ambient brightness is proportional to the blur strength.
  • the sharpness of the first image and the second image When the difference is small, avoid the situation that the sharpness of the second image after processing is worse than that of the first image due to excessive blurring of the second image, and the sharpness of the fused image is not improved compared with the first image.
  • the blur processing includes any of the following: Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis-shift blur, aperture blur, grainy blur, radial Blurred, ambiguous direction.
  • the first image is an image obtained by digital zooming the image captured by the first camera and adjusted to the current zoom factor.
  • the first image is an image directly captured by the first camera; fusing the third image with the first image to obtain the fourth image includes: digitally zooming the first image to The first image is adjusted to the current zoom factor; the third image is fused with the digitally zoomed first image to obtain a fourth image.
  • the present application provides a photographing device, which can be applied to an electronic device, and the electronic device includes a first camera and a second camera, and the first camera has a different field of view than the second camera.
  • the device is used to implement the method in the first aspect above.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • Hardware or software includes one or more modules corresponding to the above functions, for example, a processing module and a display module.
  • the display module can be used to display a preview interface when the electronic device starts the camera, and the preview interface includes the first control;
  • the processing module can be used to detect the first operation on the first control; in response to the first operation, through the first The first image is acquired by the camera, the second image is acquired by the second camera, and the definition of the second image is higher than that of the first image; the second image is blurred to obtain a third image; the third image and the first The images are fused to obtain a fourth image; and the fourth image is saved.
  • the field angle of the first camera is larger than the field angle of the second camera.
  • the processing module is specifically configured to determine the blur strength according to the similarity between the second image and the first image according to the preset correspondence between the similarity and the blur strength; Blur Strength Blurs the second image.
  • the similarity is a structural similarity SSIM value.
  • the similarity is inversely proportional to the blurriness.
  • the processing module is specifically configured to determine the blur strength according to the sensitivity corresponding to the second image and according to the preset correspondence between the sensitivity and the blur strength;
  • the second image is blurred.
  • ISO is proportional to blur strength.
  • the processing module is specifically configured to determine the blur strength according to the ambient brightness corresponding to the second image and according to the preset correspondence between the ambient brightness and the blur strength;
  • the second image is blurred.
  • the ambient brightness is proportional to the blur strength.
  • the blur processing includes any of the following: Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis-shift blur, aperture blur, grainy blur, radial Blurred, ambiguous direction.
  • the first image is an image obtained by digital zooming the image captured by the first camera and adjusted to the current zoom factor.
  • the first image is an image directly captured by the first camera; the processing module is specifically configured to digitally zoom the first image, so as to adjust the first image to the current zoom factor;
  • the third image is fused with the digitally zoomed first image to obtain a fourth image.
  • an embodiment of the present application provides an electronic device, including: a processor, and a memory configured to store instructions executable by the processor.
  • the electronic device implements the photographing method described in any one of the first aspect or possible implementation manners of the first aspect.
  • the embodiment of the present application provides a computer-readable storage medium on which computer program instructions are stored.
  • the electronic device is made to implement the photographing method according to any one of the first aspect or the possible implementation manners of the first aspect.
  • the embodiment of the present application provides a computer program product, including computer readable code, when the computer readable code is run in the electronic device, the electronic device realizes the possible functions of the first aspect or the first aspect. Realize the photographing method described in any one of the manners.
  • FIG. 1 is a schematic diagram of an application using dual cameras provided by related technologies
  • FIG. 2 is a schematic diagram of an image fusion application provided by related technologies
  • FIG. 3 is a schematic diagram of another image fusion application provided by related technologies
  • FIG. 4 is an application schematic diagram of a photographing method provided in an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 6 is a schematic composition diagram of a system architecture of an electronic device provided by an embodiment of the present application.
  • FIG. 7 is a schematic flowchart of a photographing method provided in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of a scene of a photographing operation provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of the relationship between structural similarity and blur strength provided by the embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a photographing device provided by an embodiment of the present application.
  • the cameras of mobile phones are constantly upgraded, and the camera functions of mobile phones are becoming more and more powerful.
  • many mobile phones will also be equipped with additional cameras such as telephoto cameras and ultra-wide-angle cameras that use different focal lengths from the main camera (that is, cameras with different field of view, Among them, the camera with a larger field of view has a shorter focal length). Therefore, when the user uses the mobile phone to take pictures, the mobile phone can provide a longer focal length through the telephoto camera to obtain a better telephoto shooting effect, and the mobile phone can also provide a larger field of view through the ultra-wide-angle camera to obtain a better wide-angle Shooting effect.
  • the mobile phone will select the corresponding camera to take pictures according to the different zoom factors. For example, when the user increases the zoom factor to zoom in on the captured image, the mobile phone can choose to use a telephoto camera for shooting, so as to obtain a higher quality captured image while zooming in on the captured image. For another example, when the user lowers the zoom factor to reduce the captured image, the mobile phone may choose to use a super wide-angle camera for shooting, so as to obtain a higher quality captured image while reducing the captured image. Moreover, since most of the cameras installed on mobile phones currently use fixed-focus cameras (i.e. cameras with fixed focal lengths), cameras with different focal lengths installed on mobile phones can only be photographed at a certain zoom factor corresponding to the focal length of the camera to obtain higher image quality. image.
  • fixed-focus cameras i.e. cameras with fixed focal lengths
  • the zoom factor corresponding to the focal length of the main camera can be set to 1.0x
  • the zoom factor corresponding to the focal length of the super wide-angle camera can be set to 0.4x
  • the zoom factor corresponding to the focal length of the telephoto camera can be set to 3.5x. That is, when the zoom factor is adjusted to 1.0x, the imaging quality of the captured image captured by the main camera of the mobile phone is relatively high; When the zoom factor is adjusted to 3.5x, the mobile phone uses a telephoto camera to capture high-quality images.
  • the mobile phone when the zoom factor adjusted by the user is greater than or equal to 0.4x and less than 1.0x, the mobile phone can use the ultra-wide-angle camera to shoot; when the zoom factor adjusted by the user is greater than or equal to 1.0x and less than 3.5x, the mobile phone can use the The main camera is used for shooting, and when the zoom factor adjusted by the user is greater than or equal to 3.5x, the mobile phone can use the telephoto camera for shooting.
  • the imaging quality of the captured image captured by the mobile phone will be reduced.
  • the main camera of the mobile phone can capture images with higher image quality, but when the user adjusts the zoom factor to 2.5x, although the mobile phone will continue to use the main camera, But at this time, because the focal length of the main camera is fixed, the captured image is a digital zoom based on the image captured by the main camera (that is, the image captured by the main camera is enlarged to obtain a captured image corresponding to the zoom factor), clear Compared with the image captured by the main camera, the resolution will be reduced.
  • the mobile phone when the user adjusts the zoom factor to 0.9x, the mobile phone will use the ultra-wide-angle camera to shoot, but at this time, because the zoom factor is larger than the zoom factor corresponding to the focal length of the ultra-wide-angle camera, the captured image is based on the ultra-wide-angle camera.
  • the digital zoom of the captured image that is, the image captured by the ultra-wide-angle camera is enlarged to obtain the captured image corresponding to the zoom factor
  • the clarity will be reduced compared with the image captured by the ultra-wide-angle camera
  • the mobile phone can only use the ultra-wide-angle camera to capture images with higher imaging quality.
  • the mobile phone will simultaneously use two cameras with adjacent focal lengths to shoot, and then the two cameras will be respectively The captured images are fused to obtain a final captured image.
  • the zoom factor corresponding to the focal length of the main camera As an example, continue to set the zoom factor corresponding to the focal length of the main camera to 1.0x, the zoom factor corresponding to the focal length of the super wide-angle camera to 0.4x, and the zoom factor corresponding to the focal length of the telephoto camera to 3.5x as an example.
  • the mobile phone when the user adjusts the zoom factor to 2.0x-3.5x, the mobile phone will use the main camera to shoot, and at the same time use the telephoto camera to shoot, so that the images taken by the main camera The digitally zoomed image is fused with the image captured by the telephoto camera to obtain the final captured image.
  • the zoom factor When the user adjusts the zoom factor to 0.6x-0.9x, the mobile phone will use the main camera for shooting on the basis of using the ultra-wide-angle camera for shooting. In order to fuse the digitally zoomed image captured by the ultra-wide-angle camera with the image captured by the main camera to obtain the final captured image.
  • the mobile phone when the user adjusts the zoom factor to 2.5x, the mobile phone will use the telephoto camera for shooting in addition to the main camera for shooting.
  • the zoom factor corresponding to the focal length of the telephoto camera is 3.5x, which is greater than the zoom factor 2.5x adjusted by the user, as shown in Figure 2, the image captured by the telephoto camera (as shown in (b) in Figure 2 ) is a part of the image (as shown in (a) in Figure 2) when the image captured by the main camera is digitally zoomed to a zoom factor of 2.5x.
  • the mobile phone can obtain a captured image by merging the image captured by the main camera through digital zooming to a zoom factor of 2.5x and the image captured by the telephoto camera (see (c) in Figure 2).
  • a captured image by merging the image captured by the main camera through digital zooming to a zoom factor of 2.5x and the image captured by the telephoto camera (see (c) in Figure 2).
  • the mobile phone when the user adjusts the zoom factor to 0.7x, the mobile phone will use the main camera for shooting in addition to the ultra-wide-angle camera for shooting.
  • the zoom factor corresponding to the focal length of the main camera is 1.0x, which is greater than the zoom factor adjusted by the user of 0.7x
  • the image captured by the main camera is the image captured by the ultra-wide-angle camera and adjusted to a zoom factor of 0.7x through digital zoom. part of the image at that time. Therefore, the mobile phone can adjust the digital zoom to the image captured by the ultra-wide-angle camera to a zoom factor of 0.7x for fusion with the image captured by the main camera to improve the contrast between the image obtained by the ultra-wide-angle camera and the digital zoom.
  • the sharpness of the image of the overlapped part of the image captured by the main camera camera thereby improving the clarity of the final captured image.
  • an embodiment of the present application provides a photographing method, which can be applied to a scene where an electronic device with a photographing function takes pictures through multiple cameras provided.
  • the photographing method may be, as shown in FIG. 4 , the electronic device may use two cameras with different focal lengths (that is, cameras with different field of view) to obtain two images, for example, the first One image and second image. Then one of the images, such as the second image, is blurred, and then the blurred image (such as the third image) of the image (such as the second image) is fused with the first image, and the fused image can be as a captured image (or called a fourth image).
  • two cameras with different focal lengths that is, cameras with different field of view
  • the second image may be obtained by the camera with a relatively larger focal length (that is, the camera with a relatively smaller field of view) among the two images
  • the image that is, the image obtained by the camera with a relatively large focal length is blurred.
  • the second image may also be an image acquired by a camera with a relatively small focal length (that is, a camera with a relatively large field of view) among the two images. Which of the two images has a higher definition is determined to perform blurring on the image with a higher definition.
  • the photographing method may be applied when the zoom factor adjusted by the user is not the zoom factor corresponding to the focal length of each camera set in the electronic device. And when the number of cameras with different focal lengths set on the electronic device is three or more, the two cameras involved in the photographing method can be specifically determined according to the zoom factor adjusted by the user. For example, a camera with a focal length corresponding to a zoom factor greater than the user-adjusted zoom factor and a camera with a focal length corresponding to a zoom factor smaller than the user-adjusted zoom factor may be used as the two cameras involved in the photographing method.
  • the electronic device may also always use two fixed cameras with different focal lengths to perform image capture and the like. Therefore, in the embodiment of the present application, there is no limitation on when the electronic device applies the photographing method to capture images together through the first camera and the second camera, and it can be set according to actual needs.
  • the focal lengths of two cameras with different focal lengths used at the same time may be adjacent focal lengths or the like.
  • one of the two cameras is a super wide-angle camera and the other is a main camera, or one is a main camera and the other is a telephoto camera.
  • the captured image (that is, the above-mentioned fourth image) refers to an image finally captured by the mobile phone when the user uses the mobile phone to capture the image, or an image finally captured by the mobile phone and displayed to the user.
  • the image captured by the camera with a relatively smaller focal length among the two cameras is usually an image captured by the corresponding camera (for example, the camera with a relatively smaller focal length among the two cameras).
  • the image after the digital zoom is adjusted to the zoom factor adjusted by the user (that is, the current zoom factor). Therefore, the fused image can meet the zoom factor adjusted by the user.
  • the zoom factor (that is, the current zoom factor) is then fused with the third image to be a captured image.
  • no limitation is imposed on the specific manner of making the final captured image satisfy the zoom factor adjusted by the user.
  • the electronic device when the electronic device shoots through two cameras, and then fuses the images captured separately to obtain a captured image, the electronic device reduces the corresponding image by blurring the image with higher resolution among the two acquired images. clarity. Therefore, the difference in definition between the images acquired by the two cameras caused by the difference in resolution and noise reduction capability between the cameras can be reduced. Furthermore, by fusing the two images with a small difference in definition, a captured image with an indistinct fusion boundary and a weak sense of splicing can be obtained.
  • the electronic device with camera function can be mobile phone, tablet computer, handheld computer, PC, cell phone, personal digital assistant (personal digital assistant, PDA), wearable device (such as: smart watch, Smart bracelet), smart home equipment (such as: TV), car machine (such as: car computer), smart screen, game console, and augmented reality (augmented reality, AR) / virtual reality (virtual reality, VR) equipment, etc. .
  • PDA personal digital assistant
  • wearable device such as: smart watch, Smart bracelet
  • smart home equipment such as: TV
  • car machine such as: car computer
  • smart screen such as: game console
  • AR augmented reality
  • VR virtual reality
  • the electronic device is provided with at least two cameras with different focal lengths (that is, two cameras with different viewing angles).
  • the electronic device is provided with a main camera (usually a wide-angle camera), a telephoto camera with a longer focal length than the main camera, and an ultra-wide-angle camera with a shorter focal length than the main camera.
  • FIG. 5 shows a schematic structural diagram of an electronic device provided by an embodiment of the present application. That is, for example, the electronic device shown in FIG. 5 may be a mobile phone.
  • the electronic device may include a processor 510, an external memory interface 520, an internal memory 521, a universal serial bus (universal serial bus, USB) interface 530, a charging management module 540, a power management module 541, a battery 542, Antenna 1, antenna 2, mobile communication module 550, wireless communication module 560, audio module 570, speaker 570A, receiver 570B, microphone 570C, earphone jack 570D, sensor module 580, button 590, motor 591, indicator 592, camera 593, A display screen 594, and a subscriber identification module (subscriber identification module, SIM) card interface 595, etc.
  • SIM subscriber identification module
  • the sensor module 580 may include a pressure sensor 580A, a gyroscope sensor 580B, an air pressure sensor 580C, a magnetic sensor 580D, an acceleration sensor 580E, a distance sensor 580F, a proximity light sensor 580G, a fingerprint sensor 580H, a temperature sensor 580J, a touch sensor 580K, an environment Light sensor 580L, bone conduction sensor 580M, etc.
  • the structure shown in this embodiment does not constitute a specific limitation on the electronic device.
  • the electronic device may include more or fewer components than shown, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 510 may include one or more processing units, for example: the processor 510 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit, NPU
  • a controller can be the nerve center and command center of an electronic device.
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 510 for storing instructions and data.
  • the memory in processor 510 is a cache memory.
  • the memory may hold instructions or data that the processor 510 has just used or recycled. If the processor 510 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 510 is reduced, thus improving the efficiency of the system.
  • processor 510 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the wireless communication function of the electronic device can be realized by the antenna 1, the antenna 2, the mobile communication module 550, the wireless communication module 560, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in an electronic device can be used to cover a single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 550 can provide wireless communication solutions including 2G/3G/4G/5G applied to electronic devices.
  • the mobile communication module 550 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 550 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 550 can also amplify the signal modulated by the modem processor, convert it into electromagnetic wave and radiate it through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 550 may be set in the processor 510 .
  • at least part of the functional modules of the mobile communication module 550 and at least part of the modules of the processor 510 may be set in the same device.
  • the wireless communication module 560 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite system, etc. (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module 560 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 560 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 510 .
  • the wireless communication module 560 can also receive the signal to be transmitted from the processor 510 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device is coupled to the mobile communication module 550, and the antenna 2 is coupled to the wireless communication module 560, so that the electronic device can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • general packet radio service general packet radio service
  • CDMA code division multiple access
  • WCDMA broadband Code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • BT GNSS
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device realizes the display function through the GPU, the display screen 594, and the application processor.
  • the GPU is a microprocessor for image processing, connected to the display screen 594 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 510 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 594 is used to display images, videos and the like.
  • Display 594 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device may include 1 or N display screens 594, where N is a positive integer greater than 1.
  • the electronic device can realize the shooting function through ISP, camera 593 , video codec, GPU, display screen 594 and application processor.
  • the electronic device may include 1 or N cameras 593, where N is a positive integer greater than 1.
  • the electronic device may include three cameras, one of which is a main camera, one is a telephoto camera, and one is a super wide-angle camera.
  • the internal memory 521 may be used to store computer-executable program codes including instructions.
  • the processor 510 executes various functional applications and data processing of the electronic device by executing instructions stored in the internal memory 521 .
  • the internal memory 521 may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device.
  • the internal memory 521 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash memory (universal flash storage, UFS) and the like.
  • the structure of the electronic device may include fewer structures than those shown in Figure 5, or may include more structures than those shown in Figure 5, and there is no limitation here .
  • system architecture of the electronic device may include an application layer, a framework layer
  • the application layer may be used to deploy application programs.
  • a camera application may be deployed in the application layer.
  • Framework layers can be frame, Frameworks and other system frameworks are not limited here.
  • the hardware abstraction layer can deploy a unified interface of each piece of hardware.
  • a camera hardware abstraction layer (Camera HAL3) can be deployed in the hardware abstraction layer.
  • the module (camera algorithm module (Libcamera algo)) for realizing the photographing method provided by the embodiment of the present application may also be deployed in the hardware abstraction layer.
  • the driver layer can be used to deploy the driver components of each hardware device.
  • the driver layer can be deployed with a video device driver (V4L2Driver), an image video processor (image video processor, IVP) driver (IVP Driver) or DSP Driver (DSP Driver), NPU Driver (NPU Driver), GPU Driver (GPU Driver), etc.
  • the firmware layer can be used to deploy the firmware of each hardware device.
  • the firmware layer can deploy the Internet of Things firmware (lite-OS FW) so as to drive image sensors, time of flight (time of flight, TOF) sensors, and ISP and so on.
  • the hardware layer includes various hardware provided by the electronic device.
  • the hardware layer may include image sensor, TOF sensor, ISP, IVP or DSP, NPU, GPU, etc.
  • the module (camera algorithm module) implementing the photographing method provided by the embodiment of the present application may be initialized in the hardware abstraction layer when the user opens the camera application deployed in the application layer.
  • the camera application includes a preview interface, and the preview interface includes a first control (or called a shutter button) control
  • the camera operation is the user’s first operation on the first control, such as clicking the shutter control, etc.
  • the camera application in the application layer can pass the camera command through the framework layer, camera hardware abstraction layer, video device driver and object
  • the networking firmware is sent to the image sensor, so that the image sensor can acquire an image in response to a camera command.
  • the image sensors of each camera are different, and the camera application may send a photographing instruction to the image sensor of the corresponding camera according to the camera to be used.
  • the camera application can provide the image sensor of the main camera and the telephoto camera respectively.
  • the image sensor of the camera sends a camera instruction. If the electronic device needs to use the main camera and the ultra-wide-angle camera to shoot together, the camera application can send a photographing instruction to the image sensor of the main camera and the image sensor of the ultra-wide-angle camera respectively.
  • the image sensor may send the image to the ISP.
  • the ISP processes the received images according to the preset method, it can send the processed two images to the camera hardware abstraction layer through the IoT firmware and video device driver.
  • the camera hardware abstraction layer receives the two images, the two images can be sent to the camera algorithm module for implementing the camera method of the embodiment of the present application.
  • the camera algorithm module After the camera algorithm module receives two images, it can use corresponding drivers (such as IPV or DSP Driver, NPU Driver, GPU Driver, etc.) to call corresponding hardware (such as IVP or DSP, NPU , GPU, etc.) performs blurring processing on the image captured by the camera with a relatively large focal length among the two images, and fuses the blurred image with another image to obtain a captured image. Finally, the camera algorithm module can obtain the captured image from the hardware that processes the fused image to obtain the captured image, and sends the captured image obtained by fusing the blurred image and another image to the application layer for deployment through the camera hardware abstraction layer and the framework layer The camera application of the camera application, so that the camera application displays and/or stores the received captured image.
  • corresponding drivers such as IPV or DSP Driver, NPU Driver, GPU Driver, etc.
  • hardware such as IVP or DSP, NPU , GPU, etc.
  • the following will take the electronic device as a mobile phone, and the electronic device is provided with a main camera (wide-angle camera), a telephoto camera and an ultra-wide-angle camera, wherein the zoom factor corresponding to the focal length of the main camera is set to 1.0x, and the ultra-wide-angle camera
  • the zoom factor corresponding to the focal length of the camera is set to 0.4x
  • the zoom factor corresponding to the focal length of the telephoto camera is set to 3.5x.
  • the mobile phone uses the main camera and the telephoto camera for shooting.
  • the zoom factor to 0.6x-0.9x the mobile phone uses the super wide-angle camera and the main camera to take pictures.
  • a specific implementation of a method for taking pictures provided in the embodiment of the present application is illustrated.
  • FIG. 7 shows a schematic flowchart of a photographing method provided by an embodiment of the present application. As shown in FIG. 7, the photographing method may include the following steps S701-S703.
  • the mobile phone can use the corresponding two cameras with different focal lengths to shoot when the user is taking pictures to obtain two cameras respectively Capture the resulting image.
  • the mobile phone executes the following S701.
  • the mobile phone acquires a first image through a first camera, and acquires a second image through a second camera.
  • the acquisition of the first image by the first camera and the acquisition of the second image by the second camera can be performed at the same time.
  • the time intervals are performed separately, and there is no limitation here.
  • the focal lengths of the first camera and the second camera are different.
  • the focal length of the second camera is greater than the focal length of the first camera (hereinafter all take this as an example), that is, the first camera can is the main camera set by the electronic device in this example, then the second camera can be a telephoto camera set by the electronic device, or the first camera can be an ultra-wide-angle camera set by the electronic device in this example, then the second camera can be The main camera set by the electronic device. Therefore, the second image captured by the second camera with a relatively long focal length can be included in the first image captured by the first camera with a relatively short focal length, so as to facilitate subsequent fusion of the first image and the second image.
  • the camera combination composed of the first camera and the second camera corresponds to a preset range, that is, different preset ranges correspond to different camera combinations.
  • the first camera when the preset range is 2.0x-3.5x, the first camera can be the main camera, and the second camera can be a telephoto camera, and when the preset range is 0.6x-0.9x , the first camera can be a super wide-angle camera, and the second camera can be a main camera.
  • the mobile phone can pass through the main camera (that is, the first The first image is captured by the camera as the main camera, and the second image is captured by the telephoto camera (that is, the second camera is the telephoto camera).
  • the mobile phone can pass through the ultra-wide-angle camera (that is, the first camera is an ultra-wide-angle camera). wide-angle camera) to obtain the first image, and the main camera (ie, the second camera as the main camera) to obtain the second image.
  • the camera interface of the mobile phone may include a preview interface
  • the preview interface includes a first control (or called a shutter control, a camera control), and the user's camera operation may be the user's first operation on the first control (such as clicking operation, long press operation, etc.).
  • a preview interface is displayed on the mobile phone, and the interface includes a preview frame, a camera control 801 and a zoom control 802 .
  • the preview frame is used to display the current zoom factor and the preview image of the subject in the photographing mode.
  • the camera control 801 is used to trigger the camera action of the mobile phone.
  • the zoom control 802 can be used to adjust the zoom factor, and the current zoom factor can be displayed on the zoom control.
  • a preview image when the zoom factor is 2.5x may be displayed in the preview box.
  • the user can click the camera control 801 to perform a camera operation.
  • the user's photographing operation may also be a pressing operation on a preset key (such as a power key, a volume key, etc.). Therefore, in the embodiment of the present application, there is no restriction on the user's photographing operation, as long as the operation used to trigger the mobile phone to take a photograph is the user's photographing operation.
  • a preset key such as a power key, a volume key, etc.
  • the first image captured by the mobile phone through the first camera may be an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera (for example, an image captured by the first camera and processed by an ISP), or It may be an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera and processed through digital zooming. That is, an image that matches the zoom factor adjusted by the user (ie, the current zoom factor).
  • the second image captured by the mobile phone through the second camera may be an image captured by the second camera that matches the zoom factor corresponding to the focal length of the second camera (for example, the image captured by the second camera is processed by the ISP).
  • the mobile phone can digitally zoom the first image to adjust to
  • the zoom factor adjusted by the user that is, the current zoom factor
  • the third image to obtain a fused image that matches the zoom factor adjusted by the user, so that the fused image can be used as a captured image later.
  • the mobile phone may perform the following S702.
  • the blur processing can include Gaussian blur, surface blur, box blur, Kawase blur, double blur, bokeh blur, axis-shift blur, aperture blur, granular blur, radial blur, and direction blur, etc.
  • the second image can be Gaussian Blur, Surface Blur, Box Blur, Kawase Blur, Double Blur, Bokeh Blur, Tilt-Shift Blur, Aperture Blur, Grainy Blur, Radial Blur, and Direction Blur are used for blurring.
  • the preset rule for blurring the second image may be based on the similarity between the first image and the second image (which may be used to characterize the first image and the second image)
  • the sharpness difference between images such as the higher the similarity between the first image and the second image, the smaller the sharpness difference, the lower the similarity, the larger the sharpness difference) to determine the corresponding blur strength (blur)
  • a blurring algorithm is used to perform blurring processing on the second image according to the corresponding blurring strength.
  • the similarity between the first image and the second image may be represented by structural similarity (structural similarity, SSIM).
  • structural similarity structural similarity
  • SSIM values of the first image (i.e. image x) and the second image (i.e. image y) can be calculated using the following formula:
  • x is the image x (such as the first image)
  • y is the image y (such as the second image)
  • ⁇ x is the average value of x
  • ⁇ y is the average value of y
  • ⁇ xy is the covariance of x and y
  • c 1 (k 1 L) 2
  • c 2 (k 2 L) 2
  • L is the dynamic range of the pixel value (that is, the value of the image pixel value
  • SSIM values range from 0 to 1. When the two images are exactly the same, the SSIM value is equal to 1.
  • the maximum blur strength and the minimum blur strength can be calibrated based on the SSIM values of the first image and the second image, so as to obtain the corresponding relationship between the SSIM value and the blur strength of the first image and the second image, so as to facilitate the follow-up based on the first
  • the SSIM values of the image and the second image are used to determine the corresponding blur strength.
  • the low SSIM value that can limit the similarity between the first image and the second image is too low (that is, when the SSIM value is smaller than the low SSIM value, the first image and the second image can be determined.
  • the similarity of the second image is too low
  • a high SSIM value capable of defining a high similarity between the first image and the second image that is, when the SSIM value is greater than the high SSIM value, the similarity between the first image and the second image can be determined higher degree.
  • the maximum blur strength can be calibrated based on a low SSIM value
  • the minimum blur strength can be calibrated based on a high SSIM value.
  • the blur strength is adjusted linearly until the third image obtained after blurring the second image can be compared with the first image after fusion with the first image. If the sharpness is improved and the fusion boundary is not obvious, the blur strength at this time can be used as the maximum blur strength.
  • the blur strength is linearly adjusted until the third image obtained after blurring the second image can be compared with the first image after fusion with the first image. If there is a large increase in definition and the fusion boundary is not obvious, the blur strength at this time can be taken as the minimum blur strength.
  • the blur strength corresponding to the SSIM value lower than the above-mentioned low SSIM value can be set to the above-mentioned maximum blur strength
  • the blur strength corresponding to the SSIM value higher than the above-mentioned high SSIM value can be set to the above-mentioned minimum blur strength, which will be between
  • the SSIM value between the low SSIM value and the high SSIM value corresponds linearly to the blur strength between the maximum blur strength and the highest blur strength.
  • the blur strength is calibrated when the SSIM value is 0.25, the maximum blur strength is 9, and the blur strength when the SSIM value is 0.38 is calibrated If the minimum blur intensity is 1, the corresponding relationship curve between the blur intensity and the SSIM value as shown in FIG. 9 can be obtained.
  • the preset rule for blurring the second image may be based on the sensitivity (ISO) corresponding to the second image, that is, the sensitivity of the second camera when capturing the second image.
  • the corresponding blur strength is determined, and then the second image is blurred by using a blur algorithm according to the corresponding blur strength. Therefore, when the sharpness difference between the first image and the second image is small, the situation that the sharpness of the fused image is not improved compared with the first image due to excessive blurring processing on the second image is avoided.
  • the sensitivity can be divided into segments, and then the blur strength corresponding to different sensitivity segments can be set according to the rule that the higher the sensitivity, the greater the blur strength, from low to high.
  • the blur strength is greater than a certain value, in order to avoid using a higher blur strength for blur processing, resulting in excessive blur processing of the second image and resulting in the fused image not being improved compared with the first image definition, you can Use this blur strength as the maximum blur strength so that the higher sensitivity segments all correspond to the maximum blur strength.
  • the sensitivity can be divided into 100-1000, 1000-2000, 2000-3000, 3000-4000, 4000-5000, 5000-6000 and so on. Therefore, set the corresponding blur strength when the sensitivity is 100-1000 to 1, set the corresponding blur strength to 3 when the sensitivity is 1000-2000, set the corresponding blur strength to 5 when the sensitivity is 2000-3000, and set When the sensitivity is 3000-4000, the corresponding blur strength is set to 7, and when the sensitivity is 4000-5000, the corresponding blur strength is set to 9.
  • the specific parameter settings of the sensitivity segment and its corresponding blur strength in the above example may be as follows:
  • ⁇ iso100> etc. indicate the index of the sensitivity segment, and ⁇ blur>1 ⁇ /blur> indicates the blur intensity corresponding to the sensitivity segment.
  • the preset rule for blurring the second image may be to determine the corresponding blur strength according to the ambient brightness corresponding to the second image, and then use a blur algorithm to blur the second image according to the corresponding blur strength. The image is blurred. Therefore, when the sharpness difference between the first image and the second image is small, the situation that the sharpness of the fused image is not improved compared with the first image due to excessive blurring processing on the second image is avoided.
  • the ambient brightness is usually the average brightness of the ambient light obtained by the mobile phone according to the ambient light measurement.
  • the exposure parameters adopted by the mobile phone can be calculated according to the ambient brightness, that is, the exposure parameters of the image captured by the camera are calculated according to the ambient brightness. Therefore, in the embodiment of the present application, the ambient brightness can be obtained according to the exposure parameters of the second image.
  • the ambient brightness may be divided into segments, and then the blurring strengths corresponding to different ambient brightness segments are set according to the rule that the higher the ambient brightness, the greater the blurring strength, from low to high.
  • the blur strength is greater than a certain value, in order to avoid using a higher blur strength for blur processing, resulting in excessive blur processing of the second image and resulting in the fused image not being improved compared with the first image definition, you can Use this blur strength as the maximum blur strength so that the higher sensitivity segments all correspond to the maximum blur strength.
  • the ambient brightness can be divided into 100-1000, 1000-2000, 2000-3000, 3000-4000, 4000-5000, 5000-6000 and so on. Therefore, set the corresponding blur strength when the ambient brightness is 100-1000 to 1, set the corresponding blur strength to 3 when the ambient brightness is 1000-2000, set the corresponding blur strength to 5 when the ambient brightness is 2000-3000, and set When the ambient brightness is 3000-4000, the corresponding blur strength is set to 7, and when the ambient brightness is 4000-5000, the corresponding blur strength is set to 9.
  • the specific parameter settings of the ambient brightness segment and its corresponding blur strength in the above example may be as follows:
  • ⁇ lv 100> etc. indicate the index of the sensitivity segment, and ⁇ blur>1 ⁇ /blur> indicates the blur intensity corresponding to the ambient brightness segment.
  • blur parameters corresponding to different blur strengths may be determined according to specific blur algorithms.
  • the formula of Gaussian blur can be as follows:
  • u 2 +v 2 is the blur radius
  • is the standard deviation of the normal distribution.
  • the Gaussian matrix of Gaussian blur when the blur strength is 3 is:
  • the blurring process can be performed according to the above-mentioned Gaussian matrix.
  • the Gaussian matrix of Gaussian blur is:
  • the blurring process can be performed according to the above-mentioned Gaussian matrix.
  • the third image when merging the third image with the first image, can be superimposed on the part of the first image that overlaps with the content of the third image, or the third image can directly replace the part of the first image that is related to the third image.
  • the overlapped part of the content of the third image, or other algorithms are used for fusion, which is not limited here.
  • the fused image (that is, the fourth image) may be saved as a captured image.
  • the camera algorithm module can call the IVP, DSP or CPU according to the above-mentioned embodiment, and combine the SSIM value of the first image and the second image and the difference between the SSIM value and the blur strength
  • the relationship curve (or the configuration parameter of the relationship between the sensitivity of the second image and the blur strength, or the configuration parameter of the relationship between the ambient brightness of the second image and the blur strength) is sent to IVP or DSP, so that IVP or DSP can The parameter determines how hard to blur the second image.
  • the IVP or DSP can return the determined blur strength to the camera algorithm module, and the camera algorithm module can send the determined blur strength and the first image and the second image to the GPU, so as to call the GPU to process the second image according to the determined blur strength. Blurring is performed on the blur strength to obtain a third image, and the first image and the third pattern are fused to obtain a captured image.
  • the GPU can return the captured image to the camera algorithm module, and the camera algorithm module can send it to the camera application deployed in the application layer through the camera hardware abstraction layer and the framework layer, so that the camera application can display and/or store the received captured image.
  • the camera algorithm module can also flexibly call IVP, DSP, CPU, GPU, etc.
  • the third image and The fused image of the first image is used as the final captured image.
  • the first image is an image captured by the first camera that matches the zoom factor corresponding to the focal length of the first camera, then when fusion is performed, the first image can be digitally zoomed to the zoom factor adjusted by the user (that is, the current zoom factor ) and then fused with the third image, so that the fused image can match the zoom factor adjusted by the user as the final captured image. Therefore, there is no limitation on when to adjust the image through the digital zoom so that the final image matches the zoom factor adjusted by the user.
  • the electronic device when the electronic device takes pictures with two cameras, and then fuses the images captured separately to obtain the captured image, the electronic device blurs the image with higher resolution among the two acquired images
  • the processing is done in a way that reduces the sharpness of the corresponding image. Therefore, the difference in definition between the images captured by the two cameras caused by the difference in resolution and noise reduction capability between the cameras can be reduced. Furthermore, by fusing the two images with a small difference in definition, a captured image with an indistinct fusion boundary and a weak sense of splicing can be obtained.
  • the embodiments of the present application further provide a photographing device.
  • the apparatus may be applied to the above-mentioned electronic equipment to implement the methods in the foregoing embodiments.
  • the functions of the device can be realized by hardware, and can also be realized by executing corresponding software by hardware.
  • Hardware or software includes one or more modules corresponding to the above-mentioned functions.
  • FIG. 10 shows a schematic structural diagram of a photographing device. As shown in FIG. 10 , the device includes: a processing module 1001 , a display module 1002 and the like.
  • the processing module 1001 and the display module 1002 may cooperate to implement the related methods in the foregoing embodiments.
  • units in the above device is only a division of logical functions, and may be fully or partially integrated into a physical entity or physically separated during actual implementation.
  • the units in the device can all be implemented in the form of software called by the processing element; they can also be implemented in the form of hardware; some units can also be implemented in the form of software called by the processing element, and some units can be implemented in the form of hardware.
  • each unit can be a separate processing element, or it can be integrated in a certain chip of the device. In addition, it can also be stored in the memory in the form of a program, which is called and executed by a certain processing element of the device. Function. In addition, all or part of these units can be integrated together, or implemented independently.
  • the processing element described here may also be referred to as a processor, and may be an integrated circuit with a signal processing capability. In the process of implementation, each step of the above method or each unit above may be implemented by an integrated logic circuit of hardware in the processor element or implemented in the form of software called by the processing element.
  • the units in the above device may be one or more integrated circuits configured to implement the above method, for example: one or more ASICs, or, one or more DSPs, or, one or more FPGAs, Or a combination of at least two of these integrated circuit forms.
  • the processing element can be a general-purpose processor, such as a CPU or other processors that can call programs.
  • these units can be integrated together and implemented in the form of a system-on-a-chip (SOC).
  • the units of the above apparatus for implementing each corresponding step in the above method may be implemented in the form of a processing element scheduler.
  • the apparatus may include a processing element and a storage element, and the processing element invokes a program stored in the storage element to execute the methods described in the above method embodiments.
  • the storage element may be a storage element on the same chip as the processing element, that is, an on-chip storage element.
  • the program for executing the above method may be stored in a storage element on a different chip from the processing element, that is, an off-chip storage element.
  • the processing element invokes or loads a program from the off-chip storage element to the on-chip storage element, so as to invoke and execute the methods described in the above method embodiments.
  • an embodiment of the present application may also provide an apparatus, such as an electronic device, which may include a processor, and a memory configured to store instructions executable by the processor.
  • an electronic device which may include a processor, and a memory configured to store instructions executable by the processor.
  • the processor When the processor is configured to execute the above instructions, the electronic device implements the photographing method implemented by the electronic device in the foregoing embodiments.
  • the memory can be located inside the electronic device or outside the electronic device.
  • the processor includes one or more.
  • the unit of the apparatus that implements each step in the above method may be configured as one or more processing elements, and these processing elements may be set on the corresponding electronic equipment described above, where the processing elements may be integrated circuits , for example: one or more ASICs, or one or more DSPs, or one or more FPGAs, or a combination of these types of integrated circuits. These integrated circuits can be integrated together to form a chip.
  • an embodiment of the present application further provides a chip system, and the chip system may be applied to the above-mentioned electronic device.
  • the chip system includes one or more interface circuits and one or more processors; the interface circuits and the processors are interconnected through lines; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuits, so as to realize the electronic processing in the above method embodiments.
  • Device-dependent methods include one or more interface circuits and one or more processors; the interface circuits and the processors are interconnected through lines; the processor receives and executes computer instructions from the memory of the electronic device through the interface circuits, so as to realize the electronic processing in the above method embodiments.
  • Device-dependent methods are possible implementation of the present application.
  • An embodiment of the present application further provides a computer program product, including computer instructions executed by an electronic device, such as the above-mentioned electronic device.
  • the disclosed devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division, and there may be other division methods in actual implementation.
  • multiple units or components can be Incorporation or may be integrated into another device, or some features may be omitted, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the unit described as a separate component may or may not be physically separated, and the component displayed as a unit may be one physical unit or multiple physical units, that is, it may be located in one place, or may be distributed to multiple different places . Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units can be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is realized in the form of a software function unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • a software product such as a program.
  • the software product is stored in a program product, such as a computer-readable storage medium, and includes several instructions to make a device (which may be a single-chip microcomputer, a chip, etc.) or a processor (processor) execute all of the methods described in various embodiments of the present application. or partial steps.
  • the aforementioned storage medium includes: various media capable of storing program codes such as U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk.
  • the embodiments of the present application may also provide a computer-readable storage medium on which computer program instructions are stored.
  • the computer program instructions are executed by the electronic device, the electronic device is made to implement the photographing method described in the foregoing method embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

La présente demande concerne un procédé et un dispositif de capture d'image, qui se rapportent au domaine de la capture d'image. La présente demande résout le problème de fusion de limite évidente d'images capturées lorsque deux images sont acquises au moyen de deux dispositifs de prise de vues pour une fusion afin d'obtenir une image capturée. La solution spécifique est la suivante : un dispositif électronique lance un dispositif de prise de vues ; une interface de prévisualisation est affichée, l'interface de prévisualisation comprenant une première commande ; une première opération sur une première commande est détectée ; en réponse à la première opération, un premier dispositif de prise de vues acquiert une première image, et un deuxième dispositif de prise de vues acquiert une deuxième image, la résolution de la deuxième image étant supérieure à celle de la première image ; la deuxième image est floutée pour obtenir une troisième image ; la troisième image est fusionnée avec la première image pour obtenir une quatrième image ; et la quatrième image est sauvegardée.
PCT/CN2022/093613 2021-08-11 2022-05-18 Procédé et dispositif de capture d'image WO2023016025A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110919953.8A CN113810598B (zh) 2021-08-11 2021-08-11 一种拍照方法、电子设备及存储介质
CN202110919953.8 2021-08-11

Publications (1)

Publication Number Publication Date
WO2023016025A1 true WO2023016025A1 (fr) 2023-02-16

Family

ID=78893436

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/093613 WO2023016025A1 (fr) 2021-08-11 2022-05-18 Procédé et dispositif de capture d'image

Country Status (2)

Country Link
CN (1) CN113810598B (fr)
WO (1) WO2023016025A1 (fr)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113810598B (zh) * 2021-08-11 2022-11-22 荣耀终端有限公司 一种拍照方法、电子设备及存储介质
CN116723394B (zh) * 2022-02-28 2024-05-10 荣耀终端有限公司 多摄策略调度方法及其相关设备
CN114782296B (zh) * 2022-04-08 2023-06-09 荣耀终端有限公司 图像融合方法、装置及存储介质
CN116245741B (zh) * 2022-06-28 2023-11-17 荣耀终端有限公司 图像处理方法及其相关设备
CN116051368B (zh) * 2022-06-29 2023-10-20 荣耀终端有限公司 图像处理方法及其相关设备
CN115348390A (zh) * 2022-08-23 2022-11-15 维沃移动通信有限公司 拍摄方法及拍摄装置
CN116051435B (zh) * 2022-08-23 2023-11-07 荣耀终端有限公司 图像融合方法和电子设备
CN117835077A (zh) * 2022-09-27 2024-04-05 华为终端有限公司 一种拍摄方法、电子设备及介质
CN117729445A (zh) * 2024-02-07 2024-03-19 荣耀终端有限公司 一种图像处理方法、电子设备以及计算机可读存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180048832A1 (en) * 2015-10-06 2018-02-15 Light Labs Inc. Methods and apparatus for facilitating selective blurring of one or more image portions
CN110290300A (zh) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 设备成像方法、装置、存储介质及电子设备
CN112995467A (zh) * 2021-02-05 2021-06-18 深圳传音控股股份有限公司 图像处理方法、移动终端及存储介质
CN113012085A (zh) * 2021-03-18 2021-06-22 维沃移动通信有限公司 图像处理方法和装置
CN113810598A (zh) * 2021-08-11 2021-12-17 荣耀终端有限公司 一种拍照方法及设备

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107959778B (zh) * 2017-11-30 2019-08-20 Oppo广东移动通信有限公司 基于双摄像头的成像方法和装置
CN112188096A (zh) * 2020-09-27 2021-01-05 北京小米移动软件有限公司 拍照方法及装置、终端及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180048832A1 (en) * 2015-10-06 2018-02-15 Light Labs Inc. Methods and apparatus for facilitating selective blurring of one or more image portions
CN110290300A (zh) * 2019-06-28 2019-09-27 Oppo广东移动通信有限公司 设备成像方法、装置、存储介质及电子设备
CN112995467A (zh) * 2021-02-05 2021-06-18 深圳传音控股股份有限公司 图像处理方法、移动终端及存储介质
CN113012085A (zh) * 2021-03-18 2021-06-22 维沃移动通信有限公司 图像处理方法和装置
CN113810598A (zh) * 2021-08-11 2021-12-17 荣耀终端有限公司 一种拍照方法及设备

Also Published As

Publication number Publication date
CN113810598B (zh) 2022-11-22
CN113810598A (zh) 2021-12-17

Similar Documents

Publication Publication Date Title
WO2023016025A1 (fr) Procédé et dispositif de capture d'image
WO2020073959A1 (fr) Procédé de capture d'image et dispositif électronique
CN114092364B (zh) 图像处理方法及其相关设备
CN112150399A (zh) 基于宽动态范围的图像增强方法及电子设备
WO2022262344A1 (fr) Procédé de photographie et dispositif électronique
CN115601244B (zh) 图像处理方法、装置和电子设备
US20240119566A1 (en) Image processing method and apparatus, and electronic device
CN113660408B (zh) 一种视频拍摄防抖方法与装置
CN112929558B (zh) 图像处理方法和电子设备
CN116347224B (zh) 拍摄帧率控制方法、电子设备、芯片系统及可读存储介质
CN113630558B (zh) 一种摄像曝光方法及电子设备
CN116033275B (zh) 一种自动曝光方法、电子设备及计算机可读存储介质
CN114466134A (zh) 生成hdr图像的方法及电子设备
WO2022267506A1 (fr) Procédé de fusion d'image, dispositif électronique, support de stockage, et produit-programme d'ordinateur
CN117061861B (zh) 一种拍摄方法、芯片系统和电子设备
CN116668862B (zh) 图像处理方法与电子设备
CN115631250B (zh) 图像处理方法与电子设备
CN117395495B (zh) 一种图像处理方法及电子设备
WO2023160178A1 (fr) Procédé de commande d'exposition et dispositif électronique
CN116051368B (zh) 图像处理方法及其相关设备
CN116723264B (zh) 确定目标位置信息的方法、设备及存储介质
CN117135468B (zh) 图像处理方法及电子设备
CN115526786B (zh) 图像处理方法及其相关设备
CN117528265A (zh) 视频拍摄方法及电子设备
WO2024067071A1 (fr) Procédé de photographie, dispositif électronique et support

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE